Search results for: retrospective cohort study
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 48796

Search results for: retrospective cohort study

48796 Kidney Stones in Individuals Living with Diabetes Mellitus at King Abdul-Aziz Medical City - Tertiary Care Center, Jeddah, Saudi Arabia: A Retrospective Cohort Study

Authors: Suhaib Radi, Ibrahim Basem Nafadi, Abdullah Ahmed Alsulami, Nawaf Faisal Halabi, Abdulrhman Abdullah Alsubhi, Sami Wesam Maghrabi, Waleed Saad Alshehri

Abstract:

Background: Kidney stones greatly affect individuals. The range of these effects regarding multiple kidney stone factors (size, presence of obstruction, and modality of treatment) in stone formers with and without diabetes has not been well explored in the literature to the best of the author's knowledge. Our goal is to investigate this unexplored correlation between diabetes and kidney stones by conducting a Cohort retrospective study to precisely evaluate the effects of this condition and the existence of complications in adult individuals with diabetes in Saudi Arabia in comparison to a non-diabetic control group. Methodology: This is a retrospective cohort study aiming to evaluate the range of effects of kidney stones in stone formers in a group of adults diagnosed with type 2 diabetes mellitus and adults without diabetes between 2017 and 2019 in Jeddah, Saudi Arabia. An IRB approval has been granted for this study. The data was analyzed using SPSS. The data was collected from the 1st of December 2022 until the 1st of March 2023. Results: A total of 254 individuals diagnosed with kidney stones were included, 127 of whom were adult individuals with type 2 diabetes, and 127 were non-diabetics. Our study shows that the individuals affected with diabetes were more likely to have larger kidney stones in comparison to individuals without diabetes (13.12 mm vs. 10.53 mm, p-value = 0.03). Moreover, individuals with hypertension and dyslipidemia also had significantly larger stones. On the other hand, no significant difference was found in the presence of obstruction and modality of treatment between the two groups. Conclusion: This study done in Saudi Arabia found that individuals with kidney stones who concurrently had diabetes formed larger kidney stones, and they were also found to have other comorbidities such as HTN, dyslipidemia, obesity, and renal disease. The significance of these findings could assist in the future of primary and secondary prevention of renal stones.

Keywords: kidney stone, type 2 DM, metabolic syndrome, lithotripsy

Procedia PDF Downloads 83
48795 Total Parenteral Nutrition Wastage: A Retrospective Cohort Study in a Small District General Hospital

Authors: Muhammad Faizan Butt, Maria Ambreen Tahir, Joshua James Pilkington, A. A. Warsi

Abstract:

Background: Total parenteral nutrition (TPN) use within the NHS is crucial in the prevention of malnourishment. TPN prescriptions are tailored to an individual patient’s needs. TPN bags come in fixed sizes, and minimizing wastage has financial and sustainability implications for the health service. The aim of the study is to assess current prescribing practices, look at the volume of TPN wastage and identify reasons for it. Methodology: A retrospective cohort study on TPN prescriptions over a period of 1 year (Jan-Dec 2022) was performed. All patients prescribed TPN that had been admitted under a general surgery consultant in a small district hospital were included. Data were extracted from hospital electronic records and dietician charts. Data were described, and reasons for TPN wastage were explored. Results: 49 patients were identified. The average length of TPN prescription was 8 days (median). This totaled 608 prescriptions. Of the bags prescribed, 258, 169, and 181 were 10g (2500ml), 14g (2000ml), and 18g (2000ml), respectively. The mean volume wasted from each type of bag was 634ml, 634ml, and 648ml, respectively. Reasons for TPN wastage identified were: no loss (25%), smaller bags not available (53.6%), step-down regime (8.1%), and other (12.2%). Conclusion: This study has identified that the current stocking and prescribing of TPN within a district general hospital leads to a significant wastage of 638.2ml (average). The commonest reason for wastage is the non-availability of a more appropriate sized bag.

Keywords: general surgery, TPN, sustainability, wastage

Procedia PDF Downloads 46
48794 Risk of Mortality and Spectrum of Second Primary Malignancies in Mantle Cell Lymphoma before and after Ibrutinib Approval: A Population-Based Study

Authors: Karthik Chamari, Vasudha Rudraraju, Gaurav Chaudhari

Abstract:

Background: Mantle cell lymphoma (MCL) is one of the mature B cell non-Hodgkin lymphomas (NHL). The course of MCL is moderately aggressive and variable, and it has median overall survival of 8 to 10 years. Ibrutinib, a Bruton’s tyrosine kinase inhibitor, was approved by the United States (US) Food and Drug Administration in November of 2013 for the treatment of MCL patients who have received at least one prior therapy. In this study, we aimed to evaluate whether there has been a change in survival and patterns of second primary malignancies (SPMs) among the MCL population in the US after ibrutinib approval. Methods: Using the National Cancer Institute’s Surveillance, Epidemiology, and End Results (SEER)-18, we conducted a retrospective study with patients diagnosed with MCL (ICD-0-3 code 9673/3) between 2007 and 2018. We divided patients into two six-year cohorts, pre-ibrutinib approval (2007-2012) and post-ibrutinib approval (2013-2018), and compared relative survival rates (RSRs) and standardized incidence ratios (SIRs) of SPMs between cohorts. Results: We included 9,257 patients diagnosed with MCL between 2007 and 2018 in the SEER-18 survival and SIR registries. Of these, 4,205 (45%) patients were included in the pre-ibrutinib cohort, and 5052 (55%) patients were included in the post-ibrutinib cohort. The median follow-up duration for the pre-ibrutinib cohort was 54 months (range 0 to 143 months), and the post-ibrutinib cohort was 20 months (range 0 to 71 months). There was a significant difference in the five-year RSRs between pre-ibrutinib and post-ibrutinib cohorts (57.5% vs. 62.6%, p < 0.005). Out of the 9,257 patients diagnosed with MCL, 920 developed SPMs. A higher proportion of SPMs occurred in the post-ibrutinib cohort (63%) when compared with the pre-ibrutinib cohort (37%). Non-hematological malignancies comprised most of all SPMs. A higher incidence of non-hematological malignancies occurred in the post-ibrutinib cohort (SIR 1.42, 95% CI 1.29 to 1.56) when compared with the pre-ibrutinib cohort (SIR 1.14, 95% CI 1 to 1.3). There was a statistically significant increase in the incidence of cancers of the respiratory tract (SIR 1.77, 95% CI 1.43 to 2.18), urinary tract (SIR 1.61, 95% CI 1.23 to 2.06) when compared with other non-hematological malignancies in post-ibrutinib cohort. Conclusions: Our study results suggest the relative survival rates have increased since the approval of ibrutinib for mantle cell lymphoma patients. Additionally, for some unclear reasons, the incidence of SPM’s (non-hematological malignancies), mainly cancers of the respiratory tract, urinary tract, have increased in the six years following the approval of ibrutinib. Further studies should be conducted to determine the cause of these findings.

Keywords: mantle cell lymphoma, Ibrutinib, relative survival analysis, secondary primary cancers

Procedia PDF Downloads 159
48793 Effect of Atrial Flutter on Alcoholic Cardiomyopathy

Authors: Ibrahim Ahmed, Richard Amoateng, Akhil Jain, Mohamed Ahmed

Abstract:

Alcoholic cardiomyopathy (ACM) is a type of acquired cardiomyopathy caused by chronic alcohol consumption. Frequently ACM is associated with arrhythmias such as atrial flutter. Our aim was to characterize the patient demographics and investigate the effect of atrial flutter (AF) on ACM. This was a retrospective cohort study using the Nationwide Inpatient Sample database to identify admissions in adults with principal and secondary diagnoses of alcoholic cardiomyopathy and atrial flutter from 2019. Multivariate linear and logistic regression models were adjusted for age, gender, race, household income, insurance status, Elixhauser comorbidity score, hospital location, bed size, and teaching status. The primary outcome was all-cause mortality, and secondary outcomes were the length of stay (LOS) and total charge in USD. There was a total of 21,855 admissions with alcoholic cardiomyopathy, of which 1,635 had atrial flutter (AF-ACM). Compared to Non-AF-ACM cohort, AF-ACM cohort had fewer females (4.89% vs 14.54%, p<0.001), were older (58.66 vs 56.13 years, p<0.001), fewer Native Americans (0.61% vs2.67%, p<0.01), had fewer smaller (19.27% vs 22.45%, p<0.01) & medium-sized hospitals (23.24% vs28.98%, p<0.01), but more large-sized hospitals (57.49% vs 48.57%, p<0.01), more Medicare (40.37% vs 34.08%, p<0.05) and fewer Medicaid insured (23.55% vs 33.70%, p=<0.001), fewer hypertension (10.7% vs 15.01%, p<0.05), and more obesity (24.77% vs 16.35%, p<0.001). Compared to Non-AF-ACM cohort, there was no difference in AF-ACM cohort mortality rate (6.13% vs 4.20%, p=0.0998), unadjusted mortality OR 1.49 (95% CI 0.92-2.40, p=0.102), adjusted mortality OR 1.36 (95% CI 0.83-2.24, p=0.221), but there was a difference in LOS 1.23 days (95% CI 0.34-2.13, p<0.01), total charge $28,860.30 (95% CI 11,883.96-45,836.60, p<0.01). In patients admitted with ACM, the presence of AF was not associated with a higher all-cause mortality rate or odds of all-cause mortality; however, it was associated with 1.23 days increase in LOS and a $28,860.30 increase in total hospitalization charge. Native Americans, older age and obesity were risk factors for the presence of AF in ACM.

Keywords: alcoholic cardiomyopathy, atrial flutter, cardiomyopathy, arrhythmia

Procedia PDF Downloads 91
48792 Retrieval-Induced Forgetting Effects in Retrospective and Prospective Memory in Normal Aging: An Experimental Study

Authors: Merve Akca

Abstract:

Retrieval-induced forgetting (RIF) refers to the phenomenon that selective retrieval of some information impairs memory for related, but not previously retrieved information. Despite age differences in retrieval-induced forgetting regarding retrospective memory being documented, this research aimed to highlight age differences in RIF of the prospective memory tasks for the first time. By using retrieval-practice paradigm, this study comparatively examined RIF effects in retrospective memory and event-based prospective memory in young and old adults. In this experimental study, a mixed factorial design with age group (Young, Old) as a between-subject variable, and memory type (Prospective, Retrospective) and item type (Practiced, Non-practiced) as within-subject variables was employed. Retrieval-induced forgetting was observed in the retrospective but not in the prospective memory task. Therefore, the results indicated that selective retrieval of past events led to suppression of other related past events in both age groups but not the suppression of memory for future intentions.

Keywords: prospective memory, retrieval-induced forgetting, retrieval inhibition, retrospective memory

Procedia PDF Downloads 296
48791 Incidence of Cancer in Patients with Alzheimer's Disease: A 11-Year Nationwide Population-Based Study

Authors: Jun Hong Lee

Abstract:

Background: Alzheimer`s disease (AD) I: creases with age and is characterized by the premature progressive loss of neuronal cell. In contrast, cancer cells have inappropriate cell proliferation and resistance to cell death. Objective: We evaluated the association between cancer and AD and also examined the specific types of cancer. Patients and Methods/Material and Methods: This retrospective, nationwide, longitudinal study used National Health Insurance Service – Senior cohort (NHIS-Senior) 2002-2013, which was released by the KNHIS in 2016, comprising 550,000 random subjects who were selected from over than 60. The study included a cohort of 4,408 patients who were first diagnoses as AD between 2003 and 2005. To match each dementia patient, 19,150 subjects were selected from the database by Propensity Score Matching. Results: We enrolled 4,790 patients for analysis in this cohort and the prevalence of AD was higher in female (19.29%) than in male (17.71%). A higher prevalence of AD was observed in the 70-84 year age group and in the higher income status group. A total of 540 cancers occurred within the observation interval. Overall cancer was less frequent in those with AD (12.25%) than in the control (18.46%), with HR 0.704 (95% Confidence Intervals (CIs)=0.0.64-0.775, p-Value < 0.0001). Conclusion: Our data showed a decreased incidence of overall cancers in patients with AD similar to previous studies. Patients with AD had a significantly decreased risk of colon & rectum, lung and stomach cancer. This finding lower than but consistent with Western countries. We need further investigation of genetic evidence linking AD to cancer.

Keywords: Alzheimer, cancer, nationwide, longitudinal study

Procedia PDF Downloads 158
48790 On the Survival of Individuals with Type 2 Diabetes Mellitus in the United Kingdom: A Retrospective Case-Control Study

Authors: Njabulo Ncube, Elena Kulinskaya, Nicholas Steel, Dmitry Pshezhetskiy

Abstract:

Life expectancy in the United Kingdom (UK) has been near constant since 2010, particularly for the individuals of 65 years and older. This trend has been also noted in several other countries. This slowdown in the increase of life expectancy was concurrent with the increase in the number of deaths caused by non-communicable diseases. Of particular concern is the world-wide exponential increase in the number of diabetes related deaths. Previous studies have reported increased mortality hazards among diabetics compared to non-diabetics, and on the differing effects of antidiabetic drugs on mortality hazards. This study aimed to estimate the all-cause mortality hazards and related life expectancies among type 2 diabetes (T2DM) patients in the UK using the time-variant Gompertz-Cox model with frailty. The study also aimed to understand the major causes of the change in life expectancy growth in the last decade. A total of 221 182 (30.8% T2DM, 57.6% Males) individuals aged 50 years and above, born between 1930 and 1960, inclusive, and diagnosed between 2000 and 2016, were selected from The Health Improvement Network (THIN) database of the UK primary care data and followed up to 31 December 2016. About 13.4% of participants died during the follow-up period. The overall all-cause mortality hazard ratio of T2DM compared to non-diabetic controls was 1.467 (1.381-1.558) and 1.38 (1.307-1.457) when diagnosed between 50 to 59 years and 60 to 74 years, respectively. The estimated life expectancies among T2DM individuals without further comorbidities diagnosed at the age of 60 years were 2.43 (1930-1939 birth cohort), 2.53 (1940-1949 birth cohort) and 3.28 (1950-1960 birth cohort) years less than those of non-diabetic controls. However, the 1950-1960 birth cohort had a steeper hazard function compared to the 1940-1949 birth cohort for both T2DM and non-diabetic individuals. In conclusion, mortality hazards for people with T2DM continue to be higher than for non-diabetics. The steeper mortality hazard slope for the 1950-1960 birth cohort might indicate the sub-population contributing to a slowdown in the growth of the life expectancy.

Keywords: T2DM, Gompetz-Cox model with frailty, all-cause mortality, life expectancy

Procedia PDF Downloads 104
48789 Prevalence of Adverse Events in Children and Adolescents on Antiretroviral Therapy: Examining the Pediatric Cohort in the Eastern Cape

Authors: Shannon Glaspy, Gerald Boon, Jack Lambert

Abstract:

Studies on AE of highly active antiretroviral therapy (HAART) in children and adolescents are rare. The aim of this study is to observe the frequency of treatment limiting adverse drug reactions against years on ARVs and specific ARV regimen. Methods: A retrospective cohort study was conducted in East London, South Africa. All patient files in the pediatric (0 – 18 years) ARV cohort were examined, selecting only those patients started on HAART. ARV regimen changes explicitly due to AE, age on ARV treatment onset, age of AE onset, and gender were extrapolated. Eligible subjects were obtained from patient folders, anonymized and cross-referenced with data obtained from electronic records. A total of 1120 patients [592 male (52.9%) and 528 female (47.1%)] were charted by incidence and year. Additional information was extrapolated in cases where the patient experienced lipodystrophy and lipoatrophy to include the number of years on ARVs prior to the onset of the AE. Results: Of the 1120 HIV infected children of the hospital cohort, a total of 105 (9.37%) AE (53.3% male) observed were deemed eligible for the study due to completeness of medical history and agreement between electronic records and paper files. The AE cited were as follows: lipoatrophy 62 (5.53% of all subjects), lipodystrophy 27 (2.41%), neuropathy 9 (0.8%), anemia 2 (0.17%), Steven Johnsons Syndrome 1 (0.08%), elevated LFTs 1 (0.8%), breast hypertrophy (0.08%), gastritis 1 (0.08%) and rash 1 (0.08%). The most prevalence ARV regimens associated with the onset of the AE are: D4T/3TC/EFV 72 cases (64.86% of all AE), D4T/3TC/LOPr 24 cases (21.62%). Lipoatrophy and lipodystrophy combined represent 84.76% (89 cases) of all adverse events documented in this cohort. Within the 60 cases of lipoatrophy, the average number of years on ARVs associated with an AE is 3.54, with 14 cases experiencing an AE between 0-2 years of HAART. Within the 29 cases of lipodystrophy, the average number of years on ARVs associated with an AE is 3.89, with 4 cases experiencing an AE between 0-2 years on HAART. The regimen D4T/3TC/EFV is associated with 43 cases (71.66%) of lipoatrophy and 21 cases (72.41%) of lipodystrophy. D4T/3TC/LOPr is associated with 15 cases (25%) of lipoatrophy and 7 cases (24.14%) of lipodystrophy. The frequency of AE associated with ARV regimens could be misrepresented due to prevalence of different 1st line regimens which were not captured in this study, particularly with the systematic change of 1st line drugs from D4T to ABC in 2010. Conclusion: In this descriptive study we found a 9.37% prevalence of AE were significant enough to be treatment limiting among our cohort. Lipoatrophy accounted for 59.04% of all documented AE. Overall, D4T/3TC/EFV was associated with 64.86% of all AE, 71.66% of lipoatrophy cases and 72.41% of lipodystrophy cases.

Keywords: ARV, adverse events, HAART, pediatric

Procedia PDF Downloads 170
48788 Relationship between Different Heart Rate Control Levels and Risk of Heart Failure Rehospitalization in Patients with Persistent Atrial Fibrillation: A Retrospective Cohort Study

Authors: Yongrong Liu, Xin Tang

Abstract:

Background: Persistent atrial fibrillation is a common arrhythmia closely related to heart failure. Heart rate control is an essential strategy for treating persistent atrial fibrillation. Still, the understanding of the relationship between different heart rate control levels and the risk of heart failure rehospitalization is limited. Objective: The objective of the study is to determine the relationship between different levels of heart rate control in patients with persistent atrial fibrillation and the risk of readmission for heart failure. Methods: We conducted a retrospective dual-centre cohort study, collecting data from patients with persistent atrial fibrillation who received outpatient treatment at two tertiary hospitals in central and western China from March 2019 to March 2020. The collected data included age, gender, body mass index (BMI), medical history, and hospitalization frequency due to heart failure. Patients were divided into three groups based on their heart rate control levels: Group I with a resting heart rate of less than 80 beats per minute, Group II with a resting heart rate between 80 and 100 beats per minute, and Group III with a resting heart rate greater than 100 beats per minute. The readmission rates due to heart failure within one year after discharge were statistically analyzed using propensity score matching in a 1:1 ratio. Differences in readmission rates among the different groups were compared using one-way ANOVA. The impact of varying levels of heart rate control on the risk of readmission for heart failure was assessed using the Cox proportional hazards model. Binary logistic regression analysis was employed to control for potential confounding factors. Results: We enrolled a total of 1136 patients with persistent atrial fibrillation. The results of the one-way ANOVA showed that there were differences in readmission rates among groups exposed to different levels of heart rate control. The readmission rates due to heart failure for each group were as follows: Group I (n=432): 31 (7.17%); Group II (n=387): 11.11%; Group III (n=317): 90 (28.50%) (F=54.3, P<0.001). After performing 1:1 propensity score matching for the different groups, 223 pairs were obtained. Analysis using the Cox proportional hazards model showed that compared to Group I, the risk of readmission for Group II was 1.372 (95% CI: 1.125-1.682, P<0.001), and for Group III was 2.053 (95% CI: 1.006-5.437, P<0.001). Furthermore, binary logistic regression analysis, including variables such as digoxin, hypertension, smoking, coronary heart disease, and chronic obstructive pulmonary disease as independent variables, revealed that coronary heart disease and COPD also had a significant impact on readmission due to heart failure (p<0.001). Conclusion: The correlation between the heart rate control level of patients with persistent atrial fibrillation and the risk of heart failure rehospitalization is positive. Reasonable heart rate control may significantly reduce the risk of heart failure rehospitalization.

Keywords: heart rate control levels, heart failure rehospitalization, persistent atrial fibrillation, retrospective cohort study

Procedia PDF Downloads 51
48787 Title: Real World Evidence a Tool to Overcome the Lack of a Comparative Arm in Drug Evaluation in the Context of Rare Diseases

Authors: Mohamed Wahba

Abstract:

Objective: To build a comparative arm for product (X) in specific gene mutated advanced gastrointestinal cancer using real world evidence to fulfill HTA requirements in drug evaluation. Methods: Data for product (X) were collected from phase II clinical trial while real world data for (Y) and (Z) were collected from US database. Real-world (RW) cohorts were matched to clinical trial base line characteristics using weighting by odds method. Outcomes included progression-free survival (PFS) and overall survival (OS) rates. Study location and participants: Internationally (product X, n=80) and from USA (Product Y and Z, n=73) Results: Two comparisons were made: trial cohort 1 (X) versus real-world cohort 1 (Z), trial cohort 2 (X) versus real-world cohort 2 (Y). For first line, the median OS was 9.7 months (95% CI 8.6- 11.5) and the median PFS was 5.2 months (95% CI 4.7- not reached) for real-world cohort 1. For second line, the median OS was 10.6 months (95% CI 4.7- 27.3) for real-world cohort 2 and the median PFS was 5.0 months (95% CI 2.1- 29.3). For OS analysis, results were statistically significant but not for PFS analysis. Conclusion: This study provided the clinical comparative outcomes needed for HTA evaluation.

Keywords: real world evidence, pharmacoeconomics, HTA agencies, oncology

Procedia PDF Downloads 72
48786 A Retrospective Study on the Age of Onset for Type 2 Diabetes Diagnosis

Authors: Mohamed A. Hammad, Dzul Azri Mohamed Noor, Syed Azhar Syed Sulaiman, Majed Ahmed Al-Mansoub, Muhammad Qamar

Abstract:

There is a progressive increase in the prevalence of early onset Type 2 diabetes mellitus. Early detection of Type 2 diabetes enhances the length and/or quality of life which might result from a reduction in the severity, frequency or prevent or delay of its long-term complications. The study aims to determine the onset age for the first diagnosis of Type 2 diabetes mellitus. A retrospective study conducted in the endocrine clinic at Hospital Pulau Pinang in Penang, Malaysia, January- December 2016. Records of 519 patients with Type 2 diabetes mellitus were screened to collect demographic data and determine the age of first-time diabetes mellitus diagnosis. Patients classified according to the age of diagnosis, gender, and ethnicity. The study included 519 patients with age (55.6±13.7) years, female 265 (51.1%) and male 254 (48.9%). The ethnicity distribution was Malay 191 (36.8%), Chinese 189 (36.4%) and Indian 139 (26.8%). The age of Type 2 diabetes diagnosis was (42±14.8) years. The female onset of diabetes mellitus was at age (41.5±13.7) years, while male (42.6±13.7) years. Distribution of diabetic onset by ethnicity was Malay at age (40.7±13.7) years, Chinese (43.2±13.7) years and Indian (42.3±13.7) years. Diabetic onset was classified by age as follow; ≤20 years’ cohort was 33 (6.4%) cases. Group >20- ≤40 years was 190 (36.6%) patients, and category >40- ≤60 years was 270 (52%) subjects. On the other hand, the group >60 years was 22 (4.2%) patients. The range of diagnosis was between 10 and 73 years old. Conclusion: Malay and female have an earlier onset of diabetes than Indian, Chinese and male. More than half of the patients had diabetes between 40 and 60 years old. Diabetes mellitus is becoming more common in younger age <40 years. The age at diagnosis of Type 2 diabetes mellitus has decreased with time.

Keywords: age of onset, diabetes diagnosis, diabetes mellitus, Malaysia, outpatients, type 2 diabetes, retrospective study

Procedia PDF Downloads 387
48785 Prophylactic Replacement of Voice Prosthesis: A Study to Predict Prosthesis Lifetime

Authors: Anne Heirman, Vincent van der Noort, Rob van Son, Marije Petersen, Lisette van der Molen, Gyorgy Halmos, Richard Dirven, Michiel van den Brekel

Abstract:

Objective: Voice prosthesis leakage significantly impacts laryngectomies patients' quality of life, causing insecurity and frequent unplanned hospital visits and costs. In this study, the concept of prophylactic voice prosthesis replacement was explored to prevent leakages. Study Design: A retrospective cohort study. Setting: Tertiary hospital. Methods: Device lifetimes and voice prosthesis replacements of a retrospective cohort, including all patients with laryngectomies between 2000 and 2012 in the Netherlands Cancer Institute, were used to calculate the number of needed voice prostheses per patient per year when preventing 70% of the leakages by prophylactic replacement. Various strategies for the timing of prophylactic replacement were considered: Adaptive strategies based on the individual patient’s history of replacement and fixed strategies based on the results of patients with similar voice prosthesis or treatment characteristics. Results: Patients used a median of 3.4 voice prostheses per year (range 0.1-48.1). We found a high inter-and intrapatient variability in device lifetime. When applying prophylactic replacement, this would become a median of 9.4 voice prostheses per year, which means replacement every 38 days, implying more than six additional voice prostheses per patient per year. The individual adaptive model showed that preventing 70% of the leakages was impossible for most patients, and only a median of 25% can be prevented. Monte-Carlo simulations showed that prophylactic replacement is not feasible due to the high Coefficient of Variation (Standard Deviation/Mean) in device lifetime. Conclusion: Based on our simulations, prophylactic replacement of voice prostheses is not feasible due to high inter-and intrapatient variation in device lifetime.

Keywords: voice prosthesis, voice rehabilitation, total laryngectomy, prosthetic leakage, device lifetime

Procedia PDF Downloads 109
48784 Good Functional Outcome after Late Surgical Treatment for Traumatic Rotator Cuff Tear, a Retrospective Cohort Study

Authors: Soheila Zhaeentan, Anders Von Heijne, Elisabet Hagert, André Stark, Björn Salomonsson

Abstract:

Recommended treatment for traumatic rotator cuff tear (TRCT) is surgery within a few weeks after injury if the diagnosis is made early, especially if a functional impairment of the shoulder exists. This may lead to the assumption that a poor outcome then can be expected in delayed surgical treatment, when the patient is diagnosed at a later stage. The aim of this study was to investigate if a surgical repair later than three months after injury may result in successful outcomes and patient satisfaction. There is evidence in literature that good results of treatment can be expected up to three months after the injury, but little is known of later treatment with cuff repair. 73 patients (75 shoulders), 58 males/17 females, mean age 59 (range 34-­‐72), who had undergone surgical intervention for TRCT between January 1999 to December 2011 at our clinic, were included in this study. Patients were assessed by MRI investigation, clinical examination, Western Ontario Rotator Cuff index (WORC), Oxford Shoulder Score, Constant-­‐Murley Score, EQ-­‐5D and patient subjective satisfaction at follow-­‐up. The patients treated surgically within three months ( < 12 weeks) after injury (39 cases) were compared with patients treated more than three months ( ≥ 12 weeks) after injury (36 cases). WORC was used as the primary outcome measure and the other variables as secondary. A senior consultant radiologist, blinded to patient category and clinical outcome, evaluated all MRI-­‐images. Rotator cuff integrity, presence of arthritis, fatty degeneration and muscle atrophy was evaluated in all cases. The average follow-­‐up time was 56 months (range 14-­‐149) and the average time from injury to repair was 16 weeks (range 3-­‐104). No statistically significant differences were found for any of the assessed parameters or scores between the two groups. The mean WORC score was 77 (early group, range 25-­‐ 100 and late group, range 27-­‐100) for both groups (p= 0.86), Constant-­‐Murley Score (p= 0.91), Oxford Shoulder Score (p= 0.79), EQ-­‐5D index (p= 0.86). Re-­‐tear frequency was 24% for both groups, and the patients with re-­‐tear reported less satisfaction with outcome. Discussion and conclusion: This study shows that surgical repair of TRCT performed later than three months after injury may result in good functional outcomes and patient satisfaction. However, this does not motivate an intentional delay in surgery when there is an indication for surgical repair as that delay may adversely affect the possibility to perform a repair. Our results show that surgeons may safely consider surgical repair even if a delay in diagnosis has occurred. A retrospective cohort study on 75 shoulders shows good functional result after traumatic rotator cuff tear (TRCT) treated surgically up to one year after the injury.

Keywords: traumatic rotator cuff injury, time to surgery, surgical outcome, retrospective cohort study

Procedia PDF Downloads 199
48783 Medical Radiation Exposure in a Cohort of Children Diagnosed with Solid Tumors: Single Institution Study 1985-2015

Authors: Robin L. Rohrer

Abstract:

Introduction: Pre-natal or early childhood exposure to the medical radiation used in diagnosis or treatment is an identified risk for childhood cancers but can be difficult to document. The author developed a family questionnaire/interview form to identify possible exposures. Aims: This retrospective study examines pre-natal and early childhood medical radiation exposure in a cohort of children diagnosed with a solid tumor including brain tumors from 1985-2015 at the Children’s Hospital of Pittsburgh (CHP). The hospital is a tri-state regional referral center which treats about 150-180 new cases of cancer in children per year. About 70% are diagnosed with a solid tumor. Methods: Each consented family so far (approximately 50% of the cohort) has been interviewed in person or by the phone call. Medical staff and psycho- social staff referred patient families for the interview with the author. Results: Among the families interviewed to date at least one medical radiation exposure has been identified (pre-conception, pre-natal or early childhood) in over 70% of diagnosed children. These exposures have included pre-conception sinus or chest CT or X-ray in either parent, sinus CT or X-ray in the mother or diagnostic radiation of chest or abdomen in children. Conclusions: Exposures to medical radiation for a child later diagnosed with cancer may occur at several critical junctures. These exposures may well contribute to a ‘perfect storm’ in the still elusive causes of childhood cancer. The author plans to expand the study from 1975 to present to hopefully further document these junctures.

Keywords: pediatric, solid tumors, medical radiation, cancer

Procedia PDF Downloads 246
48782 Assessing the Survival Time of Hospitalized Patients in Eastern Ethiopia During 2019–2020 Using the Bayesian Approach: A Retrospective Cohort Study

Authors: Chalachew Gashu, Yoseph Kassa, Habtamu Geremew, Mengestie Mulugeta

Abstract:

Background and Aims: Severe acute malnutrition remains a significant health challenge, particularly in low‐ and middle‐income countries. The aim of this study was to determine the survival time of under‐five children with severe acute malnutrition. Methods: A retrospective cohort study was conducted at a hospital, focusing on under‐five children with severe acute malnutrition. The study included 322 inpatients admitted to the Chiro hospital in Chiro, Ethiopia, between September 2019 and August 2020, whose data was obtained from medical records. Survival functions were analyzed using Kaplan‒Meier plots and log‐rank tests. The survival time of severe acute malnutrition was further analyzed using the Cox proportional hazards model and Bayesian parametric survival models, employing integrated nested Laplace approximation methods. Results: Among the 322 patients, 118 (36.6%) died as a result of severe acute malnutrition. The estimated median survival time for inpatients was found to be 2 weeks. Model selection criteria favored the Bayesian Weibull accelerated failure time model, which demonstrated that age, body temperature, pulse rate, nasogastric (NG) tube usage, hypoglycemia, anemia, diarrhea, dehydration, malaria, and pneumonia significantly influenced the survival time of severe acute malnutrition. Conclusions: This study revealed that children below 24 months, those with altered body temperature and pulse rate, NG tube usage, hypoglycemia, and comorbidities such as anemia, diarrhea, dehydration, malaria, and pneumonia had a shorter survival time when affected by severe acute malnutrition under the age of five. To reduce the death rate of children under 5 years of age, it is necessary to design community management for acute malnutrition to ensure early detection and improve access to and coverage for children who are malnourished.

Keywords: Bayesian analysis, severe acute malnutrition, survival data analysis, survival time

Procedia PDF Downloads 2
48781 Tuberculosis in Patients with HIV-Infection in Russia: Cohort Study over the Period of 2015-2016 Years

Authors: Marina Nosik, Irina Rymanova, Konstantin Ryzhov, Joan Yarovaya, Alexander Sobkin

Abstract:

Tuberculosis (TB) associated with HIV is one of the top causes of death worldwide. However, early detection and treatment of TB in HIV-infected individuals significantly reduces the risk of developing severe forms of TB and mortality. The goal of the study was to analyze the peculiarities of TB associated with HIV infection. Over the period of 2015-2016 a retrospective cohort study was conducted among 377 patients with TB/HIV co-infection who attended the Moscow Tuberculosis Clinic. The majority of the patients was male (64,5%). The median age was: men 37,9 (24÷62) and women 35,4 (22÷72) years. The most prevalent age group was 30-39 years both for men and women (73,3% and 54,7%, respectively). The ratio of patients in age group 50-59 and senior was 3,9%. Socioeconomic status of patients was rather low: only 2.3% of patients had a university degree; 76,1% was unemployed (of whom 21,7% were disabled). Most patients had disseminated pulmonary tuberculosis in the phase of infiltration/ decay (41,5%). The infiltrative TB was detected in 18,9% of patients; 20,1% patients had tuberculosis of intrathoracic lymph nodes. The occurrence of MDR-TB was 16,8% and XDR-TB – 17,9%. The number of HIV-positive patients with newly diagnosed TB was n=261(69,2%). The active TB-form (MbT+) among new TB/HIV cases was 44,7 %. The severe clinical forms of TB and a high TB incidence rate among HIV-infected individuals alongside with a large number of cases of newly diagnosed tuberculosis, indicate the need for more intense interaction with TB services for timely diagnosis of TB which will optimize treatment outcomes.

Keywords: HIV, tuberculosis (TB), TB associated with HIV, multidrug-resistant TB (MDR-TB)

Procedia PDF Downloads 215
48780 Pregnancy Rate and Outcomes after Uterine Fibroid Embolization Single Centre Experience in the Middle East from the United Arab Emirates at Alain Hospital

Authors: Jamal Alkoteesh, Mohammed Zeki, Mouza Alnaqbi

Abstract:

Objective: To evaluate pregnancy outcomes, complications and neonatal outcomes in women who had previously undergone uterine arterial embolization. Design: Retrospective study. In this study, most women opted for UFE as a fertility treatment after failure of myomectomy or in vitro fertilization, or because hysterectomy was the only suggested option. Background. Myomectomy is the standard approach in patients with fibroids desiring a future pregnancy. However, myomectomy may be difficult in cases of numerous interstitial and/or submucous fibroids.In these cases, UFE has the advantage of embolizing all fibroids in one procedure. This procedure is an accepted nonsurgical treatment for symptomatic uterine fibroids. Study Methods: A retrospective study of 210 patients treated with UFE for symptomatic uterine fibroids between 2011-2016 was performed. UFE was performed using ((PVA; Embozen, Beadblock) (500-900 µm in diameter). Pregnancies were identified using screening questionnaires and the study database. Of the 210 patients who received UFE treatment, 35 women younger than the age of 40 wanted to conceive and had been unable. All women in our study were advised to wait six months or more after UFE before attempting to become pregnant, of which the reported time range before attempting to conceive was seven to 33 months (average 20 months). RESULTS: In a retrospective chart review of patients younger than the age of 40 (35 patients,18 patients reported 23 pregnancies, of which five were miscarriages. Two more pregnancies were complicated by premature labor. Of the 23 pregnancies, 16 were normal full-term pregnancies, 15 women had conceived once, and four had become pregnant twice. The remaining patients did not conceive. In the study, there was no reported intrauterine growth retardation in the prenatal period, fetal distress during labor, or problems related to uterine integrity. Two patients reported minor problems during pregnancy that were borderline oligohydramnios and low-lying placenta. In the cohort of women who did conceive, overall, 16 out of 18 births proceeded normally without any complications (86%). Eight women delivered by cesarean section, and 10 women had normal vaginal delivery. In this study of 210 women, UFE had a fertility rate of 47%. Our group of 23 pregnancies was small, but did confirm successful pregnancy after UFE. The 45.7% pregnancy rate in women below the age of 40 years old who completed a term pregnancy compares favorably with women who underwent myomectomy via other method. Of the women in the cohort who did conceive, subsequent birth proceeded normally (86%). Conclusion: Pregnancy after UFE is well-documented. The risks of infertility following embolization, premature menopause, and hysterectomy are small, as is the radiation exposure during embolization. Fertility rates appear similar to patients undergoing myomectomy.UFE should not be contraindicated in patients who want to conceive and they should be able to choose between surgical options and UFE.

Keywords: fibroid, pregnancy, therapeutic embolization, uterine artery

Procedia PDF Downloads 213
48779 Continuous Glucose Monitoring Systems and the Improvement in Hypoglycemic Awareness Post-Islet Transplantation: A Single-Centre Cohort Study

Authors: Clare Flood, Shareen Forbes

Abstract:

Background: Type 1 diabetes mellitus (T1DM) is an autoimmune disorder affecting >400,000 people in the UK alone, with the global prevalence expected to double in the next decade. Islet transplant offers a minimally-invasive procedure with very low morbidity and almost no mortality, and is now as effective as whole pancreas transplant. The procedure was introduced to the UK in 2011 for patients with the most severe type 1 diabetes mellitus (T1DM) – those with unstable blood glucose, frequently occurring episodes of severe hypoglycemia and impaired awareness of hypoglycemia (IAH). Objectives: To evaluate the effectiveness of islet transplantation in improving glycemic control, reducing the burden of hypoglycemia and improving awareness of hypoglycemia through a single-centre cohort study at the Royal Infirmary of Edinburgh. Glycemic control and degree of hypoglycemic awareness will be determined and monitored pre- and post-transplantation to determine effectiveness of the procedure. Methods: A retrospective analysis of data collected over three years from the 16 patients who have undergone islet transplantation in Scotland. Glycated haemoglobin (HbA1c) was measured and continuous glucose monitoring systems (CGMS) were utilised to assess glycemic control, while Gold and Clarke score questionnaires tested IAH. Results: All patients had improved glycemic control following transplant, with optimal control seen visually at 3 months post-transplant. Glycemic control significantly improved, as illustrated by percentage time in hypoglycemia in the months following transplant (p=0.0211) and HbA1c (p=0.0426). Improved Clarke (p=0.0034) and Gold (p=0.0001) scores indicate improved glycemic awareness following transplant. Conclusion: While the small sample of islet transplant recipients at the Royal Infirmary of Edinburgh prevents definitive conclusions being drawn, it is indicated that through our retrospective, single-centre cohort study of 16 patients, islet transplant is capable of improving glycemic control, reducing the burden of hypoglycemia and IAH post-transplant. Data can be combined with similar trials at other centres to increase statistical power but from research in Edinburgh, it can be suggested that the minimally invasive procedure of islet transplantation offers selected patients with extremely unstable T1DM the incredible opportunity to regain control of their condition and improve their quality of life.

Keywords: diabetes, islet, transplant, CGMS

Procedia PDF Downloads 250
48778 A Cohort and Empirical Based Multivariate Mortality Model

Authors: Jeffrey Tzu-Hao Tsai, Yi-Shan Wong

Abstract:

This article proposes a cohort-age-period (CAP) model to characterize multi-population mortality processes using cohort, age, and period variables. Distinct from the factor-based Lee-Carter-type decomposition mortality model, this approach is empirically based and includes the age, period, and cohort variables into the equation system. The model not only provides a fruitful intuition for explaining multivariate mortality change rates but also has a better performance in forecasting future patterns. Using the US and the UK mortality data and performing ten-year out-of-sample tests, our approach shows smaller mean square errors in both countries compared to the models in the literature.

Keywords: longevity risk, stochastic mortality model, multivariate mortality rate, risk management

Procedia PDF Downloads 30
48777 The Efficacy of Pre-Hospital Packed Red Blood Cells in the Treatment of Severe Trauma: A Retrospective, Matched, Cohort Study

Authors: Ryan Adams

Abstract:

Introduction: Major trauma is the leading cause of death in 15-45 year olds and a significant human, social and economic costs. Resuscitation is a stalwart of trauma management, especially in the pre-hospital environment and packed red blood cells (pRBC) are being increasingly used with the advent of permissive hypotension. The evidence in this area is lacking and further research is required to determine its efficacy. Aim: The aim of this retrospective, matched cohort study was to determine if major trauma patients, who received pre-hospital pRBC, have a difference in their initial emergency department cardiovascular status; when compared with injury-profile matched controls. Methods: The trauma databases of the Royal Brisbane and Women's Hospital, Royal Children's Hospital (Herston) and Queensland Ambulance Service were accessed and major trauma patient (ISS>12) data, who received pre-hospital pRBC, from January 2011 to August 2014 was collected. Patients were then matched against control patients that had not received pRBC, by their injury profile. The primary outcomes was cardiovascular status; defined as shock index and Revised Trauma Score. Results: Data for 25 patients who received pre-hospital pRBC was accessed and the injury profiles matched against suitable controls. On admittance to the emergency department, a statistically significant difference was seen in the blood group (Blood = 1.42 and Control = 0.97, p-value = 0.0449). However, the same was not seen with the RTS (Blood = 4.15 and Control 5.56, p-value = 0.291). Discussion: A worsening shock index and revised trauma score was associated with pre-hospital administration of pRBC. However, due to the small sample size, limited matching protocol and associated confounding factors it is difficult to draw any solid conclusions. Further studies, with larger patient numbers, are required to enable adequate conclusions to be drawn on the efficacy of pre-hospital packed red blood cell transfusion.

Keywords: pre-hospital, packed red blood cells, severe trauma, emergency medicine

Procedia PDF Downloads 373
48776 A Retrospective Cohort Study on an Outbreak of Gastroenteritis Linked to a Buffet Lunch Served during a Conference in Accra

Authors: Benjamin Osei Tutu, Sharon Annison

Abstract:

On 21st November, 2016, an outbreak of foodborne illness occurred after a buffet lunch served during a stakeholders’ consultation meeting held in Accra. An investigation was conducted to characterise the affected people, determine the etiologic food, the source of contamination and the etiologic agent and to implement appropriate public health measures to prevent future occurrences. A retrospective cohort study was conducted via telephone interviews, using a structured questionnaire developed from the buffet menu. A case was defined as any person suffering from symptoms of foodborne illness e.g. diarrhoea and/or abdominal cramps after eating food served during the stakeholder consultation meeting in Accra on 21st November, 2016. The exposure status of all the members of the cohort was assessed by taking the food history of each respondent during the telephone interview. The data obtained was analysed using Epi Info 7. An environmental risk assessment was conducted to ascertain the source of the food contamination. Risks of foodborne infection from the foods eaten were determined using attack rates and odds ratios. Data was obtained from 54 people who consumed food served during the stakeholders’ meeting. Out of this population, 44 people reported with symptoms of food poisoning representing 81.45% (overall attack rate). The peak incubation period was seven hours with a minimum and maximum incubation periods of four and 17 hours, respectively. The commonly reported symptoms were diarrhoea (97.73%, 43/44), vomiting (84.09%, 37/44) and abdominal cramps (75.00%, 33/44). From the incubation period, duration of illness and the symptoms, toxin-mediated food poisoning was suspected. The environmental risk assessment of the implicated catering facility indicated a lack of time/temperature control, inadequate knowledge on food safety among workers and sanitation issues. Limited number of food samples was received for microbiological analysis. Multivariate analysis indicated that illness was significantly associated with the consumption of the snacks served (OR 14.78, P < 0.001). No stool and blood or samples of etiologic food were available for organism isolation; however, the suspected etiologic agent was Staphylococcus aureus or Clostridium perfringens. The outbreak could probably be due to the consumption of unwholesome snack (tuna sandwich or chicken. The contamination and/or growth of the etiologic agent in the snack may be due to the breakdown in cleanliness, time/temperature control and good food handling practices. Training of food handlers in basic food hygiene and safety is recommended.

Keywords: Accra, buffet, conference, C. perfringens, cohort study, food poisoning, gastroenteritis, office workers, Staphylococcus aureus

Procedia PDF Downloads 202
48775 Gastrointestinal Manifestations and Outcomes in Hospitalized COVID-19 Patients: A Retrospective Study

Authors: Jaylo Abalos, Sophia Zamora

Abstract:

BACKGROUND: Various gastrointestinal (GI) symptoms, including diarrhea, nausea/vomiting and abdominal pain, have been reported in patients with Coronavirus disease 2019 (COVID-19). In this context, the presence of GI symptoms is variably associated with poor clinical outcomes in COVID-19. We aim to determine the outcomes of hospitalized COVID-19 patients with gastrointestinal symptoms. METHODOLOGY: This is a retrospective cohort study that used medical records of admitted COVID-19 patients from March 2020- March 2021 in a tertiary hospital in Pangasinan. Data records were evaluated for the presence of gastrointestinal manifestations, including diarrhea, nausea, vomiting and abdominal pain at the time of admission. Comparison between cases or COVID-19 patients presenting with GI manifestations to controls or COVID-19 patients without GI manifestation was made. RESULTS: Four hundred three patients were included in the study. Of these, 22.3% presented with gastrointestinal symptoms, while 77.7% comprised the study controls. Diarrhea was the most common GI symptom (10.4%). No statistically significant difference was observed in comorbidities and laboratory findings. Mortality was the primary outcome of the study that did not reach statistical significance between cases and controls (13.33% vs. 16.30%, p =0.621). There were also no significant differences observed in the secondary outcomes, mean length of stay, (14 [12-18 days] in cases vs 14 [12- 17.5 days] in controls, p = 0.716) and need for mechanical ventilation (12.22% vs 16.93%, p = 0.329). CONCLUSION: The results of the study revealed no association of the GI symptoms to poor outcomes, including a high rate of mortality, prolonged length of stay and increased need for mechanical ventilation.

Keywords: gastrointestinal symptoms, COVID-19, outcomes, mortality, length of stay

Procedia PDF Downloads 117
48774 Factors Associated with Death during Tuberculosis Treatment of Patients Co-Infected with HIV at a Tertiary Care Setting in Cameroon: An 8-Year Hospital-Based Retrospective Cohort Study (2006-2013)

Authors: A. A. Agbor, Jean Joel R. Bigna, Serges Clotaire Billong, Mathurin Cyrille Tejiokem, Gabriel L. Ekali, Claudia S. Plottel, Jean Jacques N. Noubiap, Hortence Abessolo, Roselyne Toby, Sinata Koulla-Shiro

Abstract:

Background: Contributors to fatal outcomes in patients undergoing tuberculosis (TB) treatment in the setting of HIV co-infection are poorly characterized, especially in sub-Saharan Africa. Our study’s aim was to assess factors associated with death in TB/HIV co-infected patients during the first 6 months their TB treatment. Methods: We conducted a tertiary-care hospital-based retrospective cohort study from January 2006 to December 2013 at the Yaoundé Central Hospital, Cameroon. We reviewed medical records to identify hospitalized co-infected TB/HIV patients aged 15 years and older. Death was defined as any death occurring during TB treatment, as per the World Health Organization’s recommendations. Logistic regression analysis identified factors associated with death. Magnitudes of associations were expressed by adjusted odds ratio (aOR) with 95% confidence interval. A p value < 0.05 was considered statistically significant. Results: The 337 patients enrolled had a mean age of 39.3 (+/- 10.3) years and more (54.3%) were women. TB treatment outcomes included: treatment success in 60.8% (n=205), death in 29.4% (n=99), not evaluated in 5.3% (n=18), loss to follow-up in 5.3% (n=14), and failure in 0.3% (n=1) . After exclusion of patients lost to follow-up and not evaluated, death in TB/HIV co-infected patients during TB treatment was associated with: a TB diagnosis made before national implementation of guidelines regarding initiation of antiretroviral therapy (aOR = 2.50 [1.31-4.78]; p = 0.006), the presence of other AIDS-defining infections (aOR = 2.73 [1.27-5.86]; p = 0.010), non-AIDS comorbidities (aOR = 3.35 [1.37-8.21]; p = 0.008), not receiving co-trimoxazole prophylaxis (aOR = 3.61 [1.71-7.63]; p = 0.001), not receiving antiretroviral therapy (aOR = 2.45 [1.18-5.08]; p = 0.016), and CD4 cell counts < 50 cells/mm3 (aOR = 16.43 [1.05-258.04]; p = 0.047). Conclusions: The success rate of anti-tuberculosis treatment among hospitalized TB/HIV co-infected patients in our setting is low. Mortality in the first 6 months of treatment was high and strongly associated with specific clinical factors including states of greater immunosuppression, highlighting the urgent need for targeted interventions, including provision of anti-retroviral therapy and co-trimoxazole prophylaxis in order to enhance patient outcomes.

Keywords: TB/HIV co-infection, death, treatment outcomes, factors

Procedia PDF Downloads 425
48773 Mild Hypothermia Versus Normothermia in Patients Undergoing Cardiac Surgery: A Propensity Matched Analysis

Authors: Ramanish Ravishankar, Azar Hussain, Mahmoud Loubani, Mubarak Chaudhry

Abstract:

Background and Aims: Currently, there are no strict guidelines in cardiopulmonary bypass temperature management in cardiac surgery not involving the aortic arch. This study aims to compare patient outcomes undergoing mild hypothermia and normothermia. The aim of this study was to compare patient outcomes between mild hypothermia and normothermia undergoing on-pump cardiac surgery not involving the aortic arch. Methods: This was a retrospective cohort study from January 2015 until May 2023. Patients who underwent cardiac surgery with cardiopulmonary bypass temperatures ≥32oC were included and stratified into mild hypothermia (32oC – 35oC) and normothermia (>35oC) cohorts. Propensity matching was applied through the nearest neighbour method (1:1) using the risk factors detailed in the EuroScore using RStudio. The primary outcome was mortality. Secondary outcomes included post-op stay, intensive care unit readmission, re-admission, stroke, and renal complications. Patients who had major aortic surgery and off-pump operations were excluded. Results: Each cohort had 1675 patients. There was a significant increase in overall mortality with the mild hypothermia cohort (3.59% vs. 2.32%; p=0.04912). There was also a greater stroke incidence (2.09% vs. 1.13%; p=0.0396) and transient ischaemic attack (TIA) risk (3.1% vs. 1.49%; p=0.0027). There was no significant difference in renal complications (9.13% vs. 7.88%; p=0.2155). Conclusions: Patient’s who underwent mild hypothermia during cardiopulmonary bypass have a significantly greater mortality, stroke, and transient ischaemic attack incidence. Mild hypothermia does not appear to provide any benefit over normothermia and does not appear to provide any neuroprotective benefits. This shows different results to that of other major studies; further trials and studies need to be conducted to reach a consensus.

Keywords: cardiac surgery, therapeutic hypothermia, neuroprotection, cardiopulmonary bypass

Procedia PDF Downloads 51
48772 Psychological Wellbeing of Caregivers: Findings from a Large Cohort of Thai Adults

Authors: Vasoontara Yiengprugsawan, Sam-ang Seubsman

Abstract:

As Thais live longer, caregivers will become even more important to social and healthcare systems. Commonly reported in many low and middle‐income countries in Asia, formal social welfare services to support caregivers are lacking and informal family support will be required for all levels of care. In 2005, 87,151 open‐university adults were recruited to the Thai Cohort Study, with the majority aged between 25 and 39 years, and residing nationwide. At the 4‐year follow up in 2009 (n=60569) and the 8‐year follow‐up in 2013 (n=42785), prospective cohort participants were asked if they provide care for chronically ill, disabled, or frail family members. Among Thai cohort members reporting between 2009 and 2013, approximately 56% were not caregivers in either year, 24.5% reported providing care in 2009 only, 8.6% in 2013 only, and 10.6% reported providing care at both time points. Caregivers in the cohort reported providing financial support, help with shopping, emotional support, and assist with daily activities. Kessler 6 psychological distress scale, measured in both 2009 and 2013, was used as the primary outcome of a relationship between caregiving status and mental health. Using multivariate logistic regression, our 4‐year longitudinal findings revealed that cohort members who reported providing care at both time points were 1.4 to 1.6 times more likely to report high psychological distress than non‐caregivers, after accounting for potential covariates. With increasing needs for informal care provided by family members, the future health and social welfare system will need to provide adequate support to caregivers (e.g., respite care, clinical support and information for the family, and awareness of mental health among caregivers).

Keywords: family caregivers, psychological distress, prospective cohort, longitudinal study, Thailand

Procedia PDF Downloads 260
48771 Breech Versus Cephalic Elective Caesarean Deliveries – A Comparison of Immediate Neonatal Outcomes

Authors: Genevieve R. Kan, Jolyon Ford

Abstract:

Background: Caesarean section has become the routine route of delivery for breech fetuses, but breech cesarean deliveries are hypothesized to have poorer immediate neonatal outcomes when compared to cephalic deliveries. In accordance with this, in many Australian hospitals, the pediatric team is routinely required to attend every elective breech cesarean section in case urgent resuscitation is required. Our study aimed to determine whether term elective breech deliveries indeed had worse immediate neonatal outcomes at delivery, which will justify the necessity of pediatric staff presence at every elective breech cesarean delivery and influence the workload for the pediatric team. Objective: Elective breech cesarean deliveries were compared to elective cephalic cesarean deliveries at 37 weeks gestation or above to evaluate the immediate neonatal outcomes (Apgar scores <7 at 5 minutes, and Special Care Nursery admissions on Day 1 of life) of each group. Design: A retrospective cohort study Method: This study examined 2035 elective breech and cephalic singleton cesarean deliveries at term over 5 years from July 2017 to July 2022 at Frankston Hospital, a metropolitan hospital in Melbourne, Australia. There were 260 breech deliveries and 1775 cephalic deliveries. De-identified patient data were collected retrospectively from the hospital’s electronically integrated pregnancy and birth records to assess demographics and neonatal outcomes. Results: Apgar scores <7 at 5 minutes of life were worse in the breech group compared to the cephalic group (3.4% vs 1.6%). Special Care Nursery admissions on Day 1 of life were also higher for the breech cohort compared to the cephalic cohort (9.6% vs 8.7%). Conclusions: Our results support the expected findings that breech deliveries are associated with worse immediate neonatal outcomes. It, therefore, suggests that routine attendance at elective breech cesarean deliveries by the pediatric team is indeed required to assist with potentially higher needs for neonatal resuscitation and special care nursery admission.

Keywords: breech, cesarean section, Apgar scores, special care nursery admission

Procedia PDF Downloads 88
48770 High-Dose-Rate Brachytherapy for Cervical Cancer: The Effect of Total Reference Air Kerma on the Results of Single-Channel and Tri-Channel Applicators

Authors: Hossain A., Miah S., Ray P. K.

Abstract:

Introduction: Single channel and tri-channel applicators are used in the traditional treatment of cervical cancer. Total reference air kerma (TRAK) and treatment outcomes in high-dose-rate brachytherapy for cervical cancer using single-channel and tri-channel applicators were the main objectives of this retrospective study. Material and Methods: Patients in the radiotherapy division who received brachytherapy, chemotherapy, and external radiotherapy (EBRT) using single and tri-channel applicators were the subjects of a retrospective cohort study from 2016 to 2020. All brachytherapy parameters, including TRAK, were calculated in accordance with the international protocol. The Kaplan Meier method was used to analyze survival rates using a log-rank test. Results and Discussions: Based on treatment times of 15.34 (10-20) days and 21.35 (6.5-28) days, the TRAK for the tri-channel applicator was 0.52 cGy.m² and for the single-channel applicator was 0.34 cGy.m². Based on TRAK, the rectum, bladder, and tumor had respective Pearson correlations of 0.082, 0.009, and 0.032. The 1-specificity and sensitivity were 0.70 and 0.30, respectively. At that time, AUC was 0.71. The log-rank test showed that tri-channel applicators had a survival rate of 95% and single-channel applicators had a survival rate of 85% (p=0.565). Conclusions: The relationship between TRAK and treatment duration and Pearson correlation for the tumor, rectum, and bladder suggests that TRAK should be taken into account for the proper operation of single channel and tri-channel applicators.

Keywords: single-channel, tri-channel, high dose rate brachytherapy, cervical cancer

Procedia PDF Downloads 82
48769 Utility of Optical Coherence Tomography (OCT) and Visual Field Assessment in Neurosurgical Patients

Authors: Ana Ferreira, Ines Costa, Patricia Polónia, Josué Pereira, Olinda Faria, Pedro Alberto Silva

Abstract:

Introduction: Optical coherence tomography (OCT) and visual field tools are pivotal in evaluating neurological deficits and predicting potential visual improvement following surgical decompression in neurosurgical patients. Despite their clinical significance, a comprehensive understanding of their utility in this context is lacking in the literature. This study aims to elucidate the applications of OCT and visual field assessment, delineating distinct patterns of visual deficit presentations within the studied cohort. Methods: This retrospective analysis considered all adult patients who underwent a single surgery for pituitary adenoma or anterior skull base meningioma with optic nerve involvement, coupled with neuro-ophthalmology evaluation, between July 2020 and January 2023. A minimum follow-up period of 6 months was deemed essential. Results: A total of 24 patients, with a median age of 61, were included in the analysis. Three primary patterns emerged: 1) Low visual field involvement with compromised OCT, 2) High visual field involvement with relatively unaffected OCT, and 3) Significant compromise observed in both OCT and visual fields. Conclusion: This study delineates various findings in OCT and visual field assessments with illustrative examples. Based on the current findings, a prospective cohort will be systematically collected to further investigate and validate these patterns and their prognostic significance, enhancing our understanding of the utility of OCT and visual fields in neurosurgical patients.

Keywords: OCT, neurosurgery, visual field, optic nerve

Procedia PDF Downloads 36
48768 Long Term Follow-Up, Clinical Outcomes and Quality of Life after Total Arterial Revascularisation versus Conventional Coronary Surgery: A Retrospective Study

Authors: Jitendra Jain, Cassandra Hidajat, Hansraj Riteesh Bookun

Abstract:

Graft patency underpins long-term prognosis after coronary artery bypass grafting surgery (CABG). The benefits of the combined use of only the left internal mammary artery and radial artery, referred to as total arterial revascularisation (TAR), on long-term clinical outcomes and quality of life are relatively unknown. The aim of this study was to identify whether there were differences in long term clinical outcomes between recipients of TAR compared to a cohort of mostly arterial revascularization involving the left internal mammary, at least one radial artery and at least one saphenous vein graft. A retrospective analysis was performed on all patients who underwent TAR or were re-vascularized with supplementary saphenous vein graft from February 1996 to December 2004. Telephone surveys were conducted to obtain clinical outcome parameters including major adverse cardiac and cerebrovascular events (MACCE) and Short Form (SF-36v2) Health Survey responses. A total of 176 patients were successfully contacted to obtain postop follow up results. The mean follow-up length from time of surgery in our study was TAR 12.4±1.8 years and conventional 12.6±2.1. PCS score was TAR 45.9±8.8 vs LIMA/Rad/ SVG 44.9±9.2 (p=0.468) and MCS score was TAR 52.0±8.9 vs LIMA/Rad/SVG 52.5±9.3 (p=0.723). There were no significant differences between groups for NYHA class 3+ TAR 9.4% vs. LIMA/Rad/SVG 6.6%; or CCS 3+ TAR 2.35% vs. LIMA/Rad/SVG 0%.

Keywords: CABG; MACCEs; quality of life; total arterial revascularisation

Procedia PDF Downloads 196
48767 A Clinical Study of Tracheobronchopathia Osteochondroplastica: Findings from a Large Chinese Cohort

Authors: Ying Zhu, Ning Wu, Hai-Dong Huang, Yu-Chao Dong, Qin-Ying Sun, Wei Zhang, Qin Wang, Qiang Li

Abstract:

Background and study aims: Tracheobronchopathia osteochondroplastica (TO) is an uncommon disease of the tracheobronchial system that leads to narrowing of the airway lumen from cartilaginous and/or osseous submucosal nodules. The aim of this study is to perform a detailed review of this rare disease in a large cohort of patients with TO proven by fiberoptic bronchoscopy from China. Patients and Methods: Retrospective chart review was performed on 41,600 patients who underwent bronchoscopy in the Department of Respiratory Medicine of Changhai Hospital between January 2005 and December 2012. Cases of TO were identified based on characteristic features during bronchoscopic examination. Results: 22 cases of bronchoscopic TO were identified. Among whom one-half were male and the mean age was 47.45 ±10.91 years old. The most frequent symptoms at presentation were chronic cough (n=14) and increased sputum production (n=10). Radiographic abnormalities were observed in 3/18 patients and findings on computed tomography consistent with TO such as beaded intraluminal calcifications and/or increased luminal thickenings were observed in 18/22 patients. Patients were classified into the following categories based on the severity of bronchoscopic findings: Stage I (n=2), Stage II (n=6) and Stage III(n=14). The result that bronchoscopic improvement was observed in 2 patients administered with inhaled corticosteroids suggested that resolution of this disease is possible. Conclusions: TO is a benign disease with slow progression, which could be roughly divided into 3 stages on the basis of the characteristic endoscopic features and histopathologic findings. Chronic inflammation was thought to be more important than the other existing plausible hypotheses in the course of TO. Inhaled corticosteroids might have some impact on patients at Stage I/II.

Keywords: airway obstruction, bronchoscopy, etiology, Tracheobronchopathia osteochondroplastica (TO), treatment

Procedia PDF Downloads 436