Search results for: Neurosurgical patients cohort study
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 51823

Search results for: Neurosurgical patients cohort study

51673 Study of Contrast Induced Nephropathy in Patients Undergoing Cardiac Catheterization: Upper Egypt Experience

Authors: Ali Kassem, Sharf Eldeen-Shazly, Alshemaa Lotfy

Abstract:

Introduction: Contrast-induced nephropathy (CIN) has been the third leading cause of hospital-acquired renal failure. Patients with cardiac diseases are particularly at risk especially with repeated injections of contrast media. CIN is generally defined as an increase in serum creatinine concentration of > 0.5 mg/dL or 25% above baseline within 48 hours after contrast administration. Aim of work: To examine the frequency of CIN for patients undergoing cardiac catheterization at Sohag University Hospital (Upper Egypt) and to identify possible risk factors for CIN in these patients. Material and methods: The study included 104 patients with mean age 56.11 ±10.03, 64(61.5%) are males while 40(38.5%) are females. 44(42.3%) patients are diabetics, 43(41%) patients are hypertensive, 6(5.7%) patients have congestive heart failure, 69(66.3%) patients on statins, 74 (71.2 %) are on ACEIs or ARBs, 19(15.4%) are on metformin, 6 (5.8%) are on NSAIDs, 30(28.8%) are on diuretics. RESULTS: Patients were classified at the end of the study into two groups: Group A: Included 91 patients who did not develop CIN. Group B: Included 13 patients who developed CIN, of which serum creatinine raised > 0.5mg/dl in 6 patients and raised > 25% from the baseline after the procedure in 13 patients. The overall incidence of CIN was 12.5%. CIN increased with older age. There was an increase in the incidence of CIN in diabetic versus non-diabetic patients (20.5% and 6.7%) respectively. (p< 0.03). There was a highly significant increase in the incidence of CIN in patients with CHF versus those without CHF (100% and 71%) respectively, (P<0001). Patients on diuretics showed a significant increase in the incidence of CIN representing 61.5% of all patients who developed CIN. Conclusion: Older patients, diabetic patients, patients with CHF and patients on diuretics have higher risk of developing CIN during coronary catheterization and should receive reno-protective measures before contrast exposure.

Keywords: cardiac diseases, contrast-induced nephropathy, coronary catheterization, CIN

Procedia PDF Downloads 313
51672 Effects of Blood Pressure According to Age on End-Stage Renal Disease Development in Diabetes Mellitus Patients: A Nationwide Population-Based Cohort Study

Authors: Eun Hui Bae, Sang Yeob Lim, Bongseong Kim, Tae Ryom Oh, Su Hyun Song, Sang Heon Suh, Hong Sang Choi, Eun Mi Yang, Chang Seong Kim, Seong Kwon Ma, Kyung-Do Han, Soo Wan Kim

Abstract:

Background: Recent hypertension guidelines have recommended lower blood pressure (BP) targets in high-risk patients. However, there are no specific guidelines based on age or systolic and diastolic blood pressure (SBP and DBP, respectively). We aimed to assess the effects of age-related BP on the development of end-stage renal disease (ESRD) in patients with diabetes. Methods: A total of 2,563,870 patients with DM aged >20 years were selected from the Korean National Health Screening Program from 2009 to 2012 and followed up until the end of 2019. Participants were categorized into age and BP groups, and the hazard ratios (HRs) for ESRD were calculated. Results: During a median follow-up of 7.15 years, the incidence rates of ESRD increased with increasing SBP and DBP. The HR for ESRD was the highest in patients younger than 40 years of age with DBP ≥ 100 mmHg. The effect of SBP and DBP on ESRD development was attenuated with age (interaction p-value was <0.0001 for age and SBP and 0.0022 for age and DBP). The subgroup analysis for sex, anti-hypertension medication, and history of chronic kidney disease (CKD) showed higher HRs for ESRD among males younger than 40 years, not taking anti-hypertension medications and CKD compared to those among females older than 40 years, anti-hypertension medication and non-CKD groups. Conclusions: Higher SBP and DBP increase the risk of developing ESRD in patients with diabetes, and in particular, younger individuals face greater risk. Therefore, intensive BP management is warranted in younger patients to prevent ESRD.

Keywords: hypertension, young adult, end-stage renal disease, diabetes mellitus, chronic kidney disease, blood pressure

Procedia PDF Downloads 129
51671 Analysis of Relative Gene Expression Data of GATA3-AS1 Associated with Resistance to Neoadjuvant Chemotherapy in Locally Advanced Breast Cancer Patients of Luminal B Subtype

Authors: X. Cervantes-López, C. Arriaga-Canon, L. Contreras Espinosa

Abstract:

The goal of this study is to validate the overexpression of the lncRNA GATA3-AS1 associated with resistance to neoadjuvant chemotherapy of female patients with locally advanced mammary adenocarcinoma of luminal B subtype This study involved a cohort of one hundred thirty-seven samples for which total RNA was isolated from formalin fixed paraffin embedded (FFPE) tissue. Samples were cut using a Microtome Hyrax M25 Zeiss and RNA was isolated using the RNeasy FFPE kit and a deparaffinization solution, the next step consisted in the analysis of RNA concentration and quality, then 18 µg of RNA was treated with DNase I, and cDNA was synthesized from 50 ng total RNA, finally real-time PCR was performed with SYBR Green/ROX qPCR Master Mix in order to determined relative gene expression using RPS28 as a housekeeping gene to normalize in a fold calculation ΔCt. As a result, we validated by real-time PCR that the overexpression of the lncRNA GATA3-AS1 is associated with resistance to neoadjuvant chemotherapy in locally advanced breast cancer patients of luminal B subtype.

Keywords: breast cancer, biomarkers, genomics, neoadjuvant chemotherapy, lncRNAS

Procedia PDF Downloads 55
51670 Frequency of Tube Feeding in Aboriginal and Non-aboriginal Head and Neck Cancer Patients and the Impact on Relapse and Survival Outcomes

Authors: Kim Kennedy, Daren Gibson, Stephanie Flukes, Chandra Diwakarla, Lisa Spalding, Leanne Pilkington, Andrew Redfern

Abstract:

Introduction: Head and neck cancer and treatments are known for their profound effect on nutrition and tube feeding is a common requirement to maintain nutrition. Aim: We aimed to evaluate the frequency of tube feeding in Aboriginal and non-Aboriginal patients, and to examine the relapse and survival outcomes in patients who require enteral tube feeding. Methods: We performed a retrospective cohort analysis of 320 head and neck cancer patients from a single centre in Western Australia, identifying 80 Aboriginal patients and 240 non-Aboriginal patients matched on a 1:3 ratio by site, histology, rurality, and age. Data collected included patient demographics, tumour features, treatment details, and cancer and survival outcomes. Results: Aboriginal and non-Aboriginal patients required feeding tubes at similar rates (42.5% vs 46.2% respectively), however Aboriginal patients were far more likely to fail to return to oral nutrition, with 26.3% requiring long-term tube feeding versus only 15% of non-Aboriginal patients. In the overall study population, 27.5% required short-term tube feeding, 17.8% required long-term enteral tube nutrition, and 45.3% of patients did not have a feeding tube at any point. Relapse was more common in patients who required tube feeding, with relapses in 42.1% of the patients requiring long-term tube feeding, 31.8% in those requiring a short-term tube, versus 18.9% in the ‘no tube’ group. Survival outcomes for patients who required a long-term tube were also significantly poorer when compared to patients who only required a short-term tube, or not at all. Long-term tube-requiring patients were half as likely to survive (29.8%) compared to patients requiring a short-term tube (62.5%) or no tube at all (63.5%). Patients requiring a long-term tube were twice as likely to die with active disease (59.6%) as patients with no tube (28%), or a short term tube (33%). This may suggest an increased relapse risk in patients who require long-term feeding, due to consequences of malnutrition on cancer and treatment outcomes, although may simply reflect that patients with recurrent disease were more likely to have longer-term swallowing dysfunction due to recurrent disease and salvage treatments. Interestingly long-term tube patients were also more likely to die with no active disease (10.5%) (compared with short-term tube requiring patients (4.6%), or patients with no tube (8%)), which is likely reflective of the increased mortality associated with long-term aspiration and malnutrition issues. Conclusions: Requirement for tube feeding was associated with a higher rate of cancer relapse, and in particular, long-term tube feeding was associated with a higher likelihood of dying from head and neck cancer, but also a higher risk of dying from other causes without cancer relapse. This data reflects the complex effect of head and neck cancer and its treatments on swallowing and nutrition, and ultimately, the effects of malnutrition, swallowing dysfunction, and aspiration on overall cancer and survival outcomes. Tube feeding was seen at similar rates in Aboriginal and non-Aboriginal patient, however failure to return to oral intake with a requirement for a long-term feeding tube was seen far more commonly in the Aboriginal population.

Keywords: head and neck cancer, enteral tube feeding, malnutrition, survival, relapse, aboriginal patients

Procedia PDF Downloads 102
51669 Delayed Contralateral Prophylactic Mastectomy (CPM): Reasons and Rationale for Patients with Unilateral Breast Cancer

Authors: C. Soh, S. Muktar, C. M. Malata, J. R. Benson

Abstract:

Introduction Reasons for requesting CPM include prevention of recurrence, peace of mind and moving on after breast cancer. Some women seek CPM as a delayed procedure but factors influencing this are poorly understood. Methods A retrospective analysis examined patients undergoing CPM as either an immediate or delayed procedure with or without breast reconstruction (BR) between January 2009 and December 2019. A cross-sectional survey based on validated questionnaires (5 point Likert scale) explored patients’ decision-making process in terms of timing of CPM and any BR. Results A total of 123 patients with unilateral breast cancer underwent CPM with 39 (32.5%) delayed procedures with or without BR. The response rate amongst patients receiving questionnaires (n=33) was 22/33 (66%). Within this delayed CPM cohort were three reconstructive scenarios 1) unilateral immediate BR with CPM (n=12); 2) delayed CPM with concomitant bilateral BR (n=22); 3) delayed bilateral BR after delayed CPM (n=3). Two patients had delayed CPM without BR. The most common reason for delayed CPM was to complete all cancer treatments (including radiotherapy) before surgery on the unaffected breast (score 2.91). The second reason was unavailability of genetic test results at the time of therapeutic mastectomy (score 2.64) whilst the third most cited reason was a subsequent change in family cancer history. Conclusion Factors for delayed CPM are patient-driven with few women spontaneously changing their mind having initially decided against immediate CPM for reasons also including surgical duration. CPM should be offered as a potentially delayed option with informed discussion of risks and benefits.

Keywords: Breast Cancer, CPM, Prophylactic, Rationale

Procedia PDF Downloads 112
51668 Effect of Co-Infection With Intestinal Parasites on COVID-19 Severity: A Prospective Observational Cohort Study

Authors: Teklay Gebrecherkos, Dawit Wolday, Muhamud Abdulkader

Abstract:

Background: COVID-19 symptomatology in Africa appears significantly less serious than in the industrialized world. Our hypothesis for this phenomenon, being a different, more activated immune system due to parasite infections contributes to reduced COVID-19 outcome. We investigated this hypothesis in an endemic area in sub sub-saharan Africa. Methods: Ethiopian COVID-19 patients were enrolled and screened for intestinal parasites, between July 2020 and March 2021. The primary outcome was the proportion of patients with severe COVID-19. SARS-CoV-2 infection was confirmed by RT-PCR on samples obtained from nasopharyngeal swabs, while direct microscopic examination, modified Ritchie concentration, and Kato-Katz methods were used to identify parasites and ova from a fresh stool sample. Ordinal logistic regression models were used to estimate the association between parasite infection and COVID-19 severity. Models were adjusted for sex, age, residence, education level, occupation, body mass index, and comorbidities. Data were analyzed using STATA version 14. P-value <0.05 was considered statistically significant. Results: A total of 751 SARS-CoV-2 infected patients were enrolled, of whom 284 (37•8%) had an intestinal parasitic infection. Only 27/255 (10•6%) severe COVID-19 patients were co-infected with intestinal parasites, while 257/496 (51•8%) non-severe COVID-19 patients appeared parasite positive (p<0.0001). Patients co-infected with parasites had lower odds of developing severe COVID-19, with an adjusted odds ratio (AOR) of 0•14 (95% CI 0•09–0•24; p<0•0001) for all parasites, AOR 0•20 ([95% CI 0•11–0•38]; p<0•0001) for protozoa, and AOR 0•13 ([95% CI 0•07–0•26]; p<0•0001) for helminths. When stratified by species, co-infection with Entamoeba spp., Hymenolopis nana, and Schistosoma mansoni implied a lower probability of developing severe COVID-19. There were 11 deaths (1•5%), and all were among patients without parasites (p=0•009). Conclusions: Parasite co-infection is associated with a reduced risk of severe COVID-19 in African patients. Parasite-driven immunomodulatory responses may mute hyper-inflammation associated with severe COVID-19.

Keywords: COVID-19, SARS-COV-2, intestinal parasite, RT-PCR, co-infection

Procedia PDF Downloads 61
51667 Additional Usage of Remdesivir with the Standard of Care in Patients with Moderate And Severe COVID-19: A Tertiary Hospital’s Experience

Authors: Pugazhenthan Thangaraju

Abstract:

Background: Since the pandemic began, more than millions of people have become infected with COVID-19. Globally, researchers are working for safe and effective treatments for this disease. Remdesivir is a drug that has been approved for the treatment of COVID-19. Many aspects are still being considered that may influence the future use of remdesivir. Aim: To assess the safety and efficacy of Remdesivir in hospitalized adult patients diagnosed with moderate and severe COVID-19. Methods: It was a record-based retrospective cohort study conducted between April 1st, 2020 and June 30th, 2021 at the tertiary care teaching hospital All India Institutes of Medical Sciences (AIIMS), Raipur Results: There were a total of 10,559 medical records of COVID-19 patients of which 1034 records were included in this study. Overall, irrespective of the survival status, there was statistical significant difference observed between the WHO score at the time of admission and discharge. Clinical improvement among the survivors was found to be statistically significant. Conclusion: Remdesivir's potential efficacy against coronaviruses has so far been limited to in vitro studies and animal models. However, information about COVID-19 is rapidly expanding. Several clinical trials for the treatment of COVID-19 with remdesivir are now underway. However, the findings of this study support remdesivir as a promising agent in the fight against SARS-CoV-2.

Keywords: Remdesivir, COVID-19, SARS-CoV-2, antiviral, RNA-dependent RNA polymerase, viral pneumonia

Procedia PDF Downloads 65
51666 C-Spine Imaging in a Non-trauma Centre: Compliance with NEXUS Criteria Audit

Authors: Andrew White, Abigail Lowe, Kory Watkins, Hamed Akhlaghi, Nicole Winter

Abstract:

The timing and appropriateness of diagnostic imaging are critical to the evaluation and management of traumatic injuries. Within the subclass of trauma patients, the prevalence of c-spine injury is less than 4%. However, the incidence of delayed diagnosis within this cohort has been documented as up to 20%, with inadequate radiological examination most cited issue. In order to assess those in which c-spine injury cannot be fully excluded based on clinical examination alone and, therefore, should undergo diagnostic imaging, a set of criteria is used to provide clinical guidance. The NEXUS (National Emergency X-Radiography Utilisation Study) criteria is a validated clinical decision-making tool used to facilitate selective c-spine radiography. The criteria allow clinicians to determine whether cervical spine imaging can be safely avoided in appropriate patients. The NEXUS criteria are widely used within the Emergency Department setting given their ease of use and relatively straightforward application and are used in the Victorian State Trauma System’s guidelines. This audit utilized retrospective data collection to examine the concordance of c-spine imaging in trauma patients to that of the NEXUS criteria and assess compliance with state guidance on diagnostic imaging in trauma. Of the 183 patients that presented with trauma to the head, neck, or face (244 excluded due to incorrect triage), 98 did not undergo imaging of the c-spine. Out of those 98, 44% fulfilled at least one of the NEXUS criteria, meaning the c-spine could not be clinically cleared as per the current guidelines. The criterion most met was intoxication, comprising 42% (18 of 43), with midline spinal tenderness (or absence of documentation of this) the second most common with 23% (10 of 43). Intoxication being the most met criteria is significant but not unexpected given the cohort of patients seen at St Vincent’s and within many emergency departments in general. Given these patients will always meet NEXUS criteria, an element of clinical judgment is likely needed, or concurrent use of the Canadian C-Spine Rules to exclude the need for imaging. Midline tenderness as a met criterion was often in the context of poor or absent documentation relating to this, emphasizing the importance of clear and accurate assessments. The distracting injury was identified in 7 out of the 43 patients; however, only one of these patients exhibited a thoracic injury (T11 compression fracture), with the remainder comprising injuries to the extremities – some studies suggest that C-spine imaging may not be required in the evaluable blunt trauma patient despite distracting injuries in any body regions that do not involve the upper chest. This emphasises the need for standardised definitions for distracting injury, at least at a departmental/regional level. The data highlights the currently poor application of the NEXUS guidelines, with likely common themes throughout emergency departments, highlighting the need for further education regarding implementation and potential refinement/clarification of criteria. Of note, there appeared to be no significant differences between levels of experience with respect to inappropriately clearing the c-spine clinically with respect to the guidelines.

Keywords: imaging, guidelines, emergency medicine, audit

Procedia PDF Downloads 72
51665 The Use of Coronary Calcium Scanning for Cholesterol Assessment and Management

Authors: Eva Kirzner

Abstract:

Based on outcome studies published over the past two decades, in 2018, the ACC/AHA published new guidelines for the management of hypercholesterolemia that incorporate the use of coronary artery calcium (CAC) scanning as a decision tool for ascertaining which patients may benefit from statin therapy. This use is based on the recognition that the absence of calcium on CAC scanning (i.e., a CAC score of zero) usually signifies the absence of significant atherosclerotic deposits in the coronary arteries. Specifically, in patients with a high risk for atherosclerotic cardiovascular disease (ASCVD), initiation of statin therapy is generally recommended to decrease ASCVD risk. However, among patients with intermediate ASCVD risk, the need for statin therapy is less certain. However, there is a need for new outcome studies that provide evidence that the management of hypercholesterolemia based on these new ACC/AHA recommendations is safe for patients. Based on a Pub-Med and Google Scholar literature search, four relevant population-based or patient-based cohort studies that studied the relationship between CAC scanning, risk assessment or mortality, and statin therapy that were published between 2017 and 2021 were identified (see references). In each of these studies, patients were assessed for their baseline risk for atherosclerotic cardiovascular disease (ASCVD) using the Pooled Cohorts Equation (PCE), an ACC/AHA calculator for determining patient risk based on assessment of patient age, gender, ethnicity, and coronary artery disease risk factors. The combined findings of these four studies provided concordant evidence that a zero CAC score defines patients who remain at low clinical risk despite the non-use of statin therapy. Thus, these new studies confirm the use of CAC scanning as a safe tool for reducing the potential overuse of statin therapy among patients with zero CAC scores. Incorporating these new data suggest the following best practice: (1) ascertain ASCVD risk according to the PCE in all patients; (2) following an initial attempt trial to lower ASCVD risk with optimal diet among patients with elevated ASCVD risk, initiate statin therapy for patients who have a high ASCVD risk score; (3) if the ASCVD score is intermediate, refer patients for CAC scanning; and (4) and if the CAC score is zero among the intermediate risk ASCVD patients, statin therapy can be safely withheld despite the presence of an elevated serum cholesterol level.

Keywords: cholesterol, cardiovascular disease, statin therapy, coronary calcium

Procedia PDF Downloads 115
51664 Assessing the Survival Time of Hospitalized Patients in Eastern Ethiopia During 2019–2020 Using the Bayesian Approach: A Retrospective Cohort Study

Authors: Chalachew Gashu, Yoseph Kassa, Habtamu Geremew, Mengestie Mulugeta

Abstract:

Background and Aims: Severe acute malnutrition remains a significant health challenge, particularly in low‐ and middle‐income countries. The aim of this study was to determine the survival time of under‐five children with severe acute malnutrition. Methods: A retrospective cohort study was conducted at a hospital, focusing on under‐five children with severe acute malnutrition. The study included 322 inpatients admitted to the Chiro hospital in Chiro, Ethiopia, between September 2019 and August 2020, whose data was obtained from medical records. Survival functions were analyzed using Kaplan‒Meier plots and log‐rank tests. The survival time of severe acute malnutrition was further analyzed using the Cox proportional hazards model and Bayesian parametric survival models, employing integrated nested Laplace approximation methods. Results: Among the 322 patients, 118 (36.6%) died as a result of severe acute malnutrition. The estimated median survival time for inpatients was found to be 2 weeks. Model selection criteria favored the Bayesian Weibull accelerated failure time model, which demonstrated that age, body temperature, pulse rate, nasogastric (NG) tube usage, hypoglycemia, anemia, diarrhea, dehydration, malaria, and pneumonia significantly influenced the survival time of severe acute malnutrition. Conclusions: This study revealed that children below 24 months, those with altered body temperature and pulse rate, NG tube usage, hypoglycemia, and comorbidities such as anemia, diarrhea, dehydration, malaria, and pneumonia had a shorter survival time when affected by severe acute malnutrition under the age of five. To reduce the death rate of children under 5 years of age, it is necessary to design community management for acute malnutrition to ensure early detection and improve access to and coverage for children who are malnourished.

Keywords: Bayesian analysis, severe acute malnutrition, survival data analysis, survival time

Procedia PDF Downloads 50
51663 Operating Model of Obstructive Sleep Apnea Patients in North Karelia Central Hospital

Authors: L. Korpinen, T. Kava, I. Salmi

Abstract:

This study aimed to describe the operating model of obstructive sleep apnea. Due to the large number of patients, the role of nurses in the diagnosis and treatment of sleep apnea was important. Pulmonary physicians met only a minority of the patients. The sleep apnea study in 2018 included about 800 patients, of which about 28% were normal and 180 patients were classified as severe (apnea-hypopnea index [AHI] over 30). The operating model has proven to be workable and appropriate. The patients understand well that they may not be referred to a pulmonary doctor. However, specialized medical follow-up on professional drivers continues every year.

Keywords: sleep, apnea patient, operating model, hospital

Procedia PDF Downloads 132
51662 A Comparative Analysis on Survival in Patients with Node Positive Cutaneous Head and Neck Squamous Cell Carcinoma as per TNM 7th and Tnm 8th Editions

Authors: Petr Daniel Edward Kovarik, Malcolm Jackson, Charles Kelly, Rahul Patil, Shahid Iqbal

Abstract:

Introduction: Recognition of the presence of extra capsular spread (ECS) has been a major change in the TNM 8th edition published by the American Joint Committee on Cancer in 2018. Irrespective of the size or number of lymph nodes, the presence of ECS makes N3b disease a stage IV disease. The objective of this retrospective observational study was to conduct a comparative analysis of survival outcomes in patients with lymph node-positive cutaneous head and neck squamous cell carcinoma (CHNSCC) based on their TNM 7th and TNM 8th editions classification. Materials and Methods: From January 2010 to December 2020, 71 patients with CHNSCC were identified from our centre’s database who were treated with radical surgery and adjuvant radiotherapy. All histopathological reports were reviewed, and comprehensive nodal mapping was performed. The data were collected retrospectively and survival outcomes were compared using TNM 7th and 8th editions. Results: The median age of the whole group of 71 patients was 78 years, range 54 – 94 years, 63 were male and 8 female. In total, 2246 lymph nodes were analysed; 195 were positive for cancer. ECS was present in 130 lymph nodes, which led to a change in TNM staging. The details on N-stage as per TNM 7th edition was as follows; pN1 = 23, pN2a = 14, pN2b = 32, pN2c = 0, pN3 = 2. After incorporating the TNM 8th edition criterion (presence of ECS), the details on N-stage were as follows; pN1 = 6, pN2a = 5, pN2b = 3, pN2c = 0, pN3a = 0, pN3b = 57. This showed an increase in overall stage. According to TNM 7th edition, there were 23 patients were with stage III and remaining 48 patients, stage IV. As per TNM 8th edition, there were only 6 patients with stage III as compared to 65 patients with stage IV. For all patients, 2-year disease specific survival (DSS) and overall survival (OS) were 70% and 46%. 5-year DSS and OS rates were 66% and 20% respectively. Comparing the survival between stage III and stage IV of the two cohorts using both TNM 7th and 8th editions, there is an obvious greater survival difference between the stages if TNM 8th staging is used. However, meaningful statistics were not possible as the majority of patients (n = 65) were with stage IV and only 6 patients were stage III in the TNM 8th cohort. Conclusion: Our study provides a comprehensive analysis on lymph node data mapping in this specific patient population. It shows a better differentiation between stage III and stage IV in the TNM 8th edition as compared to TNM 7th however meaningful statistics were not possible due to the imbalance of patients in the sub-cohorts of the groups.

Keywords: cutaneous head and neck squamous cell carcinoma, extra capsular spread, neck lymphadenopathy, TNM 7th and 8th editions

Procedia PDF Downloads 107
51661 Pediatric Drug Resistance Tuberculosis Pattern, Side Effect Profile and Treatment Outcome: North India Experience

Authors: Sarika Gupta, Harshika Khanna, Ajay K Verma, Surya Kant

Abstract:

Background: Drug-resistant tuberculosis (DR-TB) is a growing health challenge to global TB control efforts. Pediatric DR-TB is one of the neglected infectious diseases. In our previously published report, we have notified an increased prevalence of DR-TB in the pediatric population at a tertiary health care centre in North India which was estimated as 17.4%, 15.1%, 18.4%, and 20.3% in (%) in the year 2018, 2019, 2020, and 2021. Limited evidence exists about a pattern of drug resistance, side effect profile and programmatic outcomes of Paediatric DR-TB treatment. Therefore, this study was done to find out the pattern of resistance, side effect profile and treatment outcome. Methodology: This was a prospective cohort study conducted at the nodal drug-resistant tuberculosis centre of a tertiary care hospital in North India from January 2021 to December 2022. Subjects included children aged between 0-18 years of age with a diagnosis of DR-TB, on the basis of GeneXpert (rifampicin [RIF] resistance detected), line probe assay and drug sensitivity testing (DST) of M. tuberculosis (MTB) grown on a culture of body fluids. Children were classified as monoresistant TB, polyresistant TB (resistance to more than 1 first-line anti-TB drug, other than both INH and RIF), MDR-TB, pre-XDR-TB and XDR-TB, as per the WHO classification. All the patients were prescribed DR TB treatment as per the standard guidelines, either shorter oral DR-TB regimen or a longer all-oral MDR/XDR-TB regimen (age below five years needed modification). All the patients were followed up for side effects of treatment once per month. The patient outcomes were categorized as good outcomes if they had completed treatment and cured or were improving during the course of treatment, while bad outcomes included death or not improving during the course of treatment. Results: Of the 50 pediatric patients included in the study, 34 were females (66.7%) and 16 were male (31.4%). Around 33 patients (64.7%) were suffering from pulmonary TB, while 17 (33.3%) were suffering from extrapulmonary TB. The proportions of monoresistant TB, polyresistant TB, MDR-TB, pre-XDR-TB and XDR-TB were 2.0%, 0%, 50.0%, 30.0% and 18.0%, respectively. Good outcome was reported in 40 patients (80.0%). The 10 bad outcomes were 7 deaths (14%) and 3 (6.0%) children who were not improving. Adverse events (single or multiple) were reported in all the patients, most of which were mild in nature. The most common adverse events were metallic taste 16(31.4%), rash and allergic reaction 15(29.4%), nausea and vomiting 13(26.0%), arthralgia 11 (21.6%) and alopecia 11 (21.6%). Serious adverse event of QTc prolongation was reported in 4 cases (7.8%), but neither arrhythmias nor symptomatic cardiac side effects occurred. Vestibular toxicity was reported in 2(3.9%), and psychotic symptoms in 4(7.8%). Hepatotoxicity, hypothyroidism, peripheral neuropathy, gynaecomastia, and amenorrhea were reported in 2 (4.0%), 4 (7.8%), 2 (3.9%), 1(2.0%), and 2 (3.9%) respectively. None of the drugs needed to be withdrawn due to uncontrolled adverse events. Conclusion: Paediatric DR TB treatment achieved favorable outcomes in a large proportion of children. DR TB treatment regimen drugs were overall well tolerated in this cohort.

Keywords: pediatric, drug-resistant, tuberculosis, adverse events, treatment

Procedia PDF Downloads 66
51660 Text Mining Past Medical History in Electrophysiological Studies

Authors: Roni Ramon-Gonen, Amir Dori, Shahar Shelly

Abstract:

Background and objectives: Healthcare professionals produce abundant textual information in their daily clinical practice. The extraction of insights from all the gathered information, mainly unstructured and lacking in normalization, is one of the major challenges in computational medicine. In this respect, text mining assembles different techniques to derive valuable insights from unstructured textual data, so it has led to being especially relevant in Medicine. Neurological patient’s history allows the clinician to define the patient’s symptoms and along with the result of the nerve conduction study (NCS) and electromyography (EMG) test, assists in formulating a differential diagnosis. Past medical history (PMH) helps to direct the latter. In this study, we aimed to identify relevant PMH, understand which PMHs are common among patients in the referral cohort and documented by the medical staff, and examine the differences by sex and age in a large cohort based on textual format notes. Methods: We retrospectively identified all patients with abnormal NCS between May 2016 to February 2022. Age, gender, and all NCS attributes reports were recorded, including the summary text. All patients’ histories were extracted from the text report by a query. Basic text cleansing and data preparation were performed, as well as lemmatization. Very popular words (like ‘left’ and ‘right’) were deleted. Several words were replaced with their abbreviations. A bag of words approach was used to perform the analyses. Different visualizations which are common in text analysis, were created to easily grasp the results. Results: We identified 5282 unique patients. Three thousand and five (57%) patients had documented PMH. Of which 60.4% (n=1817) were males. The total median age was 62 years (range 0.12 – 97.2 years), and the majority of patients (83%) presented after the age of forty years. The top two documented medical histories were diabetes mellitus (DM) and surgery. DM was observed in 16.3% of the patients, and surgery at 15.4%. Other frequent patient histories (among the top 20) were fracture, cancer (ca), motor vehicle accident (MVA), leg, lumbar, discopathy, back and carpal tunnel release (CTR). When separating the data by sex, we can see that DM and MVA are more frequent among males, while cancer and CTR are less frequent. On the other hand, the top medical history in females was surgery and, after that, DM. Other frequent histories among females are breast cancer, fractures, and CTR. In the younger population (ages 18 to 26), the frequent PMH were surgery, fractures, trauma, and MVA. Discussion: By applying text mining approaches to unstructured data, we were able to better understand which medical histories are more relevant in these circumstances and, in addition, gain additional insights regarding sex and age differences. These insights might help to collect epidemiological demographical data as well as raise new hypotheses. One limitation of this work is that each clinician might use different words or abbreviations to describe the same condition, and therefore using a coding system can be beneficial.

Keywords: abnormal studies, healthcare analytics, medical history, nerve conduction studies, text mining, textual analysis

Procedia PDF Downloads 96
51659 New Thromboprophylaxis Regime for Knee Arthroplasties

Authors: H. Noureddine, P. Rao, R. Guru, A. Chandratreya

Abstract:

The nice guidance for elective total knee replacements states that patients should be given mechanical thrombo-prophylaxis, and if no contraindications chemical thromboprophylaxis in the form of Dabigatran etexilate, Rivaroxiban, UFH, LMWH, or Fondaparinux sodium (CG92, 1.5.14, January 2010). In Practice administering oral agents has been the dominant practice as it reduces the nursing needs, and shortens hospital stay and is generally received better by patients. However, there are well documented associated bleeding risks, and their effects are difficult to reverse in case of major bleeding. Our experience with oral factor 10 inhibitors used for thromboprophylaxis was marked with several patients developing complications necessitating return to the theatre for wound washouts. This has led us to try a different protocol for thromboprophylaxis that we applied on our patients undergoing total and unicondylar knee replacements. We applied mechanical thromboprophylaxis in the form of intermittent pneumatic pressure devices, and chemical thromboprophylaxis in the form of a dose of prophylactic LMWH pre-op, then 150 mg of Aspirin to start 24 hours after the surgery and to continue for 6 weeks, alongside GI cover with PPIs or antihistamines. We also administered local anaesthetics intra-operatively in line with the ERAS protocol thus encouraging early mobilization. We have identified a cohort of 133 patients who underwent one of the aforementioned procedures in the same trust, and by the same surgeon, where this protocol was applied and examined their medical notes retrospectively with a mean follow-up period of 14 months, to identify the rate and percentage of patients who had thrombo-embolic events in the post-operative period.

Keywords: aspirin, heparin, knee arthroplasty, thromboprophylaxis

Procedia PDF Downloads 369
51658 Psychological Distress and Quality of Life in Inflammatory Bowel Disease Patients: The Role of Dispositional Mindfulness

Authors: Kelly E. Tow, Peter Caputi, Claudia Rogge, Thomas Lee, Simon R. Knowles

Abstract:

Inflammatory Bowel Disease (IBD) is a serious chronic health condition, characterised by inflammation of the gastrointestinal tract. Individuals with active IBD experience severe abdominal symptoms, which can adversely impact their physical and mental health, as well as their quality of life (QoL). Given that stress may exacerbate IBD symptoms and is frequently highlighted as a contributing factor for the development of psychological difficulties and poorer QoL, it is vital to investigate stress-management strategies aimed at improving the lives of those with IBD. The present study extends on the limited research in IBD cohorts by exploring the role of dispositional mindfulness and its impact on psychological well-being and QoL. The study examined how disease activity and dispositional mindfulness were related to psychological distress and QoL in a cohort of IBD patients. The potential role of dispositional mindfulness as a moderator between stress and anxiety, depression and QoL in these individuals was also examined. Participants included 47 patients with a clinical diagnosis of IBD. Each patient completed a series of psychological questionnaires and was assessed by a gastroenterologist to determine their disease activity levels. Correlation analyses indicated that disease activity was not significantly related to psychological distress or QoL in the sample of IBD patients. However, dispositional mindfulness was inversely related to psychological distress and positively related to QoL. Furthermore, moderation analyses demonstrated a significant interaction between stress and dispositional mindfulness on anxiety. These findings demonstrate that increased levels of dispositional mindfulness may be beneficial for individuals with IBD. Specifically, the results indicate positive links between dispositional mindfulness, general psychological well-being and QoL, and suggest that dispositional mindfulness may attenuate the negative impacts of stress on levels of anxiety in IBD patients. While further research is required to validate and expand on these findings, the current study highlights the importance of addressing psychological factors in IBD and indicates support for the use of mindfulness-based interventions for patients with the disease.

Keywords: anxiety, depression, dispositional mindfulness, inflammatory bowel disease, quality of life, stress

Procedia PDF Downloads 159
51657 Prevalence of Breast Cancer Molecular Subtypes at a Tertiary Cancer Institute

Authors: Nahush Modak, Meena Pangarkar, Anand Pathak, Ankita Tamhane

Abstract:

Background: Breast cancer is the prominent cause of cancer and mortality among women. This study was done to show the statistical analysis of a cohort of over 250 patients detected with breast cancer diagnosed by oncologists using Immunohistochemistry (IHC). IHC was performed by using ER; PR; HER2; Ki-67 antibodies. Materials and methods: Formalin fixed Paraffin embedded tissue samples were obtained by surgical manner and standard protocol was followed for fixation, grossing, tissue processing, embedding, cutting and IHC. The Ventana Benchmark XT machine was used for automated IHC of the samples. Antibodies used were supplied by F. Hoffmann-La Roche Ltd. Statistical analysis was performed by using SPSS for windows. Statistical tests performed were chi-squared test and Correlation tests with p<.01. The raw data was collected and provided by National Cancer Insitute, Jamtha, India. Result: Luminal B was the most prevailing molecular subtype of Breast cancer at our institute. Chi squared test of homogeneity was performed to find equality in distribution and Luminal B was the most prevalent molecular subtype. The worse prognostic indicator for breast cancer depends upon expression of Ki-67 and her2 protein in cancerous cells. Our study was done at p <.01 and significant dependence was observed. There exists no dependence of age on molecular subtype of breast cancer. Similarly, age is an independent variable while considering Ki-67 expression. Chi square test performed on Human epidermal growth factor receptor 2 (HER2) statuses of patients and strong dependence was observed in percentage of Ki-67 expression and Her2 (+/-) character which shows that, value of Ki depends upon Her2 expression in cancerous cells (p<.01). Surprisingly, dependence was observed in case of Ki-67 and Pr, at p <.01. This shows that Progesterone receptor proteins (PR) are over-expressed when there is an elevation in expression of Ki-67 protein. Conclusion: We conclude from that Luminal B is the most prevalent molecular subtype at National Cancer Institute, Jamtha, India. There was found no significant correlation between age and Ki-67 expression in any molecular subtype. And no dependence or correlation exists between patients’ age and molecular subtype. We also found that, when the diagnosis is Luminal A, out of the cohort of 257 patients, no patient shows >14% Ki-67 value. Statistically, extremely significant values were observed for dependence of PR+Her2- and PR-Her2+ scores on Ki-67 expression. (p<.01). Her2 is an important prognostic factor in breast cancer. Chi squared test for Her2 and Ki-67 shows that the expression of Ki depends upon Her2 statuses. Moreover, Ki-67 cannot be used as a standalone prognostic factor for determining breast cancer.

Keywords: breast cancer molecular subtypes , correlation, immunohistochemistry, Ki-67 and HR, statistical analysis

Procedia PDF Downloads 123
51656 Hypoglycemic Coma in Elderly Patients with Diabetes mellitus

Authors: D. Furuya, H. Ryujin, S. Takahira, Y. Sekine, Y. Oya, K. Sonoda, H. Ogawa, Y. Nomura, R. Maruyama, H. Kim, T. Kudo, A. Nakano, T. Saruta, S. Sugita, M. Nemoto, N. Tanahashi

Abstract:

Purpose: To study the clinical characteristics of hypoglycemic coma in adult patients with type 1 or type 2 diabetes mellitus (DM). Methods: Participants in this retrospective study comprised 91 patients (54 men, 37 women; mean age ± standard deviation, 71.5 ± 12.6 years; range, 42-97 years) brought to our emergency department by ambulance with disturbance of consciousness in the 7 years from April 2007 to March 2014. Patients with hypoglycemia caused by alcoholic ketoacidosis, nutrition disorder, malignancies and psychological disorder were excluded. Results: Patients with type 1 (8 of 91) or type 2 DM (83 of 91) were analyzed. Mean blood sugar level was 31.6 ± 10.4 in all patients. A sulfonylurea (SU) was more commonly used in elderly (>75 years old; n=44)(70.5%) than in younger patients (36.2%, p < 0.05). Cases showing prolonged unconsciousness (range, 1 hour to 21 days; n=30) included many (p < 0.05) patients with dementia (13.3%; 0.5% without dementia) and fewer (p < 0.05) patients with type 1 DM (0%; 13.1% in type 2 DM). Specialists for DM (n=33) used SU less often (24.2%) than general physicians (69.0%, p < 0.05). Conclusion: In cases of hypoglycemic coma, SU was frequently used in elderly patients with DM.

Keywords: hypoglycemic coma, Diabetes mellitus, unconsciousness, elderly patients

Procedia PDF Downloads 490
51655 Modern Cardiac Surgical Outcomes in Nonagenarians: A Multicentre Retrospective Observational Study

Authors: Laurence Weinberg, Dominic Walpole, Dong-Kyu Lee, Michael D’Silva, Jian W. Chan, Lachlan F. Miles, Bradley Carp, Adam Wells, Tuck S. Ngun, Siven Seevanayagam, George Matalanis, Ziauddin Ansari, Rinaldo Bellomo, Michael Yii

Abstract:

Background: There have been multiple recent advancements in the selection, optimization and management of cardiac surgical patients. However, there is limited data regarding the outcomes of nonagenarians undergoing cardiac surgery, despite this vulnerable cohort increasingly receiving these interventions. This study describes the patient characteristics, management and outcomes of a group of nonagenarians undergoing cardiac surgery in the context of contemporary peri-operative care. Methods: A retrospective observational study was conducted of patients 90 to 99 years of age (i.e., nonagenarians) who had undergone cardiac surgery requiring a classic median sternotomy (i.e., open-heart surgery). All operative indications were included. Patients who underwent minimally invasive surgery, transcatheter aortic valve implantation and thoracic aorta surgery were excluded. Data were collected from four hospitals in Victoria, Australia, over an 8-year period (January 2012 – December 2019). The primary objective was to assess six-month mortality in nonagenarians undergoing open-heart surgery and to evaluate the incidence and severity of postoperative complications using the Clavien-Dindo classification system. The secondary objective was to provide a detailed description of the characteristics and peri-operative management of this group. Results: A total of 12,358 adult patients underwent cardiac surgery at the study centers during the observation period, of whom 18 nonagenarians (0.15%) fulfilled the inclusion criteria. The median (IQR) [min-max] age was 91 years (90.0:91.8) [90-94] and 14 patients (78%) were men. Cardiovascular comorbidities, polypharmacy and frailty, were common. The median (IQR) predicted in-hospital mortality by EuroSCORE II was 6.1% (4.1-14.5). All patients were optimized preoperatively by a multidisciplinary team of surgeons, cardiologists, geriatricians and anesthetists. All index surgeries were performed on cardiopulmonary bypass. Isolated coronary artery bypass grafting (CABG) and CABG with aortic valve replacement were the most common surgeries being performed in four and five patients, respectively. Half the study group underwent surgery involving two or more major procedures (e.g. CABG and valve replacement). Surgery was undertaken emergently in 44% of patients. All patients except one experienced at least one postoperative complication. The most common complications were acute kidney injury (72%), new atrial fibrillation (44%) and delirium (39%). The highest Clavien-Dindo complication grade was IIIb occurring once each in three patients. Clavien-Dindo grade IIIa complications occurred in only one patient. The median (IQR) postoperative length of stay was 11.6 days (9.8:17.6). One patient was discharged home and all others to an inpatient rehabilitation facility. Three patients had an unplanned readmission within 30 days of discharge. All patients had follow-up to at least six months after surgery and mortality over this period was zero. The median (IQR) duration of follow-up was 11.3 months (6.0:26.4) and there were no cases of mortality observed within the available follow-up records. Conclusion: In this group of nonagenarians undergoing cardiac surgery, postoperative six-month mortality was zero. Complications were common but generally of low severity. These findings support carefully selected nonagenarian patients being offered cardiac surgery in the context of contemporary, multidisciplinary perioperative care. Further, studies are needed to assess longer-term mortality and functional and quality of life outcomes in this vulnerable surgical cohort.

Keywords: cardiac surgery, mortality, nonagenarians, postoperative complications

Procedia PDF Downloads 119
51654 One-Stage Conversion of Adjustable Gastric Band to One-Anastomosis Gastric Bypass Versus Sleeve Gastrectomy : A Single-Center Experience With a Short and Mid-term Follow-up

Authors: Basma Hussein Abdelaziz Hassan, Kareem Kamel, Philobater Bahgat Adly Awad, Karim Fahmy

Abstract:

Background: Laparoscopic adjustable gastric band was one of the most applied and common bariatric procedures in the last 8 years. However; the failure rate was very high, reaching approximately 60% of the patients not achieving the desired weight loss. Most patients sought another revisional surgery. In which, we compared two of the most common weight loss surgeries performed nowadays: the laparoscopic sleeve gastrectomy and laparoscopic one- anastomosis gastric bypass. Objective: To compare the weight loss and postoperative outcomes among patients undergoing conversion laparoscopic one-anastomosis gastric bypass (cOAGB) and laparoscopic sleeve gastrectomy (cSG) after a failed laparoscopic adjustable gastric band (LAGB). Patients and Methods: A prospective cohort study was conducted from June 2020 to June 2022 at a single medical center, which included 77 patients undergoing single-stage conversion to (cOAGB) vs (cSG). Patients were reassessed for weight loss, comorbidities remission, and post-operative complications at 6, 12, and 18 months. Results: There were 77 patients with failed LAGB in our study. Group (I) was 43 patients who underwent cOAGB and Group (II) was 34 patients who underwent cSG. The mean age of the cOAGB group was 38.58. While in the cSG group, the mean age was 39.47 (p=0.389). Of the 77 patients, 10 (12.99%) were males and 67 (87.01%) were females. Regarding Body mass index (BMI), in the cOAGB group the mean BMI was 41.06 and in the cSG group the mean BMI was 40.5 (p=0.042). The two groups were compared postoperative in relation to EBWL%, BMI, and the co-morbidities remission within 18 months follow-up. The BMI was calculated post-operative at three visits. After 6 months of follow-up, the mean BMI in the cOAGB group was 34.34, and the cSG group was 35.47 (p=0.229). In 12-month follow-up, the mean BMI in the cOAGB group was 32.69 and the cSG group was 33.79 (p=0.2). Finally, the mean BMI after 18 months of follow-up in the cOAGB group was 30.02, and in the cSG group was 31.79 (p=0.001). Both groups had no statistically significant values at 6 and 12 months follow-up with p-values of 0.229, and 0.2 respectively. However, patients who underwent cOAGB after 18 months of follow-up achieved lower BMI than those who underwent cSG with a statistically significant p-value of 0.005. Regarding EBWL% there was a statistically significant difference between the two groups. After 6 months of follow-up, the mean EBWL% in the cOAGB group was 35.9% and the cSG group was 33.14%. In the 12-month follow-up, the EBWL % mean in the cOAGB group was 52.35 and the cSG group was 48.76 (p=0.045). Finally, the mean EBWL % after 18 months of follow-up in the cOAGB group was 62.06 ±8.68 and in the cSG group was 55.58 ±10.87 (p=0.005). Regarding comorbidities remission; Diabetes mellitus remission was found in 22 (88%) patients in the cOAGB group and 10 (71.4%) patients in the cSG group with (p= 0.225). Hypertension remission was found in 20 (80%) patients in the cOAGB group and 14 (82.4%) patients in the cSG group with (p=1). In addition, dyslipidemia remission was found in 27(87%) patients in cOAGB group and 17(70%) patients in the cSG group with (p=0.18). Finally, GERD remission was found in about 15 (88.2%) patients in the cOAGB group and 6 (60%) patients in the cSG group with (p=0.47). There are no statistically significant differences between the two groups in the post-operative data outcomes. Conclusion: This study suggests that the conversion of LAGB to either cOAGB or cSG could be feasibly performed in a single-stage operation. cOAGB had a significant difference as regards the weight loss results than cSG among the mid-term follow-up. However, there is no significant difference in the postoperative complications and the resolution of the co-morbidities. Therefore, cOAGB could provide a reliable alternative but needs to be substantiated in future long-term studies.

Keywords: laparoscopic, gastric banding, one-anastomosis gastric bypass, Sleeve gastrectomy, revisional surgery, weight loss

Procedia PDF Downloads 63
51653 Illness Representations of Injury: A Comparison of Patients and Their Primary Caregivers

Authors: Bih-O Lee, Hsiu-Wan Hsieh, Hsiu-Chen Liu, Mer Yu Pan

Abstract:

Background: Illness perceptions are developed when people face health-threatening situations. Previous research suggests that understanding discrepancies between illness perceptions of patients and caregivers may need to improve quality of health care. Objective: This study examined the differences between illness perceptions of injured patients and those of their caregivers. Methods: Comparative study design was used. The study setting was the surgical wards of a teaching hospital in Taiwan. Participants were 127 pairs of injured patients and their caregivers. The participants completed socio-demographic data and completed the Chinese Illness Perception Questionnaire Revised-Trauma, which comprises eight subscales. Clinical data of the injured patients was obtained from medical records. Results: This study found that injured patients were more pessimistic than their caregivers about the injury. There were significant differences between patients and caregivers insofar as patients perceived more physical symptoms, scored higher in terms of reasons for their injury, had more negative emotions and experienced more consequences than caregivers. Elderly caregivers and caregivers for patients who were over 65, severely injured and admitted to an ICU perceived more negative perceptions about the injury. Conclusions: This study indicated that patients and caregivers had negative illness representations several months after injury although the intensity of their perceptions was different. The interventions should highlight the need to assist patients and caregivers after injury.

Keywords: illness representations, injury, caregivers, comparative study

Procedia PDF Downloads 376
51652 Anemia Among Pregnant Women in Kuwait: Findings from Kuwait Birth Cohort Study

Authors: Majeda Hammoud

Abstract:

Background: Anemia during pregnancy increases the risk of delivery by cesarean section, low birth weight, preterm birth, perinatal mortality, stillbirth, and maternal mortality. In this study, we aimed to assess the prevalence of anemia in pregnant women and its associated factors in the Kuwait birth cohort study. Methods: The Kuwait birth cohort (N=1108) was a prospective cohort study in which pregnant women were recruited in the third trimester. Data were collected through personal interviews with mothers who attend antenatal care visits, including data on socio-economic status and lifestyle factors. Blood samples were taken after the recruitment to measure multiple laboratory indicators. Clinical data were extracted from the medical records by a clinician including data on comorbidities. Anemia was defined as having Hemoglobin (Hb) <110 g/L with further classification as mild (100-109 g/L), moderate (70-99 g/L), or severe (<70 g/L). Predictors of anemia were classified as underlying or direct factors, and logistic regression was used to investigate their association with anemia. Results: The mean Hb level in the study group was 115.21 g/L (95%CI: 114.56- 115.87 g/L), with significant differences between age groups (p=0.034). The prevalence of anemia was 28.16% (95%CI: 25.53-30.91%), with no significant difference by age group (p=0.164). Of all 1108 pregnant women, 8.75% had moderate anemia, and 19.40% had mild anemia, but no pregnant women had severe anemia. In multivariable analysis, getting pregnant while using contraception, adjusted odds ratio (AOR) 1.73(95%CI:1.01-2.96); p=0.046 and current use of supplements, AOR 0.50 (95%CI: 0.26-0.95); p=0.035 were significantly associated with anemia (underlying factors). From the direct factors group, only iron and ferritin levels were significantly associated with anemia (P<0.001). Conclusion: Although the severe form of anemia is low among pregnant women in Kuwait, mild and moderate anemia remains a significant health problem despite free access to antenatal care.

Keywords: anemia, pregnancy, hemoglobin, ferritin

Procedia PDF Downloads 50
51651 Analgesia in Acute Traumatic Rib Fractures

Authors: A. Duncan, A. Blake, A. O'Gara, J. Fitzgerald

Abstract:

Introduction: Acute traumatic rib fractures have significant morbidity and mortality and are a commonly seen injury in trauma patients. Rib fracture pain can often be acute and can prove challenging to manage. We performed an audit on patients with acute traumatic rib fractures with the aim of composing a referral and treatment pathway for such patients. Methods: From January 2021 to January 2022, the pain medicine service encouraged early referral of all traumatic rib fractures to the pain service for a multi-modal management approach. A retrospective audit of analgesic management was performed on a select cohort of 24 patients, with a mean age of 67, of which 19 had unilateral rib fractures. Results: 17 of 24 patients (71%) underwent local, regional block as part of a multi-modal analgesia regime. Only one regional complication was observed, seen with hypotension occurring in one patient with a thoracic epidural. The group who did not undergo regional block had a length of stay (LOS) 17 days longer than those who did (27 vs. 10) and higher rates of pneumonia (29% vs. 18%). Conclusion: Early referral to pain specialists is an important component of the effective management of acute traumatic rib fractures. From our audit, it is evident that regional blocks can be effectively used in these cases as part of a multi-modal analgesia regime and may confer benefits in terms of respiratory complications and length of stay.

Keywords: rib fractures, regional blocks, thoracic epidural, erector spina block

Procedia PDF Downloads 75
51650 Genetics of Atopic Dermatitis: Role of Cytokines Genes Polymorphisms

Authors: Ghaleb Bin Huraib, Fahad Al Harthi, Misbahul Arfin, Abdulrahman Al-Asmari

Abstract:

Atopic dermatitis (AD), also known as atopic eczema, is a chronic inflammatory skin disease characterized by severe itching and recurrent relapsing eczema-like skin lesions, affecting up to 20% of children and 10% of adults in industrialized countries. AD is a complex multifactorial disease, and its exact etiology and pathogenesis have not been fully elucidated. The aim of this study was to investigate the impact of gene polymorphisms of T helper cell subtype Th1 and Th2 cytokines, interferon-gamma (IFN-γ), interleukin-6 (IL-6) and transforming growth factor (TGF)-β1on AD susceptibility in a Saudi cohort. One hundred four unrelated patients with AD and 195 healthy controls were genotyped for IFN-γ (874A/T), IL-6 (174G/C) and TGF-β1 (509C/T) polymorphisms using ARMS-PCR and PCR-RFLP technique. The frequency of genotypes AA and AT of IFN-γ (874A/T) differed significantly among patients and controls (P 0.001). The genotype AT was increased while genotype AA was decreased in AD patients as compared to controls. AD patients also had higher frequency of T containing genotypes (AT+TT) than controls (P = 0.001). The frequencies of allele T and A were statistically different in patients and controls (P = 0.04). The frequencies of genotype GG and allele G of IL-6 (174G/C) were significantly higher while genotype GC and allele C were lower in AD patients than controls. There was no significant difference in the frequencies of alleles and genotypes of TGF-β1 (509C/T) polymorphism between patient and control groups. These results showed that susceptibility to AD is influenced by presence or absence of genotypes of IFN-γ (874A/T) and IL-6 (174G/C) polymorphisms. It is concluded that T-allele and T-containing genotypes (AT+TT) of IFN-γ (874A/T) and G-allele and GG genotype ofIL-6 (174G/C) polymorphisms are susceptible to AD in Saudis.On the other hand, the TGF-β1 (509C/T) polymorphism may not be associated with AD risk in Saudi population however further studies with large sample size are required to confirm these findings.

Keywords: atopic dermatitis, interferon-γ, interleukin-6, transforming growth factor-β1, polymorphism

Procedia PDF Downloads 118
51649 Hospital Malnutrition and its Impact on 30-day Mortality in Hospitalized General Medicine Patients in a Tertiary Hospital in South India

Authors: Vineet Agrawal, Deepanjali S., Medha R., Subitha L.

Abstract:

Background. Hospital malnutrition is a highly prevalent issue and is known to increase the morbidity, mortality, length of hospital stay, and cost of care. In India, studies on hospital malnutrition have been restricted to ICU, post-surgical, and cancer patients. We designed this study to assess the impact of hospital malnutrition on 30-day post-discharge and in-hospital mortality in patients admitted in the general medicine department, irrespective of diagnosis. Methodology. All patients aged above 18 years admitted in the medicine wards, excluding medico-legal cases, were enrolled in the study. Nutritional assessment was done within 72 h of admission, using Subjective Global Assessment (SGA), which classifies patients into three categories: Severely malnourished, Mildly/moderately malnourished, and Normal/well-nourished. Anthropometric measurements like Body Mass Index (BMI), Triceps skin-fold thickness (TSF), and Mid-upper arm circumference (MUAC) were also performed. Patients were followed-up during hospital stay and 30 days after discharge through telephonic interview, and their final diagnosis, comorbidities, and cause of death were noted. Multivariate logistic regression and cox regression model were used to determine if the nutritional status at admission independently impacted mortality at one month. Results. The prevalence of malnourishment by SGA in our study was 67.3% among 395 hospitalized patients, of which 155 patients (39.2%) were moderately malnourished, and 111 (28.1%) were severely malnourished. Of 395 patients, 61 patients (15.4%) expired, of which 30 died in the hospital, and 31 died within 1 month of discharge from hospital. On univariate analysis, malnourished patients had significantly higher morality (24.3% in 111 Cat C patients) than well-nourished patients (10.1% in 129 Cat A patients), with OR 9.17, p-value 0.007. On multivariate logistic regression, age and higher Charlson Comorbidity Index (CCI) were independently associated with mortality. Higher CCI indicates higher burden of comorbidities on admission, and the CCI in the expired patient group (mean=4.38) was significantly higher than that of the alive cohort (mean=2.85). Though malnutrition significantly contributed to higher mortality on univariate analysis, it was not an independent predictor of outcome on multivariate logistic regression. Length of hospitalisation was also longer in the malnourished group (mean= 9.4 d) compared to the well-nourished group (mean= 8.03 d) with a trend towards significance (p=0.061). None of the anthropometric measurements like BMI, MUAC, or TSF showed any association with mortality or length of hospitalisation. Inference. The results of our study highlight the issue of hospital malnutrition in medicine wards and reiterate that malnutrition contributes significantly to patient outcomes. We found that SGA performs better than anthropometric measurements in assessing under-nutrition. We are of the opinion that the heterogeneity of the study population by diagnosis was probably the primary reason why malnutrition by SGA was not found to be an independent risk factor for mortality. Strategies to identify high-risk patients at admission and treat malnutrition in the hospital and post-discharge are needed.

Keywords: hospitalization outcome, length of hospital stay, mortality, malnutrition, subjective global assessment (SGA)

Procedia PDF Downloads 150
51648 Clinical Features of Acute Aortic Dissection Patients Initially Diagnosed with ST-Segment Elevation Myocardial Infarction

Authors: Min Jee Lee, Young Sun Park, Shin Ahn, Chang Hwan Sohn, Dong Woo Seo, Jae Ho Lee, Yoon Seon Lee, Kyung Soo Lim, Won Young Kim

Abstract:

Background: Acute myocardial infarction (AMI) concomitant with acute aortic syndrome (AAS) is rare but prompt recognition of concomitant AAS is crucial, especially in patients with ST-segment elevation myocardial infarction (STEMI) because misdiagnosis with early thrombolytic or anticoagulant treatment may result in catastrophic consequences. Objectives: This study investigated the clinical features of patients of STEMI concomitant with AAS that may lead to the diagnostic clue. Method: Between 1 January 2010 and 31 December 2014, 22 patients who were the initial diagnosis of acute coronary syndrome (AMI and unstable angina) and AAS (aortic dissection, intramural hematoma and ruptured thoracic aneurysm) in our emergency department were reviewed. Among these, we excluded 10 patients who were transferred from other hospital and 4 patients with non-STEMI, leaving a total of 8 patients of STEMI concomitant with AAS for analysis. Result: The mean age of study patients was 57.5±16.31 years and five patients were Standford type A and three patients were type B aortic dissection. Six patients had ST-segment elevation in anterior leads and two patients had in inferior leads. Most of the patients had acute onset, severe chest pain but no patients had dissecting nature chest pain. Serum troponin I was elevated in three patients but all patients had D-dimer elevation. Aortic regurgitation or regional wall motion abnormality was founded in four patients. However, widened mediastinum was seen in all study patients. Conclusion: When patients with STEMI have elevated D-dimer and widened mediastinum, concomitant AAS may have to be suspected.

Keywords: aortic dissection, myocardial infarction, ST-segment, d-dimer

Procedia PDF Downloads 398
51647 Management of Severe Asthma with Omalizumab in United Arab Emirates

Authors: Shanza Akram, Samir Salah, Imran Saleem, Jassim Abdou, Ashraf Al Zaabi

Abstract:

Estimated prevalence of asthma in UAE is around 10% (900,000 people). Patients with persistent symptoms despite using high dose ICS plus a second controller +/- Oral steroids are considered to have severe asthma. Omalizumab (Xolair) is an anti-IgE monoclonal antibody approved as add-on therapy for severe allergic asthma. The objective of our study was to obtain baseline characteristics of our local cohort, to determine the efficacy of omalizumab based on clinical outcomes pre and post 52 weeks of treatment and to assess safety and tolerability. Medical records of patients receiving omalizumab therapy for asthma at Zayed Military Hospital, Abu Dhabi were retrospectively reviewed. Patients fulfilling the criteria for severe allergic asthma as per GINA guidelines were included. Asthma control over 12 months pre and post omalizumab were analyzed by taking into account the number of exacerbations, hospitalizations, maintenance of medication dosages, the need for reliever therapy and PFT’s. 21 patients (5 females) with mean age 41 years were included. The mean duration of therapy was 22 months. 19 (91%) patients had Allergic Rhinitis/Sinusitis. Mean serum total IgE level was 648 IU/ml (65-1859). 11 (52%) patients were on oral maintenance steroids pre-treatment. 7 patients managed to stop steroids on treatment while 4 were able to decrease the dosage. Mean exacerbation rate decreased from 5 per year pre-treatment to 1.36 while on treatment. The number of hospitalizations decreased from a mean of 2 per year to 0.9 per year. Reliever inhaler usage decreased from mean of 40 to 15 puffs per week.2 patients discontinued therapy, 1 due to lack of benefit (2 doses) and 2nd due to severe persistent side effects. Patient compliance was poor in some cases. Treatment with omalizumab reduced the number of exacerbations, hospitalizations, maintenance and reliever medications, and is generally well tolerated. Our results show that there is room for improved documentation in terms of symptom recording and use of rescue medication at our institution. There is also need for better patient education and counseling in order to improve compliance.

Keywords: asthma, exacerbations, omalizumab, IgE

Procedia PDF Downloads 371
51646 HIV Incidence among Men Who Have Sex with Men Measured by Pooling Polymerase Chain Reaction, and Its Comparison with HIV Incidence Estimated by BED-Capture Enzyme-Linked Immunosorbent Assay and Observed in a Prospective Cohort

Authors: Mei Han, Jinkou Zhao, Yuan Yao, Liangui Feng, Xianbin Ding, Guohui Wu, Chao Zhou, Lin Ouyang, Rongrong Lu, Bo Zhang

Abstract:

To compare the HIV incidence estimated using BED capture enzyme linked immunosorbent assay (BED-CEIA) and observed in a cohort against the HIV incidence among men who have sex with men (MSM) measured by pooling polymerase chain reaction (pooling-PCR). A total of 617 MSM subjects were included in a respondent driven sampling survey in Chongqing in 2008. Among the 129 that were tested HIV antibody positive, 102 were defined with long-term infection, 27 were assessed for recent HIV infection (RHI) using BED-CEIA. The remaining 488 HIV negative subjects were enrolled to the prospective cohort and followed-up every 6 months to monitor HIV seroconversion. All of the 488 HIV negative specimens were assessed for acute HIV infection (AHI) using pooling-PCR. Among the 488 negative subjects in the open cohort, 214 (43.9%) were followed-up for six months, with 107 person-years of observation and 14 subjects seroconverted. The observed HIV incidence was 12.5 per 100 person-years (95% CI=9.1-15.7). Among the 488 HIV negative specimens, 5 were identified with acute HIV infection using pooling-PCR at an annual rate of 14.02% (95% CI=1.73-26.30). The estimated HIV-1 incidence was 12.02% (95% CI=7.49-16.56) based on BED-CEIA. The HIV incidence estimated with three different approaches was different among subgroups. In the highly HIV prevalent MSM, it costs US$ 1724 to detect one AHI case, while detection of one case of RHI with BED assay costs only US$ 42. Three approaches generated comparable and high HIV incidences, pooling PCR and prospective cohort are more close to the true level of incidence, while BED-CEIA seemed to be the most convenient and economical approach for at-risk population’s HIV incidence evaluation at the beginning of HIV pandemic. HIV-1 incidences were alarmingly high among MSM population in Chongqing, particularly within the subgroup under 25 years of age and those migrants aged between 25 to 34 years.

Keywords: BED-CEIA, HIV, incidence, pooled PCR, prospective cohort

Procedia PDF Downloads 411
51645 Calcitonin gene-related peptide Receptor Antagonists for Chronic Migraine – Real World Outcomes

Authors: B. J. Mahen, N. E. Lloyd-Gale, S. Johnson, W. P. Rakowicz, M. J. Harris, A. D. Miller

Abstract:

Background: Migraine is a leading cause of disability in the world. Calcitonin gene-related peptide (CGRP) receptor antagonists offer an approach to migraine prophylaxis by inhibiting the inflammatory and vasodilatory effects of CGRP. In recent years, NICE licensed the use of three CGRP-receptor antagonists: Fremanezumab, Galcanezumab, and Erenumab. Here, we present the outcomes of CGRP-antagonist treatment in a cohort of patients who suffer from episodic or chronic migraine and have failed at least three oral prophylactic therapies. Methods: We offered CGRP antagonists to 86 patients who met the NICE criteria to start therapy. We recorded the number of headache days per month (HDPM) at 0 weeks, 3 months, and 12 months. Of those, 26 patients were switched to an alternative treatment due to poor response or side effects. Of the 112 total cases, 9 cases did not sufficiently maintain their headache diary, and 5 cases were not followed up at 3 months. We have therefore included 98 sets of data in our analysis. Results: Fremanezumab achieved a reduction in HDPM by 51.7% at 3 months (p<0.0001), with 63.7% of patients meeting NICE criteria to continue therapy. Patients trialed on Galcanezumab attained a reduction in HDPM by 47.0% (p=0.0019), with 51.6% of patients meeting NICE criteria to continue therapy. Erenumab, however, only achieved a reduction in HDPM by 17.0% (p=0.29), and this was not statistically significant. Furthermore, 34.4%, 9.7%, and 4.9% of patients taking Fremanezumab, Galcanezumab, and Erenumab, respectively, continued therapy beyond 12 months. Of those who attempted drug holidays following 12 months of treatment, migraine symptoms relapsed in 100% of cases. Conclusion: We observed a significant improvement in HDPM amongst episodic and chronic migraine patients following treatment with Fremanezumab or Galcanezumab.

Keywords: migraine, CGRP, fremanezumab, galcanezumab, erenumab

Procedia PDF Downloads 95
51644 The Display of Age-Period/Age-Cohort Mortality Trends Using 1-Year Intervals Reveals Period and Cohort Effects Coincident with Major Influenza A Events

Authors: Maria Ines Azambuja

Abstract:

Graphic displays of Age-Period-Cohort (APC) mortality trends generally uses data aggregated within 5 or 10-year intervals. Technology allows one to increase the amount of processed data. Displaying occurrences by 1-year intervals is a logic first step in the direction of attaining higher quality landscapes of variations in temporal occurrences. Method: 1) Comparison of UK mortality trends plotted by 10-, 5- and 1-year intervals; 2) Comparison of UK and US mortality trends (period X age and cohort X age) displayed by 1-year intervals. Source: Mortality data (period, 1x1, males, 1933-1912) uploaded from the Human Mortality Database to Excel files, where Period X Age and Cohort X Age graphics were produced. The choice of transforming age-specific trends from calendar to birth-cohort years (cohort = period – age) (instead of using cohort 1x1 data available at the HMD resource) was taken to facilitate the comparison of age-specific trends when looking across calendar-years and birth-cohorts. Yearly live births, males, 1933 to 1912 (UK) were uploaded from the HFD. Influenza references are from the literature. Results: 1) The use of 1-year intervals unveiled previously unsuspected period, cohort and interacting period x cohort effects upon all-causes mortality. 2) The UK and US figures showed variations associated with particular calendar years (1936, 1940, 1951, 1957-68, 72) and, most surprisingly, with particular birth-cohorts (1889-90 in the US, and 1900, 1918-19, 1940-41 and 1946-47, in both countries. Also, the figures showed ups and downs in age-specific trends initiated at particular birth-cohorts (1900, 1918-19 and 1947-48) or a particular calendar-year (1968, 1972, 1977-78 in the US), variations at times restricted to just a range of ages (cohort x period interacting effects). Importantly, most of the identified “scars” (period and cohort) correlates with the record of occurrences of Influenza A epidemics since the late 19th Century. Conclusions: The use of 1-year intervals to describe APC mortality trends both increases the amount of information available, thus enhancing the opportunities for patterns’ recognition, and increases our capability of interpreting those patterns by describing trends across smaller intervals of time (period or birth-cohort). The US and the UK mortality landscapes share many but not all 'scars' and distortions suggested here to be associated with influenza epidemics. Different size-effects of wars are evident, both in mortality and in fertility. But it would also be realistic to suppose that the preponderant influenza A viruses circulating in UK and US at the beginning of the 20th Century might be different and the difference to have intergenerational long-term consequences. Compared with the live births trend (UK data), birth-cohort scars clearly depend on birth-cohort sizes relatives to neighbor ones, which, if causally associated with influenza, would result from influenza-related fetal outcomes/selection. Fetal selection could introduce continuing modifications on population patterns of immune-inflammatory phenotypes that might give rise to 'epidemic constitutions' favoring the occurrence of particular diseases. Comparative analysis of mortality landscapes may help us to straight our record of past circulation of Influenza viruses and document associations between influenza recycling and fertility changes.

Keywords: age-period-cohort trends, epidemic constitution, fertility, influenza, mortality

Procedia PDF Downloads 230