Search results for: dietary diversity practice and associated facrors among hypertension patients at tirunesh beijing hospital
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 12224

Search results for: dietary diversity practice and associated facrors among hypertension patients at tirunesh beijing hospital

11144 The Dietary Behavior of Eating Alone in Middle-Aged Populations by Body Mass Index (BMI)

Authors: Pil Kyoo Jo, Youngmee Lee, Jee Young Kim, Yu Jin Oh, Sohyun Park, Young Ha Joo, Hye Suk Kim, Semi Kang

Abstract:

A growing number of people are living alone and eating alone. People might have different dietary behaviors between eating alone and eating with others, it can influence their weight and health. The purpose of this study was to investigate the dietary behavior of eating alone in middle-aged populations in South Korea. We used the nationally representative data from the 5th Korea National Health and Nutrition Examination Survey (KNHANES), 2010-2012 and a cross-sectional survey on the eating behaviors among adults (N=1318, 530 men, 788 women) aged from 20 to 54 years. Results showed that ‘underweight’ group ate more amount of food when eating with others compared to eating alone and ‘overweight’ and ‘obesity’ groups had opposite respondent (p<0.05). When having a meal alone, ‘underweight’ group ate food until didn’t feel hungry and ‘overweight’ and ‘obesity’ groups ate leftover food even they felt full (p<0.01). The ‘overweight’ and ‘obesity’ groups usually ate alone than ‘underweight’ group did (p<0.05). All groups had faster meal time when eating alone than eating with others and usually ate processed foods for convenience when eating alone. Younger people, aged 10-30, ate more processed food than older people did. South Koreans spend nearly 45% of their total food consumption from processed foods. This research was supported by the National Research Foundation of Korea for 2011 Korea-Japan Basic Scientific Cooperation Program (NRF-2011B00003). This work was supported by the Ministry of Education of the Republic of Korea and the National Research Foundation of Korea (NRF-2015S1A5B6037369).

Keywords: BMI, dietary behavior, eating alone, middle-aged populations

Procedia PDF Downloads 253
11143 Predicting Resistance of Commonly Used Antimicrobials in Urinary Tract Infections: A Decision Tree Analysis

Authors: Meera Tandan, Mohan Timilsina, Martin Cormican, Akke Vellinga

Abstract:

Background: In general practice, many infections are treated empirically without microbiological confirmation. Understanding susceptibility of antimicrobials during empirical prescribing can be helpful to reduce inappropriate prescribing. This study aims to apply a prediction model using a decision tree approach to predict the antimicrobial resistance (AMR) of urinary tract infections (UTI) based on non-clinical features of patients over 65 years. Decision tree models are a novel idea to predict the outcome of AMR at an initial stage. Method: Data was extracted from the database of the microbiological laboratory of the University Hospitals Galway on all antimicrobial susceptibility testing (AST) of urine specimens from patients over the age of 65 from January 2011 to December 2014. The primary endpoint was resistance to common antimicrobials (Nitrofurantoin, trimethoprim, ciprofloxacin, co-amoxiclav and amoxicillin) used to treat UTI. A classification and regression tree (CART) model was generated with the outcome ‘resistant infection’. The importance of each predictor (the number of previous samples, age, gender, location (nursing home, hospital, community) and causative agent) on antimicrobial resistance was estimated. Sensitivity, specificity, negative predictive (NPV) and positive predictive (PPV) values were used to evaluate the performance of the model. Seventy-five percent (75%) of the data were used as a training set and validation of the model was performed with the remaining 25% of the dataset. Results: A total of 9805 UTI patients over 65 years had their urine sample submitted for AST at least once over the four years. E.coli, Klebsiella, Proteus species were the most commonly identified pathogens among the UTI patients without catheter whereas Sertia, Staphylococcus aureus; Enterobacter was common with the catheter. The validated CART model shows slight differences in the sensitivity, specificity, PPV and NPV in between the models with and without the causative organisms. The sensitivity, specificity, PPV and NPV for the model with non-clinical predictors was between 74% and 88% depending on the antimicrobial. Conclusion: The CART models developed using non-clinical predictors have good performance when predicting antimicrobial resistance. These models predict which antimicrobial may be the most appropriate based on non-clinical factors. Other CART models, prospective data collection and validation and an increasing number of non-clinical factors will improve model performance. The presented model provides an alternative approach to decision making on antimicrobial prescribing for UTIs in older patients.

Keywords: antimicrobial resistance, urinary tract infection, prediction, decision tree

Procedia PDF Downloads 237
11142 Quick off the Mark with Achilles Tendon Rupture

Authors: Emily Moore, Andrew Gaukroger, Matthew Solan, Lucy Bailey, Alexandra Boxall, Andrew Carne, Chintu Gadamsetty, Charlotte Morley, Katy Western, Iwona Kolodziejczyk

Abstract:

Introduction: Rupture of the Achilles tendon is common and has a long recovery period. Most cases are managed non-operatively. Foot and Ankle Surgeons advise an ultrasound scan to check the gap between the torn ends. A large gap (with the ankle in equinus) is a relative indication for surgery. The definitive decision regarding surgical versus non-operative management can only be made once an ultrasound scan is undertaken and the patient is subsequently reviewed by a Foot and Ankle surgeon. To get to this point, the patient journey involves several hospital departments. In nearby trusts, patients reattend for a scan and go to the plaster room both before and after the ultrasound for removal and re-application of the cast. At a third visit to the hospital, the surgeon and patient discuss options for definitive treatment. It may take 2-3 weeks from the initial Emergency Department visit before the final treatment decision is made. This “wasted time” is ultimately added to the recovery period for the patient. In this hospital, Achilles rupture patients are seen in a weekly multidisciplinary OneStop Heel Pain clinic. This pathway was already efficient but subject to occasional frustrating delays if a key staff member was absent. A new pathway was introduced with the goal to reduce delays to a definitive treatment plan. Method: A retrospective series of Achilles tendon ruptures managed according to the 2019 protocol was identified. Time taken from the Emergency Department to have both an ultrasound scan and specialist Foot and Ankle surgical review were calculated. 30 consecutive patients were treated with our new pathway and prospectively followed. The time taken for a scan and for specialist review were compared to the 30 consecutive cases from the 2019 (pre-COVID) cohort. The new pathway includes 1. A new contoured splint applied to the front of the injured limb held with a bandage. This can be removed and replaced (unlike a plaster cast) in the ultrasound department, removing the need for plaster room visits. 2. Urgent triage to a Foot and Ankle specialist. 3. Ultrasound scan for assessment of rupture gap and deep vein thrombosis check. 4. Early decision regarding surgery. Transfer to weight bearing in a prosthetic boot in equinuswithout waiting for the once-a-week clinic. 5. Extended oral VTE prophylaxis. Results: The time taken for a patient to have both an ultrasound scan and specialist review fell > 50%. All patients in the new pathway reached a definitive treatment decision within one week. There were no significant differences in patient demographics or rates of surgical vs non-operative treatment. The mean time from Emergency Department visit to specialist review and ultrasound scan fell from 8.7 days (old protocol) to 2.9 days (new pathway). The maximum time for this fell from 23 days (old protocol) to 6 days (new pathway). Conclusion: Teamwork and innovation have improved the experience for patients with an Achilles tendon rupture. The new pathway brings many advantages - reduced time in the Emergency Department, fewer hospital visits, less time using crutches and reduced overall recovery time.

Keywords: orthopaedics, achilles rupture, ultrasound, innovation

Procedia PDF Downloads 99
11141 Climate Change and Its Effects on Terrestrial Insect Diversity in Mukuruthi National Park, Nilgiri Biosphere Reserve, Tamilnadu, India

Authors: M. Elanchezhian, C. Gunasekaran, A. Agnes Deepa, M. Salahudeen

Abstract:

In recent years climate change is one of the most emerging threats facing by biodiversity both the animals and plants species. Elevated carbon dioxide and ozone concentrations, extreme temperature, changes in rainfall patterns, insects-plant interaction are the main criteria that affect biodiversity. In the present study, which emphasis the climate change and its effects on terrestrial insect diversity in Mukuruthi National Park a protected areas of Western Ghats in India. Sampling was done seasonally at the three areas using pitfall traps, over the period of January to December 2013. The statistical findings were done by Shannon wiener diversity index (H). A significant seasonal variation pattern was detected for total insect’s diversity at the different study areas. Totally nine orders of insects were recorded. Diversity and abundance of terrestrial insects shows much difference between the Natural, Shoal forest and the Grasslands.

Keywords: biodiversity, climate change, mukuruthi national park, terrestrial invertebrates

Procedia PDF Downloads 499
11140 Contemporary Paradoxical Expectations of the Nursing Profession and Revisiting the ‘Nurses’ Disciplinary Boundaries: India’s Historical and Gendered Perspective

Authors: Neha Adsul, Rohit Shah

Abstract:

Background: The global history of nursing is exclusively a history of deep contradictions as it seeks to negotiate inclusion in an already gendered world. Although a powerful 'clinical gaze exists, nurses have toiled to re-negotiate and subvert the 'medical gaze' by practicing the 'therapeutic gaze' to tether back 'care into nursing practice.' This helps address the duality of the 'body' and 'mind' wherein the patient is not just limited to being an object of medical inquiry. Nevertheless, there has been a consistent effort to fit 'nursing' into being an art or an emerging science over the years. Especially with advances in hospital-based techno-centric medical practices, the boundaries between technology and nursing practices are becoming more blurred as the technical process becomes synonymous with nursing, eroding the essence of nursing care. Aim: This paper examines the history of nursing and offers insights into how gendered relations and the ideological belief of 'nursing as gendered work' have propagated to the subjugation of the nursing profession. It further aims to provide insights into the patriarchally imbibed techno-centrism that negates the gendered caregiving which lies at the crux of a nurse's work. Method: A literature search was carried out using Google Scholar, Web of Science and PubMed databases. Search words included: technology and nursing, medical technology and nursing, history of nursing, sociology and nursing and nursing care. The history of nursing is presented in a discussion that weaves together the historical events of the 'Birth of the Clinic' and the shift from 'bed-side medicine' to 'hospital-based medicine' that legitimizes exploitation of the bodies of patients to the 'medical gaze while the emergence of nursing as acquiescent to instrumental, technical, positivist and dominant views of medicine. The resultant power asymmetries, wherein in contemporary nursing, the constant struggle of nurses to juggle between being the physicians "operational right arm" to harboring that subjective understanding of the patients to refrain from de-humanizing nursing-care. Findings: The nursing profession suffers from being rendered invisible due to gendered relations having patrifocal societal roots. This perpetuates a notion rooted in the idea that emphasizes empiricism and has resulted in theoretical and epistemological fragmentation of the understanding of body and mind as separate entities. Nurses operate within this structure while constantly being at the brink of being pushed beyond the legitimate professional boundaries while being labeled as being 'unscientific' as the work does not always corroborate and align with the existing dominant positivist lines of inquiries. Conclusion: When understood in this broader context of how nursing as a practice has evolved over the years, it provides a particularly crucial testbed for understanding contemporary gender relations. Not because nurses like to live in a gendered work trap but because the gendered relations at work are written in a covert narcissistic patriarchal milieu that fails to recognize the value of intangible yet utmost necessary 'caring work in nursing. This research urges and calls for preserving and revering the humane aspect of nursing care alongside the emerging tech-savvy expectations from nursing work.

Keywords: nursing history, technocentric, power relations, scientific duality

Procedia PDF Downloads 130
11139 Association between Healthy Eating Index-2015 Scores and the Probability of Sarcopenia in Community-Dwelling Iranian Elderly

Authors: Zahra Esmaeily, Zahra Tajari, Shahrzad Daei, Mahshid Rezaei, Atefeh Eyvazkhani, Marjan Mansouri Dara, Ahmad Reza Dorosty Motlagh, Andriko Palmowski

Abstract:

Objective: Sarcopenia (SPA) is associated with frailty and disability in the elderly. Adherence to current dietary guidelines in addition to physical activity could play a role in the prevention of muscle wasting and weakness. The Healthy Eating Index-2015 (HEI) is a tool to assess diet quality as recommended in the U.S. Dietary Guidelines for Americans. This study aimed to investigate whether there is a relationship between HEI scores and the probability of SPA (PS) among the Tehran elderly. Method: A previously validated semi-quantitative food frequency questionnaire was used to assess HEI and the dietary intake of randomly selected elderly people living in Tehran, Iran. Handgrip strength (HGS) was measured to evaluate the PS. Statistical evaluation included descriptive analysis and standard test procedures. Result: 201 subjects were included. Those probably suffering from SPA (as determined by HGS) had significantly lower HEI scores (p = 0.02). After adjusting for confounders, HEI scores and HGS were still significantly associated (adjusted R2 = 0.56, slope β = 0.03, P = 0.09). Elderly people with a low probability of SPA consumed more monounsaturated and polyunsaturated fatty acids (P = 0.06) and ingested less added sugars and saturated fats (P = 0.01 and P = 0.02, respectively). Conclusion: In this cross-sectional study, HEI scores are associated with the probability of SPA. Adhering to current dietary guidelines might contribute to ameliorating muscle strength and mass in aging individuals.

Keywords: aging, HEI-2015, Iranian, sarcopenic

Procedia PDF Downloads 183
11138 Diversity and Distribution of Cytochrome P450 2C9 Genes Related with Medical Cannabis in Thai Patients

Authors: Tanakrit Doltanakarn

Abstract:

Introduction: These days, cannabis is being accepted in many countries due to the fact that cannabis could be use in medical. The medical cannabis is used to treat and reduce the pain many diseases. For example, neuropathic pain, Parkinson, autism disorders, cancer pain reduce the adverse effect of chemotherapy, diabetes, and migraine. Active ingredients in cannabis that modulate patients' perceptions of their conditions include Δ9‐tetrahydrocannabinol (THC), cannabidiol (CBD), flavonoids, and terpenes. However, there is an adverse effect of cannabis, cardiovascular effects, psychosis, schizophrenia, mood disorder, and cognitive alternation. These effects are from the THC and CBD ingredients in the cannabis. The metabolize processes of delta-9 THC to 11-OH-delta 9 -THC (inactive form), THC were cause of adverse effects. Interestingly, the distributions of CYP2C9 gene (CYP2C9*2 and CYP2C9*3, poor metabolizer) that might affect incidences of adverse effects in patients who treated with medical cannabis. Objective: The aim of this study we want to investigate the association between genetic polymorphism of CYP2C9 frequency and Thai patients who treated with medical cannabis. Materials and Methods:We recruited sixty-five unrelated Thai patients from the College of Pharmacy, Rangsit University. DNA were extracted using Genomic DNA Mini Kit. Genotyping of CYP2C9*2 (430C>T, rs1799853) and CYP2C9*3 (1075A>C, rs1057910) were genotyped by the TaqMan Real-time PCR assay. Results: Among these 31 medicals cannabis-induced ADRs patients, they were diagnosed with 22 (33.85%) tachycardia and 3 (4.62%) arrhythmia. There were 34 (52.31%) medical cannabis-tolerant controls who were included in this study.40 (61.53%) Thai patients were female, and 25 (38.46%) were male, with median age of 57 (range 27 – 87) years. In this study, we found none of the medical cannabis-induced ADRs carried CYP2C9*2 variant along with medical cannabis-tolerant control group. CYP2C9*3 variant (intermediate metabolizer, IM) was found just only one of thirty-one (3.23%) in the medical cannabis-induced ADRs and two of thirty-fourth (5.88%) in the tolerant controls. Conclusions: Thus, the distribution of CYP2C9 alleles offer a comprehensive view of pharmacogenomics marker in Thai population that could be used as a reference for worldwide to investigate the pharmacogenomics application.

Keywords: medical cannabis, adverse effect, CYP2C9, thai patients

Procedia PDF Downloads 87
11137 Motor Vehicle Accidents During Pregnancy: Analysis of Maternal and Fetal Outcome at a University Hospital

Authors: Manjunath Attibele, Alsawafi Manal, Al Dughaishi Tamima

Abstract:

Introduction: The purpose of this study was to describe the clinical characteristics and types of mechanisms of injuries caused by Motor vehicle accidents (MVA) during pregnancy. To analyze the patterns of accidents during pregnancy and its adverse consequences on both maternal and fetal outcome. Methods: This was a retrospective cohort study on pregnant patients who met with MVAs The study period was from January 1, 2010, to December 31, 2019. All relevant data were retrieved from electronic patients’ records from the hospital information system and from the antenatal ward admission register Results: Out of 168 women who had motor vehicle accidents during the study period, of which, 39 (23.2%) women during pregnancy. Twenty-one (53.8%) women were over 30 years old. Thirty-five (89.7%) women were Omanis, and 27 (69.2%) were in their third trimester. Twenty-three (59%) of accidents happened at night, and 31 (79.5%) of them happened on a weekday. Twenty-two (56.4%) of women were driving themselves, and 24 (61.5%) of them were not using any seatbelt. Accident related abdominal & back pain was seen in 23(59%) women. Regarding the outcome of pregnancy, 23 (74.2%) had a normal vaginal delivery. The mean accident to delivery interval was 7 weeks. Thirty (96.7%) of involved newborns were relatively healthy. One woman (3.2%) had a ruptured uterusleading to fetal death (3.2%). Conclusion: This study showed that the incidence of motor vehicle accidents during pregnancy is around 23.2% . Majority had trauma-associated pain. One serious injury to a woman causing a ruptured uterus which lead to fetal death. Majority of involved newborns were relatively healthy. No reported maternal death.

Keywords: motor vehicle accidents, pregnancy, maternal outcome, fetal outcome

Procedia PDF Downloads 69
11136 Child Feeding Practices of Mothers (Childbearing) and Exploration of Their Household Food Insecurity in a Coastal Region of Bangladesh

Authors: Md Abdullah Al Mamun

Abstract:

Background: The current situation of Ensuring WHO recommended feeding practices for infant and young children which is becoming a challenge nowadays in many developing countries, especially in areas where household food security is at risk. Because many households of the developing countries often encounter severe food insecurity hence provision of adequate child nutrition is threatened. Aim: The study aimed to assess the child feeding practices of 0-24 months childbearing mothers and explore their household food insecurity in a coastal region of Bangladesh. Methods: This study was conducted in Suborno Char (one of the coastal suburbs in Noakhali District in Bangladesh) from October 2019 to April 2020. A total of 400 mothers were selected with their children of 0-24 months following a cross sectional study sampling procedure of the population. Data were collected through a standard questionnaire and analyzed using statistical tests in SPSS version 20.0.0. Results: The frequency of exclusive breastfeeding, timely initiation of complementary feeding, and giving foods from four food groups to the children were 53.5%, 75.5%, and 22.2%, respectively. Mother's level of education showed a strong association with the child feeding practices of the mothers. Mothers of severely food insecure households showed lower odds in exclusive breastfeeding practice (COR 0.233 at 95% CI 0.083, 0.655; and AOR 0.478 at 95% CI 0.133, 1.713) than mothers of food secured households. Similar results have also been found in case of timely initiation of complementary feeding and minimum dietary diversity of the children.

Keywords: household food insecurity, exclusive breastfeeding, complementary feeding, maternal education, mothers age, household income

Procedia PDF Downloads 141
11135 NMR-Based Metabolomics Reveals Dietary Effects in Liver Extracts of Arctic Charr (Salvelinus alpinus) and Tilapia (Oreochromis mossambicus) Fed Different Levels of Starch

Authors: Rani Abro, Ali Ata Moazzami, Jan Erik Lindberg, Torbjörn Lundh

Abstract:

The effect of dietary starch level on liver metabolism in Arctic charr (Salvelinus alpinus) and tilapia (Oreochromis mossambicus) was studied using 1H-NMR based metabolomics. Fingerlings were fed iso-nitrogenous diets containing 0, 10 and 20 % starch for two months before liver samples were collected for metabolite analysis. Metabolite profiling was performed using 600 MHz NMR Chenomx software. In total, 48 metabolites were profiled in liver extracts from both fish species. Following the profiling, principal component analysis (PCA) and orthogonal partial least square discriminant analysis (OPLC-DA) were performed. These revealed that differences in the concentration of significant metabolites were correlated to the dietary starch level in both species. The most prominent difference in metabolic response to starch feeding between the omnivorous tilapia and the carnivorous Arctic charr was an indication of higher anaerobic metabolism in Arctic charr. The data also indicated that amino acid and pyrimidine metabolism was higher in Artic charr than in tilapia.

Keywords: arctic charr, metabolomics, starch, tilapia

Procedia PDF Downloads 442
11134 Association between Physical Inactivity and Sedentary Behaviours with Risk of Hypertension among Sedentary Occupation Workers: A Cross-Sectional Study

Authors: Hanan Badr, Fahad Manee, Rao Shashidhar, Omar Bayoumy

Abstract:

Introduction: Hypertension is the major risk factor for cardiovascular diseases and stroke and a universe leading cause of disability-adjusted life years and mortality. Adopting an unhealthy lifestyle is thought to be associated with developing hypertension regardless of predisposing genetic factors. This study aimed to examine the association between recreational physical activity (RPA), and sedentary behaviors with a risk of hypertension among ministry employees, where there is no role for occupational physical activity (PA), and to scrutinize participants’ time spent in RPA and sedentary behaviors on the working and weekend days. Methods: A cross-sectional study was conducted among randomly selected 2562 employees working at ten randomly selected ministries in Kuwait. To have a representative sample, the proportional allocation technique was used to define the number of participants in each ministry. A self-administered questionnaire was used to collect data about participants' socio-demographic characteristics, health status, and their 24 hours’ time use during a regular working day and a weekend day. The time use covered a list of 20 different activities practiced by a person daily. The New Zealand Physical Activity Questionnaire-Short Form (NZPAQ-SF) was used to assess the level of RPA. The scale generates three categories according to the number of hours spent in RPA/week: relatively inactive, relatively active, and highly active. Gender-matched trained nurses performed anthropometric measurements (weight and height) and measuring blood pressure (two readings) using an automatic blood pressure monitor (95% accuracy level compared to a calibrated mercury sphygmomanometer). Results: Participants’ mean age was 35.3±8.4 years, with almost equal gender distribution. About 13% of the participants were smokers, and 75% were overweight. Almost 10% reported doctor-diagnosed hypertension. Among those who did not, the mean systolic blood pressure was 119.9±14.2 and the mean diastolic blood pressure was 80.9±7.3. Moreover, 73.9% of participants were relatively physically inactive and 18% were highly active. Mean systolic and diastolic blood pressure showed a significant inverse association with the level of RPA (means of blood pressure measures were: 123.3/82.8 among relatively inactive, 119.7/80.4 among relatively active, and 116.6/79.6 among highly active). Furthermore, RPA occupied 1.6% and 1.8% of working and weekend days, respectively, while sedentary behaviors (watching TV, using electronics for social media or entertaining, etc.) occupied 11.2% and 13.1%, respectively. Sedentary behaviors were significantly associated with high levels of systolic and diastolic blood pressure. Binary logistic regression revealed that physical inactivity (OR=3.13, 95% CI: 2.25-4.35) and sedentary behaviors (OR=2.25, CI: 1.45-3.17) were independent risk factors for high systolic and diastolic blood pressure after adjustment for other covariates. Conclusions: Physical inactivity and sedentary lifestyle were associated with a high risk of hypertension. Further research to examine the independent role of RPA in improving blood pressure levels and cultural and occupational barriers for practicing RPA are recommended. Policies should be enacted in promoting PA in the workplace that might help in decreasing the risk of hypertension among sedentary occupation workers.

Keywords: physical activity, sedentary behaviors, hypertension, workplace

Procedia PDF Downloads 152
11133 Silent Myocardial Infarction Presented with Homonymous Hemianopia in a Non-Diabetic Middle Aged Man

Authors: Seyed Fakhroddin Hejazi, Mohammad Saleh Sadeghi, Leili Iranirad

Abstract:

Silent myocardial infarction is defined as the appearance of pathological Q waves in the electrocardiogram, without objective signs of myocardial infarction and any minimal or atypical symptoms. Although this condition has been known for a long time, but little is known about its phenomenon and the mechanisms of it remain unclear. Its coincidence with stroke is also still controversial. This case report introduces a middle-aged man with silent myocardial infarction presented with homonymous hemianopia, which except stage 1 hypertension, had no other major cardiovascular risk factors including diabetes mellitus, hypercholesterolemia, family history of cardiac diseases and smoking. In conclusion, this case report indicated that existence of only one cardiovascular risk factor would lead to the development of MI or stroke.

Keywords: silent myocardial infarction, homonymous hemianopia, stroke, hypertension

Procedia PDF Downloads 264
11132 Additional Pathological Findings Using MRI on Patients with First Time Traumatic Lateral Patella Dislocation: A Study of 150 Patients

Authors: Ophir Segal, Daniel Weltsch, Shay Tenenbaum, Ran Thein

Abstract:

Purpose: Patients with lateral patellar dislocation (LPD) are not always referred to perform an MRI. This might be the case in first time LPD patients without surgical indications or in patients with recurrent LPD who had MRI in previous episodes. Unfortunately, in some cases, there are additional knee pathological findings, which include tearing of the collateral or cruciate ligaments and injury to the tendons or menisci. These findings might be overlooked during the physical examination or masked by nonspecific clinical findings like knee pain, effusion, or hemarthrosis. The prevalence of these findings, which can be revealed by MRI, is misreported in literature and is considered rare. In our practice, all patients with LPD are sent for MRI after LPD. Therefore, we have designed a retrospective comparative study to evaluate the prevalence of additional pathological findings in patients with acute traumatic LPD that had performed MRI, comparing different groups of patients according to age, sex, and Tibial Tuberosity-Trochlear Groove(TT-TG) distance. Methods: MRI of the knee in patients after traumatic LPD were evaluated for the presence of additional pathological findings such as injuries to ligaments: Anterior/Posterior cruciate ligament(ACL, PCL), Medial/Lateral collateral ligament(MCL, LCL), injuries to tendons(QUADICEPS, PATELLAR), menisci(Medial/Lateral meniscus(MM, LM)) and tibial plateau, by a fellowship-trained, senior musculoskeletal radiologist. A comparison between different groups of patients was performed according to age (age group < 25 years, age group > 25 years), sex (Male/Female group), and TT-TG distance (TT-TG<15 groups, TT-TG>15 groups). A descriptive and comparative statistical analysis was performed. Results: 150 patients were included in this study. All suffered from LPD between the years 2012-2017 (mean age 21.3( ± SD 8.9), 86 males). ACL, PCL, MCL, and LCL complete or partial tears were found in 17(11.3%), 3(2%), 22(14.6%), and 4(2.7%) of the patients, respectively. MM and LM tears were found in 10(6.7%) and 3(2%) of the patients, respectively. A higher prevalence of PCL injury, MM tear, and LM tear were found in the older age group compared to the younger group of patients (10.5% vs. 1.8%, 18.4% vs. 2.7%, and 7.9% vs. 0%, respectively, p<0.05). A higher prevalence of non-displaced MM tear and LCL injury was found in the male group compared to the female group (8.1% vs. 0% and 8.1% vs. 0% respectively, p<0.05). A higher prevalence of ACL injury was found in the normal TT-TG group compared to the pathologic TT-TG group (17.5% vs. 2.3%, p= 0.0184). Conclusions: Overall, 43 out of 150 (28.7%) of the patient's MRI’s were positive for additional pathological radiological findings. Interestingly, a higher prevalence of additional pathologies was found in the groups of patients with a lower risk for recurrent LPD, including males, patients older than 25, and patients with TT-TG lower than 15mm, and therefore might not be referred for an MRI scan. Thus, we recommend a strict physical examination, awareness to the high prevalence of additional pathological findings, and to consider performing an MRI in all patients after LPD.

Keywords: additional findings, lateral patellar dislocation (LPD), MRI scan, traumatic patellar dislocation, cruciate ligaments injuries, menisci injuries, collateral ligaments injuries

Procedia PDF Downloads 124
11131 The Implementation of a Nurse-Driven Palliative Care Trigger Tool

Authors: Sawyer Spurry

Abstract:

Problem: Palliative care providers at an academic medical center in Maryland stated medical intensive care unit (MICU) patients are often referred late in their hospital stay. The MICU has performed well below the hospital quality performance metric of 80% of patients who expire with expected outcomes should have received a palliative care consult within 48 hours of admission. Purpose: The purpose of this quality improvement (QI) project is to increase palliative care utilization in the MICU through the implementation of a Nurse-Driven PalliativeTriggerTool to prompt the need for specialty palliative care consult. Methods: MICU nursing staff and providers received education concerning the implications of underused palliative care services and the literature data supporting the use of nurse-driven palliative care tools as a means of increasing utilization of palliative care. A MICU population specific criteria of palliative triggers (Palliative Care Trigger Tool) was formulated by the QI implementation team, palliative care team, and patient care services department. Nursing staff were asked to assess patients daily for the presence of palliative triggers using the Palliative Care Trigger Tool and present findings during bedside rounds. MICU providers were asked to consult palliative medicinegiven the presence of palliative triggers; following interdisciplinary rounds. Rates of palliative consult, given the presence of triggers, were collected via electronic medical record e-data pull, de-identified, and recorded in the data collection tool. Preliminary Results: Over 140 MICU registered nurses were educated on the palliative trigger initiative along with 8 nurse practitioners, 4 intensivists, 2 pulmonary critical care fellows, and 2 palliative medicine physicians. Over 200 patients were admitted to the MICU and screened for palliative triggers during the 15-week implementation period. Primary outcomes showed an increase in palliative care consult rates to those patients presenting with triggers, a decreased mean time from admission to palliative consult, and increased recognition of unmet palliative care needs by MICU nurses and providers. Conclusions: Anticipatory findings of this QI project would suggest a positive correlation between utilizing palliative care trigger criteria and decreased time to palliative care consult. The direct outcomes of effective palliative care results in decreased length of stay, healthcare costs, and moral distress, as well as improved symptom management and quality of life (QOL).

Keywords: palliative care, nursing, quality improvement, trigger tool

Procedia PDF Downloads 172
11130 Management Tools for Assessment of Adverse Reactions Caused by Contrast Media at the Hospital

Authors: Pranee Suecharoen, Ratchadaporn Soontornpas, Jaturat Kanpittaya

Abstract:

Background: Contrast media has an important role for disease diagnosis through detection of pathologies. Contrast media can, however, cause adverse reactions after administration of its agents. Although non-ionic contrast media are commonly used, the incidence of adverse events is relatively low. The most common reactions found (10.5%) were mild and manageable and/or preventable. Pharmacists can play an important role in evaluating adverse reactions, including awareness of the specific preparation and the type of adverse reaction. As most common types of adverse reactions are idiosyncratic or pseudo-allergic reactions, common standards need to be established to prevent and control adverse reactions promptly and effectively. Objective: To measure the effect of using tools for symptom evaluation in order to reduce the severity, or prevent the occurrence, of adverse reactions from contrast media. Methods: Retrospective review descriptive research with data collected on adverse reactions assessment and Naranjo’s algorithm between June 2015 and May 2016. Results: 158 patients (10.53%) had adverse reactions. Of the 1,500 participants with an adverse event evaluation, 137 (9.13%) had a mild adverse reaction, including hives, nausea, vomiting, dizziness, and headache. These types of symptoms can be treated (i.e., with antihistamines, anti-emetics) and the patient recovers completely within one day. The group with moderate adverse reactions, numbering 18 cases (1.2%), had hypertension or hypotension, and shortness of breath. Severe adverse reactions numbered 3 cases (0.2%) and included swelling of the larynx, cardiac arrest, and loss of consciousness, requiring immediate treatment. No other complications under close medical supervision were recorded (i.e., corticosteroids use, epinephrine, dopamine, atropine, or life-saving devices). Using the guideline, therapies are divided into general and specific and are performed according to the severity, risk factors and ingestion of contrast media agents. Patients who have high-risk factors were screened and treated (i.e., prophylactic premedication) for prevention of severe adverse reactions, especially those with renal failure. Thus, awareness for the need for prescreening of different risk factors is necessary for early recognition and prompt treatment. Conclusion: Studying adverse reactions can be used to develop a model for reducing the level of severity and setting a guideline for a standardized, multidisciplinary approach to adverse reactions.

Keywords: role of pharmacist, management of adverse reactions, guideline for contrast media, non-ionic contrast media

Procedia PDF Downloads 285
11129 Retrospective/Prospective Analysis of Guideline Implementation and Transfusion Rates

Authors: B. Kenny

Abstract:

The complications associated with transfusions are well documented, with the serious hazards of transfusion (SHOT) reporting system continuing to report deaths and serious morbidity due to the transfusion of allogenic blood. Many different sources including the TRICC trial, NHMRC and Cochrane recommending similar transfusion triggers/guidelines. Recent studies found the rate of infection (deep infection, wound infection, chest infection, urinary tract infection, and others) were purely a dose response relationship, increasing the Relative Risk by 3.44. It was also noted that each transfused patient stayed in hospital for one additional day. We hypothesise that providing an approved/standardised, guideline with a graphical summary of decision pathways for anaemic patients will reduce unnecessary transfusions. We retrospectively assessed patients undergoing primary knee or hip arthroplasties over a 4 year period, 1459 patients. Of these, 339 (23.24%) patients received allogenic blood transfusions and 858 units of blood were transfused, 9.14% of patients transfused had haemoglobin levels above 100 g/L, 7.67% of patients were transfused without knowing the haemoglobin level, 24 hours prior to transfusion initiation and 4.5% had possible transfusion reactions. Overall, 17% of allogenic transfusions topatients admitted to the Orthopaedic department within a 4 year period were outside NHMRC and Cochrane guidelines/recommendations. If our transfusion frequency is compared with that of other authors/hospitals, transfusion rates are consistently being high. We subsequently implemented a simple guideline for transfusion initiation. This guideline was then assessed. We found the transfusion rate post transfusion implementation to be significantly lower, without increase in patient morbidity or mortalitiy, p <0.001). Transfusion rates and patient outcome can be optimized by a simple graphical aid for decision making.

Keywords: transfusion, morbidity, mortality, neck of femur, fracture, arthroplasty, rehabilitation

Procedia PDF Downloads 226
11128 Reducing the Risk of Alcohol Relapse after Liver-Transplantation

Authors: Rebeca V. Tholen, Elaine Bundy

Abstract:

Background: Liver transplantation (LT) is considered the only curative treatment for end-stage liver disease Background: Liver transplantation (LT) is considered the only curative treatment for end-stage liver disease (ESLD). The effects of alcoholism can cause irreversible liver damage, cirrhosis and subsequent liver failure. Alcohol relapse after transplant occurs in 20-50% of patients and increases the risk for recurrent cirrhosis, organ rejection, and graft failure. Alcohol relapse after transplant has been identified as a problem among liver transplant recipients at a large urban academic transplant center in the United States. Transplantation will reverse the complications of ESLD, but it does not treat underlying alcoholism or reduce the risk of relapse after transplant. The purpose of this quality improvement project is to implement and evaluate the effectiveness of a High-Risk Alcoholism Relapse (HRAR) Scale to screen and identify patients at high-risk for alcohol relapse after receiving an LT. Methods: The HRAR Scale is a predictive tool designed to determine the severity of alcoholism and risk of relapse after transplant. The scale consists of three variables identified as having the highest predictive power for early relapse including, daily number of drinks, history of previous inpatient treatment for alcoholism, and the number of years of heavy drinking. All adult liver transplant recipients at a large urban transplant center were screened with the HRAR Scale prior to hospital discharge. A zero to two ordinal score is ranked for each variable, and the total score ranges from zero to six. High-risk scores are between three to six. Results: Descriptive statistics revealed 25 patients were newly transplanted and discharged from the hospital during an 8-week period. 40% of patients (n=10) were identified as being high-risk for relapse and 60% low-risk (n=15). The daily number of drinks were determined by alcohol content (1 drink = 15g of ethanol) and number of drinks per day. 60% of patients reported drinking 9-17 drinks per day, and 40% reported ≤ 9 drinks. 50% of high-risk patients reported drinking ≥ 25 years, 40% for 11-25 years, and 10% ≤ 11 years. For number of inpatient treatments for alcoholism, 50% received inpatient treatment one time, 20% ≥ 1, and 30% reported never receiving inpatient treatment. Findings reveal the importance and value of a validated screening tool as a more efficient method than other screening methods alone. Integration of a structured clinical tool will help guide the drinking history portion of the psychosocial assessment. Targeted interventions can be implemented for all high-risk patients. Conclusions: Our findings validate the effectiveness of utilizing the HRAR scale to screen and identify patients who are a high-risk for alcohol relapse post-LT. Recommendations to help maintain post-transplant sobriety include starting a transplant support group within the organization for all high-risk patients. (ESLD). The effects of alcoholism can cause irreversible liver damage, cirrhosis and subsequent liver failure. Alcohol relapse after transplant occurs in 20-50% of patients, and increases the risk for recurrent cirrhosis, organ rejection, and graft failure. Alcohol relapse after transplant has been identified as a problem among liver transplant recipients at a large urban academic transplant center in the United States. Transplantation will reverse the complications of ESLD, but it does not treat underlying alcoholism or reduce the risk of relapse after transplant. The purpose of this quality improvement project is to implement and evaluate the effectiveness of a High-Risk Alcoholism Relapse (HRAR) Scale to screen and identify patients at high-risk for alcohol relapse after receiving a LT. Methods: The HRAR Scale is a predictive tool designed to determine severity of alcoholism and risk of relapse after transplant. The scale consists of three variables identified as having the highest predictive power for early relapse including, daily number of drinks, history of previous inpatient treatment for alcoholism, and the number of years of heavy drinking. All adult liver transplant recipients at a large urban transplant center were screened with the HRAR Scale prior to hospital discharge. A zero to two ordinal score is ranked for each variable, and the total score ranges from zero to six. High-risk scores are between three to six. Results: Descriptive statistics revealed 25 patients were newly transplanted and discharged from the hospital during an 8-week period. 40% of patients (n=10) were identified as being high-risk for relapse and 60% low-risk (n=15). The daily number of drinks were determined by alcohol content (1 drink = 15g of ethanol) and number of drinks per day. 60% of patients reported drinking 9-17 drinks per day, and 40% reported ≤ 9 drinks. 50% of high-risk patients reported drinking ≥ 25 years, 40% for 11-25 years, and 10% ≤ 11 years. For number of inpatient treatments for alcoholism, 50% received inpatient treatment one time, 20% ≥ 1, and 30% reported never receiving inpatient treatment. Findings reveal the importance and value of a validated screening tool as a more efficient method than other screening methods alone. Integration of a structured clinical tool will help guide the drinking history portion of the psychosocial assessment. Targeted interventions can be implemented for all high-risk patients. Conclusions: Our findings validate the effectiveness of utilizing the HRAR scale to screen and identify patients who are a high-risk for alcohol relapse post-LT. Recommendations to help maintain post-transplant sobriety include starting a transplant support group within the organization for all high-risk patients.

Keywords: alcoholism, liver transplant, quality improvement, substance abuse

Procedia PDF Downloads 99
11127 Molecular Epidemiology of Circulating Adenovirus Types in Acute Conjunctivitis Cases in Chandigarh, North India

Authors: Mini P. Singh, Jagat Ram, Archit Kumar, Tripti Rungta, Jasmine Khurana, Amit Gupta, R. K. Ratho

Abstract:

Introduction: Human adenovirus is the most common agent involved in viral conjunctivitis. The clinical manifestations vary with different serotypes. The identification of the circulating strains followed by phylogenetic analysis can be helpful in understanding the origin and transmission of the disease. The present study aimed to carry out molecular epidemiology of the adenovirus types in the patients with conjunctivitis presenting to the eye centre of a tertiary care hospital in North India. Materials and Methods: The conjunctival swabs were collected from 23 suspected adenoviral conjunctivitis patients between April-August, 2014 and transported in viral transport media. The samples were subjected to nested PCR targeting hexon gene of human adenovirus. The band size of 956bp was eluted and 8 representative positive samples were subjected to sequencing. The sequences were analyzed by using CLUSTALX2.1 and MEGA 5.1 software. Results: The male: female ratio was found to be 3.6:1. The mean age of presenting patients was 43.95 years (+17.2). Approximately 52.1% (12/23) of patients presented with bilateral involvement while 47.8% (11/23) with unilateral involvement of the eye. Human adenovirus DNA could be detected in 65.2% (15/23) of the patients. The phylogenetic analysis revealed presence of serotype 8 in 7 patients and serotype 4 in one patient. The serotype 8 sequences showed 99-100% identity with Tunisian, Indian and Japanese strains. The adenovirus serotype 4 strains had 100% identity with strains from Tunisia, China and USA. Conclusion: Human adenovirus was found be an important etiological agent for conjunctivitis in our set up. The phylogenetic analysis showed that the predominant circulating strains in our epidemic keratoconjunctivitis were serotypes 8 and 4.

Keywords: conjunctivitis, human adenovirus, molecular epidemiology, phylogenetics

Procedia PDF Downloads 261
11126 Surgical Management of Distal Femur Fracture Using Locking Compression Plate: Our Experience in a Rural Tertiary Care Centre in India

Authors: Pagadaplly Girish, P. V. Manohar

Abstract:

Introduction: Management of distal femur fractures is challenging. Recently, treatment has evolved towards indirect reduction and minimally invasive techniques. Objectives: To assess the fracture union and functional outcome following open reduction and internal fixation of distal femur fractures with locking compression plate and to achieve restoration of the anatomical alignment of fracture fragments and stable internal fixation. Methodology: Patients with distal femur fracture treated by locking compression during Oct 2011 to April 2013 were assessed prospectively. Patients below 18 years and those with neuro-vascular deficits were excluded. Age, sex of the patient, type of fracture, mechanism of injury, type of implant used, operative time and postoperative complications were analysed. The Neer’s scale was used to assess the outcome of the patients. Results: The total number of patients was 30; 28 males and 2 females; mean age was 41.53 years. Road traffic accidents were the major causes of injury followed by falls. The average duration of hospital stay was 21.3 days. The overall complication rate note was 23.33%. The mean range of movement around the knee joint after 6 months of follow-up was 114.330. The average time for the radiological union was 14 weeks. Excellent to good results were noted in 26 patients (86.6%) and average to poor results were observed in 4 (13.33%) patients. Conclusions: The locking compression plate gives a rigid fixation for the fracture. It also provides a good purchase in osteoporotic bones. LCP is simple and a reliable implant appropriate for fixation of femoral fractures with promising results.

Keywords: distal femur fractures, locking compression plate, Neer’s criteria, neuro-vascular deficits

Procedia PDF Downloads 229
11125 Carbapenem Usage in Medical Wards: An Antibiotic Stewardship Feedback Project

Authors: Choon Seong Ng, P. Petrick, C. L. Lau

Abstract:

Background: Carbapenem-resistant isolates have been increasingly reported recently. Carbapenem stewardship is designed to optimize its usage particularly among medical wards with high prevalence of carbapenem prescriptions to combat such emerging resistance. Carbapenem stewardship programmes (CSP) can reduce antibiotic use but clinical outcome of such measures needs further evaluation. We examined this in a prospective manner using feedback mechanism. Methods: Our single-center prospective cohort study involved all carbapenem prescriptions across the medical wards (including medical patients admitted to intensive care unit) in a tertiary university hospital setting. The impact of such stewardship was analysed according to the accepted and the rejected groups. The primary endpoint was safety. Safety measure applied in this study was the death at 1 month. Secondary endpoints included length of hospitalisation and readmission. Results: Over the 19 months’ period, input from 144 carbapenem prescriptions was analysed on the basis of acceptance of our CSP recommendations on the use of carbapenems. Recommendations made were as follows : de-escalation of carbapenem; stopping the carbapenem; use for a short duration of 5-7 days; required prolonged duration in the case of carbapenem-sensitive Extended Spectrum Beta-Lactamases bacteremia; dose adjustment; and surgical intervention for removal of septic foci. De-escalation, shorten duration of carbapenem and carbapenem cessation comprised 79% of the recommendations. Acceptance rate was 57%. Those who accepted CSP recommendations had no increase in mortality (p = 0.92), had a shorter length of hospital stay (LOS) and had cost-saving. Infection-related deaths were found to be higher among those in the rejected group. Moreover, three rejected cases (6%) among all non-indicated cases (n = 50) were found to have developed carbapenem-resistant isolates. Lastly, Pitt’s bacteremia score appeared to be a key element affecting the carbapenem prescription’s behaviour in this trial. Conclusions: Carbapenem stewardship program in the medical wards not only saves money, but most importantly it is safe and does not harm the patients with added benefits of reducing the length of hospital stay. However, more time is needed to engage the primary clinical teams by formal clinical presentation and immediate personal feedback by senior Infectious Disease (ID) personnel to increase its acceptance.

Keywords: audit and feedback, carbapenem stewardship, medical wards, university hospital

Procedia PDF Downloads 192
11124 The Bespoke ‘Hybrid Virtual Fracture Clinic’ during the COVID-19 Pandemic: A Paradigm Shift?

Authors: Anirudh Sharma

Abstract:

Introduction: The Covid-19 pandemic necessitated a change in the manner outpatient fracture clinics are conducted due to the need to reduce footfall in hospital. While studies regarding virtual fracture clinics have shown these to be useful and effective, they focus exclusively on remote consultations. However, our service was bespoke to the patient – either a face-to-face or telephone consultation depending on patient need – a ‘hybrid virtual clinic (HVC).’ We report patient satisfaction and outcomes with this novel service. Methods: Patients booked onto our fracture clinics during the first 2 weeks of national lockdown were retrospectively contacted to assess the mode of consultations (virtual, face-to-face, or hybrid), patient experience, and outcome. Patient experience was assessed using the net promoter (NPS), customer effort (CES) and customer satisfaction scores (CSS), and their likelihood of using the HVC in the absence of a pandemic. Patient outcomes were assessed using the components of the EQ5D score. Results: Of 269 possible patients, 140 patients responded to the questionnaire. Of these, 66.4% had ‘hybrid’ consultations, 27.1% had only virtual consultations, and 6.4% had only face-to-face consultations. The mean overall NPS, CES, and CSS (on a scale of 1-10) were 7.27, 7.25, and 7.37, respectively. The mean likelihood of patients using the HVC in the absence of a pandemic was 6.5/10. Patients who had ‘hybrid’ consultations showed better effort scores and greater overall satisfaction than those with virtual consultations only and also reported superior EQ5D outcomes (mean 79.27 vs. 72.7). Patients who did not require surgery reported increased satisfaction (mean 7.51 vs. 7.08) and were more likely to use the HVC in the absence of a pandemic. Conclusion: Our study indicates that a bespoke HVC has good overall patient satisfaction and outcomes and is a better format of fracture clinic service than virtual consultations alone. It may be the preferred mode for fracture clinics in similar situations in the future. Further analysis needs to be conducted in order to explore the impact on resources and clinician experience of HVC in order to appreciate this new paradigm shift.

Keywords: hybrid virtual clinic, coronavirus, COVID-19, fracture clinic, remote consultation

Procedia PDF Downloads 119
11123 Role of Vitamin-D in Reducing Need for Supplemental Oxygen Among COVID-19 Patients

Authors: Anita Bajpai, Sarah Duan, Ashlee Erskine, Shehzein Khan, Raymond Kramer

Abstract:

Introduction: This research focuses on exploring the beneficial effects if any, of Vitamin-D in reducing the need for supplemental oxygen among hospitalized COVID-19 patients. Two questions are investigated – Q1)Doeshaving a healthy level of baselineVitamin-D 25-OH (≥ 30ng/ml) help,andQ2) does administering Vitamin-D therapy after-the-factduring inpatient hospitalization help? Methods/Study Design: This is a comprehensive, retrospective, observational study of all inpatients at RUHS from March through December 2020 who tested positive for COVID-19 based on real-time reverse transcriptase–polymerase chain reaction assay of nasal and pharyngeal swabs and rapid assay antigen test. To address Q1, we looked atall N1=182 patients whose baseline plasma Vitamin-D 25-OH was known and who needed supplemental oxygen. Of this, a total of 121 patients had a healthy Vitamin-D level of ≥30 ng/mlwhile the remaining 61 patients had low or borderline (≤ 29.9ng/ml)level. Similarly, for Q2, we looked at a total of N2=893 patients who were given supplemental oxygen, of which713 were not given Vitamin-D and 180 were given Vitamin-D therapy. The numerical value of the maximum amount of oxygen flow rate(dependent variable) administered was recorded for each patient. The mean values and associated standard deviations for each group were calculated. Thesetwo sets of independent data served as the basis for independent, two-sample t-Test statistical analysis. To be accommodative of any reasonable benefitof Vitamin-D, ap-value of 0.10(α< 10%) was set as the cutoff point for statistical significance. Results: Given the large sample sizes, the calculated statistical power for both our studies exceeded the customary norm of 80% or better (β< 0.2). For Q1, the mean value for maximumoxygen flow rate for the group with healthybaseline level of Vitamin-D was 8.6 L/min vs.12.6L/min for those with low or borderline levels, yielding a p-value of 0.07 (p < 0.10) with the conclusion that those with a healthy level of baseline Vitamin-D needed statistically significant lower levels of supplemental oxygen. ForQ2, the mean value for a maximum oxygen flow rate for those not administered Vitamin-Dwas 12.5 L/min vs.12.8L/min for those given Vitamin-D, yielding a p-valueof 0.87 (p > 0.10). We thereforeconcludedthat there was no statistically significant difference in the use of oxygen therapy between those who were or were not administered Vitamin-D after-the-fact in the hospital. Discussion/Conclusion: We found that patients who had healthy levels of Vitamin-D at baseline needed statistically significant lower levels of supplemental oxygen. Vitamin-D is well documented, including in a recent article in the Lancet, for its anti-inflammatory role as an adjuvant in the regulation of cytokines and immune cells. Interestingly, we found no statistically significant advantage for giving Vitamin-D to hospitalized patients. It may be a case of “too little too late”. A randomized clinical trial reported in JAMA also did not find any reduction in hospital stay of patients given Vitamin-D. Such conclusions come with a caveat that any delayed marginal benefits may not have materialized promptly in the presence of a significant inflammatory condition. Since Vitamin-D is a low-cost, low-risk option, it may still be useful on an inpatient basis until more definitive findings are established.

Keywords: COVID-19, vitamin-D, supplemental oxygen, vitamin-D in primary care

Procedia PDF Downloads 135
11122 Genetic Diversity of Sorghum bicolor (L.) Moench Genotypes as Revealed by Microsatellite Markers

Authors: Maletsema Alina Mofokeng, Hussein Shimelis, Mark Laing, Pangirayi Tongoona

Abstract:

Sorghum is one of the most important cereal crops grown for food, feed and bioenergy. Knowledge of genetic diversity is important for conservation of genetic resources and improvement of crop plants through breeding. The objective of this study was to assess the level of genetic diversity among sorghum genotypes using microsatellite markers. A total of 103 accessions of sorghum genotypes obtained from the Department of Agriculture, Forestry and Fisheries, the African Centre for Crop Improvement and Agricultural Research Council-Grain Crops Institute collections in South Africa were estimated using 30 microsatellite markers. For all the loci analysed, 306 polymorphic alleles were detected with a mean value of 6.4 per locus. The polymorphic information content had an average value of 0.50 with heterozygosity mean value of 0.55 suggesting an important genetic diversity within the sorghum genotypes used. The unweighted pair group method with arithmetic mean clustering based on Euclidian coefficients revealed two major distinct groups without allocating genotypes based on the source of collection or origin. The genotypes 4154.1.1.1, 2055.1.1.1, 4441.1.1.1, 4442.1.1.1, 4722.1.1.1, and 4606.1.1.1 were the most diverse. The sorghum genotypes with high genetic diversity could serve as important sources of novel alleles for breeding and strategic genetic conservation.

Keywords: Genetic Diversity, Genotypes, Microsatellites, Sorghum

Procedia PDF Downloads 356
11121 Study of seum Tumor Necrosis Factor Alpha in Pediatric Patients with Hemophilia A

Authors: Sara Mohammad Atef Sabaika

Abstract:

Background: The development of factor VIII (FVIII) inhibitor and hemophilic arthropathy in patients with hemophilia A (PWHA) are a great challenge for hemophilia care. Both genetic and environmental factors led to complications in PWHA. The development of inhibitory antibodies is usually induced by the immune response. Tumor necrosis factor α (TNF-α), one of the cytokines, might contribute to its polymorphism. Aim: Study the association between tumor necrosis alpha level and genotypes in pediatric patients with hemophilia A and its relation to inhibitor development and joint status. Methods: A cross-sectional study was conducted among a sufficient number of PWHA attending the Pediatric Hematology and Oncology Unit, Pediatric department in Menoufia University hospital. The clinical parameters, FVIII, FVIII inhibitor, and serum TNF-α level were assessed. The genotyping of −380G > A TNF-α gene polymorphism was performed using real time polymerase chain reaction. Results: Among the 50 PWHA, 28 (56%) were identified as severe PWHA. The FVIII inhibitor was identified in 6/28 (21.5%) of severe PWHA. There was a significant correlation between serum TNF-α level and the development of inhibitor (p = 0:043). There was significant correlation between polymorphisms of −380G > A TNF-α gene and hemophilic arthropathy development (p = 0:645). Conclusion: The prevalence of FVIII inhibitor in severe PWHA in Menoufia was 21.5%. The frequency of replacement therapy is a risk factor for inhibitor development. Serum TNF-α level and its gene polymorphism might be used to predict inhibitor development and joint status in pediatric patients with hemophilia A.

Keywords: hemophilic arthropathy, TNF alpha., patients witb hemophilia A PWHA, inhibitor

Procedia PDF Downloads 70
11120 Real-World Prevalence of Musculoskeletal Disorders in Nigeria

Authors: F. Fatoye, C. E. Mbada, T. Gebrye, A. O. Ogunsola, C. Fatoye, O. Oyewole

Abstract:

Musculoskeletal disorders (MSDs) are a major cause of pain and disability. It is likely to become a greater economic and public health burden that is unnecessary. Thus, reliable prevalence figures are important for both clinicians and policy-makers to plan health care needs for those affected with the disease. This study estimated hospital based real-world prevalence of MSDs in Nigeria. A review of medical charts for adult patients attending Physiotherapy Outpatient Clinic at the Obafemi Awolowo University Teaching Hospitals Complex, Osun State, Nigeria between 2009 and 2018 was carried out to identify common MSDs including low back pain (LBP), cervical spondylosis (CSD), post immobilization stiffness (PIS), sprain, osteoarthritis (OA), and other conditions. Occupational class of the patients was determined using the International Labour Classification (ILO). Data were analysed using descriptive statistics of frequency and percentages. Overall, medical charts of 3,340 patients were reviewed within the span of ten years (2009 to 2018). Majority of the patients (62.8%) were in the middle class, and the remaining were in low class (25.1%) and high class (10.5%) category. An overall prevalence of 47.35% of MSD was found within the span of ten years. Of this, the prevalence of LBP, CSD, PIS, sprain, OA, and other conditions was 21.6%, 10%, 18.9%, 2%, 6.3%, and 41.3%, respectively. The highest (14.2%) and lowest (10.5%) prevalence of MSDs was recorded in the year of 2012 and 2018, respectively. The prevalence of MSDs is considerably high among Nigerian patients attending outpatient a physiotherapy clinic. The high prevalence of MSDs underscores the need for clinicians and decision makers to put in place appropriate strategies to reduce the prevalence of these conditions. In addition, they should plan and evaluate healthcare services to improve the health outcomes of patients with MSDs. Further studies are required to determine the economic burden of the condition and examine the clinical and cost-effectiveness of physiotherapy interventions for patients with MSDs.

Keywords: musculoskeletal disorders, Nigeria, prevalence, real world

Procedia PDF Downloads 152
11119 Follicular Thyroid Carcinoma in a Developing Country: A Retrospective Study of 10 Years

Authors: Abdul Aziz, Muhammad Qamar Masood, Saadia Sattar, Saira Fatima, Najmul Islam

Abstract:

Introduction: The most common endocrine tumor is thyroid cancer. Follicular Thyroid Carcinoma (FTC) accounts for 5%–10% of all thyroid cancers. Patients with FTC frequently present with more advanced stage diseases and a higher occurrence of distant metastases because of the propensity of vascular invasion. FTC is mainly treated with surgery, while radioactive iodine therapy is the main adjuvant therapy as per ATA guidelines. In many developing countries, surgical facilities and radioactive iodine are in short supply; therefore, understanding follicular thyroid cancer trends may help developing countries plan and use resources more effectively. Methodology: It was a retrospective observational study of FTC patients of age 18 years and above conducted at Aga Khan University Hospital, Karachi, from 1st January 2010 to 31st December 2019. Results: There were 404 patients with thyroid carcinoma, out of which forty (10.1%) were FTC. 50% of the patients were in the 41-60 years age group, and the female to male ratio was 1.5: 1. Twenty-four patients (60%) presented with complain of neck swelling followed by metastasis (20%) and compressive symptoms (20%). The most common site of metastasis was bone (87.5%), followed by lung (12.5%). The pre-operative thyroglobulin level was done in six out of eight metastatic patients (75%) in which it was elevated. This emphasizes the importance of checking thyroglobulin level in unusual presentation (bone pain, fractures) of a patient having neck swelling also to help in establishing the primary source of tumor. There was no complete documentation of ultrasound features of the thyroid gland in all the patients, which is an important investigation done in the initial evaluation of thyroid nodule. On FNAC, 50% (20 patients) had Bethesda category III-IV nodules, while 10% ( 4 patients ) had Bethesda category II. In sixteen patients, FNAC was not done as they presented with compressive symptoms or metastasis. Fifty percent had a total thyroidectomy and 50% had subtotal followed by completion thyroidectomy, plus ten patients had lymph node dissection, out of which seven had histopathological lymph node involvement. On histopathology, twenty-three patients (57.5%) had minimally invasive, while seventeen (42.5%) had widely invasive follicular thyroid carcinoma. The capsular invasion was present in thirty-three patients (82.5%); one patient had no capsular invasion, but there was a vascular invasion. Six patients' histopathology had no record of capsular invasion. In contrast, the lymphovascular invasion was present in twenty-six patients (65%). In this study, 65 % of the patients had clinical stage 1 disease, while 25% had stage 2 and 10% had clinical stage 4. Seventeen patients (42.5%) had received RAI 30-100 mCi, while ten patients (25%) received more than 100 mCi. Conclusion: FTC demographic and clinicopathological presentation are the same in Pakistan as compared to other countries. Surgery followed by RAI is the mainstay of treatment. Thus understanding the trend of FTC and proper planning and utilization of the resources will help the developing countries in effectively treating the FTC.

Keywords: thyroid carcinoma, follicular thyroid carcinoma, clinicopathological features, developing countries

Procedia PDF Downloads 167
11118 Antibiotic Susceptibility Pattern of the Pathogens Isolated from Hospital Acquired Acute Bacterial Meningitis in a Tertiary Health Care Centre in North India

Authors: M. S. Raza, A. Kapil, Sonu Tyagi, H. Gautam, S. Mohapatra, R. Chaudhry, S. Sood, V. Goyal, R. Lodha, V. Sreenivas, B. K. Das

Abstract:

Background: Acute bacterial meningitis remains the major cause of mortality and morbidity. More than half of the survivors develop the significant lifelong neurological abnormalities. Diagnosis of the hospital acquired acute bacterial meningitis (HAABM) is challenging as it appears either in the post operative patients or patients acquire the organisms from the hospital environment. In both the situations, pathogens are exposed to high dose of antibiotics. Chances of getting multidrug resistance organism are very high. We have performed this experiment to find out the etiological agents of HAABM and its antibiotics susceptibility pattern. Methodology: A perspective study was conducted at the Department of Microbiology, All India Institute of Medical Sciences, New Delhi. From March 2015 to April 2018 total 400 Cerebro spinal fluid samples were collected aseptically. Samples were processed for cell count, Gram staining, and culture. Culture plates were incubated at 37°C for 18-24 hours. Organism grown on blood and MacConkey agar were identified by MALDI-TOF Vitek MS (BioMerieux, France) and antibiotic susceptibility tests were performed by Kirby Bauer disc diffusion method as per CLSI 2015 guideline. Results: Of the 400 CSF samples processed, 43 (10.75%) were culture positive for different bacteria. Out of 43 isolates, the most prevalent Gram-positive organisms were S. aureus 4 (9.30%) followed by E. faecium 3 (6.97%) & CONS 2 (4.65%). Similarly, E. coli 13 (30.23%) was the commonest Gram-negative isolates followed by A. baumannii 12 (27.90%), K. pneumonia 5 (11.62%) and P. aeruginosa 4(9.30%). Most of the antibiotics tested against the Gram-negative isolates were resistance to them. Colistin was most effective followed by Meropenem and Imepenim for all Gram-negative HAABM isolates. Similarly, most of antibiotics tested were susceptible to S. aureus and CONS. However, E. faecium (100%) were only susceptible to vancomycin and teicoplanin. Conclusion: Hospital acquired acute bacterial meningitis (HAABM) is becoming the emerging challenge as most of isolates are showing resistance to commonly used antibiotics. Gram-negative organisms are emerging as the major player of HAABM. Great care needs to be taken especially in tertiary care hospitals. Similarly, antibiotic stewardship should be followed and antibiotic susceptibility test (AST) should be performed regularly to update the antibiotic patter and to prevent from the emergence of resistance. Updated information of the AST will be helpful for the better management of the meningitis patient.

Keywords: CSF, MALDI-TOF, hospital acquired acute bacterial meningitis, AST

Procedia PDF Downloads 142
11117 Diversability and Diversity: Toward Including Disability/Body-Mind Diversity in Educational Diversity, Equity, and Inclusion

Authors: Jennifer Natalya Fink

Abstract:

Since the racial reckoning of 2020, almost every major educational institution has incorporated diversity, equity, and inclusion (DEI) principles into its administrative, hiring, and pedagogical practices. Yet these DEI principles rarely incorporate explicit language or critical thinking about disability. Despite the fact that according to the World Health Organization, one in five people worldwide is disabled, making disabled people the larger minority group in the world, disability remains the neglected stepchild of DEI. Drawing on disability studies and crip theory frameworks, the underlying causes of this exclusion of disability from DEI, such as stigma, shame, invisible disabilities, institutionalization/segregation/delineation from family, and competing models and definitions of disability are examined. This paper explores both the ideological and practical shifts necessary to include disability in university DEI initiatives. It offers positive examples as well as conceptual frameworks such as 'divers ability' for so doing. Using Georgetown University’s 2020-2022 DEI initiatives as a case study, this paper describes how curricular infusion, accessibility, identity, community, and diversity administration infused one university’s DEI initiatives with concrete disability-inclusive measures. It concludes with a consideration of how the very framework of DEI itself might be challenged and transformed if disability were to be included.

Keywords: diversity, equity, inclusion, disability, crip theory, accessibility

Procedia PDF Downloads 110
11116 The Burden of Leptospirosis in Terms of Disability Adjusted Life Years in a District of Sri Lanka

Authors: A. M. U. P. Kumari, Vidanapathirana. J., Amarasekara J., Karunanayaka L.

Abstract:

Leptospirosis is a zoonotic infection with significant morbidity and mortality. As an occupational disease, it has become a global concern due to its disease burden in endemic countries and rural areas. The aim of this study was to assess disease burden in terms of DALYs of leptospirosis. A hospital-based descriptive cross-sectional study was conducted using 450 clinically diagnosed leptospirosis patients admitted to base and above hospitals in Monaragala district, Sri Lanka, using a pretested interviewer administered questionnaire. The patients were followed up till normal day today life after discharge. Estimation of DALYs was done using laboratory confirmed leptospirosis patients. Leptospirosis disease burden in the Monaragala district was 44.9 DALYs per 100,000 population which includes 33.18 YLLs and 10.9 YLDs. The incidence of leptospirosis in the Monaragala district during the study period was 59.8 per 100,000 population, and the case fatality rate (CFR) was 1.5% due to delay in health seeking behaviour; 75% of deaths were among males due to multi organ failure. The disease burden of leptospirosis in the Moneragala district was significantly high, and urgent efforts to control and prevent leptospirosis should be a priority.

Keywords: human leptospirosis, disease burden, disability adjusted life Years, Sri Lanka

Procedia PDF Downloads 222
11115 Hemoglobin Levels at a Standalone Dialysis Unit

Authors: Babu Shersad, Partha Banerjee

Abstract:

Reduction in haemoglobin levels has been implicated to be a cause for reduced exercise tolerance and cardiovascular complications of chronic renal diseases. Trends of hemoglobin levels in patients on haemodialysis could be an indicator of efficacy of hemodialysis and an indicator of quality of life in haemodialysis patients. In the UAE, the rate of growth (of patients on dialysis) is 10 to 15 per cent per year. The primary mode of haemodialysis in the region is based on in-patient hospital-based hemodialysis units. The increase in risk of cardiovascular and cerebrovascular morbidity as well as mortality in pre-dialysis Chronic Renal Disease has been reported. However, data on the health burden on haemodialysis in standalone dialysis facilities is very scarce. This is mainly due to the paucity of ambulatory centres for haemodialysis in the region. AMSA is the first center to offer standalone dialysis in the UAE and a study over a one year period was performed. Patient data was analyzed using a questionnaire for 45 patients with an average of 2.5 dialysis sessions per week. All patients were on chronic haemodialysis as outpatients. The trends of haemoglobin levels as an independent variable were evaluated. These trends were interpreted in comparison with other parameters of renal function (creatinine, uric acid, blood pressure and ferritin). Trends indicate an increase in hemoglobin levels with increased supplementation of iron and erythropoietin over time. The adequacy of hemodialysis shows improvement concomitantly. This, in turn, correlates with better patient outcomes and has a direct impact on morbidity and mortality. This study is a pilot study and further studies are indicated so that objective parameters can be studied and validated for hemodialysis in the region.

Keywords: haemodialysis, haemoglobin in haemodialysis, haemodialysis parameters, erythropoietic agents in haemodialysis

Procedia PDF Downloads 259