Search results for: diagnosis
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1953

Search results for: diagnosis

93 Mycophenolate-Induced Disseminated TB in a PPD-Negative Patient

Authors: Megan L. Srinivas

Abstract:

Individuals with underlying rheumatologic diseases such as dermatomyositis may not adequately respond to tuberculin (PPD) skin tests, creating false negative results. These illnesses are frequently treated with immunosuppressive therapy making proper identification of TB infection imperative. A 59-year-old Filipino man was diagnosed with dermatomyositis on the basis of rash, electromyography, and muscle biopsy. He was initially treated with IVIG infusions and transitioned to oral prednisone and mycophenolate. The patient’s symptoms improved on this regimen. Six months after starting mycophenolate, the patient began having fevers, night sweats, and productive cough without hemoptysis. He moved from the Philippines 5 years prior to dermatomyositis diagnosis, denied sick contacts, and was PPD negative both at immigration and immediately prior to starting mycophenolate treatment. A third PPD was negative following the onset of these new symptoms. He was treated for community-acquired pneumonia, but symptoms worsened over 10 days and he developed watery diarrhea and a growing non-tender, non-mobile mass on the left side of his neck. A chest x-ray demonstrated a cavitary lesion in right upper lobe suspicious for TB that had not been present one month earlier. Chest CT corroborated this finding also exhibiting necrotic hilar and paratracheal lymphadenopathy. Neck CT demonstrated the left-sided mass as cervical chain lymphadenopathy. Expectorated sputum and stool samples contained acid-fast bacilli (AFB), cultures showing TB bacteria. Fine-needle biopsy of the neck mass (scrofula) also exhibited AFB. An MRI brain showed nodular enhancement suspected to be a tuberculoma. Mycophenolate was discontinued and dermatomyositis treatment was switched to oral prednisone with a 3-day course of IVIG. The patient’s infection showed sensitivity to standard RIPE (rifampin, isoniazid, pyrazinamide, and ethambutol) treatment. Within a week of starting RIPE, the patient’s diarrhea subsided, scrofula diminished, and symptoms significantly improved. By the end of treatment week 3, the patient’s sputum no longer contained AFB; he was removed from isolation, and was discharged to continue RIPE at home. He was discharged on oral prednisone, which effectively addressed his dermatomyositis. This case illustrates the unreliability of PPD tests in patients with long-term inflammatory diseases such as dermatomyositis. Other immunosuppressive therapies (adalimumab, etanercept, and infliximab) have been affiliated with conversion of latent TB to disseminated TB. Mycophenolate is another immunosuppressive agent with similar mechanistic properties. Thus, it is imperative that patients with long-term inflammatory diseases and high-risk TB factors initiating immunosuppressive therapy receive a TB blood test (such as a quantiferon gold assay) prior to the initiation of therapy to ensure that latent TB is unmasked before it can evolve into a disseminated form of the disease.

Keywords: dermatomyositis, immunosuppressant medications, mycophenolate, disseminated tuberculosis

Procedia PDF Downloads 181
92 The Technique of Mobilization of the Colon for Pull-Through Procedure in Hirschsprung's Disease

Authors: Medet K. Khamitov, Marat M. Ospanov, Vasiliy M. Lozovoy, Zhenis N. Sakuov, Dastan Z. Rustemov

Abstract:

With a high rectosigmoid transitional zone in children with Hirschsprung’s disease, the upper rectal, sigmoid, left colon arteries are ligated during the pull-through of the descending part of the colon. As a result, the inferior mesenteric artery ceases to participate in the blood supply to the descending part of the colon. As a result, the reduced colon is supplied with blood only by the middle colon artery, which originates from the superior mesenteric artery. Insufficiency of blood supply to the reduced colon is the cause of the development of chronic hypoxia of the intestinal wall or necrosis of the reduced descending colon. Some surgeons prefer to preserve the left colon artery. However, it is possible to stretch the mesentery, which can lead to bowel retraction to anastomotic leaks and stenosis. Chronic hypoxia of the reduced colon, in turn, is the cause of acquired (secondary) aganglionosis. The highest frequency of anastomotic leaks is observed in children older than five years. The purpose is to reduce the risk of complications in the pull-through procedure of the descending part of the colon in patients with Hirschsprung’s disease by ensuring its sufficient mobility and maintaining blood supply to the lower mesenteric artery. Methodology and events. Two children aged 5 and 7 years with Hirschsprung’s disease were operated under the conditions of the hospital in Nur-Sultan. The diagnosis was made using x-ray contrast enema and histological examination. Operational technique. After revision of the left part of the colon and assessment of the architectonics of its blood vessels, parietal mobilization of the affected sigmoid and rectum was performed on laparotomy access, while maintaining the arterial and venous terminal arcades of the sigmoid vessels. Then, the descending branch of the left colon artery was crossed (if there is an insufficient length of the reduced intestine, the left colonic artery itself may also be crossed). This manipulation provides additional mobility of the pull-through descending part of the colon. The resulting "windows" in the mesentery of the reduced intestine were sutured to prevent the development of an internal hernia. Formed a full-blooded, sufficiently long transplant from the transverse loops of the splenic angle and the descending parts of the colon with blood supply from the upper and lower mesenteric artery, freely, without tension, is reduced to the rectal zone with the coloanal anastomosis 1.5 cm above the dentate line. Results. The postoperative period was uneventful. Patients were discharged on the 7th day. The observation was carried out for six months. In no case, there was a bowel retraction, anastomotic leak, anastomotic stenosis, or other complications. Conclusion. The presented technique of mobilization of the colon for the pull-through procedure in a high transitional rectosigmoid zone of Hirschsprung’s disease allows to maintain normal blood supply to the distal part of the colon and to avoid the tension of the colon. The technique allows reducing the risk of anastomotic leak, bowel necrosis, chronic ischemia, to exclude colon retraction and anastomotic stenosis.

Keywords: blood supply, children, colon mobilization, Hirschsprung's disease, pull-through

Procedia PDF Downloads 122
91 Structural and Functional Correlates of Reaction Time Variability in a Large Sample of Healthy Adolescents and Adolescents with ADHD Symptoms

Authors: Laura O’Halloran, Zhipeng Cao, Clare M. Kelly, Hugh Garavan, Robert Whelan

Abstract:

Reaction time (RT) variability on cognitive tasks provides the index of the efficiency of executive control processes (e.g. attention and inhibitory control) and is considered to be a hallmark of clinical disorders, such as attention-deficit disorder (ADHD). Increased RT variability is associated with structural and functional brain differences in children and adults with various clinical disorders, as well as poorer task performance accuracy. Furthermore, the strength of functional connectivity across various brain networks, such as the negative relationship between the task-negative default mode network and task-positive attentional networks, has been found to reflect differences in RT variability. Although RT variability may provide an index of attentional efficiency, as well as being a useful indicator of neurological impairment, the brain substrates associated with RT variability remain relatively poorly defined, particularly in a healthy sample. Method: Firstly, we used the intra-individual coefficient of variation (ICV) as an index of RT variability from “Go” responses on the Stop Signal Task. We then examined the functional and structural neural correlates of ICV in a large sample of 14-year old healthy adolescents (n=1719). Of these, a subset had elevated symptoms of ADHD (n=80) and was compared to a matched non-symptomatic control group (n=80). The relationship between brain activity during successful and unsuccessful inhibitions and gray matter volume were compared with the ICV. A mediation analysis was conducted to examine if specific brain regions mediated the relationship between ADHD symptoms and ICV. Lastly, we looked at functional connectivity across various brain networks and quantified both positive and negative correlations during “Go” responses on the Stop Signal Task. Results: The brain data revealed that higher ICV was associated with increased structural and functional brain activation in the precentral gyrus in the whole sample and in adolescents with ADHD symptoms. Lower ICV was associated with lower activation in the anterior cingulate cortex (ACC) and medial frontal gyrus in the whole sample and in the control group. Furthermore, our results indicated that activation in the precentral gyrus (Broadman Area 4) mediated the relationship between ADHD symptoms and behavioural ICV. Conclusion: This is the first study first to investigate the functional and structural correlates of ICV collectively in a large adolescent sample. Our findings demonstrate a concurrent increase in brain structure and function within task-active prefrontal networks as a function of increased RT variability. Furthermore, structural and functional brain activation patterns in the ACC, and medial frontal gyrus plays a role-optimizing top-down control in order to maintain task performance. Our results also evidenced clear differences in brain morphometry between adolescents with symptoms of ADHD but without clinical diagnosis and typically developing controls. Our findings shed light on specific functional and structural brain regions that are implicated in ICV and yield insights into effective cognitive control in healthy individuals and in clinical groups.

Keywords: ADHD, fMRI, reaction-time variability, default mode, functional connectivity

Procedia PDF Downloads 227
90 Targeting Apoptosis by Novel Adamantane Analogs as an Emerging Therapy for the Treatment of Hepatocellular Carcinoma Through EGFR, Bcl-2/BAX Cascade

Authors: Hanan M. Hassan, Laila Abouzeid, Lamya H. Al-Wahaibi, George S. G. Shehatou, Ali A. El-Emam

Abstract:

Cancer is a major public health problem and the second leading cause of death worldwide. In 2020, cancer diagnosis and treatment have been negatively affected by the coronavirus 2019 (COVID-19) pandemic. During the quarantine, because of the limited access to healthcare and avoiding exposure to COVID-19 as a contagious disease; patients of cancer suffered deferments in follow-up and treatment regimens leading to substantial worsening of disease, death, and increased healthcare costs. Thus, this study is designed to investigate the molecular mechanisms by which adamantne derivatives attenuate hepatocllular carcinoma experimentally and theoretically. There is a close association between increased resistance to anticancer drugs and defective apoptosis that considered a causative factor for oncogenesis. Cancer cells use different molecular pathways to inhibit apoptosis, BAX and Bcl-2 proteins have essential roles in the progression or inhibition of intrinsic apoptotic pathways triggered by mitochondrial dysfunction. Therefore, their balance ratio can promote the cellular apoptotic fate. In this study, the in vitro cytotoxic effects of seven synthetic adamantyl isothiorea derivatives were evaluated against five human tumor cell lines by MTT assay. Compounds 5 and 6 showed the best results, mostly against hepatocellular carcinoma (HCC). Hence, in vivo studies were performed in male Sprague-Dawley (SD) rats in which experimental hepatocellular carcinoma was induced with thioacetamide (TAA) (200 mg/kg, i.p., twice weekly) for 16 weeks. The most promising compounds, 5 and 6, were administered to treat liver cancer rats at a dose of 10 mg/kg/day for an additional two weeks, and the effects were compared with doxorubicin (DR), the anticancer drug. Hepatocellular carcinoma was evidenced by a dramatic increase in liver indices, oxidative stress markers, and immunohistochemical studies that were accompanied by a plethora of inflammatory mediators and alterations in the apoptotic cascade. Our results showed that treatment with adamantane derivatives 5 and 6 significantly suppressed fibrosis, inflammation, and other histopathological insults resulting in the diminished formation of hepatocyte tumorigenesis. Moreover, administration of the tested compounds resulted in amelioration of EGFR protein expression, upregulation of BAX, and lessening down of Bcl-2 levels that prove their role as apoptosis inducers. Also, the docking simulations performed for adamantane showed good fit and binding to the EGFR protein through hydrogen bond formation with conservative amino acids, which gives a shred of strong evidence for its hepatoprotective effect. In most analyses, the effects of compound 6 were more comparable to DR than compound 5. Our findings suggest that adamantane derivatives 5 and 6 are shown to have cytotoxic activity against HCC in vitro and in vivo, by more than one mechanism, possibly by inhibiting the TLR4-MyD88-NF-κB pathway and targeting EGFR signaling.

Keywords: adamantane, EGFR, HCC, apoptosis

Procedia PDF Downloads 126
89 FracXpert: Ensemble Machine Learning Approach for Localization and Classification of Bone Fractures in Cricket Athletes

Authors: Madushani Rodrigo, Banuka Athuraliya

Abstract:

In today's world of medical diagnosis and prediction, machine learning stands out as a strong tool, transforming old ways of caring for health. This study analyzes the use of machine learning in the specialized domain of sports medicine, with a focus on the timely and accurate detection of bone fractures in cricket athletes. Failure to identify bone fractures in real time can result in malunion or non-union conditions. To ensure proper treatment and enhance the bone healing process, accurately identifying fracture locations and types is necessary. When interpreting X-ray images, it relies on the expertise and experience of medical professionals in the identification process. Sometimes, radiographic images are of low quality, leading to potential issues. Therefore, it is necessary to have a proper approach to accurately localize and classify fractures in real time. The research has revealed that the optimal approach needs to address the stated problem and employ appropriate radiographic image processing techniques and object detection algorithms. These algorithms should effectively localize and accurately classify all types of fractures with high precision and in a timely manner. In order to overcome the challenges of misidentifying fractures, a distinct model for fracture localization and classification has been implemented. The research also incorporates radiographic image enhancement and preprocessing techniques to overcome the limitations posed by low-quality images. A classification ensemble model has been implemented using ResNet18 and VGG16. In parallel, a fracture segmentation model has been implemented using the enhanced U-Net architecture. Combining the results of these two implemented models, the FracXpert system can accurately localize exact fracture locations along with fracture types from the available 12 different types of fracture patterns, which include avulsion, comminuted, compressed, dislocation, greenstick, hairline, impacted, intraarticular, longitudinal, oblique, pathological, and spiral. This system will generate a confidence score level indicating the degree of confidence in the predicted result. Using ResNet18 and VGG16 architectures, the implemented fracture segmentation model, based on the U-Net architecture, achieved a high accuracy level of 99.94%, demonstrating its precision in identifying fracture locations. Simultaneously, the classification ensemble model achieved an accuracy of 81.0%, showcasing its ability to categorize various fracture patterns, which is instrumental in the fracture treatment process. In conclusion, FracXpert has become a promising ML application in sports medicine, demonstrating its potential to revolutionize fracture detection processes. By leveraging the power of ML algorithms, this study contributes to the advancement of diagnostic capabilities in cricket athlete healthcare, ensuring timely and accurate identification of bone fractures for the best treatment outcomes.

Keywords: multiclass classification, object detection, ResNet18, U-Net, VGG16

Procedia PDF Downloads 43
88 Dental Caries among Children in Bazartete, Timor-Leste and the impact of Maluk Timor School Outreach Program

Authors: A. Flaviana da Silva, B. Jenifer Apriani Ximenes, C. Efigenia dos Santos Pereira

Abstract:

The World Health Organization's 2022 Global Oral Health Status Report reveals a staggering statistic: more than 3.5 billion people, or half of the world's population, currently suffer from untreated oral diseases, encompassing issues like tooth loss, gum disease, and oral cancers. Among these, dental caries, commonly known as tooth decay, affects over 2.5 billion people globally. Dental caries result from acid erosion of teeth due to plaque build-up and consumption of free sugars. Despite being preventable through basic measures such as regular tooth-brushing with fluoride toothpaste and reduced sugar intake, untreated dental caries poses a significant and growing public health crisis. In children, dental caries stands as the most prevalent non-communicable disease worldwide, affecting 60-90% of school children to some extent. This condition severely impacts their physical, emotional, and social well-being, hindering essential activities and overall quality of life. Timor-Leste, a small nation in South-east Asia, grapples with the escalating problem of childhood dental caries, exacerbated by its unique challenges including poor access to healthcare services and limited resources. Methods: This study analysed Secondary, cross-sectional data collected by Maluk Timor in 2022 during the School Outreach Program. A total of 1,008 children aged 4-16 from eight primary schools in the Bazartete administrative post were examined for dental caries in their primary and permanent teeth. All students were invited to participate, and consent was obtained from parents and children. A team comprising one dentist and two dental nurses conducted health promotion sessions, dental examinations, and SDF treatment. A screening form based on WHO guidelines collected demographic data and caries diagnosis, categorized as decayed or healthy. Data analysis involved entering the data into Google Sheets, verifying its accuracy, and importing it into Microsoft Excel for analysis. Variables were created to identify students with carious lesions, and prevalence tables were generated, stratified by age group, gender, and location. Results: Among the 1,008 children analysed, 58.3% had dental caries. Caries prevalence was higher in primary teeth (36.7%) compared to permanent teeth (29.5%). Conclusion: In summary, this report highlights the alarming prevalence of dental caries among children in Timor-Leste and the efforts of Maluk Timor's School Outreach Program in addressing this critical issue. The results emphasize the need for effective preventive measures and improved access to oral healthcare in this region.

Keywords: dental caries, timor-leste, oral health, children, public health, primary health care, teeth

Procedia PDF Downloads 12
87 Epidemiology of Healthcare-Associated Infections among Hematology/Oncology Patients: Results of a Prospective Incidence Survey in a Tunisian University Hospital

Authors: Ezzi Olfa, Bouafia Nabiha, Ammar Asma, Ben Cheikh Asma, Mahjoub Mohamed, Bannour Wadiaa, Achour Bechir, Khelif Abderrahim, Njah Mansour

Abstract:

Background: In hematology/oncology, health care improvement has allowed increasingly aggressive management in diagnostic and therapeutic procedures. Nevertheless, these intensified procedures have been associated with higher risk of healthcare associated infections (HAIs). We undertook this study to estimate the burden of HAIs in the cancer patients in an onco -hematology unit in a Tunisian university hospital. Materials/Methods: A prospective, observational study, based on active surveillance for a period of 06 months from Mars through September 2016, was undertaken in the department of onco-hematology in a university hospital in Tunisia. Patients, who stayed in the unit for ≥ 48 h, were followed until hospital discharge. The Centers for Disease Control and Prevention criteria (CDC) for site-specific infections were used as standard definitions for HAIs. Results: One hundred fifty patients were included in the study. The gender distribution was 33.3% for girls and 66.6% boys. They have a mean age of 23.12 years (SD = 18.36 years). The main patient’s diagnosis is: Acute Lymphoblastic Leukemia (ALL): 48.7 %( n=73). The mean length of stay was 21 days +/- 18 days. Almost 8% of patients had an implantable port (n= 12), 34.9 % (n=52) had a lumber puncture and 42.7 % (n= 64) had a medullary puncture. Chemotherapy was instituted in 88% of patients (n=132). Eighty (53.3%) patients had neutropenia at admission. The incidence rate of HAIs was 32.66 % per patient; the incidence density was 15.73 per 1000 patient-days in the unit. Mortality rate was 9.3% (n= 14), and 50% of cases of death were caused by HAIs. The most frequent episodes of infection were: infection of skin and superficial mucosa (5.3%), pulmonary aspergillosis (4.6%), Healthcare associated pneumonia (HAP) (4%), Central venous catheter associated infection (4%), digestive infection (5%), and primary bloodstream infection (2.6%). Finally, fever of unknown origin (FUO) incidence rate was 14%. In case of skin and superficial infection (n= 8), 4 episodes were documented, and organisms implicated were Escherichia.coli, Geotricum capitatum and Proteus mirabilis. For pulmonary aspergillosis, 6 cases were diagnosed clinically and radiologically, and one was proved by positive aspergillus antigen in bronchial aspiration. Only one patient died due this infection. In HAP (6 cases), four episodes were diagnosed clinically and radiologically. No bacterial etiology was established in these cases. Two patients died due to HAP. For primary bloodstream infection (4 cases), implicated germs were Enterobacter cloacae, Geotricum capitatum, klebsiella pneumoniae, and Streptococcus pneumoniae. Conclusion: This type of prospective study is an indispensable tool for internal quality control. It is necessary to evaluate preventive measures and design control guides and strategies aimed to reduce the HAI’s rate and the morbidity and mortality associated with infection in a hematology/oncology unit.

Keywords: cohort prospective studies, healthcare associated infections, hematology oncology department, incidence

Procedia PDF Downloads 358
86 The Macrophage Migration Inhibitory Factor and Stem Cell Factor Levels in Serum of Adolescent and Young Adults with Mood Disorders: A Two Year Follow-Up Study

Authors: Aleksandra Rajewska-Rager, Maria Skibinska, Monika Dmitrzak-Weglarz, Natalia Lepczynska, Pawel Kapelski, Joanna Pawlak, Joanna Hauser

Abstract:

Introduction: Inflammation and cytokines have emerged as a promising target in mood disorders research; however there are still very limited numbers of study regarding inflammatory alterations among adolescents and young adults with mood disorders. The Macrophage Migration Inhibitory Factor (MIF) and Stem Cell Factor (SCF) are the pleiotropic cytokines which may play an important role in mood disorders pathophysiology. The aim of this study was to investigate levels of these factors in serum of adolescent and young adults with mood disorders compared to healthy controls. Subjects: We involved 79 patients aged 12-24 years in 2-year follow-up study with a primary diagnosis of mood disorders: bipolar disorder (BP) and unipolar disorder with BP spectrum. Study group includes 23 males (mean age 19.08, SD 3.3) and 56 females (18.39, SD 3.28). Control group consisted 35 persons: 7 males (20.43, SD 4.23) and 28 females (21.25, SD 2.11). Clinical diagnoses according to DSM-IV-TR criteria were assessed using Kiddie-Schedule for Affective Disorders and Schizophrenia-Present and Lifetime Version (K-SADS-PL) and Structured Clinical Interview for the Diagnostic and Statistical Manual (SCID) in young adults respectively. Clinical assessment includes evaluation of clinical factors and symptoms severity (rated using the Hamilton Depression Rating Scale and Young Mania Rating Scale). Clinical and biological evaluations were made at control visits respectively at baseline (week 0), euthymia (at month 3 or 6) and after 12 and 24 months. Methods: Serum protein concentration was determined by Enzyme-Linked Immunosorbent Assays (ELISA) method. Human MIF and SCF DuoSet ELISA kits were used. In the analyses non-parametric tests were used: Mann-Whitney U test, Kruskal-Wallis ANOVA, Friedman’s ANOVA, Wilcoxon signed rank test, Spearman correlation. We defined statistical significance as p < 0.05. Results: Comparing MIF and SCF levels between acute episode of depression/hypo/mania at baseline and euthymia (at month 3 or 6) we did not find any statistical differences. At baseline patients with age above 18 years old had decreased MIF level compared to patients younger than 18 years. MIF level at baseline positively correlated with age (p=0.004). Positive correlations of SCF level at month 3 and 6 with depression or mania occurrence at month 24 (p=0.03 and p=0.04, respectively) was detected. Strong correlations between MIF and SCF levels at baseline (p=0.0005) and month 3 (p=0.03) were observed. Discussion: Our results did not show any differences in MIF and SCF levels between acute episode of depression/hypo/mania and euthymia in young patients. Further studies on larger groups are recommended. Grant was founded by National Science Center in Poland no 2011/03/D/NZ5/06146.

Keywords: cytokines, MIF, mood disorders, SCF

Procedia PDF Downloads 177
85 The Role of Uterine Artery Embolization in the Management of Postpartum Hemorrhage

Authors: Chee Wai Ku, Pui See Chin

Abstract:

As an emerging alternative to hysterectomy, uterine artery embolization (UAE) has been widely used in the management of fibroids and in controlling postpartum hemorrhage (PPH) unresponsive to other therapies. Research has shown UAE to be a safe, minimally invasive procedure with few complications and minimal effects on future fertility. We present two cases highlighting the use of UAE in preventing PPH in a patient with a large fibroid at the time of cesarean section and in the treatment of secondary PPH refractory to other therapies in another patient. We present a 36-year primiparous woman who booked at 18+6 weeks gestation with a 13.7 cm subserosal fibroid at the lower anterior wall of the uterus near the cervix and a 10.8 cm subserosal fibroid in the left wall. Prophylactic internal iliac artery occlusion balloons were placed prior to the planned classical midline cesarean section. The balloons were inflated once the baby was delivered. Bilateral uterine arteries were embolized subsequently. The estimated blood loss (EBL) was 400 mls and hemoglobin (Hb) remained stable at 10 g/DL. Ultrasound scan 2 years postnatally showed stable uterine fibroids 10.4 and 7.1 cm, which was significantly smaller than before. We present the second case of a 40-year-old G2P1 with a previous cesarean section for failure to progress. There were no antenatal problems, and the placenta was not previa. She presented with term labour and underwent an emergency cesarean section for failed vaginal birth after cesarean. Intraoperatively extensive adhesions were noted with bladder drawn high, and EBL was 300 mls. Postpartum recovery was uneventful. She presented with secondary PPH 3 weeks later complicated by hypovolemic shock. She underwent an emergency examination under anesthesia and evacuation of the uterus, with EBL 2500mls. Histology showed decidua with chronic inflammation. She was discharged well with no further PPH. She subsequently returned one week later for secondary PPH. Bedside ultrasound showed that the endometrium was thin with no evidence of retained products of conception. Uterotonics were administered, and examination under anesthesia was performed, with uterine Bakri balloon and vaginal pack insertion after. EBL was 1000 mls. There was no definite cause of PPH with no uterine atony or products of conception. To evaluate a potential cause, pelvic angiogram and super selective left uterine arteriogram was performed which showed profuse contrast extravasation and acute bleeding from the left uterine artery. Superselective embolization of the left uterine artery was performed. No gross contrast extravasation from the right uterine artery was seen. These two cases demonstrated the superior efficacy of UAE. Firstly, the prophylactic use of intra-arterial balloon catheters in pregnant patients with large fibroids, and secondly, in the diagnosis and management of secondary PPH refractory to uterotonics and uterine tamponade. In both cases, the need for laparotomy hysterectomy was avoided, resulting in the preservation of future fertility. UAE should be a consideration for hemodynamically stable patients in centres with access to interventional radiology.

Keywords: fertility preservation, secondary postpartum hemorrhage, uterine embolization, uterine fibroids

Procedia PDF Downloads 165
84 Global Health Student Selected Components in Undergraduate Medical Education: Analysis of Student Feedback and Reflective Writings

Authors: Harriet Bothwell, Lowri Evans, Kevin Jones

Abstract:

Background: The University of Bristol provides all medical students the opportunity to undertake student selected components (SSCs) at multiple stages of the undergraduate programme. SSCs enable students to explore areas of interest that are not necessarily covered by the curriculum. Students are required to produce a written report and most use SSCs as an opportunity to undertake an audit or small research project. In 2013 Swindon Academy, based at the Great Western Hospital, offered eight students the opportunity of a global health SSC which included a two week trip to rural hospital in Uganda. This SSC has since expanded and in 2017 a total of 20 students had the opportunity to undertake small research projects at two hospitals in rural Uganda. 'Tomorrows Doctors' highlights the importance of understanding healthcare from a 'global perspective' and student feedback from previous SSCs suggests that self-assessed knowledge of global health increases as a result of this SSC. Through the most recent version of this SSC students had the opportunity to undertake projects in a wide range of specialties including paediatrics, palliative care, surgery and medical education. Methods: An anonymous online questionnaire was made available to students following the SSC. There was a response rate of 80% representing 16 out of the 20 students. This questionnaire surveyed students’ satisfaction and experience of the SSC including the level of academic, project and spiritual support provided as well as perceived challenges in completing the project and barriers to healthcare delivery in the low resource setting. This survey had multiple open questions allowing the collection of qualitative data. Further qualitative data was collected from the students’ project report. The suggested format included a reflection and all students completed these. All qualitative data underwent thematic analysis. Results: All respondents rated the overall experience of the SSC as 'good' or 'excellent'. Preliminary data suggest that students’ confidence in their knowledge of global health, diagnosis of tropical diseases and management of tropical diseases improved after completing this SSC. Thematic analysis of students' reflection is ongoing but suggests that students gain far more than improved knowledge of tropical diseases. Students reflect positively on having the opportunity to research in a low resource setting and feel that by completing these projects they will be 'useful' to the hospital. Several students reflect the stark contrast to healthcare delivery in the UK and recognise the 'privilege' of having a healthcare system that is free at the point of access. Some students noted the different approaches that clinicians in Uganda had to train in 'taking ownership' of their own learning. Conclusions: Students completing this SSC report increased knowledge of global health and tropical medicine. However, their reflections reveal much broader learning outcomes and demonstrate considerable insight in multiple topics including conducting research in the low resource setting, training and healthcare inequality.

Keywords: global health, medical education, student feedback, undergraduate

Procedia PDF Downloads 104
83 Single Cell and Spatial Transcriptomics: A Beginners Viewpoint from the Conceptual Pipeline

Authors: Leo Nnamdi Ozurumba-Dwight

Abstract:

Messenger ribooxynucleic acid (mRNA) molecules are compositional, protein-based. These proteins, encoding mRNA molecules (which collectively connote the transcriptome), when analyzed by RNA sequencing (RNAseq), unveils the nature of gene expression in the RNA. The obtained gene expression provides clues of cellular traits and their dynamics in presentations. These can be studied in relation to function and responses. RNAseq is a practical concept in Genomics as it enables detection and quantitative analysis of mRNA molecules. Single cell and spatial transcriptomics both present varying avenues for expositions in genomic characteristics of single cells and pooled cells in disease conditions such as cancer, auto-immune diseases, hematopoietic based diseases, among others, from investigated biological tissue samples. Single cell transcriptomics helps conduct a direct assessment of each building unit of tissues (the cell) during diagnosis and molecular gene expressional studies. A typical technique to achieve this is through the use of a single-cell RNA sequencer (scRNAseq), which helps in conducting high throughput genomic expressional studies. However, this technique generates expressional gene data for several cells which lack presentations on the cells’ positional coordinates within the tissue. As science is developmental, the use of complimentary pre-established tissue reference maps using molecular and bioinformatics techniques has innovatively sprung-forth and is now used to resolve this set back to produce both levels of data in one shot of scRNAseq analysis. This is an emerging conceptual approach in methodology for integrative and progressively dependable transcriptomics analysis. This can support in-situ fashioned analysis for better understanding of tissue functional organization, unveil new biomarkers for early-stage detection of diseases, biomarkers for therapeutic targets in drug development, and exposit nature of cell-to-cell interactions. Also, these are vital genomic signatures and characterizations of clinical applications. Over the past decades, RNAseq has generated a wide array of information that is igniting bespoke breakthroughs and innovations in Biomedicine. On the other side, spatial transcriptomics is tissue level based and utilized to study biological specimens having heterogeneous features. It exposits the gross identity of investigated mammalian tissues, which can then be used to study cell differentiation, track cell line trajectory patterns and behavior, and regulatory homeostasis in disease states. Also, it requires referenced positional analysis to make up of genomic signatures that will be sassed from the single cells in the tissue sample. Given these two presented approaches to RNA transcriptomics study in varying quantities of cell lines, with avenues for appropriate resolutions, both approaches have made the study of gene expression from mRNA molecules interesting, progressive, developmental, and helping to tackle health challenges head-on.

Keywords: transcriptomics, RNA sequencing, single cell, spatial, gene expression.

Procedia PDF Downloads 97
82 Management of Femoral Neck Stress Fractures at a Specialist Centre and Predictive Factors to Return to Activity Time: An Audit

Authors: Charlotte K. Lee, Henrique R. N. Aguiar, Ralph Smith, James Baldock, Sam Botchey

Abstract:

Background: Femoral neck stress fractures (FNSF) are uncommon, making up 1 to 7.2% of stress fractures in healthy subjects. FNSFs are prevalent in young women, military recruits, endurance athletes, and individuals with energy deficiency syndrome or female athlete triad. Presentation is often non-specific and is often misdiagnosed following the initial examination. There is limited research addressing the return–to–activity time after FNSF. Previous studies have demonstrated prognostic time predictions based on various imaging techniques. Here, (1) OxSport clinic FNSF practice standards are retrospectively reviewed, (2) FNSF cohort demographics are examined, (3) Regression models were used to predict return–to–activity prognosis and consequently determine bone stress risk factors. Methods: Patients with a diagnosis of FNSF attending Oxsport clinic between 01/06/2020 and 01/01/2020 were selected from the Rheumatology Assessment Database Innovation in Oxford (RhADiOn) and OxSport Stress Fracture Database (n = 14). (1) Clinical practice was audited against five criteria based on local and National Institute for Health Care Excellence guidance, with a 100% standard. (2) Demographics of the FNSF cohort were examined with Student’s T-Test. (3) Lastly, linear regression and Random Forest regression models were used on this patient cohort to predict return–to–activity time. Consequently, an analysis of feature importance was conducted after fitting each model. Results: OxSport clinical practice met standard (100%) in 3/5 criteria. The criteria not met were patient waiting times and documentation of all bone stress risk factors. Importantly, analysis of patient demographics showed that of the population with complete bone stress risk factor assessments, 53% were positive for modifiable bone stress risk factors. Lastly, linear regression analysis was utilized to identify demographic factors that predicted return–to–activity time [R2 = 79.172%; average error 0.226]. This analysis identified four key variables that predicted return-to-activity time: vitamin D level, total hip DEXA T value, femoral neck DEXA T value, and history of an eating disorder/disordered eating. Furthermore, random forest regression models were employed for this task [R2 = 97.805%; average error 0.024]. Analysis of the importance of each feature again identified a set of 4 variables, 3 of which matched with the linear regression analysis (vitamin D level, total hip DEXA T value, and femoral neck DEXA T value) and the fourth: age. Conclusion: OxSport clinical practice could be improved by more comprehensively evaluating bone stress risk factors. The importance of this evaluation is demonstrated by the population found positive for these risk factors. Using this cohort, potential bone stress risk factors that significantly impacted return-to-activity prognosis were predicted using regression models.

Keywords: eating disorder, bone stress risk factor, femoral neck stress fracture, vitamin D

Procedia PDF Downloads 159
81 Neuroanatomical Specificity in Reporting & Diagnosing Neurolinguistic Disorders: A Functional & Ethical Primer

Authors: Ruairi J. McMillan

Abstract:

Introduction: This critical analysis aims to ascertain how well neuroanatomical aetiologies are communicated within 20 case reports of aphasia. Neuroanatomical visualisations based on dissected brain specimens were produced and combined with white matter tract and vascular taxonomies of function in order to address the most consistently underreported features found within the aphasic case study reports. Together, these approaches are intended to integrate aphasiological knowledge from the past 20 years with aphasiological diagnostics, and to act as prototypal resources for both researchers and clinical professionals. The medico-legal precedent for aphasia diagnostics under Canadian, US and UK case law and the neuroimaging/neurological diagnostics relative to the functional capacity of aphasic patients are discussed in relation to the major findings of the literary analysis, neuroimaging protocols in clinical use today, and the neuroanatomical aetiologies of different aphasias. Basic Methodology: Literature searches of relevant scientific databases (e.g, OVID medline) were carried out using search terms such as aphasia case study (year) & stroke induced aphasia case study. A series of 7 diagnostic reporting criteria were formulated, and the resulting case studies were scored / 7 alongside clinical stroke criteria. In order to focus on the diagnostic assessment of the patient’s condition, only the case report proper (not the discussion) was used to quantify results. Statistical testing established if specific reporting criteria were associated with higher overall scores and potentially inferable increases in quality of reporting. Statistical testing of whether criteria scores were associated with an unclear/adjusted diagnosis were also tested, as well as the probability of a given criterion deviating from an expected estimate. Major Findings: The quantitative analysis of neuroanatomically driven diagnostics in case studies of aphasia revealed particularly low scores in the connection of neuroanatomical functions to aphasiological assessment (10%), and in the inclusion of white matter tracts within neuroimaging or assessment diagnostics (30%). Case studies which included clinical mention of white matter tracts within the report itself were distributed among higher scoring cases, as were case studies which (as clinically indicated) related the affected vascular region to the brain parenchyma of the language network. Concluding Statement: These findings indicate that certain neuroanatomical functions are integrated less often within the patient report than others, despite a precedent for well-integrated neuroanatomical aphasiology also being found among the case studies sampled, and despite these functions being clinically essential in diagnostic neuroimaging and aphasiological assessment. Therefore, ultimately the integration and specificity of aetiological neuroanatomy may contribute positively to the capacity and autonomy of aphasic patients as well as their clinicians. The integration of a full aetiological neuroanatomy within the reporting of aphasias may improve patient outcomes and sustain autonomy in the event of medico-ethical investigation.

Keywords: aphasia, language network, functional neuroanatomy, aphasiological diagnostics, medico-legal ethics

Procedia PDF Downloads 35
80 Separation of Urinary Proteins with Sodium Dodecyl Sulphate Polyacrylamide Gel Electrophoresis in Patients with Secondary Nephropathies

Authors: Irena Kostovska, Katerina Tosheska Trajkovska, Svetlana Cekovska, Julijana Brezovska Kavrakova, Hristina Ampova, Sonja Topuzovska, Ognen Kostovski, Goce Spasovski, Danica Labudovic

Abstract:

Background: Proteinuria is an important feature of secondary nephropathies. The quantitative and qualitative analysis of proteinuria plays an important role in determining the types of proteinuria (glomerular, tubular and mixed), in the diagnosis and prognosis of secondary nephropathies. The damage of the glomerular basement membrane is responsible for a proteinuria characterized by the presence of large amounts of protein with high molecular weights such as albumin (69 kilo Daltons-kD), transferrin (78 kD) and immunoglobulin G (150 kD). An insufficiency of proximal tubular function is the cause of a proteinuria characterized by the presence of proteins with low molecular weight (LMW), such as retinol binding protein (21 kD) and α1-microglobulin (31 kD). In some renal diseases, a mixed glomerular and tubular proteinuria is frequently seen. Sodium dodecyl sulphate polyacrylamide gel electrophoresis (SDS-PAGE) is the most widely used method of analyzing urine proteins for clinical purposes. The main aim of the study is to determine the type of proteinuria in the most common secondary nephropathies such as diabetic, hypertensive nephropathy and preeclampsia. Material and methods: In this study were included 90 subjects: subjects with diabetic nephropathy (n=30), subjects with hypertensive nephropahty (n=30) and pregnant women with preeclampsia (n=30). We divided all subjects according to UM/CR into three subgroups: macroalbuminuric (UM/CR >300 mg/g), microalbuminuric (UM/CR 30-300 mg/g) and normolabuminuric (UM/CR<30 mg/g). In all subjects, we measured microalbumin and creatinine in urine with standard biochemical methods. Separation of urinary proteins was performed by SDS-PAGE, in several stages: linear gel preparation (4-22%), treatment of urinary samples before their application on the gel, electrophoresis, gel fixation, coloring with Coomassie blue, and identification of the separated protein fractions based on standards with exactly known molecular weight. Results: According to urinary microalbumin/creatinin ratio in group of subject with diabetic nephropathy, nine patients were macroalbuminuric, while 21 subject were microalbuminuric. In group of subjects with hypertensive nephropathy, we found macroalbuminuria (n=4), microalbuminuria (n=20) and normoalbuminuria (n=6). All pregnant women with preeclampsia were macroalbuminuric. Electrophoretic separation of urinary proteins showed that in macroalbuminric patients with diabetic nephropathy 56% have mixed proteinuria, 22% have glomerular proteinuria and 22% have tubular proteinuria. In subgroup of subjects with diabetic nephropathy and microalbuminuria, 52% have glomerular proteinuria, 8% have tubular proteinuria, and 40% of subjects have normal electrophoretic findings. All patients with maroalbuminuria and hypertensive nephropathy have mixed proteinuria. In subgroup of patients with microalbuminuria and hypertensive nephropathy, we found: 32% with mixed proteinuria, 27% with normal findings, 23% with tubular, and 18% with glomerular proteinuria. In all normoalbuminruic patiens with hypertensive nephropathy, we detected normal electrophoretic findings. In group of subjects pregnant women with preeclampsia, we found: 81% with mixed proteinuria, 13% with glomerular, and 8% with tubular proteinuria. Conclusion: By SDS PAGE method, we detected that in patients with secondary nephropathies the most common type of proteinuria is mixed proteinuria, indicating both loss of glomerular permeability and tubular function. We can conclude that SDS PAGE is high sensitive method for detection of renal impairment in patients with secondary nephropathies.

Keywords: diabetic nephropathy, preeclampsia, hypertensive nephropathy, SDS PAGE

Procedia PDF Downloads 119
79 A Genetic Identification of Candida Species Causing Intravenous Catheter-Associated Candidemia in Heart Failure Patients

Authors: Seyed Reza Aghili, Tahereh Shokohi, Shirin Sadat Hashemi Fesharaki, Mohammad Ali Boroumand, Bahar Salmanian

Abstract:

Introduction: Intravenous catheter-associated fungal infection as nosocomial infection continue to be a deep problem among hospitalized patients, decreasing quality of life and adding healthcare costs. The capacity of catheters in the spread of candidemia in heart failure patients is obvious. The aim of this study was to evaluate the prevalence and genetic identification of Candida species in heart disorder patients. Material and Methods: This study was conducted in Tehran Hospital of Cardiology Center (Tehran, Iran, 2014) during 1.5 years on the patients hospitalized for at least 7 days and who had central or peripheral vein catheter. Culture of catheters, blood and skin of the location of catheter insertion were applied for detecting Candida colonies in 223 patients. Identification of Candida species was made on the basis of a combination of various phenotypic methods and confirmed by sequencing the ITS1-5.8S-ITS2 region amplified from the genomic DNA using PCR and the NCBI BLAST. Results: Of the 223 patients samples tested, we identified totally 15 Candida isolates obtained from 9 (4.04%) catheter cultures, 3 (1.35%) blood cultures and 2 (0.90%) skin cultures of the catheter insertion areas. On the base of ITS region sequencing, out of nine Candida isolates from catheter, 5(55.6%) C. albicans, 2(22.2%) C. glabrata, 1(11.1%) C. membranifiaciens and 1 (11.1%) C. tropicalis were identified. Among three Candida isolates from blood culture, C. tropicalis, C. carpophila and C. membranifiaciens were identified. Non-candida yeast isolated from one blood culture was Cryptococcus albidus. One case of C. glabrata and one case of Candida albicans were isolated from skin culture of the catheter insertion areas in patients with positive catheter culture. In these patients, ITS region of rDNA sequence showed a similarity between Candida isolated from the skin and catheter. However, the blood samples of these patients were negative for fungal growth. We report two cases of catheter-related candidemia caused by C. membranifiaciens and C. tropicalis on the base of genetic similarity of species isolated from blood and catheter which were treated successfully with intravenous fluconazole and catheter removal. In phenotypic identification methods, we could only identify C. albicans and C. tropicalis and other yeast isolates were diagnosed as Candida sp. Discussion: Although more than 200 species of Candida have been identified, only a few cause diseases in humans. There is some evidence that non-albicans infections are increasing. Many risk factors, including prior antibiotic therapy, use of a central venous catheter, surgery, and parenteral nutrition are considered to be associated with candidemia in hospitalized heart failure patients. Identifying the route of infection in candidemia is difficult. Non-albicans candida as the cause of candidemia is increasing dramatically. By using conventional method, many non-albicans isolates remain unidentified. So, using more sensitive and specific molecular genetic sequencing to clarify the aspects of epidemiology of the unknown candida species infections is essential. The positive blood and catheter cultures for candida isolates and high percentage of similarity of their ITS region of rDNA sequence in these two patients confirmed the diagnosis of intravenous catheter-associated candidemia.

Keywords: catheter-associated infections, heart failure patient, molecular genetic sequencing, ITS region of rDNA, Candidemia

Procedia PDF Downloads 304
78 Prevalence of Occupational Asthma Diagnosed by Specific Challenge Test in 5 Different Working Environments in Thailand

Authors: Sawang Saenghirunvattana, Chao Saenghirunvattana, Maria Christina Gonzales, Wilai Srimuk, Chitchamai Siangpro, Kritsana Sutthisri

Abstract:

Introduction: Thailand is one of the fastest growing countries in Asia. It has emerged from agricultural to industrialized economy. Work places have shifted from farms to factories, offices and streets were employees are exposed to certain chemicals and pollutants causing occupational diseases particularly asthma. Work-related diseases are major concern and many studies have been published to demonstrate certain professions and their exposures that elevate the risk of asthma. Workers who exhibit coughing, wheezing and difficulty of breathing are brought to a health care setting where Pulmonary Function Test (PFT) is performed and based from results, they are then diagnosed of asthma. These patients, known to have occupational asthma eventually get well when removed from the exposure of the environment. Our study, focused on performing PFT or specific challenge test in diagnosing workers of occupational asthma with them executing the test within their workplace, maintaining the environment and their daily exposure to certain levels of chemicals and pollutants. This has provided us with an understanding and reliable diagnosis of occupational asthma. Objective: To identify the prevalence of Thai workers who develop asthma caused by exposure to pollutants and chemicals from their working environment by conducting interview and performing PFT or specific challenge test in their work places. Materials and Methods: This study was performed from January-March 2015 in Bangkok, Thailand. The percentage of abnormal symptoms of 940 workers in 5 different areas (factories of plastic, fertilizer, animal food, office and streets) were collected through a questionnaire. The demographic information, occupational history, and the state of health were determined using a questionnaire and checklists. PFT was executed in their work places and results were measured and evaluated. Results: Pulmonary Function test was performed by 940 participants. The specific challenge test was done in factories of plastic, fertilizer, animal food, office environment and on the streets of Thailand. Of the 100 participants working in the plastic industry, 65% complained of having respiratory symptoms. None of them had an abnormal PFT. From the participants who worked with fertilizers and are exposed to sulfur dioxide, out of 200 participants, 20% complained of having symptoms and 8% had abnormal PFT. The 300 subjects working with animal food reported that 45% complained of respiratory symptoms and 15% had abnormal PFT results. From the office environment where there is indoor pollution, Out of 140 subjects, 7% had symptoms and 4% had abnormal PFT. The 200 workers exposed to traffic pollution, 24% reported respiratory symptoms and 12% had abnormal PFT. Conclusion: We were able to identify and diagnose participants of occupational asthma through their abnormal lung function test done at their work places. The chemical agents and exposures were determined therefore effective management of workers with occupational asthma were advised to avoid further exposure for better chances of recovery. Further studies identifying the risk factors and causative agents of asthma in workplaces should be developed to encourage interventional strategies and programs that will prevent occupation related diseases particularly asthma.

Keywords: occupational asthma, pulmonary function test, specific challenge test, Thailand

Procedia PDF Downloads 281
77 Modeling the Demand for the Healthcare Services Using Data Analysis Techniques

Authors: Elizaveta S. Prokofyeva, Svetlana V. Maltseva, Roman D. Zaitsev

Abstract:

Rapidly evolving modern data analysis technologies in healthcare play a large role in understanding the operation of the system and its characteristics. Nowadays, one of the key tasks in urban healthcare is to optimize the resource allocation. Thus, the application of data analysis in medical institutions to solve optimization problems determines the significance of this study. The purpose of this research was to establish the dependence between the indicators of the effectiveness of the medical institution and its resources. Hospital discharges by diagnosis; hospital days of in-patients and in-patient average length of stay were selected as the performance indicators and the demand of the medical facility. The hospital beds by type of care, medical technology (magnetic resonance tomography, gamma cameras, angiographic complexes and lithotripters) and physicians characterized the resource provision of medical institutions for the developed models. The data source for the research was an open database of the statistical service Eurostat. The choice of the source is due to the fact that the databases contain complete and open information necessary for research tasks in the field of public health. In addition, the statistical database has a user-friendly interface that allows you to quickly build analytical reports. The study provides information on 28 European for the period from 2007 to 2016. For all countries included in the study, with the most accurate and complete data for the period under review, predictive models were developed based on historical panel data. An attempt to improve the quality and the interpretation of the models was made by cluster analysis of the investigated set of countries. The main idea was to assess the similarity of the joint behavior of the variables throughout the time period under consideration to identify groups of similar countries and to construct the separate regression models for them. Therefore, the original time series were used as the objects of clustering. The hierarchical agglomerate algorithm k-medoids was used. The sampled objects were used as the centers of the clusters obtained, since determining the centroid when working with time series involves additional difficulties. The number of clusters used the silhouette coefficient. After the cluster analysis it was possible to significantly improve the predictive power of the models: for example, in the one of the clusters, MAPE error was only 0,82%, which makes it possible to conclude that this forecast is highly reliable in the short term. The obtained predicted values of the developed models have a relatively low level of error and can be used to make decisions on the resource provision of the hospital by medical personnel. The research displays the strong dependencies between the demand for the medical services and the modern medical equipment variable, which highlights the importance of the technological component for the successful development of the medical facility. Currently, data analysis has a huge potential, which allows to significantly improving health services. Medical institutions that are the first to introduce these technologies will certainly have a competitive advantage.

Keywords: data analysis, demand modeling, healthcare, medical facilities

Procedia PDF Downloads 114
76 A Patient-Centered Approach to Clinical Trial Development: Real-World Evidence from a Canadian Medical Cannabis Clinic

Authors: Lucile Rapin, Cynthia El Hage, Rihab Gamaoun, Maria-Fernanda Arboleda, Erin Prosk

Abstract:

Introduction: Sante Cannabis (SC), a Canadian group of clinics dedicated to medical cannabis, based in Montreal and in the province of Quebec, has served more than 8000 patients seeking cannabis-based treatment over the past five years. As randomized clinical trials with natural medical cannabis are scarce, real-world evidence offers the opportunity to fill research gaps between scientific evidence and clinical practice. Data on the use of medical cannabis products from SC patients were prospectively collected, leading to a large real-world database on the use of medical cannabis. The aim of this study was to report information on the profiles of both patients and prescribed medical cannabis products at SC clinics, and to assess the safety of medical cannabis among Canadian patients. Methods: This is an observational retrospective study of 1342 adult patients who were authorized with medical cannabis products between October 2017 and September 2019. Information regarding demographic characteristics, therapeutic indications for medical cannabis use, patterns in dosing and dosage form of medical cannabis and adverse effects over one-year follow-up (initial and 4 follow-up (FUP) visits) were collected. Results: 59% of SC patients were female, with a mean age of 56.7 (SD= 15.6, range= (19-97)). Cannabis products were authorized mainly for patients with a diagnosis of chronic pain (68.8% of patients), cancer (6.7%), neurological disorders (5.6%), and mood disorders (5.4 %). At initial visit, a large majority (70%) of patients were authorized exclusively medical cannabis products, 27% were authorized a combination of pharmaceutical cannabinoids and medical cannabis and 3% were prescribed only pharmaceutical cannabinoids. This pattern was recurrent over the one-year follow-up. Overall, oil was the preferred formulation (average over visits 72.5%) followed by a combination of oil and dry (average 19%), other routes of administration accounted for less than 4%. Patients were predominantly prescribed products with a balanced THC:CBD ratio (59%-75% across visits). 28% of patients reported at least one adverse effect (AE) at the 3-month follow-up visit and 12% at the six-month FUP visit. 84.8% of total AEs were mild and transient. No serious AE was reported. Overall, the most common side effects reported were dizziness (11.95% of total AEs), drowsiness (11.4%), dry mouth (5.5%), nausea (4.8%), headaches (4.6%), cough (4.4%), anxiety (4.1%) and euphoria (3.5%). Other adverse effects accounted for less than 3% of total AE. Conclusion: Our results confirm that the primary area of clinical use for medical cannabis is in pain management. Patients in this cohort are largely utilizing plant-based cannabis oil products with a balanced ratio of THC:CBD. Reported adverse effects were mild and included dizziness and drowsiness. This real-world data confirms the tolerable safety profile of medical cannabis and suggests medical indications not yet validated in controlled clinical trials. Such data offers an important opportunity for the investigation of the long-term effects of cannabinoid exposure in real-life conditions. Real-world evidence can be used to direct clinical trial research efforts on specific indications and dosing patterns for product development.

Keywords: medical cannabis, safety, real-world data, Canada

Procedia PDF Downloads 102
75 Recurrent Neural Networks for Classifying Outliers in Electronic Health Record Clinical Text

Authors: Duncan Wallace, M-Tahar Kechadi

Abstract:

In recent years, Machine Learning (ML) approaches have been successfully applied to an analysis of patient symptom data in the context of disease diagnosis, at least where such data is well codified. However, much of the data present in Electronic Health Records (EHR) are unlikely to prove suitable for classic ML approaches. Furthermore, as scores of data are widely spread across both hospitals and individuals, a decentralized, computationally scalable methodology is a priority. The focus of this paper is to develop a method to predict outliers in an out-of-hours healthcare provision center (OOHC). In particular, our research is based upon the early identification of patients who have underlying conditions which will cause them to repeatedly require medical attention. OOHC act as an ad-hoc delivery of triage and treatment, where interactions occur without recourse to a full medical history of the patient in question. Medical histories, relating to patients contacting an OOHC, may reside in several distinct EHR systems in multiple hospitals or surgeries, which are unavailable to the OOHC in question. As such, although a local solution is optimal for this problem, it follows that the data under investigation is incomplete, heterogeneous, and comprised mostly of noisy textual notes compiled during routine OOHC activities. Through the use of Deep Learning methodologies, the aim of this paper is to provide the means to identify patient cases, upon initial contact, which are likely to relate to such outliers. To this end, we compare the performance of Long Short-Term Memory, Gated Recurrent Units, and combinations of both with Convolutional Neural Networks. A further aim of this paper is to elucidate the discovery of such outliers by examining the exact terms which provide a strong indication of positive and negative case entries. While free-text is the principal data extracted from EHRs for classification, EHRs also contain normalized features. Although the specific demographical features treated within our corpus are relatively limited in scope, we examine whether it is beneficial to include such features among the inputs to our neural network, or whether these features are more successfully exploited in conjunction with a different form of a classifier. In this section, we compare the performance of randomly generated regression trees and support vector machines and determine the extent to which our classification program can be improved upon by using either of these machine learning approaches in conjunction with the output of our Recurrent Neural Network application. The output of our neural network is also used to help determine the most significant lexemes present within the corpus for determining high-risk patients. By combining the confidence of our classification program in relation to lexemes within true positive and true negative cases, with an inverse document frequency of the lexemes related to these cases, we can determine what features act as the primary indicators of frequent-attender and non-frequent-attender cases, providing a human interpretable appreciation of how our program classifies cases.

Keywords: artificial neural networks, data-mining, machine learning, medical informatics

Procedia PDF Downloads 103
74 A Retrospective Study: Correlation between Enterococcus Infections and Bone Carcinoma Incidence

Authors: Sonia A. Stoica, Lexi Frankel, Amalia Ardeljan, Selena Rashid, Ali Yasback, Omar Rashid

Abstract:

Introduction Enterococcus is a vast genus of lactic acid bacteria, gram-positivecocci species. They are common commensal organisms in the intestines of humans: E. faecalis (90–95%) and E. faecium (5–10%). Rare groups of infections can occur with other species, including E. casseliflavus, E. gallinarum, and E. raffinosus. The most common infections caused by Enterococcus include urinary tract infections, biliary tract infections, subacute endocarditis, diverticulitis, meningitis, septicemia, and spontaneous bacterial peritonitis. The treatment for sensitive strains of these bacteria includes ampicillin, penicillin, cephalosporins, or vancomycin, while the treatment for resistant strains includes daptomycin, linezolid, tygecycline, or streptogramine. Enterococcus faecalis CECT7121 is an encouraging nominee for being considered as a probiotic strain. E. faecalis CECT7121 enhances and skews the profile of cytokines to the Th1 phenotype in situations such as vaccination, anti-tumoral immunity, and allergic reactions. It also enhances the secretion of high levels of IL-12, IL-6, TNF alpha, and IL-10. Cytokines have been previously associated with the development of cancer. The intention of this study was to therefore evaluate the correlation between Enterococcus infections and incidence of bone carcinoma. Methods A retrospective cohort study (2010-2019) was conducted through a Health Insurance Portability and Accountability Act (HIPAA) compliant national database and conducted using International Classification of Disease (ICD) 9th and 10th codes for bone carcinoma diagnosis in a previously Enterococcus infected population. Patients were matched for age range and Charlson Comorbidity Index (CCI). Access to the database was granted by Holy Cross Health for academic research. Chi-squared test was used to assess statistical significance. Results A total number of 17,056 patients was obtained in Enterococcus infected group as well as in the control population (matched by Age range and CCI score). Subsequent bone carcinoma development was seen at a rate of 1.07% (184) in the Enterococcal infectious group and 3.42% (584) in the control group, respectively. The difference was statistically significant by p= 2.2x10-¹⁶, Odds Ratio = 0.355 (95% CI 0.311 - 0.404) Treatment for enterococcus infection was analyzed and controlled for in both enterococcus infected and noninfected populations. 78 out of 6,624 (1.17%) patients with a prior enterococcus infection and treated with antibiotics were compared to 202 out of 6,624 (3.04%) patients with no history of enterococcus infection (control) and received antibiotic treatment. Both populations subsequently developed bone carcinoma. Results remained statistically significant (p<2.2x10-), Odds Ratio=0.456 (95% CI 0.396-0.525). Conclusion This study shows a statistically significant correlation between Enterococcus infection and a decreased incidence of bone carcinoma. The immunologic response of the organism to Enterococcus infection may exert a protecting mechanism from developing bone carcinoma. Further exploration is needed to identify the potential mechanism of Enterococcus in reducing bone carcinoma incidence.

Keywords: anti-tumoral immunity, bone carcinoma, enterococcus, immunologic response

Procedia PDF Downloads 158
73 Knowledge and Attitude Towards Strabismus Among Adult Residents in Woreta Town, Northwest Ethiopia: A Community-Based Study

Authors: Henok Biruk Alemayehu, Kalkidan Berhane Tsegaye, Fozia Seid Ali, Nebiyat Feleke Adimassu, Getasew Alemu Mersha

Abstract:

Background: Strabismus is a visual disorder where the eyes are misaligned and point in different directions. Untreated strabismus can lead to amblyopia, loss of binocular vision, and social stigma due to its appearance. Since it is assumed that knowledge is pertinent for early screening and prevention of strabismus, the main objective of this study was to assess knowledge and attitudes toward strabismus in Woreta town, Northwest Ethiopia. Providing data in this area is important for planning health policies. Methods: A community-based cross-sectional study was done in Woreta town from April–May 2020. The sample size was determined using a single population proportion formula by taking a 50% proportion of good knowledge, 95% confidence level, 5% margin of errors, and 10% non- response rate. Accordingly, the final computed sample size was 424. All four kebeles were included in the study. There were 42,595 people in total, with 39,684 adults and 9229 house holds. A sample fraction ’’k’’ was obtained by dividing the number of the household by the calculated sample size of 424. Systematic random sampling with proportional allocation was used to select the participating households with a sampling fraction (K) of 21 i.e. each household was approached in every 21 households included in the study. One individual was selected ran- domly from each household with more than one adult, using the lottery method to obtain a final sample size. The data was collected through a face-to-face interview with a pretested and semi-structured questionnaire which was translated from English to Amharic and back to English to maintain its consistency. Data were entered using epi-data version 3.1, then processed and analyzed via SPSS version- 20. Descriptive and analytical statistics were employed to summarize the data. A p-value of less than 0.05 was used to declare statistical significance. Result: A total of 401 individuals aged over 18 years participated, with a response rate of 94.5%. Of those who responded, 56.6% were males. Of all the participants, 36.9% were illiterate. The proportion of people with poor knowledge of strabismus was 45.1%. It was shown that 53.9% of the respondents had a favorable attitude. Older age, higher educational level, having a history of eye examination, and a having a family history of strabismus were significantly associated with good knowledge of strabismus. A higher educational level, older age, and hearing about strabismus were significantly associated with a favorable attitude toward strabismus. Conclusion and recommendation: The proportion of good knowledge and favorable attitude towards strabismus were lower than previously reported in Gondar City, Northwest Ethiopia. There is a need to provide health education and promotion campaigns on strabismus to the community: what strabismus is, its’ possible treatments and the need to bring children to the eye care center for early diagnosis and treatment. it advocate for prospective research endeavors to employ qualitative study design.Additionally, it suggest the exploration of studies that investigate causal-effect relationship.

Keywords: strabismus, knowledge, attitude, Woreta

Procedia PDF Downloads 33
72 An Epidemiological Study on Cutaneous Melanoma, Basocellular and Epidermoid Carcinomas Diagnosed in a Sunny City in Southeast Brazil in a Five-Year Period

Authors: Carolina L. Cerdeira, Julia V. F. Cortes, Maria E. V. Amarante, Gersika B. Santos

Abstract:

Skin cancer is the most common cancer in several parts of the world; in a tropical country like Brazil, the situation isn’t different. The Brazilian population is exposed to high levels of solar radiation, increasing the risk of developing cutaneous carcinoma. Aimed at encouraging prevention measures and the early diagnosis of these tumors, a study was carried out that analyzed data on cutaneous melanomas, basal cell, and epidermoid carcinomas, using as primary data source the medical records of 161 patients registered in one pathology service, which performs skin biopsies in a city of Minas Gerais, Brazil. All patients diagnosed with skin cancer at this service from January 2015 to December 2019 were included. The incidence of skin carcinoma cases was correlated with the identification of histological type, sex, age group, and topographic location. Correlation between variables was verified by Fisher's exact test at a nominal significance level of 5%, with statistical analysis performed by R® software. A significant association was observed between age group and type of cancer (p=0.0085); age group and sex (0.0298); and type of cancer and body region affected (p < 0.01). Those 161 cases analyzed comprised 93 basal cell carcinomas, 66 epidermoid carcinomas, and only two cutaneous melanomas. In the group aged 19 to 30 years, the epidermoid form was most prevalent; from 31 to 45 and from 46 to 59 years, the basal cell prevailed; in 60-year-olds or over, both types had higher frequencies. Associating age group and sex, in groups aged 18 to 30 and 46 to 59 years, women were most affected. In the 31-to 45-year-old group, men predominated. There was a gender balance in the age group 60-year-olds or over. As for topography, there was a high prevalence in the head and neck, followed by upper limbs. Relating histological type and topography, there was a prevalence of basal cell and epidermoid carcinomas in the head and neck. In the chest, the basal cell form was most prevalent; in upper limbs, the epidermoid form prevailed. Cutaneous melanoma affected only the chest and upper limbs. About 82% of patients 60-year-olds or over had head and neck cancer; from 46 to 59 and 60-year-olds or over, the head and neck region and upper limbs were predominantly affected; the distribution was balanced in the 31-to 45-year-old group. In conclusion, basal cell carcinoma was predominant, whereas cutaneous melanoma was the rarest among the types analyzed. Patients 60-year-olds or over were most affected, showing gender balance. In young adults, there was a prevalence of the epidermoid form; in middle-aged patients, basal cell carcinoma was predominant; in the elderly, both forms presented with higher frequencies. There was a higher incidence of head and neck cancers, followed by malignancies affecting the upper limbs. The epidermoid type manifested significantly in the upper limbs. Body regions such as the thorax and lower limbs were less affected, which is justified by the lower exposure of these areas to incident solar radiation.

Keywords: basal cell carcinoma, cutaneous melanoma, skin cancer, squamous cell carcinoma, topographic location

Procedia PDF Downloads 104
71 Improving the Uptake of Community-Based Multidrug-Resistant Tuberculosis Treatment Model in Nigeria

Authors: A. Abubakar, A. Parsa, S. Walker

Abstract:

Despite advances made in the diagnosis and management of drug-sensitive tuberculosis (TB) over the past decades, treatment of multidrug-resistant tuberculosis (MDR-TB) remains challenging and complex particularly in high burden countries including Nigeria. Treatment of MDR-TB is cost-prohibitive with success rate generally lower compared to drug-sensitive TB and if care is not taken it may become the dominant form of TB in future with many treatment uncertainties and substantial morbidity and mortality. Addressing these challenges requires collaborative efforts thorough sustained researches to evaluate the current treatment guidelines, particularly in high burden countries and prevent progression of resistance. To our best knowledge, there has been no research exploring the acceptability, effectiveness, and cost-effectiveness of community-based-MDR-TB treatment model in Nigeria, which is among the high burden countries. The previous similar qualitative study looks at the home-based management of MDR-TB in rural Uganda. This research aimed to explore patient’s views and acceptability of community-based-MDR-TB treatment model and to evaluate and compare the effectiveness and cost-effectiveness of community-based versus hospital-based MDR-TB treatment model of care from the Nigerian perspective. Knowledge of patient’s views and acceptability of community-based-MDR-TB treatment approach would help in designing future treatment recommendations and in health policymaking. Accordingly, knowledge of effectiveness and cost-effectiveness are part of the evidence needed to inform a decision about whether and how to scale up MDR-TB treatment, particularly in a poor resource setting with limited knowledge of TB. Mixed methods using qualitative and quantitative approach were employed. Qualitative data were obtained using in-depth semi-structured interviews with 21 MDR-TB patients in Nigeria to explore their views and acceptability of community-based MDR-TB treatment model. Qualitative data collection followed an iterative process which allowed adaptation of topic guides until data saturation. In-depth interviews were analyzed using thematic analysis. Quantitative data on treatment outcomes were obtained from medical records of MDR-TB patients to determine the effectiveness and direct and indirect costs were obtained from the patients using validated questionnaire and health system costs from the donor agencies to determine the cost-effectiveness difference between community and hospital-based model from the Nigerian perspective. Findings: Some themes have emerged from the patient’s perspectives indicating preference and high acceptability of community-based-MDR-TB treatment model by the patients and mixed feelings about the risk of MDR-TB transmission within the community due to poor infection control. The result of the modeling from the quantitative data is still on course. Community-based MDR-TB care was seen as the acceptable and most preferred model of care by the majority of the participants because of its convenience which in turn enhanced recovery, enables social interaction and offer more psychosocial benefits as well as averted productivity loss. However, there is a need to strengthen this model of care thorough enhanced strategies that ensure guidelines compliance and infection control in order to prevent the progression of resistance and curtail community transmission.

Keywords: acceptability, cost-effectiveness, multidrug-resistant TB treatment, community and hospital approach

Procedia PDF Downloads 98
70 Development of the Drug Abuse Health Information System in Thai Community

Authors: Waraporn Boonchieng, Ekkarat Boonchieng, Sivaporn Aungwattana, Decha Tamdee, Wongamporn Pinyavong

Abstract:

Drug addiction represents one of the most important public health issues in both developed and developing countries. The purpose of this study was to develop a drug abuse health information in a community in Northern Thailand using developmental research design. The developmental researchers performed four phases to develop drug abuse health information, including 1) synthesizing knowledge related to drug abuse prevention and identifying the components of drug abuse health information; 2) developing the system in mobile application and website; 3) implementing drug abuse health information in the rural community; and 4) evaluating the feasibility of drug abuse health information. Data collection involved both qualitative and quantitative procedures. The qualitative data and quantitative data were analyzed using content analysis and descriptive statistics, respectively. The findings of this study showed that drug abuse health information consisted of five sections, including drug-related prevention knowledge for teens, drug-related knowledge for adults and professionals, the database for drug dependence treatment centers, self-administered questionnaires, and supportive counseling sections. First, in drug-related prevention knowledge for teens, the developmental researchers designed four infographics and animation to provide drug-related prevention knowledge, including types of illegal drugs, causes of drug abuse, consequences of drug abuse, drug abuse diagnosis and treatment, and drug abuse prevention. Second, in drug-related knowledge for adults and professionals, the developmental researchers developed many documents in a form of PDF file to provide drug-related knowledge, including types of illegal drugs, causes of drug abuse, drug abuse prevention, and relapse prevention guideline. Third, database for drug dependence treatment centers included the place, direction map, operation time, and the way for contacting all drug dependence treatment centers in Thailand. Fourth, self-administered questionnaires comprised preventive drugs behavior questionnaire, drug abuse knowledge questionnaire, the stages of change readiness and treatment eagerness to drug use scale, substance use behaviors questionnaire, tobacco use behaviors questionnaire, stress screening, and depression screening. Finally, for supportive counseling, the developmental researchers designed chatting box through which each user could write and send their concerns to counselors individually. Results from evaluation process showed that 651 participants used drug abuse health information via mobile application and website. Among all users, 48.8% were males and 51.2% were females. More than half (55.3%) were 15-20 years old and most of them (88.0%) were Buddhists. Most users reported ever getting knowledge related to drugs (86.1%), and drinking alcohol (94.2%) while some of them (6.9%) reported ever using tobacco. For satisfaction with using the drug abuse health information, more than half of users reflected that the contents of drug abuse health information were interesting (59%), up-to date (61%), and highly useful to their self-study (59%) at high level. In addition, half of them were satisfied with the design in terms of infographics (54%) and animation (51%). Thus, this drug abuse health information can be adopted to explore drug abuse situation and serves as a tool to prevent drug abuse and addiction among Thai community people.

Keywords: drug addiction, health informatics, big data, development research

Procedia PDF Downloads 87
69 COVID-19’s Impact on the Use of Media, Educational Performance, and Learning in Children and Adolescents with ADHD Who Engaged in Virtual Learning

Authors: Christina Largent, Tazley Hobbs

Abstract:

Objective: A literature review was performed to examine the existing research on COVID-19 lockdown as it relates to ADHD child/adolescent individuals, media use, and impact on educational performance/learning. It was surmised that with the COVID-19 shut-down and transition to remote learning, a less structured learning environment, increased screen time, in addition to potential difficulty accessing school resources would impair ADHD individuals’ performance and learning. A resulting increase in the number of youths diagnosed and treated for ADHD would be expected. As of yet, there has been little to no published data on the incidence of ADHD as it relates to COVID-19 outside of reports from several nonprofit agencies such as CHADD (Children and Adults with Attention-Deficit/Hyperactivity Disorder ), who reported an increased number of calls to their helpline, The New York based Child Mind Institute, who reported an increased number of appointments to discuss medications, and research released from Athenahealth showing an increase in the number of patients receiving new diagnosis of ADHD and new prescriptions for ADHD medications. Methods: A literature search for articles published between 2020 and 2021 from Pubmed, Google Scholar, PsychInfo, was performed. Search phrases and keywords included “covid, adhd, child, impact, remote learning, media, screen”. Results: Studies primarily utilized parental reports, with very few from the perspective of the ADHD individuals themselves. Most findings thus far show that with the COVID-19 quarantine and transition to online learning, ADHD individuals’ experienced decreased ability to keep focused or adhere to the daily routine, as well as increased inattention-related problems, such as careless mistakes or lack of completion in homework, which in turn translated into overall more difficulty with remote learning. To add further injury, one study showed (just on evaluation of two different sites within the US) that school based services for these individuals decreased with the shift to online-learning. Increased screen time, television, social media, and gaming were noted amongst ADHD individuals. One study further differentiated the degree of digital media, identifying individuals with “problematic “ or “non-problematic” use. ADHD children with problematic digital media use suffered from more severe core symptoms of ADHD, negative emotions, executive function deficits, damage to family environment, pressure from life events, and a lower motivation to learn. Conclusions and Future Considerations: Studies found not only was online learning difficult for ADHD individuals but it, in addition to greater use of digital media, was associated with worsening ADHD symptoms impairing schoolwork, in addition to secondary findings of worsening mood and behavior. Currently, data on the number of new ADHD cases, in addition to data on the prescription and usage of stimulants during COVID-19, has not been well documented or studied; this would be well-warranted out of concern for over diagnosing or over-prescribing our youth. It would also be well-worth studying how reversible or long-lasting these negative impacts may be.

Keywords: COVID-19, remote learning, media use, ADHD, child, adolescent

Procedia PDF Downloads 108
68 Extremism among College and High School Students in Moscow: Diagnostics Features

Authors: Puzanova Zhanna Vasilyevna, Larina Tatiana Igorevna, Tertyshnikova Anastasia Gennadyevna

Abstract:

In this day and age, extremism in various forms of its manifestation is a real threat to the world community, the national security of a state and its territorial integrity, as well as to the constitutional rights and freedoms of citizens. Extremism, as it is known, in general terms described as a commitment to extreme views and actions, radically denying the existing social norms and rules. Supporters of extremism in the ideological and political struggles often adopt methods and means of psychological warfare, appeal not to reason and logical arguments, but to emotions and instincts of the people, to prejudices, biases, and a variety of mythological designs. They are dissatisfied with the established order and aim at increasing this dissatisfaction among the masses. Youth extremism holds a specific place among the existing forms and types of extremism. In this context in 2015, we conducted a survey among Moscow college and high school students. The aim of this study was to determine how great or small is the difference in understanding and attitudes towards extremism manifestations, inclination and readiness to take part in extremist activities and what causes this predisposition, if it exists. We performed multivariate analysis and found the Russian college and high school students' opinion about the extremism and terrorism situation in our country and also their cognition on these topics. Among other things, we showed, that the level of aggressiveness of young people were not above the average for the whole population. The survey was conducted using the questionnaire method. The sample included college and high school students in Moscow (642 and 382, respectively) by method of random selection. The questionnaire was developed by specialists of RUDN University Sociological Laboratory and included both original questions (projective questions, the technique of incomplete sentences), and the standard test Dayhoff S. to determine the level of internal aggressiveness. It is also used as an experiment, the technique of study option using of FACS and SPAFF to determine the psychotypes and determination of non-verbal manifestations of emotions. The study confirmed the hypothesis that in respondents’ opinion, the level of aggression is higher today than a few years ago. Differences were found in the understanding of and respect for such social phenomena as extremism, terrorism, and their danger and appeal for the two age groups of young people. Theory of psychotypes, SPAFF (specific affect cording system) and FACS (facial action cording system) are considered as additional techniques for the diagnosis of a tendency to extreme views. Thus, it is established that diagnostics of acceptance of extreme views among young people is possible thanks to simultaneous use of knowledge from the different fields of socio-humanistic sciences. The results of the research can be used in a comparative context with other countries and as a starting point for further research in the field, taking into account its extreme relevance.

Keywords: extremism, youth extremism, diagnostics of extremist manifestations, forecast of behavior, sociological polls, theory of psychotypes, FACS, SPAFF

Procedia PDF Downloads 318
67 Integration of an Evidence-Based Medicine Curriculum into Physician Assistant Education: Teaching for Today and the Future

Authors: Martina I. Reinhold, Theresa Bacon-Baguley

Abstract:

Background: Medical knowledge continuously evolves and to help health care providers to stay up-to-date, evidence-based medicine (EBM) has emerged as a model. The practice of EBM requires new skills of the health care provider, including directed literature searches, the critical evaluation of research studies, and the direct application of the findings to patient care. This paper describes the integration and evaluation of an evidence-based medicine course sequence into a Physician Assistant curriculum. This course sequence teaches students to manage and use the best clinical research evidence to competently practice medicine. A survey was developed to assess the outcomes of the EBM course sequence. Methodology: The cornerstone of the three-semester sequence of EBM are interactive small group discussions that are designed to introduce students to the most clinically applicable skills to identify, manage and use the best clinical research evidence to improve the health of their patients. During the three-semester sequence, the students are assigned each semester to participate in small group discussions that are facilitated by faculty with varying background and expertise. Prior to the start of the first EBM course in the winter semester, PA students complete a knowledge-based survey that was developed by the authors to assess the effectiveness of the course series. The survey consists of 53 Likert scale questions that address the nine objectives for the course series. At the end of the three semester course series, the same survey was given to all students in the program and the results from before, and after the sequence of EBM courses are compared. Specific attention is paid to overall performance of students in the nine course objectives. Results: We find that students from the Class of 2016 and 2017 consistently improve (as measured by percent correct responses on the survey tool) after the EBM course series (Class of 2016: Pre- 62% Post- 75%; Class of 2017: Pre- 61 % Post-70%). The biggest increase in knowledge was observed in the areas of finding and evaluating the evidence, with asking concise clinical questions (Class of 2016: Pre- 61% Post- 81%; Class of 2017: Pre- 61 % Post-75%) and searching the medical database (Class of 2016: Pre- 24% Post- 65%; Class of 2017: Pre- 35 % Post-66 %). Questions requiring students to analyze, evaluate and report on the available clinical evidence regarding diagnosis showed improvement, but to a lesser extend (Class of 2016: Pre- 56% Post- 77%; Class of 2017: Pre- 56 % Post-61%). Conclusions: Outcomes identified that students did gain skills which will allow them to apply EBM principles. In addition, the outcomes of the knowledge-based survey allowed the faculty to focus on areas needing improvement, specifically the translation of best evidence into patient care. To address this area, the clinical faculty developed case scenarios that were incorporated into the lecture and discussion sessions, allowing students to better connect the research studies with patient care. Students commented that ‘class discussion and case examples’ contributed most to their learning and that ‘it was helpful to learn how to develop research questions and how to analyze studies and their significance to a potential client’. As evident by the outcomes, the EBM courses achieved the goals of the course and were well received by the students. 

Keywords: evidence-based medicine, clinical education, assessment tool, physician assistant

Procedia PDF Downloads 106
66 “Self-Torturous Thresholds” in Post-WWII Japan: Three Thresholds to Queer Japanese Futures

Authors: Maari Sugawara

Abstract:

This arts-based research is about "self-torture": the interplay of seemingly opposing elements of pain, pleasure, submission, and power. It asserts that "self-torture" can be considered a nontrivial mediation between the aesthetic and the sociopolitical. It explores what the author calls queered self-torture; "self-torture" marked by an ambivalence that allows the oppressed to resist, and their counter-valorization occasionally functions as therapeutic solutions to the problems they highlight and condense. The research goal is to deconstruct normative self-torture and propose queered self-torture as a fertile ground for considering the complexities of desire that allow the oppressed to practice freedom. While “self-torture” manifests in many societies, this research focuses on cultural and national identity in post-WWII Japan using this lens of self-torture, as masochism functions as the very basis for Japanese cultural and national identity to ensure self-preservation. This masochism is defined as an impulse to realize a sense of pride and construct an identity through the acceptance of subordination, shame, and humiliation in the face of an all-powerful Other; the dominant Euro-America. It could be argued that this self-torture is a result of Japanese cultural annihilation and the trauma of the nation's defeat to the US. This is the definition of "self-torturous thresholds," the author’s post-WWII Japan psycho-historical diagnosis; when this threshold is crossed, the oppressed begin to torture themselves; the oppressors no longer need to do anything to maintain their power. The oppressed are already oppressing themselves. The term "oppressed" here refers to Japanese individuals and residents of Japan who are subjected to oppressive “white” heteropatriarchal supremacist structures and values that serve colonialist interests. There are three stages in "self-torturous thresholds": (1) the oppressors no longer need to oppress because the oppressed voluntarily commit to self-torture; (2) the oppressed find pleasure in self-torture; and (3) the oppressed achieve queered self-torture, to achieve alternative futures. Using the conceptualization of "self-torture," this research examines and critiques pleasure, desire, capital, and power in postwar Japan, which enables the discussion of the data-colonizing “Moonshot Research and Development program”. If the oppressed want to divest from the habits of normative self-torture, which shape what is possible in both our present and future, we need methods to feel and know that the alternative results of self-torture are possible. Phase three will be enacted using Sarah Ahmed's queer methodology to reorient national and cultural identity away from heteronormativity. Through theoretical analysis, textual analysis, archival research, ethnographic interviews, and digital art projects, including experimental documentary as a method to capture the realities of the individuals who are practicing self-torture, this research seeks to reveal how self-torture may become not just a vehicle of pleasure but also a mode of critiquing power and achieving freedom. It seeks to encourage the imaginings of queer Japanese futures, where the marginalized survive Japan’s natural and man-made disasters and Japan’s Imperialist past and present rather than submitting to the country’s continued violence.

Keywords: arts-based research, Japanese studies, interdisciplinary arts, queer studies, cultural studies, popular culture, BDSM, sadomasochism, sexuality, VR, AR, digital art, visual arts, speculative fiction

Procedia PDF Downloads 36
65 Malaria Menace in Pregnancy; Hard to Ignore

Authors: Nautiyal Ruchira, Nautiyal Hemant, Chaudhury Devnanda, Bhargava Surbhi, Chauhan Nidhi

Abstract:

Introduction: South East Asian region contributes 2.5 million cases of malaria each year to the global burden of 300 to 500 million of which 76% is reported from India. Government of India launched a national program almost half a century ago, still malaria remains a major public health challenge. Pregnant women are more susceptible to severe malaria and its fetomaternal complications. Inadequate surveillance and under-reporting underestimates the problem. Aim: Present study aimed to analyze the clinical course and pattern of malaria during pregnancy and to study the feto-maternal outcome. Methodology: This is a prospective observational study carried out at Himalayan Institute of Medical Sciences – a tertiary care center in the sub-Himalayan state of Uttarakhand, Northern India. All the pregnant women with malaria and its complications were recruited in the study during 2009 to 2014 which included referred cases from the state of western Uttar Pradesh. A thorough history and clinical examination were carried out to assess maternal and fetal condition. Relevant investigations including haemogram, platelet count, LFT, RFT, and USG was done. Blood slides and rapid diagnostic tests were done to diagnose the type of malaria.The primary outcomes measured were the type of malaria infection, maternal complications associated with malaria, outcome of pregnancy and effect on the fetus. Results: 67 antenatal cases with malaria infection were studied. 71% patients were diagnosed with plasmodium vivax infection, 25% cases were plasmodium falciparum positive and in 3% cases mixed infection was found. 38(56%) patients were primigravida and 29(43%) were multiparous. Most of the patients had already received some treatment from their local doctors and presented with severe malaria with the complications. Thrombocytopenia was the commonest manifestation seen in 35(52%) patients, jaundice in 28%, severe anemia in 18%, and severe oligohydramnios in 10% and renal failure in 6% cases. Regarding pregnancy outcome there were 44 % preterm deliveries, 22% had IUFD and abortions in 6% cases.20% of newborn were low birth weight and 6% were IUGR. There was only one maternal death which occurred due to ARDS in falciparum malaria. Although Plasmodium vivax was the main parasite considering the severity of clinical presentation, all the patients received intensive care. As most of the patients had received chloroquine therapy hence they were treated with IV artesunate followed by oral artemesinin combination therapy. Other therapies in the form of packed RBC’s and platelet transfusions, dialysis and ventilator support were provided when required. Conclusion: Even in areas with annual parasite index (API) less than 2 like ours, malaria in pregnancy could be an alarming problem. Vivax malaria cannot be considered benign in pregnancy because of high incidence of morbidity. Prompt diagnosis and aggressive treatment can reduce morbidity and mortality significantly. Increased community level research, integrating ANC checkups with the distribution of insecticide-treated nets in areas of high endemicity, imparting education and awareness will strengthen the existing control strategies.

Keywords: severe malaria, pregnancy, plasmodium vivax, plasmodium falciparum

Procedia PDF Downloads 252
64 The Interactive Wearable Toy "+Me", for the Therapy of Children with Autism Spectrum Disorders: Preliminary Results

Authors: Beste Ozcan, Valerio Sperati, Laura Romano, Tania Moretta, Simone Scaffaro, Noemi Faedda, Federica Giovannone, Carla Sogos, Vincenzo Guidetti, Gianluca Baldassarre

Abstract:

+me is an experimental interactive toy with the appearance of a soft, pillow-like, panda. Shape and consistency are designed to arise emotional attachment in young children: a child can wear it around his/her neck and treat it as a companion (i.e. a transitional object). When caressed on paws or head, the panda emits appealing, interesting outputs like colored lights or amusing sounds, thanks to embedded electronics. Such sensory patterns can be modified through a wirelessly connected tablet: by this, an adult caregiver can adapt +me responses to a child's reactions or requests, for example, changing the light hue or the type of sound. The toy control is therefore shared, as it depends on both the child (who handles the panda) and the adult (who manages the tablet and mediates the sensory input-output contingencies). These features make +me a potential tool for therapy with children with Neurodevelopmental Disorders (ND), characterized by impairments in the social area, like Autism Spectrum Disorders (ASD) and Language Disorders (LD): as a proposal, the toy could be used together with a therapist, in rehabilitative play activities aimed at encouraging simple social interactions and reinforcing basic relational and communication skills. +me was tested in two pilot experiments, the first one involving 15 Typically Developed (TD) children aged in 8-34 months, the second one involving 7 children with ASD, and 7 with LD, aged in 30-48 months. In both studies a researcher/caregiver, during a one-to-one, ten-minute activity plays with the panda and encourages the child to do the same. The purpose of both studies was to ascertain the general acceptability of the device as an interesting toy that is an object able to capture the child's attention and to maintain a high motivation to interact with it and with the adult. Behavioral indexes for estimating the interplay between the child, +me and caregiver were rated from the video recording of the experimental sessions. Preliminary results show how -on average- participants from 3 groups exhibit a good engagement: they touch, caress, explore the panda and show enjoyment when they manage to trigger luminous and sound responses. During the experiments, children tend to imitate the caregiver's actions on +me, often looking (and smiling) at him/her. Interesting behavioral differences between TD, ASD, and LD groups are scored: for example, ASD participants produce a fewer number of smiles both to panda and to a caregiver with respect to TD group, while LD scores stand between ASD and TD subjects. These preliminary observations suggest that the interactive toy +me is able to raise and maintain the interest of toddlers and therefore it can be reasonably used as a supporting tool during therapy, to stimulate pivotal social skills as imitation, turn-taking, eye contact, and social smiles. Interestingly, the young age of participants, along with the behavioral differences between groups, seem to suggest a further potential use of the device: a tool for early differential diagnosis (the average age of a child

Keywords: autism spectrum disorders, interactive toy, social interaction, therapy, transitional wearable companion

Procedia PDF Downloads 93