Search results for: nursing college and hospital under The Royal Thai Army
125 Health Care Teams during COVID-19: Roles, Challenges, Emotional State and Perceived Preparedness to the Next Pandemic
Authors: Miriam Schiff, Hadas Rosenne, Ran Nir-Paz, Shiri Shinan Altman
Abstract:
To examine (1) the level, predictors, and subjective perception of professional quality of life (PRoQL), posttraumatic growth, roles, task changes during the pandemic, and perceived preparedness for the next pandemic. These variables were added as part of an international study on social workers in healthcare stress, resilience, and perceived preparedness we took part in, along with Australia, Canada, China, Hong Kong, Singapore, and Taiwan. (2) The extent to which background variables, rate of exposure to the virus, working in COVID wards, profession, personal resilience, and resistance to organizational change predict posttraumatic growth, perceived preparedness, and PRoQL (the latter was examined among social workers only). (3) The teams' perceptions of how the pandemic impacted them at the personal, professional, and organizational levels and what assisted them. Methodologies: Mixed quantitative and qualitative methods were used. 1039 hospital healthcare workers from various professions participated in the quantitative study while 32 participated in in-depth interviews. The same methods were used in six other countries. Findings: The level of PRoQL was moderate, with higher burnout and secondary traumatization level than during routine times. Differences between countries in the level of PRoQL were found as well. Perceived preparedness for the next pandemic at the personal level was moderate and similar among the different health professions. Higher exposure to the virus was associated with lower perceived preparedness of the hospitals. Compared to other professions, doctors and nurses perceived hospitals as significantly less prepared for the next pandemic. The preparedness of the State of Israel for the next pandemic is perceived as low by all healthcare professionals. A moderate level of posttraumatic growth was found. Staff who worked at the COVID ward reported a greater level of growth. Doctors reported the lowest level of growth. The staff's resilience was high, with no differences among professions or levels of exposure. Working in the COVID ward and resilience predicted better preparedness, while resistance to organizational change predicted worse preparedness. Findings from the qualitative part of the study revealed that healthcare workers reported challenges at the personal, professional and organizational level during the different waves of the pandemic. They also report on internal and external resources they either owned or obtained during that period. Conclusion: Exposure to the COVID-19 virus is associated with secondary traumatization on one hand and personal posttraumatic growth on the other hand. Personal and professional discoveries and a sense of mission helped cope with the pandemic that was perceived as a historical event, war, or mass casualty event. Personal resilience, along with the support of colleagues, family, and direct management, were seen as significant components of coping. Hospitals should plan ahead and improve their preparedness to the next pandemic.Keywords: covid-19, health-care, social workers, burnout, preparedness, international perspective
Procedia PDF Downloads 74124 Sustainable Agricultural and Soil Water Management Practices in Relation to Climate Change and Disaster: A Himalayan Country Experience
Authors: Krishna Raj Regmi
Abstract:
A “Climate change adaptation and disaster risk management for sustainable agriculture” project was implemented in Nepal, a Himalayan country during 2008 to 2013 sponsored jointly by Food and Agriculture Organization (FAO) and United Nations Development Programme (UNDP), Nepal. The paper is based on the results and findings of this joint pilot project. The climate change events such as increased intensity of erratic rains in short spells, trend of prolonged drought, gradual rise in temperature in the higher elevations and occurrence of cold and hot waves in Terai (lower plains) has led to flash floods, massive erosion in the hills particularly in Churia range and drying of water sources. These recurring natural and climate-induced disasters are causing heavy damages through sedimentation and inundation of agricultural lands, crops, livestock, infrastructures and rural settlements in the downstream plains and thus reducing agriculture productivity and food security in the country. About 65% of the cultivated land in Nepal is rainfed with drought-prone characteristics and stabilization of agricultural production and productivity in these tracts will be possible through adoption of rainfed and drought-tolerant technologies as well as efficient soil-water management by the local communities. The adaptation and mitigation technologies and options identified by the project for soil erosion, flash floods and landslide control are on-farm watershed management, sloping land agriculture technologies (SALT), agro-forestry practices, agri-silvi-pastoral management, hedge-row contour planting, bio-engineering along slopes and river banks, plantation of multi-purpose trees and management of degraded waste land including sandy river-bed flood plains. The stress tolerant technologies with respect to drought, floods and temperature stress for efficient utilization of nutrient, soil, water and other resources for increased productivity are adoption of stress tolerant crop varieties and breeds of animals, indigenous proven technologies, mixed and inter-cropping systems, system of rice/wheat intensification (SRI), direct rice seeding, double transplanting of rice, off-season vegetable production and regular management of nurseries, orchards and animal sheds. The alternate energy use options and resource conservation practices for use by local communities are installation of bio-gas plants and clean stoves (Chulla range) for mitigation of green house gas (GHG) emissions, use of organic manures and bio-pesticides, jatropha cultivation, green manuring in rice fields and minimum/zero tillage practices for marshy lands. The efficient water management practices for increasing productivity of crops and livestock are use of micro-irrigation practices, construction of water conservation and water harvesting ponds, use of overhead water tanks and Thai jars for rain water harvesting and rehabilitation of on-farm irrigation systems. Initiation of some works on community-based early warning system, strengthening of met stations and disaster database management has made genuine efforts in providing disaster-tailored early warning, meteorological and insurance services to the local communities. Contingent planning is recommended to develop coping strategies and capacities of local communities to adopt necessary changes in the cropping patterns and practices in relation to adverse climatic and disaster risk conditions. At the end, adoption of awareness raising and capacity development activities (technical and institutional) and networking on climate-induced disaster and risks through training, visits and knowledge sharing workshops, dissemination of technical know-how and technologies, conduct of farmers' field schools, development of extension materials and their displays are being promoted. However, there is still need of strong coordination and linkage between agriculture, environment, forestry, meteorology, irrigation, climate-induced pro-active disaster preparedness and research at the ministry, department and district level for up-scaling, implementation and institutionalization of climate change and disaster risk management activities and adaptation mitigation options in agriculture for sustainable livelihoods of the communities.Keywords: climate change adaptation, disaster risk management, soil-water management practices, sustainable agriculture
Procedia PDF Downloads 510123 Female Frontline Health Workers in High-Risk Workplaces: Legal Protection in Bangladesh amid the Covid-19 Pandemic
Authors: Nabila Farhin, Israt Jahan
Abstract:
Despite the feminisation of the global health force, women mostly engage in nursing, midwifery and community health workers (HWs), and the posts like surgeons, doctors, and specialists are generally male-dominated. It is also prominent in Bangladesh, where female HWs witness systematic workplace inequalities, discrimination, and underpayment. The Covid-19 pandemic put unsurmountable pressure on HWs as they had to serve in high-risk workplaces as frontliners. The already disadvantaged female HWs shouldered the same burden, were overworked without adequate occupational health and safety measures (OSH) and risked their lives. Acknowledging their vulnerable workplace conditions, the World Health Organization (WHO) and International Labour Organization (ILO) circulated a few specialised guidelines amid the peril. Bangladesh tried to adhere to international guidelines while formulating pandemic management strategies. In reality, the already weak and understaffed health sector collapsed with the patient influx and many HWs got infected and died in the line of duty, exposing the high-risk nature of the work. Unfortunately, the gender-segregated data of infected HWs are absent. This qualitative research investigates whether the existing laws of Bangladesh are adequate in protecting female HWs as frontliners in high-risk workplaces during the Covid-19 pandemic. The paper first examines international labour laws safeguarding female frontline HWs. It also analyses the specialised Covid-19 pandemic guidelines protecting their interests. Finally, the research investigates the compliance of Bangladesh as per international legal guidance during the pandemic. In doing so, it explores the domestic laws, professional guidelines for HWs and pandemic response strategies. The paper critically examines the primary sources like international and national statutes, rules, regulations and guidelines. Secondary sources like authoritative journal articles, books and newspaper reports are contextually analysed in line with the objective of the paper. The definition of HW is ambiguous in the labour laws of Bangladesh. It leads to confusion regarding the extent of legal protection rendered to female HWs at private hospitals in high-risk situations. The labour laws are not applicable in Public hospitals, as the employees follow the public service rules. Unfortunately, the country has no specialised law to protect HWs in high-risk workplaces, and the professional guidelines for HWs also remain inadequate in this regard. Even though the pandemic management strategies highlight some protective measures in high-risk situations, they only deal with HWs who are pregnant or have underlying health issues. No specialised protective guidelines can be found for female HWs as frontliners. Therefore, the laws are insufficient and failed to render adequate legal protection to female frontline HWs during the pandemic. The country also lacks comprehensive health legislation and uniform institutional and professional guidelines, preventing them from accessing grievance mechanisms. Hence, the female HWs felt victimised while duty-bound to serve in high-risk workplaces without adequate safeguards. Bangladesh should clarify the definition of HWs and standardise the service rules for providing medical care in high-risk workplaces. The research also recommends adequate health legislation and specialised legal protection to safeguard female HWs in future emergencies.Keywords: female health workers (HWs), high-risk workplaces, Covid-19 pandemic, Bangladesh
Procedia PDF Downloads 78122 The Role of Intraluminal Endoscopy in the Diagnosis and Treatment of Fluid Collections in Patients With Acute Pancreatitis
Authors: A. Askerov, Y. Teterin, P. Yartcev, S. Novikov
Abstract:
Introduction: Acute pancreatitis (AP) is a socially significant problem for public health and continues to be one of the most common causes of hospitalization of patients with pathology of the gastrointestinal tract. It is characterized by high mortality rates, which reaches 62-65% in infected pancreatic necrosis. Aims & Methods: The study group included 63 patients who underwent transluminal drainage (TLD) fluid collection (FC). All patients were performed transabdominal ultrasound, computer tomography of the abdominal cavity and retroperitoneal organs and endoscopic ultrasound (EUS) of the pancreatobiliary zone. The EUS was used as a final diagnostic method to determine the characteristics of FC. The indications for TLD were: the distance between the wall of the hollow organ and the FC was not more than 1 cm, the absence of large vessels on the puncture trajectory (more than 3 mm), and the size of the formation was more than 5 cm. When a homogeneous cavity with clear, even contours was detected, a plastic stent with rounded ends (“double pig tail”) was installed. The indication for the installation of a fully covered self-expanding stent was the detection of nonhomogeneous anechoic FC with hyperechoic inclusions and cloudy purulent contents. In patients with necrotic forms after drainage of the purulent cavity, a cystonasal drainage with a diameter of 7Fr was installed in its lumen under X-ray control to sanitize the cavity with a 0.05% aqueous solution of chlorhexidine. Endoscopic necrectomy was performed every 24-48 hours. The plastic stent was removed in 6 month, the fully covered self-expanding stent - in 1 month after the patient was discharged from the hospital. Results: Endoscopic TLD was performed in 63 patients. The FC corresponding to interstitial edematous pancreatitis was detected in 39 (62%) patients who underwent TLD with the installation of a plastic stent with rounded ends. In 24 (38%) patients with necrotic forms of FC, a fully covered self-expanding stent was placed. Communication with the ductal system of the pancreas was found in 5 (7.9%) patients. They underwent pancreaticoduodenal stenting. A complicated postoperative period was noted in 4 (6.3%) cases and was manifested by bleeding from the zone of pancreatogenic destruction. In 2 (3.1%) cases, this required angiography and endovascular embolization a. gastroduodenalis, in 1 (1.6%) case, endoscopic hemostasis was performed by filling the cavity with 4 ml of Hemoblock hemostatic solution. The combination of both methods was used in 1 (1.6%) patient. There was no evidence of recurrent bleeding in these patients. Lethal outcome occurred in 4 patients (6.3%). In 3 (4.7%) patients, the cause of death was multiple organ failure, in 1 (1.6%) - severe nosocomial pneumonia that developed on the 32nd day after drainage. Conclusions: 1. EUS is not only the most important method for diagnosing FC in AP, but also allows you to determine further tactics for their intraluminal drainage.2. Endoscopic intraluminal drainage of fluid zones in 45.8% of cases is the final minimally invasive method of surgical treatment of large-focal pancreatic necrosis. Disclosure: Nothing to disclose.Keywords: acute pancreatitis, fluid collection, endoscopy surgery, necrectomy, transluminal drainage
Procedia PDF Downloads 109121 Assessing the Efficiency of Pre-Hospital Scoring System with Conventional Coagulation Tests Based Definition of Acute Traumatic Coagulopathy
Authors: Venencia Albert, Arulselvi Subramanian, Hara Prasad Pati, Asok K. Mukhophadhyay
Abstract:
Acute traumatic coagulopathy in an endogenous dysregulation of the intrinsic coagulation system in response to the injury, associated with three-fold risk of poor outcome, and is more amenable to corrective interventions, subsequent to early identification and management. Multiple definitions for stratification of the patients' risk for early acute coagulopathy have been proposed, with considerable variations in the defining criteria, including several trauma-scoring systems based on prehospital data. We aimed to develop a clinically relevant definition for acute coagulopathy of trauma based on conventional coagulation assays and to assess its efficacy in comparison to recently established prehospital prediction models. Methodology: Retrospective data of all trauma patients (n = 490) presented to our level I trauma center, in 2014, was extracted. Receiver operating characteristic curve analysis was done to establish cut-offs for conventional coagulation assays for identification of patients with acute traumatic coagulopathy was done. Prospectively data of (n = 100) adult trauma patients was collected and cohort was stratified by the established definition and classified as "coagulopathic" or "non-coagulopathic" and correlated with the Prediction of acute coagulopathy of trauma score and Trauma-Induced Coagulopathy Clinical Score for identifying trauma coagulopathy and subsequent risk for mortality. Results: Data of 490 trauma patients (average age 31.85±9.04; 86.7% males) was extracted. 53.3% had head injury, 26.6% had fractures, 7.5% had chest and abdominal injury. Acute traumatic coagulopathy was defined as international normalized ratio ≥ 1.19; prothrombin time ≥ 15.5 s; activated partial thromboplastin time ≥ 29 s. Of the 100 adult trauma patients (average age 36.5±14.2; 94% males), 63% had early coagulopathy based on our conventional coagulation assay definition. Overall prediction of acute coagulopathy of trauma score was 118.7±58.5 and trauma-induced coagulopathy clinical score was 3(0-8). Both the scores were higher in coagulopathic than non-coagulopathic patients (prediction of acute coagulopathy of trauma score 123.2±8.3 vs. 110.9±6.8, p-value = 0.31; trauma-induced coagulopathy clinical score 4(3-8) vs. 3(0-8), p-value = 0.89), but not statistically significant. Overall mortality was 41%. Mortality rate was significantly higher in coagulopathic than non-coagulopathic patients (75.5% vs. 54.2%, p-value = 0.04). High prediction of acute coagulopathy of trauma score also significantly associated with mortality (134.2±9.95 vs. 107.8±6.82, p-value = 0.02), whereas trauma-induced coagulopathy clinical score did not vary be survivors and non-survivors. Conclusion: Early coagulopathy was seen in 63% of trauma patients, which was significantly associated with mortality. Acute traumatic coagulopathy defined by conventional coagulation assays (international normalized ratio ≥ 1.19; prothrombin time ≥ 15.5 s; activated partial thromboplastin time ≥ 29 s) demonstrated good ability to identify coagulopathy and subsequent mortality, in comparison to the prehospital parameter-based scoring systems. Prediction of acute coagulopathy of trauma score may be more suited for predicting mortality rather than early coagulopathy. In emergency trauma situations, where immediate corrective measures need to be taken, complex multivariable scoring algorithms may cause delay, whereas coagulation parameters and conventional coagulation tests will give highly specific results.Keywords: trauma, coagulopathy, prediction, model
Procedia PDF Downloads 176120 Insights on the Halal Status of Antineoplastic and Immunomodulating Agents and Nutritional and Dietary Supplements in Malaysia
Authors: Suraiya Abdul Rahman, Perasna M. Varma, Amrahi Buang, Zhari Ismail, Wan Rosalina W. Rosli, Ahmad Rashidi M. Tahir
Abstract:
Background: Muslims has the obligation to ensure that everything they consume including medicines should be halal. With the growing demands for halal medicines in October 2012, Malaysia has launched the world's first Halal pharmaceutical standards called Malaysian Standard MS 2424:2012 Halal Pharmaceuticals-General Guidelines to serve as a basic requirement for halal pharmaceuticals in Malaysia. However, the biggest challenge faced by pharmaceutical companies to comply is finding the origin or source of the ingredients and determine their halal status. Aim: This study aims to determine the halal status of the antineoplastic and immunomodulating agents, and nutritional and dietary supplements by analysing the origin of their active pharmaceutical ingredients (API) and excipients to provide an insight on the common source and halal status of pharmaceutical ingredients and an indication on adjustment required in order to be halal compliance. Method: The ingredients of each product available in a government hospital in central of Malaysia and their sources were determined from the product package leaflets, information obtained from manufacturer, reliable websites and standard pharmaceutical references. The ingredients were categorised as halal, musbooh or haram based on the definition set in MS2424. Results: There were 162 medications included in the study where 123 (76%) were under the antineoplastic and immunomodulating agents group, while 39 (24%) were nutritional and dietary supplements. In terms of the medication halal status, the proportion of halal, musbooh and haram were 40.1% (n=65), 58.6% (n=95) and 1.2% (n=2) respectively. With regards to the API, there were 89 (52%) different active ingredient identified for antineoplastic and immunomodulating agents with the proportion of 89.9% (n=80) halal and 10.1% (n=9) were mushbooh. There were 83 (48%) active ingredient from the nutritional and dietary supplements group with proportion of halal and masbooh were 89.2% (n=74) and 10.8% (n=9) respectively. No haram APIs were identified in all therapeutic classes. There were a total of 176 excipients identified from the products ranges. It was found that majority of excipients are halal with the proportion of halal, masbooh and haram were at 82.4% (n=145), 17% (n=30) and 0.6% (n=1) respectively. With regards of the sources of the excipeints, most of masbooh excipients (76.7%, n = 23) were classified as masbooh because they have multiple possible origin which consist of animals, plant or others. The remaining 13.3% and 10% were classified as masbooh due to their ethanol and land animal origin respectively. The one haram excipient was gelatine of bovine-porcine origin. Masbooh ingredients found in this research were glycerol, tallow, lactose, polysorbate, dibasic sodium phosphate, stearic acid and magnesium stearate. Ethanol, gelatine, glycerol and magnesium stearate were the most common ingredients classified as mushbooh. Conclusion: This study shows that most API and excipients are halal. However the majority of the medicines in these products categories are mushbooh due to certain excipients only, which could be replaced with halal alternative excipients. This insight should encourage the pharmaceutical products manufacturers to go for halal certification to meet the increasing demand for Halal certified medications for the benefit of mankind.Keywords: antineoplastic and immunomodulation agents, halal pharmaceutical, MS2424, nutritional and dietary supplements
Procedia PDF Downloads 302119 Managing Human-Wildlife Conflicts Compensation Claims Data Collection and Payments Using a Scheme Administrator
Authors: Eric Mwenda, Shadrack Ngene
Abstract:
Human-wildlife conflicts (HWCs) are the main threat to conservation in Africa. This is because wildlife needs overlap with those of humans. In Kenya, about 70% of wildlife occurs outside protected areas. As a result, wildlife and human range overlap, causing HWCs. The HWCs in Kenya occur in the drylands adjacent to protected areas. The top five counties with the highest incidences of HWC are Taita Taveta, Narok, Lamu, Kajiado, and Laikipia. The common wildlife species responsible for HWCs are elephants, buffaloes, hyenas, hippos, leopards, baboons, monkeys, snakes, and crocodiles. To ensure individuals affected by the conflicts are compensated, Kenya has developed a model of HWC compensation claims data collection and payment. We collected data on HWC from all eight Kenya Wildlife Service (KWS) Conservation Areas from 2009 to 2019. Additional data was collected from stakeholders' consultative workshops held in the Conservation Areas and a literature review regarding payment of injuries and ongoing insurance schemes being practiced in areas. This was followed by the description of the claims administration process and calculation of the pricing of the compensation claims. We further developed a digital platform for data capture and processing of all reported conflict cases and payments. Our product recognized four categories of HWC (i.e., human death and injury, property damage, crop destruction, and livestock predation). Personal bodily injury and human death were provided based on the Continental Scale of Benefits. We proposed a maximum of Kenya Shillings (KES) 3,000,000 for death. Medical, pharmaceutical, and hospital expenses were capped at a maximum of KES 150,000, as well as funeral costs at KES 50,000. Pain and suffering were proposed to be paid for 12 months at the rate of KES 13,500 per month. Crop damage was to be based on farm input costs at a maximum of KES 150,000 per claim. Livestock predation leading to death was based on Tropical Livestock Unit (TLU), which is equivalent to KES 30,000, whick includes Cattle (1 TLU = KES 30,000), Camel (1.4 TLU = KES 42,000), Goat (0.15 TLU = 4,500), Sheep (0.15 TLU = 4,500), and Donkey (0.5 TLU = KES 15,000). Property destruction (buildings, outside structures and harvested crops) was capped at KES 150,000 per any one claim. We conclude that it is possible to use an administrator to collect data on HWC compensation claims and make payments using technology. The success of the new approach will depend on a piloting program. We recommended that a pilot scheme be initiated for eight months in Taita Taveta, Kajiado, Baringo, Laikipia, Narok, and Meru Counties. This will test the claims administration process as well as harmonize data collection methods. The results of this pilot will be crucial in adjusting the scheme before country-wide roll out.Keywords: human-wildlife conflicts, compensation, human death and injury, crop destruction, predation, property destruction
Procedia PDF Downloads 55118 Diagnostic Yield of CT PA and Value of Pre Test Assessments in Predicting the Probability of Pulmonary Embolism
Authors: Shanza Akram, Sameen Toor, Heba Harb Abu Alkass, Zainab Abdulsalam Altaha, Sara Taha Abdulla, Saleem Imran
Abstract:
Acute pulmonary embolism (PE) is a common disease and can be fatal. The clinical presentation is variable and nonspecific, making accurate diagnosis difficult. Testing patients with suspected acute PE has increased dramatically. However, the overuse of some tests, particularly CT and D-dimer measurement, may not improve care while potentially leading to patient harm and unnecessary expense. CTPA is the investigation of choice for PE. Its easy availability, accuracy and ability to provide alternative diagnosis has lowered the threshold for performing it, resulting in its overuse. Guidelines have recommended the use of clinical pretest probability tools such as ‘Wells score’ to assess risk of suspected PE. Unfortunately, implementation of guidelines in clinical practice is inconsistent. This has led to low risk patients being subjected to unnecessary imaging, exposure to radiation and possible contrast related complications. Aim: To study the diagnostic yield of CT PA, clinical pretest probability of patients according to wells score and to determine whether or not there was an overuse of CTPA in our service. Methods: CT scans done on patients with suspected P.E in our hospital from 1st January 2014 to 31st December 2014 were retrospectively reviewed. Medical records were reviewed to study demographics, clinical presentation, final diagnosis, and to establish if Wells score and D-Dimer were used correctly in predicting the probability of PE and the need for subsequent CTPA. Results: 100 patients (51male) underwent CT PA in the time period. Mean age was 57 years (24-91 years). Majority of patients presented with shortness of breath (52%). Other presenting symptoms included chest pain 34%, palpitations 6%, collapse 5% and haemoptysis 5%. D Dimer test was done in 69%. Overall Wells score was low (<2) in 28 %, moderate (>2 - < 6) in 47% and high (> 6) in 15% of patients. Wells score was documented in medical notes of only 20% patients. PE was confirmed in 12% (8 male) patients. 4 had bilateral PE’s. In high-risk group (Wells > 6) (n=15), there were 5 diagnosed PEs. In moderate risk group (Wells >2 - < 6) (n=47), there were 6 and in low risk group (Wells <2) (n=28), one case of PE was confirmed. CT scans negative for PE showed pleural effusion in 30, Consolidation in 20, atelactasis in 15 and pulmonary nodule in 4 patients. 31 scans were completely normal. Conclusion: Yield of CT for pulmonary embolism was low in our cohort at 12%. A significant number of our patients who underwent CT PA had low Wells score. This suggests that CT PA is over utilized in our institution. Wells score was poorly documented in medical notes. CT-PA was able to detect alternative pulmonary abnormalities explaining the patient's clinical presentation. CT-PA requires concomitant pretest clinical probability assessment to be an effective diagnostic tool for confirming or excluding PE. . Clinicians should use validated clinical prediction rules to estimate pretest probability in patients in whom acute PE is being considered. Combining Wells scores with clinical and laboratory assessment may reduce the need for CTPA.Keywords: CT PA, D dimer, pulmonary embolism, wells score
Procedia PDF Downloads 232117 Correlation between the Levels of Some Inflammatory Cytokines/Haematological Parameters and Khorana Scores of Newly Diagnosed Ambulatory Cancer Patients
Authors: Angela O. Ugwu, Sunday Ocheni
Abstract:
Background: Cancer-associated thrombosis (CAT) is a cause of morbidity and mortality among cancer patients. Several risk factors for developing venous thromboembolism (VTE) also coexist with cancer patients, such as chemotherapy and immobilization, thus contributing to the higher risk of VTE in cancer patients when compared to non-cancer patients. This study aimed to determine if there is any correlation between levels of some inflammatory cytokines/haematological parameters and Khorana scores of newly diagnosed chemotherapy naïve ambulatory cancer patients (CNACP). Methods: This was a cross-sectional analytical study carried out from June 2021 to May 2022. Eligible newly diagnosed cancer patients 18 years and above (case group) were enrolled consecutively from the adult Oncology Clinics of the University of Nigeria Teaching Hospital, Ituku/Ozalla (UNTH). The control group was blood donors at UNTH Ituku/Ozalla, Enugu blood bank, and healthy members of the Medical and Dental Consultants Association of Nigeria (MDCAN), UNTH Chapter. Blood samples collected from the participants were assayed for IL-6, TNF-Alpha, and haematological parameters such as haemoglobin, white blood cell count (WBC), and platelet count. Data were entered into an Excel worksheet and were then analyzed using Statistical Package for Social Sciences (SPSS) computer software version 21.0 for windows. A P value of < 0.05 was considered statistically significant. Results: A total of 200 participants (100 cases and 100 controls) were included in the study. The overall mean age of the participants was 47.42 ±15.1 (range 20-76). The sociodemographic characteristics of the two groups, including age, sex, educational level, body mass index (BMI), and occupation, were similar (P > 0.05). Following One Way ANOVA, there were significant differences between the mean levels of interleukin-6 (IL-6) (p = 0.036) and tumor necrotic factor-α (TNF-α) (p = 0.001) in the three Khorana score groups of the case group. Pearson’s correlation analysis showed a significant positive correlation between the Khorana scores and IL-6 (r=0.28, p = 0.031), TNF-α (r= 0.254, p= 0.011), and PLR (r= 0.240, p=0.016). The mean serum levels of IL-6 were significantly higher in CNACP than in the healthy controls [8.98 (8-12) pg/ml vs. 8.43 (2-10) pg/ml, P=0.0005]. There were also significant differences in the mean levels of the haemoglobin (Hb) level (P < 0.001)); white blood cell (WBC) count ((P < 0.001), and platelet (PL) count (P = 0.005) between the two groups of participants. Conclusion: There is a significant positive correlation between the serum levels of IL-6, TNF-α, and PLR and the Khorana scores of CNACP. The mean serum levels of IL-6, TNF-α, PLR, WBC, and PL count were significantly higher in CNACP than in the healthy controls. Ambulatory cancer patients with high-risk Khorana scores may benefit from anti-inflammatory drugs because of the positive correlation with inflammatory cytokines. Recommendations: Ambulatory cancer patients with 2 Khorana scores may benefit from thromboprophylaxis since they have higher Khorana scores. A multicenter study with a heterogeneous population and larger sample size is recommended in the future to further elucidate the relationship between IL-6, TNF-α, PLR, and the Khorana scores among cancer patients in the Nigerian population.Keywords: thromboprophylaxis, cancer, Khorana scores, inflammatory cytokines, haematological parameters
Procedia PDF Downloads 82116 Evaluation of the Role of Advocacy and the Quality of Care in Reducing Health Inequalities for People with Autism, Intellectual and Developmental Disabilities at Sheffield Teaching Hospitals
Authors: Jonathan Sahu, Jill Aylott
Abstract:
Individuals with Autism, Intellectual and Developmental disabilities (AIDD) are one of the most vulnerable groups in society, hampered not only by their own limitations to understand and interact with the wider society, but also societal limitations in perception and understanding. Communication to express their needs and wishes is fundamental to enable such individuals to live and prosper in society. This research project was designed as an organisational case study, in a large secondary health care hospital within the National Health Service (NHS), to assess the quality of care provided to people with AIDD and to review the role of advocacy to reduce health inequalities in these individuals. Methods: The research methodology adopted was as an “insider researcher”. Data collection included both quantitative and qualitative data i.e. a mixed method approach. A semi-structured interview schedule was designed and used to obtain qualitative and quantitative primary data from a wide range of interdisciplinary frontline health care workers to assess their understanding and awareness of systems, processes and evidence based practice to offer a quality service to people with AIDD. Secondary data were obtained from sources within the organisation, in keeping with “Case Study” as a primary method, and organisational performance data were then compared against national benchmarking standards. Further data sources were accessed to help evaluate the effectiveness of different types of advocacy that were present in the organisation. This was gauged by measures of user and carer experience in the form of retrospective survey analysis, incidents and complaints. Results: Secondary data demonstrate near compliance of the Organisation with the current national benchmarking standard (Monitor Compliance Framework). However, primary data demonstrate poor knowledge of the Mental Capacity Act 2005, poor knowledge of organisational systems, processes and evidence based practice applied for people with AIDD. In addition there was poor knowledge and awareness of frontline health care workers of advocacy and advocacy schemes for this group. Conclusions: A significant amount of work needs to be undertaken to improve the quality of care delivered to individuals with AIDD. An operational strategy promoting the widespread dissemination of information may not be the best approach to deliver quality care and optimal patient experience and patient advocacy. In addition, a more robust set of standards, with appropriate metrics, needs to be developed to assess organisational performance which will stand the test of professional and public scrutiny.Keywords: advocacy, autism, health inequalities, intellectual developmental disabilities, quality of care
Procedia PDF Downloads 217115 The Potential Impact of Big Data Analytics on Pharmaceutical Supply Chain Management
Authors: Maryam Ziaee, Himanshu Shee, Amrik Sohal
Abstract:
Big Data Analytics (BDA) in supply chain management has recently drawn the attention of academics and practitioners. Big data refers to a massive amount of data from different sources, in different formats, generated at high speed through transactions in business environments and supply chain networks. Traditional statistical tools and techniques find it difficult to analyse this massive data. BDA can assist organisations to capture, store, and analyse data specifically in the field of supply chain. Currently, there is a paucity of research on BDA in the pharmaceutical supply chain context. In this research, the Australian pharmaceutical supply chain was selected as the case study. This industry is highly significant since the right medicine must reach the right patients, at the right time, in right quantity, in good condition, and at the right price to save lives. However, drug shortages remain a substantial problem for hospitals across Australia with implications on patient care, staff resourcing, and expenditure. Furthermore, a massive volume and variety of data is generated at fast speed from multiple sources in pharmaceutical supply chain, which needs to be captured and analysed to benefit operational decisions at every stage of supply chain processes. As the pharmaceutical industry lags behind other industries in using BDA, it raises the question of whether the use of BDA can improve transparency among pharmaceutical supply chain by enabling the partners to make informed-decisions across their operational activities. This presentation explores the impacts of BDA on supply chain management. An exploratory qualitative approach was adopted to analyse data collected through interviews. This study also explores the BDA potential in the whole pharmaceutical supply chain rather than focusing on a single entity. Twenty semi-structured interviews were undertaken with top managers in fifteen organisations (five pharmaceutical manufacturers, five wholesalers/distributors, and five public hospital pharmacies) to investigate their views on the use of BDA. The findings revealed that BDA can enable pharmaceutical entities to have improved visibility over the whole supply chain and also the market; it enables entities, especially manufacturers, to monitor consumption and the demand rate in real-time and make accurate demand forecasts which reduce drug shortages. Timely and precise decision-making can allow the entities to source and manage their stocks more effectively. This can likely address the drug demand at hospitals and respond to unanticipated issues such as drug shortages. Earlier studies explore BDA in the context of clinical healthcare; however, this presentation investigates the benefits of BDA in the Australian pharmaceutical supply chain. Furthermore, this research enhances managers’ insight into the potentials of BDA at every stage of supply chain processes and helps to improve decision-making in their supply chain operations. The findings will turn the rhetoric of data-driven decision into a reality where the managers may opt for analytics for improved decision-making in the supply chain processes.Keywords: big data analytics, data-driven decision, pharmaceutical industry, supply chain management
Procedia PDF Downloads 106114 Functionalization of Sanitary Pads with Probiotic Paste
Authors: O. Sauperl, L. Fras Zemljic
Abstract:
The textile industry is gaining increasing importance in the field of medical materials. Therefore, presented research is focused on textile materials for external (out-of-body) use. Such materials could be various hygienic textile products (diapers, tampons, sanitary napkins, incontinence products, etc.), protective textiles and various hospital linens (surgical covers, masks, gowns, cloths, bed linens, etc.) wound pillows, bandages, orthopedic socks, etc. Function of tampons and sanitary napkins is not only to provide protection during the menstrual cycle, but their function can be also to take care of physiological or pathological vaginal discharge. In general, women's intimate areas are against infection protected by a low pH value of the vaginal flora. High pH inhibits the development of harmful microorganisms, as it is difficult to be reproduced in an acidic environment. The normal vaginal flora in healthy women is highly colonized by lactobacilli. The lactic acid produced by these organisms maintains the constant acidity of the vagina. If the balance of natural protection breaks, infections can occur. In the market, there exist probiotic tampons as a medical product supplying the vagina with beneficial probiotic lactobacilli. But, many users have concerns about the use of tampons due to the possible dry-out of the vagina as well as the possible toxic shock syndrome, which is the reason that they use mainly sanitary napkins during the menstrual cycle. Functionalization of sanitary napkins with probiotics is, therefore, interesting in regard to maintain a healthy vaginal flora and to offer to users added value of the sanitary napkins in the sense of health- and environmentally-friendly products. For this reason, the presented research is oriented in functionalization of the sanitary napkins with the probiotic paste in order to activate the lactic acid bacteria presented in the core of the functionalized sanitary napkin at the time of the contact with the menstrual fluid. In this way, lactobacilli could penetrate into vagina and by maintaining healthy vaginal flora to reduce the risk of vaginal disorders. In regard to the targeted research problem, the influence of probiotic paste applied onto cotton hygienic napkins on selected properties was studied. The aim of the research was to determine whether the sanitary napkins with the applied probiotic paste may assure suitable vaginal pH to maintain a healthy vaginal flora during the use of this product. Together with this, sorption properties of probiotic functionalized sanitary napkins were evaluated and compared to the untreated one. The research itself was carried out on the basis of tracking and controlling the input parameters, currently defined by Slovenian producer (Tosama d.o.o.) as the most important. Successful functionalization of sanitary pads with the probiotic paste was confirmed by ATR-FTIR spectroscopy. Results of the methods used within the presented research show that the absorption of the pads treated with probiotic paste deteriorates compared to non-treated ones. The coating shows a 6-month stability. Functionalization of sanitary pads with probiotic paste is believed to have a commercial potential for lowering the probability of infection during the menstrual cycle.Keywords: functionalization, probiotic paste, sanitary pads, textile materials
Procedia PDF Downloads 191113 Scaling up Small and Sick Newborn Care Through the Establishment of the First Human Milk Bank in Nepal
Authors: Prajwal Paudel, Shreeprasad Adhikari, Shailendra Bir Karmacharya, Kalpana Upadhyaya
Abstract:
Background: Human milk banks have been recommended by the World Health Organization (WHO) for newborn and child nourishment in the provision of optimum nutrition as an alternative to breastfeeding in circumstances when direct breastfeeding is inaccessible. The vulnerable group of babies, mainly preterm, low birth weight, and sick newborns, are at a greater risk of mortality and possibly benefit from the safe use of donated human milk through milk banks. In this study, we aimed to shed light on the process involved during the setting up of the nation’s first milk bank and its vitality in small and sick newborn nutrition and care. Methods: The study was conducted in Paropakar Maternity and Women’s Hospital, where the first human milk (HMB) was established. The establishment involved a stepwise process of need assessment meeting, formation of the HMB committee, learning visit to HMB in India, studying the strengths and weaknesses of promoting breastfeeding and HMB system integration, procurement, installation, and setting up the infrastructure, and developing technical competency, launching of the HMB. After the initiation of HMB services, information regarding the recruited donor mothers and the volume of milk pasteurized and consumed by the needy recipient babies were recorded. Descriptive statistics with frequencies and percentages were used to describe the utilization of HMB services. Results: During the study period, a total of 506113 ml of milk was collected, while 49930 ml of milk was pasteurized. Of the pasteurized milk, 381248 ml of milk was dispensed. The total volume of milk received was from a total of 883 after proper routine screening tests. Similarly, the total number of babies who received the donated human milk (DHM) was 912 with different neonatal conditions. Among the babies who received DHM, 527(57.7%) were born via CS, and 385 (42.21%) were delivered normally. In the birth weight category,9 (1%) of the babies were less than 1000 grams, 75 (8.2%) were less than 1500 grams, 405 (44.4%) were between 1500 to less than 2500 grams whereas, 423 (46.4%) of the babies who received DHM were normal weight babies. Among the sick newborns, perinatal asphyxia accounted for 166 (18.2%), preterm with other complications 372 (40.7%), preterm 23 (2.02%), respiratory distress 140 (15.35%), neonatal jaundice 150 (16.44%), sepsis 94 (10.30%), meconium aspiration syndrome 9(1%), seizure disorder 28 (3.07%), congenital anomalies 13 (1.42%) and others 33(3. 61%). The neonatal mortality rate dropped to 6.2/1000 live births from 7.5/1000 live births in the first year of establishment as compared to the previous year. Conclusion: The establishment of the first HMB in Nepal involved a comprehensive approach to integrate a new system with the existing newborn care in the provision of safe DHM. Premature babies with complication, babies born via CS, perinatal asphyxia and babies with sepsis consumed the greater proportion of DHM. Rigorous research is warranted to assess the impact of DHM in small and sick newborn who otherwise would be fed formula milk.Keywords: human milk bank, sick-newborn, mortality, neonatal nutrition
Procedia PDF Downloads 11112 Integrated Care on Chronic Diseases in Asia-Pacific Countries
Authors: Chang Liu, Hanwen Zhang, Vikash Sharma, Don Eliseo Lucerno-Prisno III, Emmanuel Yujuico, Maulik Chokshi, Prashanthi Krishnakumar, Bach Xuan Tran, Giang Thu Vu, Kamilla Anna Pinter, Shenglan Tang
Abstract:
Background and Aims: Globally, many health systems focus on hospital-based healthcare models targeting acute care and disease treatment, which are not effective in addressing the challenges of ageing populations, chronic conditions, multi-morbidities, and increasingly unhealthy lifestyles. Recently, integrated care programs on chronic diseases have been developed, piloted, and implemented to meet such challenges. However, integrated care programs in the Asia-Pacific region vary in the levels of integration from linkage to coordination to full integration. This study aims to identify and analyze existing cases of integrated care in the Asia-Pacific region and identify the facilitators and barriers in order to improve existing cases and inform future cases. Methods: The study is a comparative study, with a combination approach of desk-based research and key informant interviews. The selected countries included in this study represent a good mix of lower-middle income countries (the Philippines, India, Vietnam, and Fiji), upper-middle income country (China), and high-income country (Singapore) in the Asia-Pacific region. Existing integrated care programs were identified through the scoping review approach. Trigger, history, general design, beneficiaries, and objectors were summarized with barriers and facilitators of integrated care based on key informant interviews. Representative case(s) in each country were selected and comprehensively analyzed through deep-dive case studies. Results: A total of 87 existing integrated care programs on chronic diseases were found in all countries, with 44 in China, 21 in Singapore, 12 in India, 5 in Vietnam, 4 in the Philippines, and 1 in Fiji. 9 representative cases of integrated care were selected for in-depth description and analysis, with 2 in China, the Philippines, and Vietnam, and 1 in Singapore, India, and Fiji. Population aging and the rising chronic disease burden have been identified as key drivers for almost all the six countries. Among the six countries, Singapore has the longest history of integrated care, followed by Fiji, the Philippines, and China, while India and Vietnam have a shorter history of integrated care. Incentives, technologies, education, and performance evaluation would be crucial for developing strategies for implementing future programs and improve already existing programs. Conclusion: Integrated care is important for addressing challenges surrounding the delivery of long-term care. To date, there is an increasing trend of integrated care programs on chronic diseases in the Asia-Pacific region, and all six countries in our study set integrated care as a direction for their health systems transformation.Keywords: integrated healthcare, integrated care delivery, chronic diseases, Asia-Pacific region
Procedia PDF Downloads 135111 Expanding Behavioral Crisis Care: Expansion of Psychiatric and Addiction-Care Services through a 23/7 Behavioral Crisis Center
Authors: Garima Singh
Abstract:
Objectives: Behavioral Crisis Center (BCC) is a community solution to a community problem. There has been an exponential increase in the incidence and prevalence of mental health crises around the world. The effects of the crisis negatively impact our patients and their families and strain the law enforcement and emergency room. The goal of the multi-disciplinary care model is to break the crisis cycle and provide 24-7 rapid access to an acre and crisis stabilization. We initiated our first BCC care center in 2020 in the midst of the COVID pandemic and have seen a remarkable improvement in patient ‘care and positive financial outcome. Background: Mental illnesses are common in the United States. Nearly one in five U.S. adults live with a mental illness (52.9 million in 2020). This number represented 21.0% of all U.S. adults. To address some of these challenges and help our community, In May 2020, we opened our first Behavioral crisis center (BCC). Since then, we have served more than 2500 patients and is the first southwest Missouri’s first 24/7 facility for crisis–level behavioral health and substance use needs. It has been proven to be a more effective place than emergency departments, jails, or local law enforcement. Methods: BCC was started in 2020 to serve the unmet need of the community and provide access to behavioral health and substance use services identified in the community. Funding was possible with significant investment from the county and Missouri Foundation for Health, with contributions from medical partners. It is a multi-disciplinary care center consisting of Physicians, nurse practitioners, nurses, behavioral technicians, peer support specialists, clinical intake specialists, and clinical coordinators and hospitality specialists. The center provides services including psychiatry care, outpatient therapy, community support services, primary care, peer support and engagement. It is connected to a residential treatment facility for substance use treatment for continuity of care and bridging the gap, which has resulted in the completion of treatment and better outcomes. Results: BCC has proven to be a great resource to the community and the Missouri Health Coalition is providing funding to replicate the model in other regions and work on a similar model for children and adolescents. Overall, 29% of the patients seen at BCC are stabilized and discharged with outpatient care. 50% needed acute stabilization in a hospital setting and 21% required long-term admission, mostly for substance use treatment. The local emergency room had a 42% reduction in behavioral health encounters compared to the previous 3 years. Also, by a quick transfer to BCC, the average stay in ER was reduced by 10 hours and time to follow up behavioral health assessment decreased by an average of 4 hours. Uninsured patients are also provided Medicaid application assistance which has benefited 55% of individuals receiving care at BCC. Conclusions: BCC is impacting community health and improving access to quality care and substance use treatment. It is a great investment for our patients and families.Keywords: BCC, behvaioral health, community health care, addiction treatment
Procedia PDF Downloads 76110 Analyzing the Effectiveness of Elderly Design and the Impact on Sustainable Built Environment
Authors: Tristance Kee
Abstract:
With an unprecedented increase in elderly population around the world, the severe lack of quality housing and health-and-safety provisions to serve this cohort cannot be ignored any longer. Many elderly citizens, especially singletons, live in unsafe housing conditions with poorly executed planning and design. Some suffer from deteriorating mobility, sight and general alertness and their sub-standard living conditions further hinder their daily existence. This research explains how concepts such as Universal Design and Co-Design operate in a high density city such as Hong Kong, China where innovative design can become an alternative solution where government and the private sector fail to provide quality elderly friendly facilities to promote a sustainable urban development. Unlike other elderly research which focuses more on housing policies, nursing care and theories, this research takes a more progressive approach by providing an in-depth impact assessment on how innovative design can be practical solutions for creating a more sustainable built environment. The research objectives are to: 1) explain the relationship between innovative design for elderly and a healthier and sustainable environment; 2) evaluate the impact of human ergonomics with the use of universal design; and 3) explain how innovation can enhance the sustainability of a city in improving citizen’s sight, sound, walkability and safety within the ageing population. The research adopts both qualitative and quantitative methodologies to examine ways to improve elderly population’s relationship to our built environment. In particular, the research utilizes collected data from questionnaire survey and focus group discussions to obtain inputs from various stakeholders, including designers, operators and managers related to public housing, community facilities and overall urban development. In addition to feedbacks from end-users and stakeholders, a thorough analysis on existing elderly housing facilities and Universal Design provisions are examined to evaluate their adequacy. To echo the theme of this conference on Innovation and Sustainable Development, this research examines the effectiveness of innovative design in a risk-benefit factor assessment. To test the hypothesis that innovation can cater for a sustainable development, the research evaluated the health improvement of a sample size of 150 elderly in a period of eight months. Their health performances, including mobility, speech and memory are monitored and recorded on a regular basis to assess if the use of innovation does trigger impact on improving health and home safety for an elderly cohort. This study was supported by district community centers under the auspices of Home Affairs Bureau to provide respondents for questionnaire survey, a standardized evaluation mechanism, and professional health care staff for evaluating the performance impact. The research findings will be integrated to formulate design solutions such as innovative home products to improve elderly daily experience and safety with a particular focus on the enhancement on sight, sound and mobility safety. Some policy recommendations and architectural planning recommendations related to Universal Design will also be incorporated into the research output for future planning of elderly housing and amenity provisions.Keywords: elderly population, innovative design, sustainable built environment, universal design
Procedia PDF Downloads 228109 Case Report: Opioid Sparing Anaesthesia with Dexmedetomidine in General Surgery
Authors: Shang Yee Chong
Abstract:
Perioperative pain is a complex mechanism activated by various nociceptive, neuropathic, and inflammatory pathways. Opioids have long been a mainstay for analgesia in this period, even as we are continuously moving towards a multimodal model to improve pain control while minimising side effects. Dexmedetomidine, a potent alpha-2 agonist, is a useful sedative and hypnotic agent. Its use in the intensive care unit has been well described, and it is increasingly an adjunct intraoperatively for its opioid sparing effects and to decrease pain scores. We describe a case of a general surgical patient in whom minimal opioids was required with dexmedetomidine use. The patient was a 61-year-old Indian gentleman with a history of hyperlipidaemia and type 2 diabetes mellitus, presenting with rectal adenocarcinoma detected on colonoscopy. He was scheduled for a robotic ultra-low anterior resection. The patient was induced with intravenous fentanyl 75mcg, propofol 160mg and atracurium 40mg. He was intubated conventionally and mechanically ventilated. Anaesthesia was maintained with inhalational desflurane and anaesthetic depth was measured with the Masimo EEG Sedline brain function monitor. An initial intravenous dexmedetomidine dose (bolus) of 1ug/kg for 10 minutes was given prior to anaesthetic induction and thereafter, an infusion of 0.2-0.4ug/kg/hr to the end of surgery. In addition, a bolus dose of intravenous lignocaine 1.5mg/kg followed by an infusion at 1mg/kg/hr throughout the surgery was administered. A total of 10mmol of magnesium sulphate and intravenous paracetamol 1000mg were also given for analgesia. There were no significant episodes of bradycardia or hypotension. A total of intravenous phenylephrine 650mcg was given throughout to maintain the patient’s mean arterial pressure within 10-15mmHg of baseline. The surgical time lasted for 5 hours and 40minutes. Postoperatively the patient was reversed and extubated successfully. He was alert and comfortable and pain scores were minimal in the immediate post op period in the postoperative recovery unit. Time to first analgesia was 4 hours postoperatively – with paracetamol 1g administered. This was given at 6 hourly intervals strictly for 5 days post surgery, along with celecoxib 200mg BD as prescribed by the surgeon regardless of pain scores. Oral oxycodone was prescribed as a rescue analgesic for pain scores > 3/10, but the patient did not require any dose. Neither was there nausea or vomiting. The patient was discharged on postoperative day 5. This case has reinforced the use of dexmedetomidine as an adjunct in general surgery cases, highlighting its excellent opioid-sparing effects. In the entire patient’s hospital stay, the only dose of opioid he received was 75mcg of fentanyl at the time of anaesthetic induction. The patient suffered no opioid adverse effects such as nausea, vomiting or postoperative ileus, and pain scores varied from 0-2/10. However, intravenous lignocaine infusion was also used in this instance, which would have helped improve pain scores. Paracetamol, lignocaine, and dexmedetomidine is thus an effective, opioid-sparing combination of multi-modal analgesia for major abdominal surgery cases.Keywords: analgesia, dexmedetomidine, general surgery, opioid sparing
Procedia PDF Downloads 135108 External Validation of Established Pre-Operative Scoring Systems in Predicting Response to Microvascular Decompression for Trigeminal Neuralgia
Authors: Kantha Siddhanth Gujjari, Shaani Singhal, Robert Andrew Danks, Adrian Praeger
Abstract:
Background: Trigeminal neuralgia (TN) is a heterogenous pain syndrome characterised by short paroxysms of lancinating facial pain in the distribution of the trigeminal nerve, often triggered by usually innocuous stimuli. TN has a low prevalence of less than 0.1%, of which 80% to 90% is caused by compression of the trigeminal nerve from an adjacent artery or vein. The root entry zone of the trigeminal nerve is most sensitive to neurovascular conflict (NVC), causing dysmyelination. Whilst microvascular decompression (MVD) is an effective treatment for TN with NVC, all patients do not achieve long-term pain relief. Pre-operative scoring systems by Panczykowski and Hardaway have been proposed but have not been externally validated. These pre-operative scoring systems are composite scores calculated according to a subtype of TN, presence and degree of neurovascular conflict, and response to medical treatments. There is discordance in the assessment of NVC identified on pre-operative magnetic resonance imaging (MRI) between neurosurgeons and radiologists. To our best knowledge, the prognostic impact for MVD of this difference of interpretation has not previously been investigated in the form of a composite scoring system such as those suggested by Panczykowski and Hardaway. Aims: This study aims to identify prognostic factors and externally validate the proposed scoring systems by Panczykowski and Hardaway for TN. A secondary aim is to investigate the prognostic difference between a neurosurgeon's interpretation of NVC on MRI compared with a radiologist’s. Methods: This retrospective cohort study included 95 patients who underwent de novo MVD in a single neurosurgical unit in Melbourne. Data was recorded from patients’ hospital records and neurosurgeon’s correspondence from perioperative clinic reviews. Patient demographics, type of TN, distribution of TN, response to carbamazepine, neurosurgeon, and radiologist interpretation of NVC on MRI, were clearly described prospectively and preoperatively in the correspondence. Scoring systems published by Panczykowski et al. and Hardaway et al. were used to determine composite scores, which were compared with the recurrence of TN recorded during follow-up over 1-year. Categorical data analysed using Pearson chi-square testing. Independent numerical and nominal data analysed with logistical regression. Results: Logistical regression showed that a Panczykowski composite score of greater than 3 points was associated with a higher likelihood of pain-free outcome 1-year post-MVD with an OR 1.81 (95%CI 1.41-2.61, p=0.032). The composite score using neurosurgeon’s impression of NVC had an OR 2.96 (95%CI 2.28-3.31, p=0.048). A Hardaway composite score of greater than 2 points was associated with a higher likelihood of pain-free outcome 1 year post-MVD with an OR 3.41 (95%CI 2.58-4.37, p=0.028). The composite score using neurosurgeon’s impression of NVC had an OR 3.96 (95%CI 3.01-4.65, p=0.042). Conclusion: Composite scores developed by Panczykowski and Hardaway were validated for the prediction of response to MVD in TN. A composite score based on the neurosurgeon’s interpretation of NVC on MRI, when compared with the radiologist’s had a greater correlation with pain-free outcomes 1 year post-MVD.Keywords: de novo microvascular decompression, neurovascular conflict, prognosis, trigeminal neuralgia
Procedia PDF Downloads 74107 The Dark History of American Psychiatry: Racism and Ethical Provider Responsibility
Authors: Mary Katherine Hoth
Abstract:
Despite racial and ethnic disparities in American psychiatry being well-documented, there remains an apathetic attitude among nurses and providers within the field to engage in active antiracism and provide equitable, recovery-oriented care. It is insufficient to be a “colorblind” nurse or provider and state that call care provided is identical for every patient. Maintaining an attitude of “colorblindness” perpetuates the racism prevalent throughout healthcare and leads to negative patient outcomes. The purpose of this literature review is to highlight the how the historical beginnings of psychiatry have evolved into the disparities seen in today’s practice, as well as to provide some insight on methods that providers and nurses can employ to actively participate in challenging these racial disparities. Background The application of psychiatric medicine to White people versus Black, Indigenous, and other People of Color has been distinctly different as a direct result of chattel slavery and the development of pseudoscience “diagnoses” in the 19th century. This weaponization of the mental health of Black people continues to this day. Population The populations discussed are Black, Indigenous, and other People of Color, with a primary focus on Black people’s experiences with their mental health and the field of psychiatry. Methods A literature review was conducted using CINAHL, EBSCO, MEDLINE, and PubMed databases with the following terms: psychiatry, mental health, racism, substance use, suicide, trauma-informed care, disparities and recovery-oriented care. Articles were further filtered based on meeting the criteria of peer-reviewed, full-text availability, written in English, and published between 2018 and 2023. Findings Black patients are more likely to be diagnosed with psychotic disorders and prescribed antipsychotic medications compared to White patients who were more often diagnosed with mood disorders and prescribed antidepressants. This same disparity is also seen in children and adolescents, where Black children are more likely to be diagnosed with behavior problems such as Oppositional Defiant Disorder (ODD) and White children with the same presentation are more likely to be diagnosed with Attention Hyperactivity Disorder. Medications advertisements for antipsychotics like Haldol as recent as 1974 portrayed a Black man, labeled as “agitated” and “aggressive”, a trope we still see today in police violence cases. The majority of nursing and medical school programs do not provide education on racism and how to actively combat it in practice, leaving many healthcare professionals acutely uneducated and unaware of their own biases and racism, as well as structural and institutional racism. Conclusions Racism will continue to grow wherever it is given time, space, and energy. Providers and nurses have an ethical obligation to educate themselves, actively deconstruct their personal racism and bias, and continuously engage in active antiracism by dismantling racism wherever it is encountered, be it structural, institutional, or scientific racism. Agents of change at the patient care level not only improve the outcomes of Black patients, but it will also lead the way in ensuring Black, Indigenous, and other People of Color are included in research of methods and medications in psychiatry in the future.Keywords: disparities, psychiatry, racism, recovery-oriented care, trauma-informed care
Procedia PDF Downloads 129106 Prevalence and Associated Risk Factors of Age-Related Macular Degeneration in the Retina Clinic at a Tertiary Center in Makkah Province, Saudi Arabia: A Retrospective Record Review
Authors: Rahaf Mandura, Fatmah Abusharkh, Layan Kurdi, Rahaf Shigdar, Khadijah Alattas
Abstract:
Introduction: Age-related macular degeneration (AMD) in older individuals are serious health issues that severely impact the quality of life of millions globally. In 2020, the fourth leading cause of blindness worldwide was AMD. The global prevalence of AMD is estimated to be around 8.7%. AMD is a progressive disease involving the macular region of the retina, and it has a complex pathophysiology. RPE cell dysfunction plays a crucial step in the pathway leading to irreversible degeneration of photoreceptors with yellowish lipid-rich, protein-containing drusen deposits accumulating between Bruch's membrane and the RPE. Furthermore, lipofuscinogenesis, drusogenesis, inflammation, and neovascularization are four main processes responsible for the formation of the two types of AMD: the wet (exudative, neovascular) and dry (non-exudative, geographic atrophy) types. We retrospectively evaluated the prevalence of AMD among patients visiting the retina clinic at King Abdulaziz University Hospital (Jeddah, Makkah Province, Saudi Arabia) to identify the commonly associated risk factors of AMD. Methods: The records of 3,067 individuals from 2017 to 2021 were reviewed. Of these, 1,935 satisfied the inclusion criteria and were included in this study. We excluded all patient below 18 years, and those who did not undergo fundus imaging or attend their booked appointments, follow-ups, treatments, and referrals were excluded. Results: The prevalence of AMD among the patients was 4%. The age of patients with AMD was significantly greater than those without AMD (72.4 ± 9.8 years vs. 57.2 ± 15.5 years; p < 0.001). Participants with a family history of AMD tended to have the disease more than those without such a history (85.7% vs. 45%; p = 0.043). Ex- and current smokers were more likely to have AMD than non-smokers (34% and 18.6% vs. 7.2%; p < 0.001). Patients with hypertension and those without type 1 diabetes were at a higher risk of developing AMD than those without hypertension (5.5% vs. 2.8%; p = 0.002) and those with type 1 diabetes (4.2% vs. 0.8%; p = 0.040). In contrast, sex, nationality, type 2 diabetes, and abnormal lipid profile were not significantly associated with AMD. Regarding the clinical characteristics of AMD cases, most cases (70.4%) were of the dry type and affected both eyes (77.2%). The disease duration was ≥5 years in 43.1% of the patients. The most frequent chronic diseases associated with AMD were type 2 diabetes (69.1%), hypertension (61.7%), and dyslipidemia (18.5%). Conclusion: In summary, our single tertiary center study showed that AMD is widely prevalent in Jeddah, Saudi Arabia (4%) and linked to a wide range of risk factors. Some of these are modifiable risk factors that can be adjusted to help reduce AMD occurrence. Furthermore, this study has shown the importance of screening and follow-up of family members of patients with AMD to promote early detection and intervention of AMD. We recommend conducting further research on AMD in Saudi Arabia. Concerning the study design, a community-based cross-sectional study would be more helpful for assessing the disease's prevalence. Finally, recruiting a larger sample size is required for more accurate estimation.Keywords: age related macular degeneration, prevelence, risk factor, dry AMD
Procedia PDF Downloads 42105 Risk Factors Associated with Increased Emergency Department Visits and Hospital Admissions Among Child and Adolescent Patients
Authors: Lalanthica Yogendran, Manassa Hany, Saira Pasha, Benjamin Chaucer, Simarpreet Kaur, Christopher Janusz
Abstract:
Children and adolescent patients visit the Psychiatric Emergency Department (ED) for multiple reasons. Visiting the Psychiatric ED itself can be a traumatic experience that can affect an adolescents mental well-being, regardless of a history of mental illness. Despite this, limited research exists in this domain. Prospective studies have correlated adverse psychosocial determinants among adolescents to risk factors for poor well-being and unfavorable behavior outcomes. Studies have also shown that physiological stress is a contributor in the development of health problems and an increase in substance abuse in adolescents. This study aimed to retrospectively determine which psychosocial factors are associated with an increase in psychiatric ED visits. 600 charts of patients who had a psychiatric ED and inpatient admission visit from January 2014 through December 2014 were reviewed. Sociodemographics, diagnoses, ED visits and inpatient admissions were collected. Descriptive statistics, chi-square tests and independent t-test analyses were utilized to examine differences in the sample to determine which factors affected ED visits and admissions. The sample was 50% female, 35.2% self-identified black, and had a mean age of 13 years. The majority, 85%, went to public school and 17% were in special education. Attention Deficit Hyperactivity Disorder was the most common admitting diagnosis, found in 132(23%) responders. Most patients came from single parent household 305 (53%). The mean ages of patients that were sexually active, with legal issues, and reporting marijuana substance abuse were 15, 14.35, and 15 years respectively. Patients from two biological parent households had significantly fewer ED visits (1.2 vs. 1.7, p < 0.01) and admissions (0.09 vs. 0.26, p < 0.01). Among social factors, those who reported sexual, physical or emotional abuse had a significantly greater number of ED visits (2.1 vs. 1.5, p < 0.01) and admissions (0.61 vs. 0.14, p < 0.01) than those who did not. Patients that were sexually active or had legal issues or substance abuse with marijuana had a significantly greater number of admissions (0.43 vs. 0.17, p < 0.01), (0.54 vs. .18, p < 0.01) and (0.46 vs. 0.18, p < 0.01) respectively. This data supports the theory of the stability of a two parent home. Dual parenting plays a role in creating a safe space where a child can develop; this is shown by subsequent decreases in psychiatric ED visits and admissions. This may highlight the psychological protective role of a two parent household. Abuse can exacerbate existing psychiatric illness or initiate the onset of new disease. Substance abuse and legal issues result in early induction to the criminal system. Results show that this causes an increase in frequency of visits and severity of symptoms. Only marijuana, but not other illicit substances, correlated with higher incidence of psychiatric ED visits. This may speak to the psychotropic nature of tetrahydrocannabinols and their role in mental illness. This study demonstrates the array of psychosocial factors that lead to increased ED visits and admissions in children and adolescents.Keywords: adolescent, child psychiatry, emergency department, substance abuse
Procedia PDF Downloads 333104 Assessment of Sleeping Patterns of Saudis with Type 2 Diabetes Mellitus in Ramadan and Non-Ramadan Periods Using a Wearable Device and a Questionnaire
Authors: Abdullah S. Alghamdi, Khaled Alghamdi, Richard O. Jenkins, Parvez I. Haris
Abstract:
Background: Quantity and quality of sleep have been reported to be significant risk factors for obesity and development of metabolic disorders such as type 2 diabetes mellitus (T2DM). The relationship between diabetes and sleep quantity was reported to be U-shaped, which means increased or decreased sleeping hours can increase the risk of diabetes. The plasma glucagon levels were found to continuously decrease during night-time sleep in healthy individuals, independently of blood glucose and insulin levels. The disturbance of the circadian rhythm is also important and has been linked with an increased the chance of diabetes incidence. There is a lack of research on sleep patterns on Saudis with T2DM and how this is affected by Ramadan fasting. Aim: To assess the sleeping patterns of Saudis with T2DM (before, during, and after Ramadan), using two different techniques and relate this to their HbA1c levels. Method: This study recruited 82 Saudi with T2DM, who chose to fast during Ramadan, from the Endocrine and Diabetic Centre of Al Iman General Hospital, Riyadh, Saudi Arabia. Ethical approvals for the study were obtained from De Montfort University and Saudi Ministry of Health. Their sleeping patterns were assessed by a self-administered questionnaire (before, during, and after Ramadan). The assessment included the daily total sleeping hours (DTSH), and total night-time sleeping hours (TNTSH) of the participants. In addition, sleeping patterns of 36 patients, randomly selected from the 82 participants, were further tracked during and after Ramadan by using Fitbit Flex 2™ accelerometer. Blood samples were collected in each period for measuring HbA1c. Results: Questionnaire analysis revealed that the sleeping patterns significantly changed between the periods, with shorter hours during Ramadan (P < 0.001 for DTSH, and P < 0.001 for TNTSH). These findings were confirmed by the Fitbit data, which also indicated significant shorter sleeping hours for the DTSH, and the TNTSH during Ramadan (P < 0.001 and P < 0.001, respectively). Although there were no significant correlations between the questionnaire and Fitbit data, the TNTSH were shorter among the participants in all periods by both techniques. The mean HbA1c significantly varied between periods, with lowest level during Ramadan. Although the statistical tests did not show significant variances in the mean HbA1c between the groups of participants regarding their hours of sleeping, the lowest mean HbA1c was observed in the group of participants who slept for 6-8 hours and had longer night-time sleeping hours. Conclusion: A short sleep duration, and absence of night-time sleep were significantly observed among the majority of the study population during Ramadan, which could suppress the full benefits of Ramadan fasting for diabetic patients. This study showed that there is a good agreement between the findings of the questionnaire and the Fitbit device for evaluating sleeping patterns in a Saudi population. A larger study is needed in the future to investigate the impact of Ramadan fasting on sleep quality and quantity and its relationship with health and disease.Keywords: Diabetes, Fasting, Fitbit, HbA1c, IPAQ, Ramadan, Sleep
Procedia PDF Downloads 113103 Estimating the Effect of a Newly Developed Portable Innovative Balance Room System with a Digital Game Program on Falls and Incontinence Symptoms in the Elderly
Authors: Özge Çeliker Tosun, Melda Başer Secer, İsmail Düşmez, Sedat Çapar, İlkay Kozak, Melahat Aktaş, Furkan Can Şimşek, Gökhan Tosun
Abstract:
Purpose: Portable innovative balance room system with digital game program; It was created to be able to be divided into small areas, such as inside the house, garden, balcony, to enable the person to enter and perform both evaluation and exercise safely, and to ensure that these results can be stored and sent to the therapist live or later when desired. The aim is to compare the effectiveness of the exercise program applied by the elderly within this system and the exercise program implemented under the supervision of a physiotherapist on balance and urinary incontinence symptoms. Materials and Methods: The study was conducted in a randomized controlled manner on 63 people with urinary incontinence (mean age: 75.5 years) at Narlıdere Nursing Home Elderly Care and Rehabilitation Center. Elderly people participating in the study were divided into 3 groups: 1. Group, an exercise program consisting of pelvic floor muscle training and OTOGA exercises, 2. Group, only pelvic floor muscle training, and 3. Group, pelvic floor muscle training and Otoga exercises in the form of a digital game program in a portable balance room system. (self-administered) for 12 weeks. Pelvic floor distress inventory (PTDE-20) and bladder diary were used to evaluate the incontinance symptoms of the cases. Pelvic floor muscle function was evaluated with superficial EMG. Berg, Fall Effectiveness Scale (FES) and Functional Status Evaluations (Chair Stand Test, Eight (8) Food Up and Go Test, Chair Sit and Reach Test, Two Minutes Step Test) were used to evaluate balance. The existence of differences between groups was analyzed using Krusskal Wallis analysis of variance, and the difference between before and after exercise was analyzed with Wilcoxon tests. Results: After treatment, PTDE-20, daily urinary incontinence and toilet visits values decreased significantly in all three groups (p < 0.001). While there was a statistically significant increase in pelvic floor muscle EMG values in the 2nd and third groups after treatment, there was no change in the other group (2nd Group PFM average EMG before-after: 5.5 (4.15-10.95) - 10.95 (8.68-13.68), P=0.05, 3 Group PFM average EMG before-after: 6.5 (4.28-11.55) - 11.75 (8.67-14.26), p=0.04). While BERG score, Chair Stand Test, Eight (8) Food Up and Go Test, and Two Minutes Step Test values increased in all groups (p<0.05), Fall Effectiveness Scale (FES) values did not change after treatment. Conclusion: Although pelvic floor muscle training combined with balance exercises reduces symptoms, it may not lead to a positive improvement in the functions of the pelvic floor muscles. For this reason, recovery lasts for a short time, and then symptoms may reoccur in the future. However, thanks to the new system, when balance exercises are combined with a game program for the pelvic floor muscles, a double effect can be achieved with a single application and both incontinence and balance problems can be treated in a safe environment where the person can do it himself. But more work needs to be done on this subject. However, thanks to the new system, a double effect can be achieved with a single application, and both incontinence and balance problems can be treated in a safe environment where the person can do it himself. But more work needs to be done on new systemKeywords: fall, urinary incontinance, balance, elderly
Procedia PDF Downloads 75102 Atypical Intoxication Due to Fluoxetine Abuse with Symptoms of Amnesia
Authors: Ayse Gul Bilen
Abstract:
Selective serotonin reuptake inhibitors (SSRIs) are commonly prescribed antidepressants that are used clinically for the treatment of anxiety disorders, obsessive-compulsive disorder (OCD), panic disorders and eating disorders. The first SSRI, fluoxetine (sold under the brand names Prozac and Sarafem among others), had an adverse effect profile better than any other available antidepressant when it was introduced because of its selectivity for serotonin receptors. They have been considered almost free of side effects and have become widely prescribed, however questions about the safety and tolerability of SSRIs have emerged with their continued use. Most SSRI side effects are dose-related and can be attributed to serotonergic effects such as nausea. Continuous use might trigger adverse effects such as hyponatremia, tremor, nausea, weight gain, sleep disturbance and sexual dysfunction. Moderate toxicity can be safely observed in the hospital for 24 hours, and mild cases can be safely discharged (if asymptomatic) from the emergency department once cleared by Psychiatry in cases of intentional overdose and after 6 to 8 hours of observation. Although fluoxetine is relatively safe in terms of overdose, it might still be cardiotoxic and inhibit platelet secretion, aggregation, and plug formation. There have been reported clinical cases of seizures, cardiac conduction abnormalities, and even fatalities associated with fluoxetine ingestions. While the medical literature strongly suggests that most fluoxetine overdoses are benign, emergency physicians need to remain cognizant that intentional, high-dose fluoxetine ingestions may induce seizures and can even be fatal due to cardiac arrhythmia. Our case is a 35-year old female patient who was sent to ER with symptoms of confusion, amnesia and loss of orientation for time and location after being found wandering in the streets unconsciously by police forces that informed 112. Upon laboratory examination, no pathological symptom was found except sinus tachycardia in the EKG and high levels of aspartate transaminase (AST) and alanine transaminase (ALT). Diffusion MRI and computed tomography (CT) of the brain all looked normal. Upon physical and sexual examination, no signs of abuse or trauma were found. Test results for narcotics, stimulants and alcohol were negative as well. There was a presence of dysrhythmia which required admission to the intensive care unit (ICU). The patient gained back her conscience after 24 hours. It was discovered from her story afterward that she had been using fluoxetine due to post-traumatic stress disorder (PTSD) for 6 months and that she had attempted suicide after taking 3 boxes of fluoxetine due to the loss of a parent. She was then transferred to the psychiatric clinic. Our study aims to highlight the need to consider toxicologic drug use, in particular, the abuse of selective serotonin reuptake inhibitors (SSRIs), which have been widely prescribed due to presumed safety and tolerability, for diagnosis of patients applying to the emergency room (ER).Keywords: abuse, amnesia, fluoxetine, intoxication, SSRI
Procedia PDF Downloads 199101 A Second Chance to Live and Move: Lumbosacral Spinal Cord Ischemia-Infarction after Cardiac Arrest and the Artery of Adamkiewicz
Authors: Anna Demian, Levi Howard, L. Ng, Leslie Simon, Mark Dragon, A. Desai, Timothy Devlantes, W. David Freeman
Abstract:
Introduction: Out-of-hospital cardiac arrest (OHCA) can carry a high mortality. For survivors, the most common complication is hypoxic-ischemic brain injury (HIBI). Rarely, lumbosacral spinal cord and/or other spinal cord artery ischemia can occur due to anatomic variation and variable mean arterial pressure after the return of spontaneous circulation. We present a case of an OHCA survivor who later woke up with bilateral leg weakness with preserved sensation (ASIA grade B, L2 level). Methods: We describe a clinical, radiographic, and laboratory presentation, as well as a National Library of Medicine (NLM) search engine methodology, characterizing incidence/prevalence of this entity is discussed. A 70-year-old male, a longtime smoker, and alcohol user, suddenly collapsed at a bar surrounded by friends. He had complained of chest pain before collapsing. 911 was called. EMS arrived, and the patient was in pulseless electrical activity (PEA), cardiopulmonary resuscitation (CPR) was initiated, and the patient was intubated, and a LUCAS device was applied for continuous, high-quality CPR in the field by EMS. In the ED, central lines were placed, and thrombolysis was administered for a suspected Pulmonary Embolism (PE). It was a prolonged code that lasted 90 minutes. The code continued with the eventual return of spontaneous circulation. The patient was placed on an epinephrine and norepinephrine drip to maintain blood pressure. ECHO was performed and showed a “D-shaped” ventricle worrisome for PE as well as an ejection fraction around 30%. A CT with PE protocol was performed and confirmed bilateral PE. Results: The patient woke up 24 hours later, following commands, and was extubated. He was found paraplegic below L2 with preserved sensation, with hypotonia and areflexia consistent with “spinal shock” or anterior spinal cord syndrome. MRI thoracic and lumbar spine showed a conus medullaris level spinal cord infarction. The patient was given IV steroids upon initial discovery of cord infarct. NLM search using “cardiac arrest” and “spinal cord infarction” revealed 57 results, with only 8 review articles. Risk factors include age, atherosclerotic disease, and intraaortic balloon pump placement. AoA (Artery of Adamkiewicz) anatomic variation along with existing atherosclerotic factors and low perfusion were also known risk factors. Conclusion: Acute paraplegia from anterior spinal cord infarction of the AoA territory after cardiac arrest is rare. Larger prospective, multicenter trials are needed to examine potential interventions of hypothermia, lumbar drains, which are sometimes used in aortic surgery to reduce ischemia and/or other neuroprotectants.Keywords: cardiac arrest, spinal cord infarction, artery of Adamkiewicz, paraplegia
Procedia PDF Downloads 189100 Mesalazine-Induced Myopericarditis in a Professional Athlete
Authors: Tristan R. Fraser, Christopher D. Steadman, Christopher J. Boos
Abstract:
Myopericarditis is an inflammation syndrome characterised by clinical diagnostic criteria for pericarditis, such as chest pain, combined with evidence of myocardial involvement, such as elevation of biomarkers of myocardial damage, e.g., troponins. It can rarely be a complication of therapeutics used for dysregulated immune-mediated diseases such as inflammatory bowel disease (IBD), for example, mesalazine. The infrequency of mesalazine-induced myopericarditis adds to the challenge in its recognition. Rapid diagnosis and the early introduction of treatment are crucial. This case report follows a 24-year-old professional footballer with a past medical history of ulcerative colitis, recently started on mesalazine for disease control. Three weeks after mesalazine was initiated, he was admitted with fever, shortness of breath, and chest pain worse whilst supine and on deep inspiration, as well as elevated venous blood cardiac troponin T level (cTnT, 288ng/L; normal: <13ng/L). Myocarditis was confirmed on initial inpatient cardiac MRI, revealing the presence of florid myocarditis with preserved left ventricular systolic function and an ejection fraction of 67%. This was a longitudinal case study following the progress of a single individual with myopericarditis over four acute hospital admissions over nine weeks, with admissions ranging from two to five days. Parameters examined included clinical signs and symptoms, serum troponin, transthoracic echocardiogram, and cardiac MRI. Serial measurements of cardiac function, including cardiac MRI and transthoracic echocardiogram, showed progressive deterioration of cardiac function whilst mesalazine was continued. Prior to cessation of mesalazine, transthoracic echocardiography revealed a small global pericardial effusion of < 1cm and worsening left ventricular systolic function with an ejection fraction of 45%. After recognition of mesalazine as a potential cause and consequent cessation of the drug, symptoms resolved, with cardiac MRI performed as an outpatient showing resolution of myocardial oedema. The patient plans to make a return to competitive sport. Patients suffering from myopericarditis are advised to refrain from competitive sport for at least six months in order to reduce the risk of cardiac remodelling and sudden cardiac death. Additional considerations must be taken in individuals for whom competitive sport is an essential component of their livelihood, such as professional athletes. Myopericarditis is an uncommon, however potentially serious medical condition with a wide variety of aetiologies, including viral, autoimmune, and drug-related causes. Management is mainly supportive and relies on prompt recognition and removal of the aetiological process. Mesalazine-induced myopericarditis is a rare condition; as such increasing awareness of mesalazine as a precipitant of myopericarditis is vital for optimising the management of these patients.Keywords: myopericarditis, mesalazine, inflammatory bowel disease, professional athlete
Procedia PDF Downloads 13599 Usage Of the Transpedicular Screw Fixation Method in the Treatment of Pediatric Patients with Injuries of the Thoracic and Lumbar Spine.
Authors: S. D. Zalepugin, A. E. Murzich, D. G. Satskevich, A. B. Palivanov
Abstract:
Introduction. The incidence of spinal injuries in patients under 18 years of age has increased significantly in recent years, which represents a significant economic, social and medical problem. The most common method of surgical stabilization of spinal fractures in pediatric patients is transpedicular posterior spinal fusion, which is widely used by spinal neurosurgeons in adult patients. Purpose of the study: This study evaluates the results of treatment of thoracolumbar spine lesions in children using the transpedicular screw fixation method. Materials and methods. From 2019 to 2024, 35 children with injuries to the thoracic and lumbar spine underwent surgical treatment using the transpedicular screw fixation method. Among the injured, girls prevailed (21 cases, 60%). The age of the victims ranged from 9 to 17 years. The main causes of damage were: catatrauma (19 cases), road accident (5 cases), sports injury (6 cases), and other reasons - 5 cases. In 5 cases, suicidal attempts occurred. Co-injury was observed in most cases (20 patients, or 57%), which is natural for high-energy injury. Vertebral-spinal injury with neurological disorders was observed in 13 patients, the disorders ranged from mild inferior (4 children) to moderate/severe paraparesis (5 patients) and inferior paraplegia (4 children). 6 children had pelvic organ dysfunction in the form of urinary and fecal retention or incontinence. All thirty-five patients, within a period of 1 to 57 days after the injury, underwent several surgical interventions from the posterior surgical access using a screw fixation method (posterior decompression + spinal fusion). In 12 cases, it was necessary to perform the second stage of surgical treatment - anterior decompression of the spinal cord or its roots. Verticalization of patients was carried out within 1 to 5 days after surgery. Results. In all patients, the nearest, up to 1 year, results were evaluated. In children operated in 2019-2021, the results were studied in terms of 3 to 5 years. The procedures used, clinical results and the quality of the fixative installation were assessed. All patients managed to achieve positive results. The use of internal fixation made it possible to carry out early verticalization of children, eliminate pain syndrome and achieve a regression of neurological disorders in most patients (especially in cases when the operation was performed early after injury - from 1 to 3 days). Within the first month, the ability to self-care was fully restored. Bone fusion was observed within 6-12 months after surgery. There were no complications after surgery. The analysis of postoperative radiographs, CT and MRI images revealed the correct standing of the screws in all cases. Conclusion. The posterior spinal fusion using the new method of screw fixation in pediatric patients allows to achieve durable stabilization of damage, begins early rehabilitation of patients and reduces the duration of hospital treatment by 2-3 times. Thus, we recommend the use of a transpedicular fixator in children as a reliable, technically feasible method for restoring spinal stability with a low risk of intra- and postoperative complications.Keywords: pediatric patients, spinal injuries, transpedicular stabilization, operative treatment
Procedia PDF Downloads 898 Developmental Difficulties Prevalence and Management Capacities among Children Including Genetic Disease in a North Coastal District of Andhra Pradesh, India: A Cross-sectional Study
Authors: Koteswara Rao Pagolu, Raghava Rao Tamanam
Abstract:
The present study was aimed to find out the prevalence of DD's in Visakhapatnam, one of the north coastal districts of Andhra Pradesh, India during a span of five years. A cross-sectional investigation was held at District early intervention center (DEIC), Visakhapatnam from 2016 to 2020. To identify the pattern and trend of different DD's including seasonal variations, a retrospective analysis of the health center's inpatient database for the past 5 years was done. Male and female children aged 2 months-18 years are included in the study with the prior permission of the concerned medical officer. The screening tool developed by the Ministry of health and family welfare, India, was used for the study. Among 26,423 cases, children with birth defects are 962, 2229 with deficiencies, 7516 with diseases, and 15716 with disabilities were admitted during the study period. From birth defects, congenital deafness occurred in large numbers with 22.66%, and neural tube defect observed in a small number of cases with 0.83% during the period. From the side of deficiencies, severe acute malnutrition has mostly occurred (66.80 %) and a small number of children were affected with goiter (1.70%). Among the diseases, dental carriers (67.97%) are mostly found and these cases were at peak during the years 2016 and 2019. From disabilities, children with vision impairment (20.55%) have mostly approached the center. Over the past 5 years, the admission rate of down's syndrome and congenital deafness cases showed a rising trend up to 2019 and then declined. Hearing impairment, motor delay, and learning disorder showed a steep rise and gradual decline trend, whereas severe anemia, vitamin-D deficiency, otitis media, reactive airway disease, and attention deficit hyperactivity disorder showed a declining trend. However, congenital heart diseases, dental caries, and vision impairment admission rates showed a zigzag pattern over the past 5 years. This center had inadequate diagnostic facilities related to genetic disease management. For advanced confirmation, the cases are referred to a district government hospital or private diagnostic laboratories in the city for genetic tests. Information regarding the overall burden and pattern of admissions in the health center is obtained by the review of DEIC records. Through this study, it is observed that the incidence of birth defects, as well as genetic disease burden, is high in the Visakhapatnam district. Hence there is a need for strengthening of management services for these diseases in this region.Keywords: child health screening, developmental delays, district early intervention center, genetic disease management, infrastructural facility, Visakhapatnam district
Procedia PDF Downloads 21397 Optimization of Multi-Disciplinary Expertise and Resource for End-Stage Renal Failure (ESRF) Patient Care
Authors: Mohamed Naser Zainol, P. P. Angeline Song
Abstract:
Over the years, the profile of end-stage renal patients placed under The National Kidney Foundation Singapore (NKFS) dialysis program has evolved, with a gradual incline in the number of patients with behavior-related issues. With these challenging profiles, social workers and counsellors are often expected to oversee behavior management, through referrals from its partnering colleagues. Due to the segregation of tasks usually found in many hospital-based multi-disciplinary settings, social workers’ and counsellors’ interventions are often seen as an endpoint, limiting other stakeholders’ involvement that could otherwise be potentially crucial in managing such patients. While patients’ contact in local hospitals often leads to eventual discharge, NKFS patients are mostly long term. It is interesting to note that these patients are regularly seen by a team of professionals that includes doctors, nurses, dietitians, exercise specialists in NKFS. The dynamism of relationships presents an opportunity for any of these professionals to take ownership of their potentials in leading interventions that can be helpful to patients. As such, it is important to have a framework that incorporates the strength of these professionals and also channels empowerment across the multi-disciplinary team in working towards wholistic patient care. This paper would like to suggest a new framework for NKFS’s multi-disciplinary team, where the group synergy and dynamics are used to encourage ownership and promote empowerment. The social worker and counsellor use group work skills and his/her knowledge of its members’ strengths, to generate constructive solutions that are centered towards patient’s growth. Using key ideas from Karl’s Tomm Interpersonal Communications, the Communication Management of Meaning and Motivational Interviewing, the social worker and counsellor through a series of guided meeting with other colleagues, facilitates the transmission of understanding, responsibility sharing and tapping on team resources for patient care. As a result, the patient can experience personal and concerted approach and begins to flow in a direction that is helpful for him. Using seven case studies of identified patients with behavioral issues, the social worker and counsellor apply this framework for a period of six months. Patient’s overall improvement through interventions as a result of this framework are recorded using the AB single case design, with baseline measured three months before referral. Interviews with patients and their families, as well as other colleagues that are not part of the multi-disciplinary team are solicited at mid and end points to gather their experiences about patient’s progress as a by-product of this framework. Expert interviews will be conducted on each member of the multi-disciplinary team to study their observations and experience in using this new framework. Hence, this exploratory framework hopes to identify the inherent usefulness in managing patients with behavior related issues. Moreover, it would provide indicators in improving aspects of the framework when applied to a larger population.Keywords: behavior management, end-stage renal failure, satellite dialysis, multi-disciplinary team
Procedia PDF Downloads 14696 One Pot Synthesis of Cu–Ni–S/Ni Foam for the Simultaneous Removal and Detection of Norfloxacin
Authors: Xincheng Jiang, Yanyan An, Yaoyao Huang, Wei Ding, Manli Sun, Hong Li, Huaili Zheng
Abstract:
The residual antibiotics in the environment will pose a threat to the environment and human health. Thus, efficient removal and rapid detection of norfloxacin (NOR) in wastewater is very important. The main sources of NOR pollution are the agricultural, pharmaceutical industry and hospital wastewater. The total consumption of NOR in China can reach 5440 tons per year. It is found that neither animals nor humans can totally absorb and metabolize NOR, resulting in the excretion of NOR into the environment. Therefore, residual NOR has been detected in water bodies. The hazards of NOR in wastewater lie in three aspects: (1) the removal capacity of the wastewater treatment plant for NOR is limited (it is reported that the average removal efficiency of NOR in the wastewater treatment plant is only 68%); (2) NOR entering the environment will lead to the emergence of drug-resistant strains; (3) NOR is toxic to many aquatic species. At present, the removal and detection technologies of NOR are applied separately, which leads to a cumbersome operation process. The development of simultaneous adsorption-flocculation removal and FTIR detection of pollutants has three advantages: (1) Adsorption-flocculation technology promotes the detection technology (the enrichment effect on the material surface improves the detection ability); (2) The integration of adsorption-flocculation technology and detection technology reduces the material cost and makes the operation easier; (3) FTIR detection technology endows the water treatment agent with the ability of molecular recognition and semi-quantitative detection for pollutants. Thus, it is of great significance to develop a smart water treatment material with high removal capacity and detection ability for pollutants. This study explored the feasibility of combining NOR removal method with the semi-quantitative detection method. A magnetic Cu-Ni-S/Ni foam was synthesized by in-situ loading Cu-Ni-S nanostructures on the surface of Ni foam. The novelty of this material is the combination of adsorption-flocculation technology and semi-quantitative detection technology. Batch experiments showed that Cu-Ni-S/Ni foam has a high removal rate of NOR (96.92%), wide pH adaptability (pH=4.0-10.0) and strong ion interference resistance (0.1-100 mmol/L). According to the Langmuir fitting model, the removal capacity can reach 417.4 mg/g at 25 °C, which is much higher than that of other water treatment agents reported in most studies. Characterization analysis indicated that the main removal mechanisms are surface complexation, cation bridging, electrostatic attraction, precipitation and flocculation. Transmission FTIR detection experiments showed that NOR on Cu-Ni-S/Ni foam has easily recognizable FTIR fingerprints; the intensity of characteristic peaks roughly reflects the concentration information to some extent. This semi-quantitative detection method has a wide linear range (5-100 mg/L) and a low limit of detection (4.6 mg/L). These results show that Cu-Ni-S/Ni foam has excellent removal performance and semi-quantitative detection ability of NOR molecules. This paper provides a new idea for designing and preparing multi-functional water treatment materials to achieve simultaneous removal and semi-quantitative detection of organic pollutants in water.Keywords: adsorption-flocculation, antibiotics detection, Cu-Ni-S/Ni foam, norfloxacin
Procedia PDF Downloads 76