Search results for: severe morbidity
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2274

Search results for: severe morbidity

1914 Potential Impacts of Climate Change on Hydrological Droughts in the Limpopo River Basin

Authors: Nokwethaba Makhanya, Babatunde J. Abiodun, Piotr Wolski

Abstract:

Climate change possibly intensifies hydrological droughts and reduces water availability in river basins. Despite this, most research on climate change effects in southern Africa has focused exclusively on meteorological droughts. This thesis projects the potential impact of climate change on the future characteristics of hydrological droughts in the Limpopo River Basin (LRB). The study uses regional climate model (RCM) measurements (from the Coordinated Regional Climate Downscaling Experiment, CORDEX) and a combination of hydrological simulations (using the Soil and Water Assessment Tool Plus model, SWAT+) to predict the impacts at four global warming levels (GWLs: 1.5℃, 2.0℃, 2.5℃, and 3.0℃) under the RCP8.5 future climate scenario. The SWAT+ model was calibrated and validated with a streamflow dataset observed over the basin, and the sensitivity of model parameters was investigated. The performance of the SWAT+LRB model was verified using the Nash-Sutcliffe efficiency (NSE), Percent Bias (PBIAS), Root Mean Square Error (RMSE), and coefficient of determination (R²). The Standardized Precipitation Evapotranspiration Index (SPEI) and the Standardized Precipitation Index (SPI) have been used to detect meteorological droughts. The Soil Water Index (SSI) has been used to define agricultural drought, while the Water Yield Drought Index (WYLDI), the Surface Run-off Index (SRI), and the Streamflow Index (SFI) have been used to characterise hydrological drought. The performance of the SWAT+ model simulations over LRB is sensitive to the parameters CN2 (initial SCS runoff curve number for moisture condition II) and ESCO (soil evaporation compensation factor). The best simulation generally performed better during the calibration period than the validation period. In calibration and validation periods, NSE is ≤ 0.8, while PBIAS is ≥ ﹣80.3%, RMSE ≥ 11.2 m³/s, and R² ≤ 0.9. The simulations project a future increase in temperature and potential evapotranspiration over the basin, but they do not project a significant future trend in precipitation and hydrological variables. However, the spatial distribution of precipitation reveals a projected increase in precipitation in the southern part of the basin and a decline in the northern part of the basin, with the region of reduced precipitation projected to increase with GWLs. A decrease in all hydrological variables is projected over most parts of the basin, especially over the eastern part of the basin. The simulations predict meteorological droughts (i.e., SPEI and SPI), agricultural droughts (i.e., SSI), and hydrological droughts (i.e., WYLDI, SRI) would become more intense and severe across the basin. SPEI-drought has a greater magnitude of increase than SPI-drought, and agricultural and hydrological droughts have a magnitude of increase between the two. As a result, this research suggests that future hydrological droughts over the LRB could be more severe than the SPI-drought projection predicts but less severe than the SPEI-drought projection. This research can be used to mitigate the effects of potential climate change on basin hydrological drought.

Keywords: climate change, CORDEX, drought, hydrological modelling, Limpopo River Basin

Procedia PDF Downloads 104
1913 Comparison of Analgesic Efficacy of Paracetamol and Tramadol for Pain Relief in Active Labor

Authors: Krishna Dahiya

Abstract:

Introduction: Labour pain has been described as the most severe pain experienced by women in their lives. Pain management in labour is one of the most important challenges faced by the obstetrician. The opioids are the primary treatment for patients with moderate and severe pain but these drugs are not always tolerated and are associated with dose-dependent side effects. Nonsteroidal anti-inflammatory drugs, too, are associated with variable adverse effects. Considering these factors, our study compared the efficacy and side effect of intravenous tramadol and paracetamol. Objective: To evaluate the efficacy and adverse effects of an intravenous infusion of 1000 mg of paracetamol as compared with an intravenous injection of 50mg of tramadol for intrapartum analgesia. Methods: In a randomized prospective study at Pt. BDS PGIMS, 200 women in active labor were allocated to received either paracetamol (n=100) or tramadol (n=100). The primary outcome was the efficacy of the drug to supply adequate analgesia as measured by a change in the visual analog scale (VAS) pain intensity score at various times after drug administration. The secondary outcomes included the need for additional rescue analgesia and the presence of adverse maternal or fetal events. Results: The mean age of cases were 25.55 ± 3.849 years and 25.60 ± 3.655 years respectively As recorded by the VAS score, there was significant pain reduction at 30 minutes, and at 1 and 2 hours in both groups (P<0.01). In comparison, between group I and II, a significantly higher rate of nausea and vomiting in tramadol group (14% vs 8%; P < 0.03) patients. Similarly, drowsiness (0% vs 11%; P<0.01), dry mouth (0% vs 8%; P<0.04) and dizziness (0% vs 9%; P<0.02) was also significant in group II. Conclusion: Due to difficulty in administering epidural analgesia to all parturients, administration of paracetamol and tramadol infusion for analgesia is simple and less invasive alternative. In the present study, both paracetamol and tramadol were equally effective for labour analgesia but paracetamol has emerged as safe alternative as compared to tramadol due to a low incidence of side effects.

Keywords: paracetamol, tramadol, labor, analgesia

Procedia PDF Downloads 267
1912 Clinical Audit on the Introduction of Apremilast into Ireland

Authors: F. O’Dowd, G. Murphy, M. Roche, E. Shudell, F. Keane, M. O’Kane

Abstract:

Intoduction: Apremilast (Otezla®) is an oral phosphodiesterase-4 (PDE4) inhibitor indicated for treatment of adult patients with moderate to severe plaque psoriasis who have contraindications to have failed or intolerant of standard systemic therapy and/or phototherapy; and adult patients with active psoriatic arthritis. Apremilast influences intracellular regulation of inflammatory mediators. Two randomized, placebo-controlled trials evaluating apremilast in 1426 patients with moderate to severe plague psoriasis (ESTEEM 1 and 2) demonstrated that the commonest adverse reactions (AE’s) leading to discontinuation were nausea (1.6%), diarrhoea (1.0%), and headaches (0.8%). The overall proportion of subjects discontinuing due to adverse reactions was 6.1%. At week 16 these trials demonstrated significant more apremilast-treated patients (33.1%) achieved the primary end point PASI-75 than placebo (5.3%). We began prescribing apremilast in July 2015. Aim: To evaluate efficacy and tolerability of apremilast in an Irish teaching hospital psoriasis population. Methods: A proforma documenting clinical evaluation parameters, prior treatment experience and AE’s; was completed prospectively on all patients commenced on apremilast since July 2015 – July 2017. Data was collected at week 0,6,12,24,36 and week 52 with 20/71 patients having passed week 52. Efficacy was assessed using Psoriasis Area and Severity Index (PASI) and Dermatology Life Quality Index (DLQI). AE’s documented included GI effects, infections, changes in weight and mood. Retrospective chart review and telephone review was utilised for missing data. Results: A total of 71 adult subjects (38 male, 33 female; age range 23-57), with moderate to severe psoriasis, were evaluated. Prior treatment: 37/71 (52%) were systemic/biologic/phototherapy naïve; 14/71 (20%) has prior phototherapy alone;20/71 (28%) had previous systemic/biologic exposure; 12/71 (17%) had both psoriasis and psoriatic arthritis. PASI responses: mean baseline PASI was 10.1 and DLQI was 15.Week 6: N=71, n=15 (21%) achieved PASI 75. Week 12: N= 48, n=6 (13%) achieved a PASI 100%; n=16 (34.5%) achieved a PASI 75. Week 24: N=40, n=10 (25%) achieved a PASI 100; n=15 (37.5%) achieved a PASI 75. Week 52: N= 20, n=4 (20%) achieved a PASI 100; n= 16 (80%) achieved a PASI 75. (N= number of pts having passed the time point indicated, n= number of pts (out of N) achieving PASI or DLQI responses at that time). DLQI responses: week 24: N= 40, n=30 (75%) achieved a DLQI score of 0; n=5 (12.5%) achieved a DLQI score of 1; n=1 (2.5%) achieved a DLQI score of 10 (due to lack of efficacy). Adverse Events: The proportion of patients that discontinued treatment due to AE’s was n=7 (9.8%). One patient experienced nausea alleviated by dose reduction; another developed significant dysgeusia for certain foods, both continued therapy. Two patients lost 2-3 kg. Conclusion: Initial Irish patient experience of Apremilast appears comparable to that observed in trials with good efficacy and tolerability.

Keywords: Apremilast, introduction, Ireland, clinical audit

Procedia PDF Downloads 133
1911 Increasing Prevalence of Multi-Allergen Sensitivities in Patients with Allergic Rhinitis and Asthma in Eastern India

Authors: Sujoy Khan

Abstract:

There is a rising concern with increasing allergies affecting both adults and children in rural and urban India. Recent report on adults in a densely populated North Indian city showed sensitization rates for house dust mite, parthenium, and cockroach at 60%, 40% and 18.75% that is now comparable to allergy prevalence in cities in the United States. Data from patients residing in the eastern part of India is scarce. A retrospective study (over 2 years) was done on patients with allergic rhinitis and asthma where allergen-specific IgE levels were measured to see the aero-allergen sensitization pattern in a large metropolitan city of East India. Total IgE and allergen-specific IgE levels were measured using ImmunoCAP (Phadia 100, Thermo Fisher Scientific, Sweden) using region-specific aeroallergens: Dermatophagoides pteronyssinus (d1); Dermatophagoides farinae (d2); cockroach (i206); grass pollen mix (gx2) consisted of Cynodon dactylon, Lolium perenne, Phleum pratense, Poa pratensis, Sorghum halepense, Paspalum notatum; tree pollen mix (tx3) consisted of Juniperus sabinoides, Quercus alba, Ulmus americana, Populus deltoides, Prosopis juliflora; food mix 1 (fx1) consisted of Peanut, Hazel nut, Brazil nut, Almond, Coconut; mould mix (mx1) consisted of Penicillium chrysogenum, Cladosporium herbarum, Aspergillus fumigatus, Alternaria alternate; animal dander mix (ex1) consisted of cat, dog, cow and horse dander; and weed mix (wx1) consists of Ambrosia elatior, Artemisia vulgaris, Plantago lanceolata, Chenopodium album, Salsola kali, following manufacturer’s instructions. As the IgE levels were not uniformly distributed, median values were used to represent the data. 92 patients with allergic rhinitis and asthma (united airways disease) were studied over 2 years including 21 children (age < 12 years) who had total IgE and allergen-specific IgE levels measured. The median IgE level was higher in 2016 than in 2015 with 60% of patients (adults and children) being sensitized to house dust mite (dual positivity for Dermatophagoides pteronyssinus and farinae). Of 11 children in 2015, whose total IgE ranged from 16.5 to >5000 kU/L, 36% of children were polysensitized (≥4 allergens), and 55% were sensitized to dust mites. Of 10 children in 2016, total IgE levels ranged from 37.5 to 2628 kU/L, and 20% were polysensitized with 60% sensitized to dust mites. Mould sensitivity was 10% in both of the years in the children studied. A consistent finding was that ragweed sensitization (molecular homology to Parthenium hysterophorus) appeared to be increasing across all age groups, and throughout the year, as reported previously by us where 25% of patients were sensitized. In the study sample overall, sensitizations to dust mite, cockroach, and parthenium were important risks in our patients with moderate to severe asthma that reinforces the importance of controlling indoor exposure to these allergens. Sensitizations to dust mite, cockroach and parthenium allergens are important predictors of asthma morbidity not only among children but also among adults in Eastern India.

Keywords: aAeroallergens, asthma, dust mite, parthenium, rhinitis

Procedia PDF Downloads 171
1910 The Impact of Covid-19 on Anxiety Levels in the General Population of the United States: An Exploratory Survey

Authors: Amro Matyori, Fatimah Sherbeny, Askal Ali, Olayiwola Popoola

Abstract:

Objectives: The study evaluated the impact of COVID-19 on anxiety levels in the general population in the United States. Methods: The study used an online questionnaire. It adopted the Generalized Anxiety Disorder Assessment (GAD-7) instrument. It is a self-administered scale with seven items used as a screening tool and severity measure for generalized anxiety disorder. The participants rated the frequency of anxiety symptoms in the last two weeks on a Likert scale, which ranges from 0-3. Then the item points are summed to provide the total score. Results: Thirty-two participants completed the questionnaire. Among them, 24 (83%) females and 5 (17%) males. The age range of 18-24-year-old represented the most respondents. Only one of the participants tested positive for the COVID-19, and 39% of them, one of their family members, friends, or colleagues, tested positive for the coronavirus. Moreover, 10% have lost a family member, a close friend, or a colleague because of COVID-19. Among the respondents, there were ten who scored approximately five points on the GAD-7 scale, which indicates mild anxiety. Furthermore, eight participants scored 10 to 14 points, which put them under the category of moderate anxiety, and one individual who was categorized under severe anxiety scored 15 points. Conclusions: It is identified that most of the respondents scored the points that put them under the mild anxiety category during the COVID-19 pandemic. It is also noticed that severe anxiety was the lowest among the participants, and people who tested positive and/or their family members, close friends, and colleagues were more likely to experience anxiety. Additionally, participants who lost friends or family members were also at high risk of anxiety. It is obvious the COVID-19 outcomes and too much thinking about the pandemic put people under stress which led to anxiety. Therefore, continuous assessment and monitoring of psychological outcomes during pandemics will help to establish early well-informed interventions.

Keywords: anxiety and covid-19, covid-19 and mental health outcomes, influence of covid-19 on anxiety, population and covid-19 impact on mental health

Procedia PDF Downloads 184
1909 Inhibition of Food Borne Pathogens by Bacteriocinogenic Enterococcus Strains

Authors: Neha Farid

Abstract:

Due to the abuse of antimicrobial medications in animal feed, the occurrence of multi-drug resistant (MDR) pathogens in foods is currently a growing public health concern on a global scale. MDR infections have the potential to penetrate the food chain by posing a serious risk to both consumers and animals. Food pathogens are those biological agents that have the tendency to cause pathogenicity in the host body upon ingestion. The major reservoirs of foodborne pathogens include food-producing fauna like cows, pigs, goats, sheep, deer, etc. The intestines of these animals are highly condensed with several different types of food pathogens. Bacterial food pathogens are the main cause of foodborne disease in humans; almost 66% of the reported cases of food illness in a year are caused by the infestation of bacterial food pathogens. When ingested, these pathogens reproduce and survive or form different kinds of toxins inside host cells causing severe infections. The genus Listeria consists of gram-positive, rod-shaped, non-spore-forming bacteria. The disease caused by Listeria monocytogenes is listeriosis or gastroenteritis, which induces fever, vomiting, and severe diarrhea in the affected body. Campylobacter jejuni is a gram-negative, curved-rod-shaped bacteria causing foodborne illness. The major source of Campylobacter jejuni is livestock and poultry; particularly, chicken is highly colonized with Campylobacter jejuni. Serious public health concerns include the widespread growth of bacteria that are resistant to antibiotics and the slowing in the discovery of new classes of medicines. The objective of this study is to provide some potential antibacterial activities with certain broad-range antibiotics and our desired bacteriocins, i.e., Enterococcus faecium from specific strains preventing microbial contamination pathways in order to safeguard the food by lowering food deterioration, contamination, and foodborne illnesses. The food pathogens were isolated from various sources of dairy products and meat samples. The isolates were tested for the presence of Listeria and Campylobacter by gram staining and biochemical testing. They were further sub-cultured on selective media enriched with the growth supplements for Listeria and Campylobacter. All six strains of Listeria and Campylobacter were tested against ten antibiotics. Campylobacter strains showed resistance against all the antibiotics, whereas Listeria was found to be resistant only against Nalidixic Acid and Erythromycin. Further, the strains were tested against the two bacteriocins isolated from Enterococcus faecium. It was found that bacteriocins showed better antimicrobial activity against food pathogens. They can be used as a potential antimicrobial for food preservation. Thus, the study concluded that natural antimicrobials could be used as alternatives to synthetic antimicrobials to overcome the problem of food spoilage and severe food diseases.

Keywords: food pathogens, listeria, campylobacter, antibiotics, bacteriocins

Procedia PDF Downloads 45
1908 Gamma-Hydroxybutyrate (GHB): A Review for the Prehospital Clinician

Authors: Theo Welch

Abstract:

Background: Gamma-hydroxybutyrate (GHB) is a depressant of the central nervous system with euphoric effects. It is being increasingly used recreationally in the United Kingdom (UK) despite associated morbidity and mortality. Due to the lack of evidence, healthcare professionals remain unsure as to the optimum management of GHB acute toxicity. Methods: A literature review was undertaken of its pharmacology and the emergency management of its acute toxicity.Findings: GHB is inexpensive and readily available over the Internet. Treatment of GHB acute toxicity is supportive. Clinicians should pay particular attention to the airway as emesis is common. Intubation is required in a minority of cases. Polydrug use is common and worsens prognosis. Conclusion: An inexpensive and readily available drug, GHB acute toxicity can be difficult to identify and treat. GHB acute toxicity is generally treated conservatively. Further research is needed to ascertain the indications, benefits, and risks of intubating patients with GHB acute toxicity. instructions give you guidelines for preparing papers for the conference.

Keywords: GHB, gamma-hydroxybutyrate, prehospital, emergency, toxicity, management

Procedia PDF Downloads 177
1907 Behavioral and Cultural Risk Factor of Cardiovascular Disease in India: Evidence from SAGE-Study

Authors: Sunita Patel

Abstract:

Cardiovascular diseases are the leading cause of morbidity as well as mortality in India. Objective of this study is to examine CVDs prevalence and identify their behavioral and cultural risk factors with the help of SAGE-2007 data conducted on 6th states in India. Findings reveal that 18.3% of people diagnosed with CVDs in India. Higher disease occurs in an increasing rate between ages of 30-39 having OR 2.45 (CI: 1.66-3.63) and 70+ age OR 7.45 (CI: 4.82-11.49) times higher compare to 18-29 age group respectively. Wealth quintile higher CVD occurs as 3rd in 60% (CI: 1.16-2.21) and in richest 5th quintile 58% (CI: 1.13-2.21) contrast to lowest quintile. Relative risk depicted that 22.4% in moderate and 44% in vigorous activity have less chance of diseases compare to who performed no work and those who consumed alcohol. Results reveal that policy prospect should be recommended and that it would be beneficial for awareness of people and their future.

Keywords: behavioral risk, cultural risk, cardio-vascular diseases, wealth quintile

Procedia PDF Downloads 377
1906 A Perspective on Emergency Care of Gunshot Injuries in Northern Taiwan

Authors: Liong-Rung Liu, Yu-Hui Chiu, Wen-Han Chang

Abstract:

Firearm injuries are high-energy injuries. The ballistic pathways could cause severe burns or chemical damages to vessels, musculoskeletal or other major organs. The high mortality rate is accompanied by complications such as sepsis. As laws prohibit gun possession, civilian gunshot wounds (GSW) are relatively rare in Taiwan. Our hospital, Mackay Memorial Hospital, located at the center of Taipei city is surrounded by nightclubs and red-light districts. Due to this unique location, our hospital becomes the first-line trauma center managing gunshot victims in Taiwan. To author’s best knowledge, there are few published research articles regarding this unique situation. We hereby analyze the distinct characteristics and length of stay (LOS) of GSW patients in the emergency room (ER) at Mackay Memorial Hospital. A 6-year retrospective analysis of 27 patients treated for GSW injuries from January 2012 to December 2017 was performed. The patients’ records were reviewed for the following analyses, 1) wound position and the correlated clinical presentations; 2) the LOS in ED of patients receiving emergency surgery for major organ or vascular injuries. We found males (96.3%) were injured by guns more often than females (3.7%) in all age groups. The most common injured site was in the extremities. With regards to the ER LOS, the average time were 72.2 ± 34.5 minutes for patients with triage I and 207.4 ± 143.9 minutes for patients with triage II. The ED LOS of patients whose ISS score were more than 15 was 59.9 ± 25.6 minutes, and 179.4 ± 119.8 minutes for patients whose ISS score were between 9 to 15, respectively. Among these 27 patients, 10 patients had emergency surgery and their average ED stay time was 104.5 ± 33.3 minutes. Even more, the average ED stay time could be shortened to 88.8 ± 32.3 minutes in the 5 patients with trauma team activation. In conclusion, trauma team activation in severe GSW patients indeed shortens the ED LOS and might initially improve the quality of patient care. This is the result of better trauma systems, including advances in care from emergency medical services and acute care surgical management.

Keywords: gunshot, length of stay, trauma, mortality

Procedia PDF Downloads 110
1905 Combining Patients Pain Scores Reports with Functionality Scales in Chronic Low Back Pain Patients

Authors: Ivana Knezevic, Kenneth D. Candido, N. Nick Knezevic

Abstract:

Background: While pain intensity scales remain generally accepted assessment tool, and the numeric pain rating score is highly subjective, we nevertheless rely on them to make a judgment about treatment effects. Misinterpretation of pain can lead practitioners to underestimate or overestimate the patient’s medical condition. The purpose of this study was to analyze how the numeric rating pain scores given by patients with low back pain correlate with their functional activity levels. Methods: We included 100 consecutive patients with radicular low back pain (LBP) after the Institutional Review Board (IRB) approval. Pain scores, numeric rating scale (NRS) responses at rest and in the movement,Oswestry Disability Index (ODI) questionnaire answers were collected 10 times through 12 months. The ODI questionnaire is targeting a patient’s activities and physical limitations as well as a patient’s ability to manage stationary everyday duties. Statistical analysis was performed by using SPSS Software version 20. Results: The average duration of LBP was 14±22 months at the beginning of the study. All patients included in the study were between 24 and 78 years old (average 48.85±14); 56% women and 44% men. Differences between ODI and pain scores in the range from -10% to +10% were considered “normal”. Discrepancies in pain scores were graded as mild between -30% and -11% or +11% and +30%; moderate between -50% and -31% and +31% and +50% and severe if differences were more than -50% or +50%. Our data showed that pain scores at rest correlate well with ODI in 65% of patients. In 30% of patients mild discrepancies were present (negative in 21% and positive in 9%), 4% of patients had moderate and 1% severe discrepancies. “Negative discrepancy” means that patients graded their pain scores much higher than their functional ability, and most likely exaggerated their pain. “Positive discrepancy” means that patients graded their pain scores much lower than their functional ability, and most likely underrated their pain. Comparisons between ODI and pain scores during movement showed normal correlation in only 39% of patients. Mild discrepancies were present in 42% (negative in 39% and positive in 3%); moderate in 14% (all negative), and severe in 5% (all negative) of patients. A 58% unknowingly exaggerated their pain during movement. Inconsistencies were equal in male and female patients (p=0.606 and p=0.928).Our results showed that there was a negative correlation between patients’ satisfaction and the degree of reporting pain inconsistency. Furthermore, patients talking opioids showed more discrepancies in reporting pain intensity scores than did patients taking non-opioid analgesics or not taking medications for LBP (p=0.038). There was a highly statistically significant correlation between morphine equivalents doses and the level of discrepancy (p<0.0001). Conclusion: We have put emphasis on the patient education in pain evaluation as a vital step in accurate pain level reporting. We have showed a direct correlation with patients’ satisfaction. Furthermore, we must identify other parameters in defining our patients’ chronic pain conditions, such as functionality scales, quality of life questionnaires, etc., and should move away from an overly simplistic subjective rating scale.

Keywords: pain score, functionality scales, low back pain, lumbar

Procedia PDF Downloads 210
1904 Air Pollution on Stroke in Shenzhen, China: A Time-Stratified Case Crossover Study Modified by Meteorological Variables

Authors: Lei Li, Ping Yin, Haneen Khreis

Abstract:

Stroke is the second leading cause of death and a third leading cause of death and disability worldwide in 2019. Given the significant role of environmental factors in stroke development and progression, it is essential to investigate the effect of air pollution on stroke occurrence while considering the modifying effects of meteorological variables. This study aimed to evaluate the association between short-term exposure to air pollution and the incidence of stroke subtypes in Shenzhen, China, and to explore the potential interactions of meteorological factors with air pollutants. The study analyzed data from January 1, 2006, to December 31, 2014, including 88,214 cases of ischemic stroke and 30,433 cases of hemorrhagic stroke among residents of Shenzhen. Using a time-stratified case–crossover design with conditional quasi-Poisson regression, the study estimated the percentage changes in stroke morbidity associated with short-term exposure to nitrogen dioxide (NO₂), sulfur dioxide (SO₂), particulate matter less than 10 mm in aerodynamic diameter (PM10), carbon monoxide (CO), and ozone (O₃). A five-day moving average of air pollution was applied to capture the cumulative effects of air pollution. The estimates were further stratified by sex, age, education level, and season. The additive and multiplicative interaction between air pollutants and meteorologic variables were assessed by the relative excess risk due to interaction (RERI) and adding the interactive term into the main model, respectively. The study found that NO₂ was positively associated with ischemic stroke occurrence throughout the year and in the cold season (November through April), with a stronger effect observed among men. Each 10 μg/m³ increment in the five-day moving average of NO₂ was associated with a 2.38% (95% confidence interval was 1.36% to 3.41%) increase in the risk of ischemic stroke over the whole year and a 3.36% (2.04% to 4.69%) increase in the cold season. The harmful effect of CO on ischemic stroke was observed only in the cold season, with each 1 mg/m³ increment in the five-day moving average of CO increasing the risk by 12.34% (3.85% to 21.51%). There was no statistically significant additive interaction between individual air pollutants and temperature or relative humidity, as demonstrated by the RERI. The interaction term in the model showed a multiplicative antagonistic effect between NO₂ and temperature (p-value=0.0268). For hemorrhagic stroke, no evidence of the effects of any individual air pollutants was found in the whole population. However, the RERI indicated a statistically additive and multiplicative interaction of temperature on the effects of PM10 and O₃ on hemorrhagic stroke onset. Therefore, the insignificant conclusion should be interpreted with caution. The study suggests that environmental NO₂ and CO might increase the morbidity of ischemic stroke, particularly during the cold season. These findings could help inform policy decisions aimed at reducing air pollution levels to prevent stroke and other health conditions. Additionally, the study provides valuable insights into the interaction between air pollution and meteorological variables, which underscores the need for further research into the complex relationship between environmental factors and health.

Keywords: air pollution, meteorological variables, interactive effect, seasonal pattern, stroke

Procedia PDF Downloads 58
1903 Scoring System for the Prognosis of Sepsis Patients in Intensive Care Units

Authors: Javier E. García-Gallo, Nelson J. Fonseca-Ruiz, John F. Duitama-Munoz

Abstract:

Sepsis is a syndrome that occurs with physiological and biochemical abnormalities induced by severe infection and carries a high mortality and morbidity, therefore the severity of its condition must be interpreted quickly. After patient admission in an intensive care unit (ICU), it is necessary to synthesize the large volume of information that is collected from patients in a value that represents the severity of their condition. Traditional severity of illness scores seeks to be applicable to all patient populations, and usually assess in-hospital mortality. However, the use of machine learning techniques and the data of a population that shares a common characteristic could lead to the development of customized mortality prediction scores with better performance. This study presents the development of a score for the one-year mortality prediction of the patients that are admitted to an ICU with a sepsis diagnosis. 5650 ICU admissions extracted from the MIMICIII database were evaluated, divided into two groups: 70% to develop the score and 30% to validate it. Comorbidities, demographics and clinical information of the first 24 hours after the ICU admission were used to develop a mortality prediction score. LASSO (least absolute shrinkage and selection operator) and SGB (Stochastic Gradient Boosting) variable importance methodologies were used to select the set of variables that make up the developed score; each of this variables was dichotomized and a cut-off point that divides the population into two groups with different mean mortalities was found; if the patient is in the group that presents a higher mortality a one is assigned to the particular variable, otherwise a zero is assigned. These binary variables are used in a logistic regression (LR) model, and its coefficients were rounded to the nearest integer. The resulting integers are the point values that make up the score when multiplied with each binary variables and summed. The one-year mortality probability was estimated using the score as the only variable in a LR model. Predictive power of the score, was evaluated using the 1695 admissions of the validation subset obtaining an area under the receiver operating characteristic curve of 0.7528, which outperforms the results obtained with Sequential Organ Failure Assessment (SOFA), Oxford Acute Severity of Illness Score (OASIS) and Simplified Acute Physiology Score II (SAPSII) scores on the same validation subset. Observed and predicted mortality rates within estimated probabilities deciles were compared graphically and found to be similar, indicating that the risk estimate obtained with the score is close to the observed mortality, it is also observed that the number of events (deaths) is indeed increasing as the outcome go from the decile with the lowest probabilities to the decile with the highest probabilities. Sepsis is a syndrome that carries a high mortality, 43.3% for the patients included in this study; therefore, tools that help clinicians to quickly and accurately predict a worse prognosis are needed. This work demonstrates the importance of customization of mortality prediction scores since the developed score provides better performance than traditional scoring systems.

Keywords: intensive care, logistic regression model, mortality prediction, sepsis, severity of illness, stochastic gradient boosting

Procedia PDF Downloads 187
1902 Co-Factors of Hypertension and Decomposition of Inequalities in Its Prevalence in India: Evidence from NFHS-4

Authors: Ayantika Biswas

Abstract:

Hypertension still remains one of the most important preventable contributors to adult mortality and morbidity and a major public health challenge worldwide. Studying regional and rural-urban differences in prevalence and assessment of the contributions of different indicators is essential in determining the drivers of this condition. The 2015-16 National Family Health Survey data has been used for the study. Bivariate analysis, multinomial regression analysis, concentration indices and decomposition of concentration indices assessing contribution of factors has been undertaken in the present study. An overall concentration index of 0.003 has been found for hypertensive population, which shows its concentration among the richer wealth quintiles. The contribution of factors like age 45 to 49 years, years of schooling between 5 to 9 years are factors that are important contributors to inequality in hypertension occurrence. Studies should be conducted to find approaches to prevent or delay the onset of the condition.

Keywords: hypertension, decomposition, inequalities, India

Procedia PDF Downloads 113
1901 Seroprevalence of Cytomegalovirus among Pregnant Women in Islamabad, Pakistan

Authors: Hassan Waseem

Abstract:

Cytomegalovirus (CMV) is ubiquitously distributed viral agent responsible for different clinical manifestations that may vary according to the immunologic status of the patient. CMV can cause morbidity and mortality among fetuses and patients with compromised immune system. A cross-sectional study was carried out in Islamabad to investigate the prevalence and risk factors associated with CMV infection among pregnant women. Blood samples of 172 pregnant women visiting Mother and Child Healthcare, Pakistan Institute of Medical Sciences (PIMS) Islamabad were taken. In present study, serum samples of the women were checked for CMV-specific IgG and IgM antibodies by enzyme linked immunosorbent assay (ELISA). Clinical, obstetrical and socio-demographical characteristics of the women were collected by using structured questionnaires. Out of 172 pregnant women included in the study, 171 (99.4%) were CMV specific IgG positive and 30 (17.4%) were found positive for CMV-IgM antibodies. The CMV has taken an endemic form in Pakistan so, routine screening of CMV among pregnant women is recommended.

Keywords: Cytomegalovirus, blood transfusion, ELISA, seroprevalence

Procedia PDF Downloads 339
1900 Factors Associated with Self-Rated Health among Persons with Disabilities: A Korean National Survey

Authors: Won-Seok Kim, Hyung-Ik Shin

Abstract:

Self-rated health (SRH) is a subjective assessment of individual health and has been identified as a strong predictor for mortality and morbidity. However few studies have been directed to the factors associated with SRH in persons with disabilities (PWD). We used data of 7th Korean national survey for 5307 PWD in 2008. Multiple logistic regression analysis was performed to find out independent risk factors for poor SRH in PWD. As a result, indicators of physical condition (poor instrumental ADL), socioeconomic disadvantages (poor education, economically inactive, low self-rated social class, medicaid in health insurance, presence of unmet need for hospital use) and social participation and networks (no use of internet service) were selected as independent risk factors for poor SRH in final model. Findings in the present study would be helpful in making a program to promote the health and narrow the gap of health status between the PWD.

Keywords: disabilities, risk factors, self-rated health, socioeconomic disadvantages, social networks

Procedia PDF Downloads 372
1899 Prevalence and Effect of Substance Use and Psychological Co-Morbidities in Medical and Dental Students of a Medical University of Nepal

Authors: Nidesh Sapkota, Garima Pudasaini, Dikshya Agrawal, Binav Baral, Umesh Bhagat, Dharanidhar Baral

Abstract:

Background: Medical and Dental students are vulnerable to higher levels of Psychological distress than other age matched peers. Many studies reveals that there is high prevalence of psychoactive substance use and Psychiatric co-morbidities among them. Objectives: -To study the prevalence of substance use among medical and dental students of a Medical University. -To study the prevalence of depression and anxiety in medical and dental students of a Medical University. Materials and Method: A cross-sectional descriptive study in which simple random sampling was done. Semi-structured questionnaire, AUDIT for alcohol use, Fagerstrom test for Nicotine dependence, Cannabis screening test (CAST), Beck’s Depression Inventory (BDI), Beck’s Anxiety Inventory (BAI) were used for the assessment. Results: Total sample size was 588 in which the mean age of participants was 22±2years. Among them the prevalence of alcohol users was 47.75%(281) in which 32%(90) were harmful users. Among 19.55%(115) nicotine users 56.5%(65), 37.4%(43), 6.1%(7) had low, low to moderate and moderate dependence respectively. The prevalence of cannabis users was 9%(53) with 45.3%(24), 18.9%(10) having low and high addiction respectively. Depressive symptoms were recorded in 25.3%(149) out of which 12.6%(74), 6.5%(38), 5.3%(31), 0.5%(3), 0.5%(3) had mild, borderline, moderate, severe and extreme depressive symptoms respectively. Similarly anxiety was recorded among 7.8%(46) students with 42 having moderate and 4 having severe anxiety symptoms. Among them 6.3%(37) had suicidal thoughts and 4(0.7%) of them had suicide attempt in last one year. Statistically significant association was noted with harmful alcohol users, Depression and suicidal attempts. Similar association was noted between Depression and suicide with moderate use of nicotine. Conclusion: There is high prevalence of Psychoactive substance use and psychiatric co-morbidities noted in the studies sample. Statistically significant association was noted with Psychiatric co-morbidities and substance use.

Keywords: alcohol, cannabis, dependence, depression, medical students

Procedia PDF Downloads 445
1898 Mental Well-Being and Quality of Life: A Comparative Study of Male Leather Tannery and Non-Tannery Workers of Kanpur City, India

Authors: Gyan Kashyap, Shri Kant Singh

Abstract:

Improved mental health can be articulated as a good physical health and quality of life. Mental health plays an important role in survival of any one’s life. In today’s time people living with stress in life due to their personal matters, health problems, unemployment, work environment, living environment, substance use, life style and many more important reasons. Many studies confirmed that the significant proportion of mental health people increasing in India. This study is focused on mental well-being of male leather tannery workers in Kanpur city, India. Environment at work place as well as living environment plays an important health risk factors among leather tannery workers. Leather tannery workers are more susceptible to many chemicals and physical hazards, just because they are liable to be affected by their exposure to lots of hazardous materials and processes during tanning work in very hazardous work environment. The aim of this study to determine the level of mental health disorder and quality of life among male leather tannery and non-tannery workers in Kanpur city, India. This study utilized the primary data from the cross- sectional household study which was conducted from January to June, 2015 on tannery and non-tannery workers as a part of PhD program from the Jajmau area of Kanpur city, India. The sample of 286 tannery and 295 non-tannery workers has been collected from the study area. We have collected information from the workers of age group 15-70 those who were working at the time of survey for at least one year. This study utilized the general health questionnaire (GHQ-12) and work related stress scale to test the mental wellbeing of male tannery and non-tannery workers. By using GHQ-12 and work related stress scale, Polychoric factor analysis method has been used for best threshold and scoring. Some of important question like ‘How would you rate your overall quality of life’ on Likert scale to measure the quality of life, their earnings, education, family size, living condition, household assets, media exposure, health expenditure, treatment seeking behavior and food habits etc. Results from the study revealed that around one third of tannery workers had severe mental health problems then non-tannery workers. Mental health problem shown the statistically significant association with wealth quintile, 56 percent tannery workers had severe mental health problem those belong to medium wealth quintile. And 42 percent tannery workers had moderate mental health problem among those from the low wealth quintile. Work related stress scale found the statistically significant results for tannery workers. Large proportion of tannery and non-tannery workers reported they are unable to meet their basic needs from their earnings and living in worst condition. Important result from the study, tannery workers who were involved in beam house work in tannery (58%) had severe mental health problem. This study found the statistically significant association with tannery work and mental health problem among tannery workers.

Keywords: GHQ-12, mental well-being, factor analysis, quality of life, tannery workers

Procedia PDF Downloads 365
1897 Risk Factors for Significant Obstetric Anal Sphincter Injury in a District General Hospital

Authors: A. Wahid Uddin

Abstract:

Obstetric anal sphincter injury carries significant morbidity for a woman and affects the quality of life to the extent of permanent damage to anal sphincter musculature. The study was undertaken in a district general hospital by retrospectively reviewing random 63 case notes of patients diagnosed with a significant third or fourth-degree perineal tear admitted between the year of 2015 to 2018. The observations were collected by a pre-designed questionnaire. All variables were expressed as percentages. The major risk factors noted were nulliparity (37%), instrumental delivery (25%), and birth weight of more than 4 kg (14%). Forceps delivery with or without episiotomy was the major contributing factor (75%). In the majority of the cases (71%), no record of any perineal protection measures undertaken. The study concluded that recommended perineal protection measures should be adopted as a routine practise.

Keywords: forceps, obstetrics, perineal, sphincter

Procedia PDF Downloads 110
1896 A Lightning Strike Mimic: The Abusive Use of Dog Shock Collar Presents as Encephalopathy, Respiratory Arrest, Cardiogenic Shock, Severe Hypernatremia, Rhabdomyolysis, and Multiorgan Injury

Authors: Merrick Lopez, Aashish Abraham, Melissa Egge, Marissa Hood, Jui Shah

Abstract:

A 3 year old male with unknown medical history presented initially with encephalopathy, intubated for respiratory failure, and admitted to the pediatric intensive care unit (PICU) with refractory shock. During resuscitation in the emergency department, he was found to be in severe metabolic acidosis with a pH of 7.03 and escalated on vasopressor drips for hypotension. His initial sodium was 174. He was noted to have burn injuries to his scalp, forehead, right axilla, bilateral arm creases and lower legs. He had rhabdomyolysis (initial creatinine kinase 5,430 U/L with peak levels of 62,340 normal <335 U/L), cardiac injury (initial troponin 88 ng/L with peak at 145 ng/L, normal <15ng/L), hypernatremia (peak 174, normal 140), hypocalcemia, liver injury, acute kidney injury, and neuronal loss on magnetic resonance imaging (MRI). Soft restraints and a shock collar were found in the home. He was critically ill for 8 days, but was gradually weaned off drips, extubated, and started on feeds. Discussion Electrical injury, specifically lightning injury is an uncommon but devastating cause of injury in pediatric patients. This patient with suspected abusive use of a dog shock collar presented similar to a lightning strike. Common entrance points include the hands and head, similar to our patient with linear wounds on his forehead. When current enters, it passes through tissues with the least resistance. Nerves, blood vessels, and muscles, have high fluid and electrolyte content and are commonly affected. Exit points are extremities: our child who had circumferential burns around his arm creases and ankles. Linear burns preferentially follow areas of high sweat concentration, and are thought to be due to vaporization of water on the skin’s surface. The most common cause of death from a lightning strike is due to cardiopulmonary arrest. The massive depolarization of the myocardium can result in arrhythmias and myocardial necrosis. The patient presented in cardiogenic shock with evident cardiac damage. Electricity going through vessels can lead to vaporization of intravascular water. This can explain his severe hypernatremia. He also sustained other internal organ injuries (adrenal glands, pancreas, liver, and kidney). Electrical discharge also leads to direct skeletal muscle injury in addition to prolonged muscular spasm. Rhabdomyolysis, the acute damage of muscle, leads to release of potentially toxic components into the circulation which could lead to acute renal failure. The patient had severe rhabdomyolysis and renal injury. Early hypocalcemia has been consistently demonstrated in patients with rhabdomyolysis. This was present in the patient and led to increased vasopressor needs. Central nervous system injuries are also common which can include encephalopathy, hypoxic injury, and cerebral infarction. The patient had evidence of brain injury as seen on MRI. Conclusion Electrical injuries due to lightning strikes and abusive use of a dog shock collar are rare, but can both present in similar ways with respiratory failure, shock, hypernatremia, rhabdomyolysis, brain injury, and multiorgan damage. Although rare, it is essential for early identification and prompt management for acute and chronic complications in these children.

Keywords: cardiogenic shock, dog shock collar, lightning strike, rhabdomyolysis

Procedia PDF Downloads 65
1895 Improvement of Healthcare Quality and Psychological Stress Relieve for Transition Program in Intensive Care Units

Authors: Ru-Yu Lien, Shih-Hsin Hung, Shu-Fen Lu, Shu-I Chin, Wen-Ju Yang, Wan Ming-Shang, Chien-Ying Wang

Abstract:

Background: Upon recovery from critical condition, patients are normally transferred from the intensive care units (ICUs) to the general wards. However, transferring patients to a new environment causes stressful experiences for both patients and their families. Therefore, there is a necessity to communicate with the patients and their families to reduce psychological stress and unplanned return. Methods: This study was performed in the general ICUs from January 1, 2021, to December 31, 2021, in Taipei Veteran General Hospital. The patients who were evaluated by doctors and liaison nurses transferred to the general wards were selected as the research objects and ranked by the Critical Care Transition Program (CCTP). The plan was applied to 40 patients in a study group and usual care support for a control group of 40 patients. The psychological condition of patients was evaluated by a migration stress scale and a hospital anxiety and depression scale. In addition, the rate of return to ICU was also measured. Results: A total of 63 patients out of 80 (78.8%) experienced moderate to severe degrees of anxiety, and 42 patients (52.6%) experienced moderate to severe degrees of depression before being transferred. The difference between anxiety and depression changed more after the transfer; moreover, when a transition program was applied, it was lower than without a transition program. The return to ICU rate in the study group was lower than in the usual transition group, with an adjusted odds ratio of 0.21 (95% confidence interval: 0.05-0.888, P=0.034). Conclusion: Our study found that the transfer program could reduce the anxiety and depression of patients and the associated stress on their families during the transition from ICU. Before being transferred out of ICU, the healthcare providers need to assess the needs of patients to set the goals of the care plan and perform patient-centered decision-making with multidisciplinary support.

Keywords: ICU, critical care transition program, healthcare, transition program

Procedia PDF Downloads 52
1894 Molecular Identification and Genotyping of Human Brucella Strains Isolated in Kuwait

Authors: Abu Salim Mustafa

Abstract:

Brucellosis is a zoonotic disease endemic in Kuwait. Human brucellosis can be caused by several Brucella species with Brucella melitensis causing the most severe and Brucella abortus the least severe disease. Furthermore, relapses are common after successful chemotherapy of patients. The classical biochemical methods of culture and serology for identification of Brucellae provide information about the species and serotypes only. However, to differentiate between relapse and reinfection/epidemiological investigations, the identification of genotypes using molecular methods is essential. In this study, four molecular methods [16S rRNA gene sequencing, real-time PCR, enterobacterial repetitive intergenic consensus (ERIC)-PCR and multilocus variable-number tandem-repeat analysis (MLVA)-16] were evaluated for the identification and typing of 75 strains of Brucella isolated in Kuwait. The 16S rRNA gene sequencing suggested that all the strains were B. melitensis and real-time PCR confirmed their species identity as B. melitensis. The ERIC-PCR band profiles produced a dendrogram of 75 branches suggesting each strain to be of a unique type. The cluster classification, based on ~ 80% similarity, divided all the ERIC genotypes into two clusters, A and B. Cluster A consisted of 9 ERIC genotypes (A1-A9) corresponding to 9 individual strains. Cluster B comprised of 13 ERIC genotypes (B1-B13) with B5 forming the largest cluster of 51 strains. MLVA-16 identified all isolates as B. melitensis and divided them into 71 MLVA-types. The cluster analysis of MLVA-16-types suggested that most of the strains in Kuwait originated from the East Mediterranean Region, a few from the African group and one new genotype closely matched with the West Mediterranean region. In conclusion, this work demonstrates that B. melitensis, the most pathogenic species of Brucella, is prevalent in Kuwait. Furthermore, MLVA-16 is the best molecular method, which can identify the Brucella species and genotypes as well as determine their origin in the global context. Supported by Kuwait University Research Sector grants MI04/15 and SRUL02/13.

Keywords: Brucella, ERIC-PCR, MLVA-16, RT-PCR, 16S rRNA gene sequencing

Procedia PDF Downloads 354
1893 Too Well to Die; Too Ill to Live

Authors: Deepak Jugran

Abstract:

The last century has witnessed rapid scientific growth, and social policies mainly targeted to increase the “life expectancy” of the people. As a result of these developments, the aging as well as ailing population, is increasing by every day. Despite an increase in “life expectancy”, we have not recorded compression in morbidity numbers as the age of onset of the majority of health issues has not increased substantially. In recent years, the prevalence of chronic diseases along with the improved treatment has also resulted in the increase of people living with chronic diseases. The last decade has also focused on social policies to increase the life expectancy in the population; however, in recent decades, social policies and biomedical research are gradually shifting on the potential of increasing healthy life or healthspan. In this article, we review the existing framework of lifespan and healthspan and wish to ignite a discussion among social scientists and public health experts to propose a wholistic framework to balance the trade-offs on social policies for “lifespan” and “healthspan”.

Keywords: lifespan, healthspan, chronic diseases, social policies

Procedia PDF Downloads 78
1892 Seroprevalence of Toxoplasmosis among Hemato-Oncology Patients in Tertiary Hospital of East Cost Malaysia

Authors: Aisha Khodijah Kholib Jati, Suharni Mohamad, Azlan Husin, Wan Suriana Wan Ab Rahman

Abstract:

Introduction: Toxoplasmosis is caused by an obligate intracellular parasite, Toxoplasma gondii (T. gondii). It is commonly asymptomatic in normal individual, but it can be fatal to immunocompromised patients as it can lead to severe complications such as encephalitis, chorioetinitis and myocarditis. Objective: The aim of the study was to determine the seroprevalence of toxoplasmosis and its association with socio-demographic and behavioral characteristics among hemato-oncology patients in Hospital USM. Methods: In this cross-sectional study, 56 hemato-oncology patients were screened for immunoglobulin M (IgM) antibodies, immunoglobulin G (IgG) antibodies, and IgG avidity of T. gondii by using ELISA Kit (BioRad, USA). For anti-T. gondii IgG antibody, titer ≥ 9 IU/ml was considered as recent infection, while for IgM, ratio ≥ 1.00 was considered as reactive for the anti-T. gondii IgM antibodies. Low avidity index is considered as recent infection within 20 weeks while high avidity considered as past infection. T. gondii exposure, socio-demographic and behavioral characteristics was assessed by a questionnaire and interview. Results: A total of 28 (50.0%) hemato-oncology patients were seropositive for T. gondii antibodies. Out of that total, 27 (48.21%) patients were IgG+/IgM- and one patient (1.79%) was IgG+/IgM+ with high avidity index. Univariate analysis showed that age, gender, ethnicity, marital status, educational level, employment status, stem cell transplant, blood transfusion, close contact with cats, water supply, and consumption of undercooked meat were not significantly associated with Toxoplasma seropositivity rate. Discussion: The seropositivity rate of IgG anti-T. gondii was high among hemato-oncology patients in Hospital USM. With impaired immune system, these patients might have a severe consequence if the infection reactivated. Therefore, screening for anti-T. gondii may be considered in the future. Moreover, health programme towards healthy food and good hygiene practice need to be implemented.

Keywords: immunocompromised, seroprevalence, socio-demographic, toxoplasmosis

Procedia PDF Downloads 132
1891 The Use of STIMULAN Resorbable Antibiotic Beads in Conjunction with Autologous Tissue Transfer to Treat Recalcitrant Infections and Osteomyelitis in Diabetic Foot Wounds

Authors: Hayden R Schott, John M Felder III

Abstract:

Introduction: Chronic lower extremity wounds in the diabetic and vasculopathic populations are associated with a high degree of morbidity.When wounds require more extensive treatment than can be offered by wound care centers, more aggressive solutions involve local tissue transfer and microsurgical free tissue transfer for achieving definitive soft tissue coverage. These procedures of autologous tissue transfer (ATT) offer resilient, soft tissue coverage of limb-threatening wounds and confer promising limb salvage rates. However, chronic osteomyelitis and recalcitrant soft tissue infections are common in severe diabetic foot wounds and serve to significantly complicate ATT procedures. Stimulan is a resorbable calcium sulfate antibiotic carrier. The use of stimulan antibiotic beads to treat chronic osteomyelitis is well established in the orthopedic and plastic surgery literature. In these procedures, the beads are placed beneath the skin flap to directly deliver antibiotics to the infection site. The purpose of this study was to quantify the success of Stimulan antibiotic beads in treating recalcitrant infections in patients with diabetic foot wounds receiving ATT. Methods: A retrospective review of clinical and demographic information was performed on patients who underwent ATT with the placement of Stimulan antibiotic beads for attempted limb salvage from 2018-21. Patients were analyzed for preoperative wound characteristics, demographics, infection recurrence, and adverse outcomes as a result of product use. The primary endpoint was 90 day infection recurrence, with secondary endpoints including 90 day complications. Outcomes were compared using basic statistics and Fisher’s exact tests. Results: In this time span, 14 patients were identified. At the time of surgery, all patients exhibited clinical signs of active infection, including positive cultures and erythema. 57% of patients (n=8) exhibited chronic osteomyelitis prior to surgery, and 71% (n=10) had exposed bone at the wound base. In 57% of patients (n=8), Stimulan beads were placed beneath a free tissue flap and beneath a pedicle tissue flap in 42% of patients (n=6). In all patients, Stimulan beads were only applied once. Recurrent infections were observed in 28% of patients (n=4) at 90 days post-op, and flap nonadherence was observed in 7% (n=1). These were the only Stimulan related complications observed. Ultimately, lower limb salvage was successful in 85% of patients (n=12). Notably, there was no significant association between the preoperative presence of osteomyelitis and recurrent infections. Conclusions: The use of Stimulanantiobiotic beads to treat recalcitrant infections in patients receiving definitive skin coverage of diabetic foot wounds does not appear to demonstrate unnecessary risk. Furthermore, the lack of significance between the preoperative presence of osteomyelitis and recurrent infections indicates the successful use of Stimulan to dampen infection in patients with osteomyelitis, as is consistent with the literature. Further research is needed to identify Stimulan as the significant contributor to infection treatment using future cohort and case control studies with more patients. Nonetheless, the use of Stimulan antibiotic beads in patients with diabetic foot wounds demonstrates successful infection suppression and maintenance of definitive soft tissue coverage.

Keywords: wound care, stimulan antibiotic beads, free tissue transfer, plastic surgery, wound, infection

Procedia PDF Downloads 62
1890 Outcomes of Pain Management for Patients in Srinagarind Hospital: Acute Pain Indicator

Authors: Chalermsri Sorasit, Siriporn Mongkhonthawornchai, Darawan Augsornwan, Sudthanom Kamollirt

Abstract:

Background: Although knowledge of pain and pain management is improving, they are still inadequate to patients. The Nursing Division of Srinagarind Hospital is responsible for setting the pain management system, including work instruction development and pain management indicators. We have developed an information technology program for monitoring pain quality indicators, which was implemented to all nursing departments in April 2013. Objective: To study outcomes of acute pain management in process and outcome indicators. Method: This is a retrospective descriptive study. The sample population was patients who had acute pain 24-48 hours after receiving a procedure, while admitted to Srinagarind Hospital in 2014. Data were collected from the information technology program. 2709 patients with acute pain from 10 Nursing Departments were recruited in the study. The research tools in this study were 1) the demographic questionnaire 2) the pain management questionnaire for process indicators, and 3) the pain management questionnaire for outcome indicators. Data were analyzed and presented by percentages and means. Results: The process indicators show that nurses used pain assessment tool and recorded 99.19%. The pain reassessment after the intervention was 96.09%. The 80.15% of the patients received opioid for pain medication and the most frequency of non-pharmacological intervention used was positioning (76.72%). For the outcome indicators, nearly half of them (49.90%) had moderate–severe pain, mean scores of worst pain was 6.48 and overall pain was 4.08. Patient satisfaction level with pain management was good (49.17%) and very good (46.62%). Conclusion: Nurses used pain assessment tools and pain documents which met the goal of the pain management process. Patient satisfaction with pain management was at high level. However the patients had still moderate to severe pain. Nurses should adhere more strictly to the guidelines of pain management, by using acute pain guidelines especially when pain intensity is particularly moderate-high. Nurses should also develop and practice a non-pharmacological pain management program to continually improve the quality of pain management. The information technology program should have more details about non-pharmacological pain techniques.

Keywords: outcome, pain management, acute pain, Srinagarind Hospital

Procedia PDF Downloads 204
1889 The Efficacy of Pre-Hospital Packed Red Blood Cells in the Treatment of Severe Trauma: A Retrospective, Matched, Cohort Study

Authors: Ryan Adams

Abstract:

Introduction: Major trauma is the leading cause of death in 15-45 year olds and a significant human, social and economic costs. Resuscitation is a stalwart of trauma management, especially in the pre-hospital environment and packed red blood cells (pRBC) are being increasingly used with the advent of permissive hypotension. The evidence in this area is lacking and further research is required to determine its efficacy. Aim: The aim of this retrospective, matched cohort study was to determine if major trauma patients, who received pre-hospital pRBC, have a difference in their initial emergency department cardiovascular status; when compared with injury-profile matched controls. Methods: The trauma databases of the Royal Brisbane and Women's Hospital, Royal Children's Hospital (Herston) and Queensland Ambulance Service were accessed and major trauma patient (ISS>12) data, who received pre-hospital pRBC, from January 2011 to August 2014 was collected. Patients were then matched against control patients that had not received pRBC, by their injury profile. The primary outcomes was cardiovascular status; defined as shock index and Revised Trauma Score. Results: Data for 25 patients who received pre-hospital pRBC was accessed and the injury profiles matched against suitable controls. On admittance to the emergency department, a statistically significant difference was seen in the blood group (Blood = 1.42 and Control = 0.97, p-value = 0.0449). However, the same was not seen with the RTS (Blood = 4.15 and Control 5.56, p-value = 0.291). Discussion: A worsening shock index and revised trauma score was associated with pre-hospital administration of pRBC. However, due to the small sample size, limited matching protocol and associated confounding factors it is difficult to draw any solid conclusions. Further studies, with larger patient numbers, are required to enable adequate conclusions to be drawn on the efficacy of pre-hospital packed red blood cell transfusion.

Keywords: pre-hospital, packed red blood cells, severe trauma, emergency medicine

Procedia PDF Downloads 372
1888 Transcriptome Analysis for Insights into Disease Progression in Dengue Patients

Authors: Abhaydeep Pandey, Shweta Shukla, Saptamita Goswami, Bhaswati Bandyopadhyay, Vishnampettai Ramachandran, Sudhanshu Vrati, Arup Banerjee

Abstract:

Dengue virus infection is now considered as one of the most important mosquito-borne infection in human. The virus is known to promote vascular permeability, cerebral edema leading to Dengue hemorrhagic fever (DHF) or Dengue shock syndrome (DSS). Dengue infection has known to be endemic in India for over two centuries as a benign and self-limited disease. In the last couple of years, the disease symptoms have changed, manifesting severe secondary complication. So far, Delhi has experienced 12 outbreaks of dengue virus infection since 1997 with the last reported in 2014-15. Without specific antivirals, the case management of high-risk dengue patients entirely relies on supportive care, involving constant monitoring and timely fluid support to prevent hypovolemic shock. Nonetheless, the diverse clinical spectrum of dengue disease, as well as its initial similarity to other viral febrile illnesses, presents a challenge in the early identification of this high-risk group. WHO recommends the use of warning signs to identify high-risk patients, but warning signs generally appear during, or just one day before the development of severe illness, thus, providing only a narrow window for clinical intervention. The ability to predict which patient may develop DHF and DSS may improve the triage and treatment. With the recent discovery of high throughput RNA sequencing allows us to understand the disease progression at the genomic level. Here, we will collate the results of RNA-Sequencing data obtained recently from PBMC of different categories of dengue patients from India and will discuss the possible role of deregulated genes and long non-coding RNAs NEAT1 for development of disease progression.

Keywords: long non-coding RNA (lncRNA), dengue, peripheral blood mononuclear cell (PBMC), nuclear enriched abundant transcript 1 (NEAT1), dengue hemorrhagic fever (DHF), dengue shock syndrome (DSS)

Procedia PDF Downloads 285
1887 Prevalence and Patterns of Hearing Loss among the Elderly with Hypertension in Southwest, Nigeria

Authors: Ayo Osisanya, Promise Ebuka Okonkwo

Abstract:

Reduced hearing sensitivity among the elderly has been attributed to some risk factors and influence of age-related degenerative conditions such as diabetes, cardiovascular disease, Alzheimer’s disease, bipolar disorder, and hypertension. Hearing loss; especially the age-related type (presbycusis), has been reported as one of the global burden affecting the general well-being and quality of life of the elderly with hypertension. Thus, hearing loss has been observed to be associated with hypertension and functional decline in elderly, as this condition makes them experience poor communication, fatigue, reduced social functions, mood-swing, and withdrawal syndrome. Emerging research outcomes indicate a strong relationship between hypertension and reduced auditory performance among the elderly. Therefore, this study determined the prevalence, types, and patterns of hearing loss associated with hypertension, with a bid to suggesting comprehensive management strategies and a model of creating awareness towards promoting good healthy living among the elderly in Nigeria. One hundred and seventy-two elderly, aged 65–85 with hypertension were purposively selected from patients undergoing treatment for hypertension in some tertiary hospitals in southwest Nigeria for the study. Participants were suggested to Pure-Tone Audiometry (PTA) through the use of Maico 53 Diagnostic Audiometer to determine the degree, types ad patterns of hearing loss among the elderly with hypertension. Results showed that 148 (86.05%) elderly with hypertension presented with different degrees, types, and patterns of hearing loss. Out of this number, 123 (83.11%) presented with bilateral hearing loss, while 25 (16.89%) had unilateral hearing loss. Degree of hearing loss, 74 moderate hearing loss, 118 moderately severe and 50 severe hearing loss. 36% of the hearing loss appeared as flat audiometric configuration, 24% were slopping, 19% were rising, while 21% were tough-shaped audiometric configurations. The findings showed high prevalence of hearing loss among the elderly with hypertension in Southwest, Nigeria. Based on the findings, management of elderly with hypertension should include regular audiological rehabilitation and total adherence to hearing conservation principles, otological management, regulation of blood pressure and adequate counselling / follow-up services.

Keywords: auditory performance, elderly, hearing loss, hypertension

Procedia PDF Downloads 272
1886 A Data-Driven Platform for Studying the Liquid Plug Splitting Ratio

Authors: Ehsan Atefi, Michael Grigware

Abstract:

Respiratory failure secondary to surfactant deficiency resulting from respiratory distress syndrome is considered one major cause of morbidity in preterm infants. Surfactant replacement treatment (SRT) is considered an effective treatment for this disease. Here, we introduce an AI-mediated approach for estimating the distribution of surfactant in the lung airway of a newborn infant during SRT. Our approach implements machine learning to precisely estimate the splitting ratio of a liquid drop during bifurcation at different injection velocities and patient orientations. This technique can be used to calculate the surfactant residue remaining on the airway wall during the surfactant injection process. Our model works by minimizing the pressure drop difference between the two airway branches at each generation, subject to mass and momentum conservation. Our platform can be used to generate feedback for immediately adjusting the velocity of injection and patient orientation during SRT.

Keywords: respiratory failure, surfactant deficiency, surfactant replacement, machine learning

Procedia PDF Downloads 94
1885 Association of the Time in Targeted Blood Glucose Range of 3.9–10 Mmol/L with the Mortality of Critically Ill Patients with or without Diabetes

Authors: Guo Yu, Haoming Ma, Peiru Zhou

Abstract:

BACKGROUND: In addition to hyperglycemia, hypoglycemia, and glycemic variability, a decrease in the time in the targeted blood glucose range (TIR) may be associated with an increased risk of death for critically ill patients. However, the relationship between the TIR and mortality may be influenced by the presence of diabetes and glycemic variability. METHODS: A total of 998 diabetic and non-diabetic patients with severe diseases in the ICU were selected for this retrospective analysis. The TIR is defined as the percentage of time spent in the target blood glucose range of 3.9–10.0 mmol/L within 24 hours. The relationship between TIR and in-hospital in diabetic and non-diabetic patients was analyzed. The effect of glycemic variability was also analyzed. RESULTS: The binary logistic regression model showed that there was a significant association between the TIR as a continuous variable and the in-hospital death of severely ill non-diabetic patients (OR=0.991, P=0.015). As a classification variable, TIR≥70% was significantly associated with in-hospital death (OR=0.581, P=0.003). Specifically, TIR≥70% was a protective factor for the in-hospital death of severely ill non-diabetic patients. The TIR of severely ill diabetic patients was not significantly associated with in-hospital death; however, glycemic variability was significantly and independently associated with in-hospital death (OR=1.042, P=0.027). Binary logistic regression analysis of comprehensive indices showed that for non-diabetic patients, the C3 index (low TIR & high CV) was a risk factor for increased mortality (OR=1.642, P<0.001). In addition, for diabetic patients, the C3 index was an independent risk factor for death (OR=1.994, P=0.008), and the C4 index (low TIR & low CV) was independently associated with increased survival. CONCLUSIONS: The TIR of non-diabetic patients during ICU hospitalization was associated with in-hospital death even after adjusting for disease severity and glycemic variability. There was no significant association between the TIR and mortality of diabetic patients. However, for both diabetic and non-diabetic critically ill patients, the combined effect of high TIR and low CV was significantly associated with ICU mortality. Diabetic patients seem to have higher blood glucose fluctuations and can tolerate a large TIR range. Both diabetic and non-diabetic critically ill patients should maintain blood glucose levels within the target range to reduce mortality.

Keywords: severe disease, diabetes, blood glucose control, time in targeted blood glucose range, glycemic variability, mortality

Procedia PDF Downloads 197