Search results for: epidemiology of meningitis
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 213

Search results for: epidemiology of meningitis

33 Challenging Clinical Scenario of Blood Stream Candida Infections – An Indian Experience

Authors: P. Uma Devi, S. Sujith, K. Rahul, T. S. Dipu, V. Anil Kumar , Vidya Menon

Abstract:

Introduction: Candida is an important cause of bloodstream infections (BSIs), causing significant mortality and morbidity. The epidemiology of Candida infection is also changing, mainly in relation to the number of episodes caused by species Candida non-albicans. However, in India, the true burden of candidemia is not clear. Thus, this study was conducted to evaluate the clinical characteristics, species distribution, antifungal susceptibility and outcome of candidemia at our hospital. Methodology: Between January 2012 and April 2014, adult patients with at least one positive blood culture for Candida species were identified through the microbiology laboratory database (for each patient only the first episode of candidemia was recorded). Patient data was collected by retrospective chart review of clinical characteristics including demographic data, risk factors; species distribution, resistance to antifungals and survival. Results: A total of 165 episodes of Candida BSI were identified, with 115 episodes occurring in adult patients. Most of the episodes occurred in males (69.6%). Nearly 82.6% patients were between 41 to 80 years and majority of the patients were in the intensive care unit (65.2%) at the time of diagnosis. On admission, 26.1% and 18.3% patients had pneumonia and urinary tract infection, respectively. Majority of the candidemia episodes were found in the general medicine department (23.5%) followed by gastrointestinal surgery (13.9%) and medical oncology & haematology (13%). Risk factors identified were prior hospitalization within one year (83.5%), antibiotic therapy within the last one month (64.3%), indwelling urinary catheter (63.5%), central venous catheter use (59.1%), diabetes mellitus (53%), severe sepsis (45.2%), mechanical ventilation (43.5%) and surgery (36.5%). C. tropicalis (30.4%) was the leading cause of infection followed by C. parapsilosis (28.7%) and C. albicans (13%). Other non-albicans species isolated included C. haemulonii (7.8%), C. glabrata (7%), C. famata (4.3%) and C. krusei (1.7%). Antifungal susceptibility to fluconazole was 87.9% (C. parapsilosis), 100% (C. tropicalis) and 93.3% (C. albicans). Mortality was noted in 51 patients (44.3%). Early mortality (within 7 days) was noted in 32 patients while late mortality (between 7 and 30 days) was noted in 19 patients. Conclusion: In recent years, candidemia has been flourishing in critically ill patients. Comparison of data from our own hospital from 2005 shows a doubling of the incidence. Rapid changes in the rate of infection, potential risk factors, and emergence of non-albicans Candida demand continued surveillance of this serious BSI. High index of suspicion and sensitive diagnostics are essential to improve outcomes in resource limited settings with emergence of non-albicans Candida.

Keywords: antifungal susceptibility, candida albicans, candidemia, non-albicans candida

Procedia PDF Downloads 429
32 Development and Psychometric Validation of the Hospitalised Older Adults Dignity Scale for Measuring Dignity during Acute Hospital Admissions

Authors: Abdul-Ganiyu Fuseini, Bernice Redley, Helen Rawson, Lenore Lay, Debra Kerr

Abstract:

Aim: The study aimed to develop and validate a culturally appropriate patient-reported outcome measure for measuring dignity for older adults during acute hospital admissions. Design: A three-phased mixed-method sequential exploratory design was used. Methods: Concept elicitation and generation of items for the scale was informed by older adults’ perspectives about dignity during acute hospitalization and a literature review. Content validity evaluation and pre-testing were undertaken using standard instrument development techniques. A cross-sectional survey design was conducted involving 270 hospitalized older adults for evaluation of construct and convergent validity, internal consistency reliability, and test–retest reliability of the scale. Analysis was performed using Statistical Package for the Social Sciences, version 25. Reporting of the study was guided by the Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) checklist. Results: We established the 15-item Hospitalized Older Adults’ Dignity Scale that has a 5-factor structure: Shared Decision-Making (3 items); Healthcare Professional-Patient Communication (3 items); Patient Autonomy (4 items); Patient Privacy (2 items); and Respectful Care (3 items). Excellent content validity, adequate construct and convergent validity, acceptable internal consistency reliability, and good test-retest reliability were demonstrated. Conclusion: We established the Hospitalized Older Adults Dignity Scale as a valid and reliable scale to measure dignity for older adults during acute hospital admissions. Future studies using confirmatory factor analysis are needed to corroborate the dimensionality of the factor structure and external validity of the scale. Routine use of the scale may provide information that informs the development of strategies to improve dignity-related care in the future. Impact: The development and validation of the Hospitalized Older Adults Dignity Scale will provide healthcare professionals with a feasible and reliable scale for measuring older adults’ dignity during acute hospitalization. Routine use of the scale may enable the capturing and incorporation of older patients’ perspectives about their healthcare experience and provide information that informs the development of strategies to improve dignity-related care in the future.

Keywords: dignity, older adults, hospitalisation, scale, patients, dignified care, acute care

Procedia PDF Downloads 67
31 The Impact of Online Visit Practice by Midwifery Students on Child-Rearing Midwives during The COVID-19 Pandemic: A Qualitative Descriptive Study

Authors: Mari Murakami, Hiromi Kawasaki, Saori Fujimoto, Yoko Ueno

Abstract:

Background: In Japan, one of the goals of midwifery education is the development of one’s ability to comprehensively support the child-rearing generation in collaboration with professionals from other disciplines. However, in order to prevent the spread of Covid-19, it has become extremely difficult to provide face-to-face support for mothers and children. Early on in the pandemic, we sought help from three parenting midwives as an alternative and attempted an online visit. Since midwives who are raising children respond to the training as both mothers who are care recipients and midwives as care providers. Therefore, we attempted to verify the usefulness of midwives experiencing training as mothers by clarifying the effects on those midwives who are raising children and who have experienced online visit training by students. Methods: The online visitations were conducted in June 2020. The collaborators were three midwives who were devoted to childcare. During the online visit training, we used the feedback records of their questions given by the collaborators (with their permission) to the students. The verbatim record was created from the records. Qualitative descriptive analysis was used, and subcategories and categories were extracted. This study was approved by the Ethical Committee for Epidemiology of Hiroshima University. Results: The average age of the three midwives was 36.3 years, with an average of 12.3 years of experience after graduation. They were each raising multiple children (ranging between a minimum of 2 and a maximum of 4 children). Their youngest infants were 6.7 months old on average for all. Five categories that emerged were: contributing to the development of midwifery students as a senior; the joy of accepting the efforts of a mother while raising children; recalling the humility of beginners through the integrity of midwifery students; learning opportunities about the benefits of online visits; and suggesting further challenges for online visits. Conclusion: The online visit training was an opportunity for midwives who are raising their own children to reinforce an honest and humble approach based on the attitude of the students, for self-improvement, and to reflect on the practice of midwifery from another person’s viewpoint. It was also noted that the midwives contributed to the education of midwifery students. Furthermore, they also agreed with the use of online visitations and considered the advantages and disadvantages of its use from the perspective of mothers and midwives. Online visits were seen to empower midwives on childcare leave, as their child-rearing was accepted and admired. Online visits by students were considered to be an opportunity to not only provide a sense of fulfillment as a recipient of care but also to think concretely about career advancement, during childcare leave, regarding the ideal way for midwifery training and teaching.

Keywords: child-rearing midwife, COVID-19 pandemic, online visit practice, qualitive descriptive study

Procedia PDF Downloads 117
30 Time of Death Determination in Medicolegal Death Investigations

Authors: Michelle Rippy

Abstract:

Medicolegal death investigation historically is a field that does not receive much research attention or advancement, as all of the subjects are deceased. Public health threats, drug epidemics and contagious diseases are typically recognized in decedents first, with thorough and accurate death investigations able to assist in epidemiology research and prevention programs. One vital component of medicolegal death investigation is determining the decedent’s time of death. An accurate time of death can assist in corroborating alibies, determining sequence of death in multiple casualty circumstances and provide vital facts in civil situations. Popular television portrays an unrealistic forensic ability to provide the exact time of death to the minute for someone found deceased with no witnesses present. The actuality of unattended decedent time of death determination can generally only be narrowed to a 4-6 hour window. In the mid- to late-20th century, liver temperatures were an invasive action taken by death investigators to determine the decedent’s core temperature. The core temperature was programmed into an equation to determine an approximate time of death. Due to many inconsistencies with the placement of the thermometer and other variables, the accuracy of the liver temperatures was dispelled and this once common place action lost scientific support. Currently, medicolegal death investigators utilize three major after death or post-mortem changes at a death scene. Many factors are considered in the subjective determination as to the time of death, including the cooling of the decedent, stiffness of the muscles, release of blood internally, clothing, ambient temperature, disease and recent exercise. Current research is utilizing non-invasive hospital grade tympanic thermometers to measure the temperature in the each of the decedent’s ears. This tool can be used at the scene and in conjunction with scene indicators may provide a more accurate time of death. The research is significant and important to investigations and can provide an area of accuracy to a historically inaccurate area, considerably improving criminal and civil death investigations. The goal of the research is to provide a scientific basis to unwitnessed deaths, instead of the art that the determination currently is. The research is currently in progress with expected termination in December 2018. There are currently 15 completed case studies with vital information including the ambient temperature, decedent height/weight/sex/age, layers of clothing, found position, if medical intervention occurred and if the death was witnessed. This data will be analyzed with the multiple variables studied and available for presentation in January 2019.

Keywords: algor mortis, forensic pathology, investigations, medicolegal, time of death, tympanic

Procedia PDF Downloads 85
29 Is Obesity Associated with CKD-(unknown) in Sri Lanka? A Protocol for a Cross Sectional Survey

Authors: Thaminda Liyanage, Anuga Liyanage, Chamila Kurukulasuriya, Sidath Bandara

Abstract:

Background: The burden of chronic kidney disease (CKD) is growing rapidly around the world, particularly in Asia. Over the last two decades Sri Lanka has experienced an epidemic of CKD with ever growing number of patients pursuing medical care due to CKD and its complications, specially in the “Mahaweli” river basin in north central region of the island nation. This was apparently a new form of CKD which was not attributable to conventional risk factors such as diabetes mellitus, hypertension or infection and widely termed as “CKD-unknown” or “CKDu”. In the past decade a number of small scale studies were conducted to determine the aetiology, prevalence and complications of CKDu in North Central region. These hospital-based studies did not provide an accurate estimate of the problem as merely 10% or less of the people with CKD are aware of their diagnosis even in developed countries with better access to medical care. Interestingly, similar observations were made on the changing epidemiology of obesity in the region but no formal study was conducted to date to determine the magnitude of obesity burden. Moreover, if increasing obesity in the region is associated with CKD epidemic is yet to be explored. Methods: We will conduct an area wide cross sectional survey among all adult residents of the “Mahaweli” development project area 5, in the North Central Province of Sri Lanka. We will collect relevant medical history, anthropometric measurements, blood and urine for hematological and biochemical analysis. We expect a participation rate of 75%-85% of all eligible participants. Participation in the study is voluntary, there will be no incentives provided for participation. Every analysis will be conducted in a central laboratory and data will be stored securely. We will calculate the prevalence of obesity and chronic kidney disease, overall and by stage using total number of participants as the denominator and report per 1000 population. The association of obesity and CKD will be assessed with regression models and will be adjusted for potential confounding factors and stratified by potential effect modifiers where appropriate. Results: This study will provide accurate information on the prevalence of obesity and CKD in the region. Furthermore, this will explore the association between obesity and CKD, although causation may not be confirmed. Conclusion: Obesity and CKD are increasingly recognized as major public health problems in Sri Lanka. Clearly, documenting the magnitude of the problem is the essential first step. Our study will provide this vital information enabling the government to plan a coordinated response to tackle both obesity and CKD in the region.

Keywords: BMI, Chronic Kidney Disease, obesity, Sri Lanka

Procedia PDF Downloads 242
28 Nursing Students’ Learning Effects of Online Visits for Mothers Rearing Infants during the COVID-19 Pandemic

Authors: Saori Fujimoto, Hiromi Kawasaki, Mari Murakami, Yoko Ueno

Abstract:

Background: Coronavirus disease (COVID-19) has been spreading throughout the world. In Japan, many nursing universities have conducted online clinical practices to secure students’ learning opportunities. In the field of women’s health nursing, even after the pandemic ended, it will be worthwhile to utilize online practice in declining birthrate and reducing the burden of mothers. This study examined the learning effects of conducting online visits for mothers with infants during the COVID-19 pandemic by nursing students to enhance the students’ ability to carry out the online practice even in ordinary times effectively. Methods: Students were divided into groups of three, and information on the mothers was assessed, and the visits were planned. After role-play was conducted by the students and teachers, an online visit was conducted. The analysis target was the self-evaluation score of nine students who conducted online visits in June 2020 and had consented to participate. The evaluation contents included three items for assessment, two items for planning, one item for ethical consideration, five items for nursing practice, and two items for evaluation. The self-evaluation score ranged from 4 (‘Can do with a little advice’) to 1 (‘Can’t do with a little advice’). A univariate statistical analysis was performed. This study was approved by the Ethical Committee for Epidemiology of Hiroshima University. Results: The items with the highest mean (standard deviation) scores were ‘advocates for the dignity and the rights of mothers’ (3.89 (0.31)) and ‘communication behavior needed to create a trusting relationship’ (3.89 (0.31)).’ Next were the ‘individual nursing practice tailored to mothers (3.78 (0.42))’ and ‘review own practice and work on own task (3.78 (0.42)).’ The mean (standard deviation) of the items by type were as follows: three assessment items, 3.26 (0.70), two planning items, 3.11 (0.49), one ethical consideration item, 3.89 (0.31), five nursing practice items, 3.56 (0.54), and two evaluation items, 3.67 (0.47). Conclusion: The highest self-evaluations were for ‘advocates for the dignity and the rights of mothers’ and ‘communication behavior needed to create a trusting relationship.’ These findings suggest that the students were able to form good relationships with the mothers by improving their ability to effectively communicate and by presenting a positive attitude, even when conducting health visits online. However, the self-evaluation scores for assessment and planning were lower than those of ethical consideration, nursing practice, and evaluation. This was most likely due to a lack of opportunities and time to gather information and the need to modify and add plans in a short amount of time during one online visit. It is necessary to further consider the methods used in conducting online visits from the following viewpoints: methods of gathering information and the ability to make changes through multiple visits.

Keywords: infants, learning effects, mothers, online visit practice

Procedia PDF Downloads 108
27 Financial Burden of Occupational Slip and Fall Incidences in Taiwan

Authors: Kai Way Li, Lang Gan

Abstract:

Slip &Fall are common in Taiwan. They could result in injuries and even fatalities. Official statistics indicate that more than 15% of all occupational incidences were slip/fall related. All the workers in Taiwan are required by the law to join the worker’s insurance program administered by the Bureau of Labor Insurance (BLI). The BLI is a government agency under the supervision of the Ministry of Labor. Workers claim with the BLI for insurance compensations when they suffer fatalities or injuries at work. Injuries statistics based on worker’s compensation claims were rarely studied. The objective of this study was to quantify the injury statistics and financial cost due to slip-fall incidences based on the BLI compensation records. Compensation records in the BLI during 2007 to 2013 were retrieved. All the original application forms, approval opinions, results for worker’s compensations were in hardcopy and were stored in the BLI warehouses. Xerox copies of the claims, excluding the personal information of the applicants (or the victim if passed away), were obtained. The content in the filing forms were coded in an Excel worksheet for further analyses. Descriptive statistics were performed to analyze the data. There were a total of 35,024 claims including 82 deaths, 878 disabilities, and 34,064 injuries/illnesses which were slip/fall related. It was found that the average losses for the death cases were 40 months. The total dollar amount for these cases paid was 86,913,195 NTD. For the disability cases, the average losses were 367.36 days. The total dollar amount for these cases paid was almost 2.6 times of those for the death cases (233,324,004 NTD). For the injury/illness cases, the average losses for the illness cases were 58.78 days. The total dollar amount for these cases paid was approximately 13 times of those of the death cases (1134,850,821 NTD). For the applicants/victims, 52.3% were males. There were more males than females for the deaths, disability, and injury/illness cases. Most (57.8%) of the female victims were between 45 to 59 years old. Most of the male victims (62.6%) were, on the other hand, between 25 to 39 years old. Most of the victims were in manufacturing industry (26.41%), next the construction industry (22.20%), and next the retail industry (13.69%). For the fatality cases, head injury was the main problem for immediate or eventual death (74.4%). For the disability case, foot (17.46%) and knee (9.05%) injuries were the leading problems. The compensation claims other than fatality and disability were mainly associated with injuries of the foot (18%), hand (12.87%), knee (10.42%), back (8.83%), and shoulder (6.77%). The slip/fall cases studied indicate that the ratios among the death, disability, and injury/illness counts were 1:10:415. The ratios of dollar amount paid by the BLI for the three categories were 1:2.6:13. Such results indicate the significance of slip-fall incidences resulting in different severity. Such information should be incorporated in to slip-fall prevention program in industry.

Keywords: epidemiology, slip and fall, social burden, workers’ compensation

Procedia PDF Downloads 302
26 A Comparison between Five Indices of Overweight and Their Association with Myocardial Infarction and Death, 28-Year Follow-Up of 1000 Middle-Aged Swedish Employed Men

Authors: Lennart Dimberg, Lala Joulha Ian

Abstract:

Introduction: Overweight (BMI 25-30) and obesity (BMI 30+) have consistently been associated with cardiovascular (CV) risk and death since the Framingham heart study in 1948, and BMI was included in the original Framingham risk score (FRS). Background: Myocardial infarction (MI) poses a serious threat to the patient's life. In addition to BMI, several other indices of overweight have been presented and argued to replace FRS as more relevant measures of CV risk. These indices include waist circumference (WC), waist/hip ratio (WHR), sagittal abdominal diameter (SAD), and sagittal abdominal diameter to height (SADHtR). Specific research question: The research question of this study is to evaluate the interrelationship between the various body measurements, BMI, WC, WHR, SAD, and SADHtR, and which measurement is strongly associated with MI and death. Methods: In 1993, 1,000 middle-aged Caucasian, randomly selected working men of the Swedish Volvo-Renault cohort were surveyed at a nurse-led health examination with a questionnaire, EKG, laboratory tests, blood pressure, height, weight, waist, and sagittal abdominal diameter measurements. Outcome data of myocardial infarction over 28 years come from Swedeheart (the Swedish national myocardial infarction registry) and the Swedish death registry. The Aalen-Johansen and Kaplan–Meier methods were used to estimate the cumulative incidences of MI and death. Multiple logistic regression analyses were conducted to compare BMI with the other four body measurements. The risk for the various measures of obesity was calculated with outcomes of accumulated first-time myocardial infarction and death as odds ratios (OR) in quartiles. The ORs between the 4th and the 1st quartile of each measure were calculated to estimate the association between the body measurement variables and the probability of cumulative incidences of myocardial infarction (MI) over time. Double-sided P values below 0.05 will be considered statistically significant. Unadjusted odds ratios were calculated for obesity indicators, MI, and death. Adjustments for age, diabetes, SBP, and the ratio of total cholesterol/HDL-C and blue/white collar status were performed. Results: Out of 1000 people, 959 subjects had full information about the five different body measurements. Of those, 90 participants had a first MI, and 194 persons died. The study showed that there was a high and significant correlation between the five different body measurements, and they were all associated with CVD risk factors. All body measurements were significantly associated with MI, with the highest (OR=3.6) seen for SADHtR and WC. After adjustment, all but SADHtR remained significant with weaker ORs. As for all-cause mortality, WHR (OR=1.7), SAD (OR=1.9), and SADHtR (OR=1.6) were significantly associated, but not WC and BMI. However, after adjustment, only WHR and SAD were significantly associated with death, but with attenuated ORs.

Keywords: BMI, death, epidemiology, myocardial infarction, risk factor, sagittal abdominal diameter, sagittal abdominal diameter to height, waist circumference, waist-hip ratio

Procedia PDF Downloads 62
25 Incidence and Risk Factors of Traumatic Lumbar Puncture in Newborns in a Tertiary Care Hospital

Authors: Heena Dabas, Anju Paul, Suman Chaurasia, Ramesh Agarwal, M. Jeeva Sankar, Anurag Bajpai, Manju Saksena

Abstract:

Background: Traumatic lumbar puncture (LP) is a common occurrence and causes substantial diagnostic ambiguity. There is paucity of data regarding its epidemiology. Objective: To assess the incidence and risk factors of traumatic LP in newborns. Design/Methods: In a prospective cohort study, all inborn neonates admitted in NICU and planned to undergo LP for a clinical indication of sepsis were included. Neonates with diagnosed intraventricular hemorrhage (IVH) of grade III and IV were excluded. The LP was done by operator - often a fellow or resident assisted by bedside nurse. The unit has policy of not routinely using any sedation/analgesia during the procedure. LP is done by 26 G and 0.5-inch-long hypodermic needle inserted in third or fourth lumbar space while the infant is in lateral position. The infants were monitored clinically and by continuous measurement of vital parameters using multipara monitor during the procedure. The occurrence of traumatic tap along with CSF parameters and other operator and assistant characteristics were recorded at the time of procedure. Traumatic tap was defined as presence of visible blood or more than 500 red blood cells on microscopic examination. Microscopic trauma was defined when CSF is not having visible blood but numerous RBCs. The institutional ethics committee approved the study protocol. A written informed consent from the parents and the health care providers involved was obtained. Neonates were followed up till discharge/death and final diagnosis was assigned along with treating team. Results: A total of 362 (21%) neonates out of 1726 born at the hospital were admitted during the study period (July 2016 to January, 2017). Among these neonates, 97 (26.7%) were suspected of sepsis. A total of 54 neonates were enrolled who met the eligibility criteria and parents consented to participate in the study. The mean (SD) birthweight was 1536 (732) grams and gestational age 32.0 (4.0) weeks. All LPs were indicated for late onset sepsis at the median (IQR) age of 12 (5-39) days. The traumatic LP occurred in 19 neonates (35.1%; 95% C.I 22.6% to 49.3%). Frank blood was observed in 7 (36.8%) and in the remaining, 12(63.1%) CSF was detected to have microscopic trauma. The preliminary risk factor analysis including birth weight, gestational age and operator/assistant and other characteristics did not demonstrate clinically relevant predictors. Conclusion: A significant number of neonates requiring lumbar puncture in our study had high incidence of traumatic tap. We were not able to identify modifiable risk factors. There is a need to understand the reasons and further reduce this issue for improving management in NICUs.

Keywords: incidence, newborn, traumatic, lumbar puncture

Procedia PDF Downloads 271
24 Basal Cell Carcinoma: Epidemiological Analysis of a 5-Year Period in a Brazilian City with a High Level of Solar Radiation

Authors: Maria E. V. Amarante, Carolina L. Cerdeira, Julia V. Cortes, Fiorita G. L. Mundim

Abstract:

Basal cell carcinoma (BCC) is the most prevalent type of skin cancer in humans. It arises from the basal cells of the epidermis and cutaneous appendages. The role of sunlight exposure as a risk factor for BCC is very well defined due to its power to influence genetic mutations, in addition to having a suppressor effect on the skin immune system. Despite showing low metastasis and mortality rates, the tumor is locally infiltrative, aggressive, and destructive. Considering the high prevalence rate of this carcinoma and the importance of early detection, a retrospective study was carried out in order to correlate the clinical data available on BBC, characterize it epidemiologically, and thus enable effective prevention measures for the population. Data on the period from January 2015 to December 2019 were collected from the medical records of patients registered at one pathology service located in the southeast region of Brazil, known as SVO, which delivers skin biopsy results. The study was aimed at correlating the variables, sex, age, and subtypes found. Data analysis was performed using the chi-square test at a nominal significance level of 5% in order to verify the independence between the variables of interest. Fisher's exact test was applied in cases where the absolute frequency in the cells of the contingency table was less than or equal to five. The statistical analysis was performed using the R® software. Ninety-three basal cell carcinoma were analyzed, and its frequency in the 31-to 45-year-old age group was 5.8 times higher in men than in women, whereas, from 46 to 59 years, the frequency was found 2.4 times higher in women than in men. Between the ages of 46 to 59 years, it should be noted that the sclerodermiform subtype appears more than the solid one, with a difference of 7.26 percentage points. Reversely, the solid form appears more frequently in individuals aged 60 years or more, with a difference of 8.57 percentage points. Among women, the frequency of the solid subtype was 9.93 percentage points higher than the sclerodermiform frequency. In males, the same percentage difference is observed, but sclerodermiform is the most prevalent subtype. It is concluded in this study that, in general, there is a predominance of basal cell carcinoma in females and in individuals aged 60 years and over, which demonstrates the tendency of this tumor. However, when rarely found in younger individuals, the male gender prevailed. The most prevalent subtype was the solid one. It is worth mentioning that the sclerodermiform subtype, which is more aggressive, was seen more frequently in males and in the 46-to 59-year-old range.

Keywords: basal cell carcinoma, epidemiology, sclerodermiform basal cell carcinoma, skin cancer, solar radiation, solid basal cell carcinoma

Procedia PDF Downloads 114
23 Development of Loop Mediated Isothermal Amplification (Lamp) Assay for the Diagnosis of Ovine Theileriosis

Authors: Muhammad Fiaz Qamar, Uzma Mehreen, Muhammad Arfan Zaman, Kazim Ali

Abstract:

Ovine Theileriosis is a world-wide concern, especially in tropical and subtropical areas, due to having tick abundance that has received less awareness in different developed and developing areas due to less worth of sheep, low to the middle level of infection in different small ruminants herd. Across Asia, the prevalence reports have been conducted to provide equivalent calculation of flock and animal level prevalence of Theileriosisin animals. It is a challenge for veterinarians to timely diagnosis & control of Theileriosis and famers because of the nature of the organism and inadequacy of restricted plans to control. All most work is based upon the development of such a technique which should be farmer-friendly, less expensive, and easy to perform into the field. By the timely diagnosis of this disease will decrease the irrational use of the drugs, and other plan was to determine the prevalence of Theileriosis in District Jhang by using the conventional method, PCR and qPCR, and LAMP. We quantify the molecular epidemiology of T.lestoquardiin sheep from Jhang districts, Punjab, Pakistan. In this study, we concluded that the overall prevalence of Theileriosis was (32/350*100= 9.1%) in sheep by using Giemsa staining technique, whereas (48/350*100= 13%) is observed by using PCR technique (56/350*100=16%) in qPCR and the LAMP technique have shown up to this much prevalence percentage (60/350*100= 17.1%). The specificity and sensitivity also calculated in comparison with the PCR and LAMP technique. Means more positive results have been shown when the diagnosis has been done with the help of LAMP. And there is little bit of difference between the positive results of PCR and qPCR, and the least positive animals was by using Giemsa staining technique/conventional method. If we talk about the specificity and sensitivity of the LAMP as compared to PCR, The cross tabulation shows that the results of sensitivity of LAMP counted was 94.4%, and specificity of LAMP counted was 78%. Advances in scientific field must be upon reality based ideas which can lessen the gaps and hurdles in the way of scientific research; the lamp is one of such techniques which have done wonders in adding value and helping human at large. It is such a great biological diagnostic tools and has helped a lot in the proper diagnosis and treatment of certain diseases. Other methods for diagnosis, such as culture techniques and serological techniques, have exposed humans with great danger. However, with the help of molecular diagnostic technique like LAMP, exposure to such pathogens is being avoided in the current era Most prompt and tentative diagnosis can be made using LAMP. Other techniques like PCR has many disadvantages when compared to LAMP as PCR is a relatively expensive, time consuming, and very complicated procedure while LAMP is relatively cheap, easy to perform, less time consuming, and more accurate. LAMP technique has removed hurdles in the way of scientific research and molecular diagnostics, making it approachable to poor and developing countries.

Keywords: distribution, thelaria, LAMP, primer sequences, PCR

Procedia PDF Downloads 78
22 Epidemiological Data of Schistosoma haematobium Bilharzia in Rural and Urban Localities in the Republic of Congo

Authors: Jean Akiana, Digne Merveille Nganga Bouanga, Nardiouf Sjelin Nsana, Wilfrid Sapromet Ngoubili, Chyvanelle Ndous Akiridzo, Vishnou Reize Ampiri, Henri-Joseph Parra, Florence Fenollar, Didier Raoult, Oleg Mediannikov, Cheikh Sadhibou Sokhna

Abstract:

Schistosoma haematobium schistosomiasis is an endemic disease in which the level of human exposure, incidence, and fatality attributed to it remains, unfortunately, high worldwide. The erection of hydroelectric infrastructures constitute a major factor in the emergence of this disease. In the context of the Republic of the Congo, which considers industrialization and modernization as two essential pillars of development, building the hydroelectric dams of Liouesso (19 Mw) and the feasibility studies of the dams of Chollet (600MW) in the Sangha, of Sounda (1000MW) in Kouilou and Kouembali (150MW) on Lefini is necessary to increase the country's energy capacities. Likewise, the urbanization of former endemic localities should take into account the maintenance of contamination points. However, health impact studies on schistosomiasis epidemiology in general and urinary bilharzia, in particular, have never been carried out in these areas, neither before nor after the erection of those dams. Participants benefited from an investigative questionnaire, urinalysis both by dipstick and urine filtrate examined under a microscope. Assessment of the genetic diversity of schistosoma species populations was considered as well as PCR analysis to confirm the test strip and microscopy tests. 405 participants were registered in five localities. The sampling was made up of a balanced population in terms of male/female ratio, which is around 1. The prevalence rate was 45% (55/123) in Nkayi, 10.40% (11/106) in Loudima, 1 case in Mbomo (West Cuvette), which would probably be imported, zero in Liouesso and Kabo. The highest oviuria (number of eggs per volume of urine) is 150 S. haematobium eggs/10ml in Nkayi, apart from the case of imported Mbomo, imported from Gabon, which has 160 S. haematobium eggs/10ml. The lowest oviuria was 2 S. haematobium eggs/10ml. Prevalence rates are still high in semi-urban areas (Nkayi). As praziquantel treatments are available and effective, it is important to step up mass treatment campaigns in high risk areas already largely initiated by the National Schistosomiasis Control Program. Prevalence rates are still high in semi-urban areas (Nkayi). As praziquantel treatments are available and effective, it is important to step up mass treatment campaigns in high risk areas already largely initiated by the National Schistosomiasis Control Program.

Keywords: Bilharzia, Schistosoma haematobium, oviuria, urbanization, Congo

Procedia PDF Downloads 123
21 Mental Health on Three Continents: A Comparison of Mental Health Disorders in the Usa, India and Brazil

Authors: Henry Venter, Murali Thyloth, Alceu Casseb

Abstract:

Historically, mental and substance use disorders were not a global health priority. Since the 1993 World Development Report, the importance of the contribution of mental health and substance abuse on the relative global burden associated with disease morbidity has been recognized with 300 million people worldwide suffering from depression alone. This led to an international effort to improve the mental health of populations around the world. Despite these efforts some countries remain at the top of the list of countries with the highest rate of mental illness. Important research questions were asked: Would there be commonalities regarding mental health between these countries; would there be common factors leading to the high prevalence of mental illness; and how prepared are these countries with mental health delivery? Findings from this research can aid organizations and institutions preparing mental health service providers to focus training and preparation to address specific needs revealed by the study. Methods: Researchers decided to compare three distinctly different countries at the top of the list of countries with the highest rate of mental illness, the USA, India and Brazil, situated on three different continents with different economies and lifestyles. Data were collected using archival research methodology, reviewing records and findings of international and national health and mental health studies to subtract and compare data and findings. Results: The findings indicated that India is the most depressed country in the world, followed by the USA (and China) with Brazil in Latin America with the greatest number of depressed individuals. By 2020 roughly 20% of India, acountry of over one billion citizens, will suffer from some form of mental illnees, yet there are less than 4,000 experts available. In the USA 164.8 million people were substance abusers and an estimate of 47.6 million adults, 18 or older, had any mental illness in 2018. That means that about one in five adults in the USA experiences some form of mental illness each year, but only 41% of those affected received mental health care or services in the past year. Brazil has the greatest number of depressed individuals, in Latin America. Adults living in Sao Paulo megacity has prevalence of mental disorders at greater levels than similar surveys conducted in other areas of the world with more than one million adults with serious impairment levels. Discussion: The results show that, despite the vast socioeconomic differences between the three countries, there are correlations regarding mental health prevalence and difficulty to provide adequate services including a lack of awareness of how serious mental illness is, stigma for seeking mental illness, with comorbidity a common phenomenon, and a lack of partnership between different levels of service providers, which weakens mental health service delivery. The findings also indicate that mental health training institutions have a monumental task to prepare personnel to address the future mental health needs in each of the countries compared, which will constitute the next phase of the research.

Keywords: mental health epidemiology, mental health disorder, mental health prevalence, mental health treatment

Procedia PDF Downloads 86
20 Multilevel Regression Model - Evaluate Relationship Between Early Years’ Activities of Daily Living and Alzheimer’s Disease Onset Accounting for Influence of Key Sociodemographic Factors Using a Longitudinal Household Survey Data

Authors: Linyi Fan, C.J. Schumaker

Abstract:

Background: Biomedical efforts to treat Alzheimer’s disease (AD) have typically produced mixed to poor results, while more lifestyle-focused treatments such as exercise may fare better than existing biomedical treatments. A few promising studies have indicated that activities of daily life (ADL) may be a useful way of predicting AD. However, the existing cross-sectional studies fail to show how functional-related issues such as ADL in early years predict AD and how social factors influence health either in addition to or in interaction with individual risk factors. This study would helpbetterscreening and early treatments for the elderly population and healthcare practice. The findings have significance academically and practically in terms of creating positive social change. Methodology: The purpose of this quantitative historical, correlational study was to examine the relationship between early years’ ADL and the development of AD in later years. The studyincluded 4,526participantsderived fromRAND HRS dataset. The Health and Retirement Study (HRS) is a longitudinal household survey data set that is available forresearchof retirement and health among the elderly in the United States. The sample was selected by the completion of survey questionnaire about AD and dementia. The variablethat indicates whether the participant has been diagnosed with AD was the dependent variable. The ADL indices and changes in ADL were the independent variables. A four-step multilevel regression model approach was utilized to address the research questions. Results: Amongst 4,526 patients who completed the AD and dementia questionnaire, 144 (3.1%) were diagnosed with AD. Of the 4,526 participants, 3,465 (76.6%) have high school and upper education degrees,4,074 (90.0%) were above poverty threshold. The model evaluatedthe effect of ADL and change in ADL on onset of AD in late years while allowing the intercept of the model to vary by level of education. The results suggested that the only significant predictor of the onset of AD was changes in early years’ ADL (b = 20.253, z = 2.761, p < .05). However, the result of the sensitivity analysis (b = 7.562, z = 1.900, p =.058), which included more control variables and increased the observation period of ADL, are not supported this finding. The model also estimated whether the variances of random effect vary by Level-2 variables. The results suggested that the variances associated with random slopes were approximately zero, suggesting that the relationship between early years’ ADL were not influenced bysociodemographic factors. Conclusion: The finding indicated that an increase in changes in ADL leads to an increase in the probability of onset AD in the future. However, this finding is not support in a broad observation period model. The study also failed to reject the hypothesis that the sociodemographic factors explained significant amounts of variance in random effect. Recommendations were then made for future research and practice based on these limitations and the significance of the findings.

Keywords: alzheimer’s disease, epidemiology, moderation, multilevel modeling

Procedia PDF Downloads 103
19 Valorization of Surveillance Data and Assessment of the Sensitivity of a Surveillance System for an Infectious Disease Using a Capture-Recapture Model

Authors: Jean-Philippe Amat, Timothée Vergne, Aymeric Hans, Bénédicte Ferry, Pascal Hendrikx, Jackie Tapprest, Barbara Dufour, Agnès Leblond

Abstract:

The surveillance of infectious diseases is necessary to describe their occurrence and help the planning, implementation and evaluation of risk mitigation activities. However, the exact number of detected cases may remain unknown whether surveillance is based on serological tests because identifying seroconversion may be difficult. Moreover, incomplete detection of cases or outbreaks is a recurrent issue in the field of disease surveillance. This study addresses these two issues. Using a viral animal disease as an example (equine viral arteritis), the goals were to establish suitable rules for identifying seroconversion in order to estimate the number of cases and outbreaks detected by a surveillance system in France between 2006 and 2013, and to assess the sensitivity of this system by estimating the total number of outbreaks that occurred during this period (including unreported outbreaks) using a capture-recapture model. Data from horses which exhibited at least one positive result in serology using viral neutralization test between 2006 and 2013 were used for analysis (n=1,645). Data consisted of the annual antibody titers and the location of the subjects (towns). A consensus among multidisciplinary experts (specialists in the disease and its laboratory diagnosis, epidemiologists) was reached to consider seroconversion as a change in antibody titer from negative to at least 32 or as a three-fold or greater increase. The number of seroconversions was counted for each town and modeled using a unilist zero-truncated binomial (ZTB) capture-recapture model with R software. The binomial denominator was the number of horses tested in each infected town. Using the defined rules, 239 cases located in 177 towns (outbreaks) were identified from 2006 to 2013. Subsequently, the sensitivity of the surveillance system was estimated as the ratio of the number of detected outbreaks to the total number of outbreaks that occurred (including unreported outbreaks) estimated using the ZTB model. The total number of outbreaks was estimated at 215 (95% credible interval CrI95%: 195-249) and the surveillance sensitivity at 82% (CrI95%: 71-91). The rules proposed for identifying seroconversion may serve future research. Such rules, adjusted to the local environment, could conceivably be applied in other countries with surveillance programs dedicated to this disease. More generally, defining ad hoc algorithms for interpreting the antibody titer could be useful regarding other human and animal diseases and zoonosis when there is a lack of accurate information in the literature about the serological response in naturally infected subjects. This study shows how capture-recapture methods may help to estimate the sensitivity of an imperfect surveillance system and to valorize surveillance data. The sensitivity of the surveillance system of equine viral arteritis is relatively high and supports its relevance to prevent the disease spreading.

Keywords: Bayesian inference, capture-recapture, epidemiology, equine viral arteritis, infectious disease, seroconversion, surveillance

Procedia PDF Downloads 265
18 Budget Impact Analysis of a Stratified Treatment Cascade for Hepatitis C Direct Acting Antiviral Treatment in an Asian Middle-Income Country through the Use of Compulsory and Voluntary Licensing Options

Authors: Amirah Azzeri, Fatiha H. Shabaruddin, Scott A. McDonald, Rosmawati Mohamed, Maznah Dahlui

Abstract:

Objective: A scaled-up treatment cascade with direct-acting antiviral (DAA) therapy is necessary to achieve global WHO targets for hepatitis C virus (HCV) elimination in Malaysia. Recently, limited access to Sofosbuvir/Daclatasvir (SOF/DAC) is available through compulsory licensing, with future access to Sofosbuvir/Velpatasvir (SOF/VEL) expected through voluntary licensing due to recent agreements. SOF/VEL has superior clinical outcomes, particularly for cirrhotic stages, but has higher drug acquisition costs compared to SOF/DAC. It has been proposed that a stratified treatment cascade might be the most cost-efficient approach for Malaysia whereby all HCV patients are treated with SOF/DAC except for patients with cirrhosis who are treated with SOF/VEL. This study aimed to conduct a five-year budget impact analysis from the provider perspective of the proposed stratified treatment cascade for HCV treatment in Malaysia. Method: A disease progression model that was developed based on model-predicted HCV epidemiology data in Malaysia was used for the analysis, where all HCV patients in scenario A were treated with SOF/DAC for all disease stages while in scenario B, SOF/DAC was used only for non-cirrhotic patients and SOF/VEL was used for the cirrhotic patients. The model projections estimated the annual numbers of patients in care and the numbers of patients to be initiated on DAA treatment nationally. Healthcare costs associated with DAA therapy and disease stage monitoring was included to estimate the downstream cost implications. For scenario B, the estimated treatment uptake of SOF/VEL for cirrhotic patients were 25%, 50%, 75%, 100% and 100% for 2018, 2019, 2020, 2021 and 2022 respectively. Healthcare costs were estimated based on standard clinical pathways for DAA treatment described in recent guidelines. All costs were reported in US dollars (conversion rate US$1=RM4.09, the price year 2018). Scenario analysis was conducted for 5% and 10% reduction of SOF/VEL acquisition cost anticipated from the competitive market pricing of generic DAA in Malaysia. Results: The stratified treatment cascade with SOF/VEL in Scenario B was found to be cost-saving compared to Scenario A. A substantial portion of the cost reduction was due to the costs associated with DAA therapy which resulted in USD 40 thousand (year 1) to USD 443 thousand (year 5) savings annually, with cumulative savings of USD 1.1 million after 5 years. Cost reductions for disease stage monitoring were seen in year three onwards which resulted in cumulative savings of USD 1.1 thousand. Scenario analysis estimated cumulative savings of USD 1.24 to USD 1.35 million when the acquisition cost of SOF/VEL was reduced. Conclusion: A stratified treatment cascade with SOF/VEL was expected to be cost-saving and can results in a budget impact reduction in overall healthcare expenditure in Malaysia compared to treatment with SOF/DAC. The better clinical efficacy with SOF/VEL is expected to halt patients’ HCV disease progression and may reduce downstream costs of treating advanced disease stages. The findings of this analysis may be useful to inform healthcare policies for HCV treatment in Malaysia.

Keywords: Malaysia, direct acting antiviral, compulsory licensing, voluntary licensing

Procedia PDF Downloads 141
17 Carbohydrate Intake and Physical Activity Levels Modify the Association between FTO Gene Variants and Obesity and Type 2 Diabetes: First Nutrigenetics Study in an Asian Indian Population

Authors: K. S. Vimal, D. Bodhini, K. Ramya, N. Lakshmipriya, R. M. Anjana, V. Sudha, J. A. Lovegrove, V. Mohan, V. Radha

Abstract:

Gene-lifestyle interaction studies have been carried out in various populations. However, to date there are no studies in an Asian Indian population. Hence, we examined whether lifestyle factors such as diet and physical activity modify the association between fat mass and obesity–associated (FTO) gene variants and obesity and type 2 diabetes (T2D) in an Asian Indian population. We studied 734 unrelated T2D and 884 normal glucose-tolerant (NGT) participants randomly selected from the Chennai Urban Rural Epidemiology Study (CURES) in Southern India. Obesity was defined according to the World Health Organization Asia Pacific Guidelines (non-obese, BMI < 25 kg/m2; obese, BMI ≥ 25 kg/m2). Six single nucleotide polymorphisms (SNPs) in the FTO gene (rs9940128, rs7193144, rs8050136, rs918031, rs1588413 and rs11076023) identified from recent genome-wide association studies for T2D were genotyped by polymerase chain reaction-restriction fragment length polymorphism and direct sequencing. Dietary assessment was carried out using a validated food frequency questionnaire and physical activity was based upon the self-report. Interaction analyses were performed by including the interaction terms in the model. A joint likelihood ratio test of the main SNP effects and the SNP-diet/physical activity interaction effects was used in the linear regression analyses to maximize statistical power. Statistical analyses were performed using STATA version 13. There was a significant interaction between FTO SNP rs8050136 and carbohydrate energy percentage (Pinteraction=0.04) on obesity, where the ‘A’ allele carriers of the SNP rs8050136 had 2.46 times higher risk of obesity than those with ‘CC’ genotype (P=3.0x10-5) among individuals in the highest tertile of carbohydrate energy percentage. Furthermore, among those who had lower levels of physical activity, the ‘A’ allele carriers of the SNP rs8050136 had 1.89 times higher risk of obesity than those with ‘CC’ genotype (P=4.0x10-5). We also found a borderline interaction between SNP rs11076023 and carbohydrate energy percentage (Pinteraction=0.08) on T2D, where the ‘A’ allele carriers in the highest tertile of carbohydrate energy percentage, had 1.57 times higher risk of T2D than those with ‘TT’ genotype (P=0.002). There was also a significant interaction between SNP rs11076023 and physical activity (Pinteraction=0.03) on T2D. No further significant interactions between SNPs and macronutrient intake or physical activity on obesity and T2D were observed. In conclusion, this is the first study to provide evidence for a gene-diet and gene-physical activity interaction on obesity and T2D in an Asian Indian population. These findings suggest that the association between FTO gene variants and obesity and T2D is influenced by carbohydrate intake and physical activity levels. Greater understanding of how FTO gene influences obesity and T2D through dietary and exercise interventions will advance the development of behavioral intervention and personalised lifestyle strategies predicted to reduce the development of metabolic diseases in ‘A’ allele carriers of both SNPs in this Asian Indian population.

Keywords: dietary intake, FTO, obesity, physical activity, type 2 diabetes, Asian Indian.

Procedia PDF Downloads 504
16 Shocks and Flows - Employing a Difference-In-Difference Setup to Assess How Conflicts and Other Grievances Affect the Gender and Age Composition of Refugee Flows towards Europe

Authors: Christian Bruss, Simona Gamba, Davide Azzolini, Federico Podestà

Abstract:

In this paper, the authors assess the impact of different political and environmental shocks on the size and on the age and gender composition of asylum-related migration flows to Europe. With this paper, the authors contribute to the literature by looking at the impact of different political and environmental shocks on the gender and age composition of migration flows in addition to the size of these flows. Conflicting theories predict different outcomes concerning the relationship between political and environmental shocks and the migration flows composition. Analyzing the relationship between the causes of migration and the composition of migration flows could yield more insights into the mechanisms behind migration decisions. In addition, this research may contribute to better informing national authorities in charge of receiving these migrant, as women and children/the elderly require different assistance than young men. To be prepared to offer the correct services, the relevant institutions have to be aware of changes in composition based on the shock in question. The authors analyze the effect of different types of shocks on the number, the gender and age composition of first time asylum seekers originating from 154 sending countries. Among the political shocks, the authors consider: violence between combatants, violence against civilians, infringement of political rights and civil liberties, and state terror. Concerning environmental shocks, natural disasters (such as droughts, floods, epidemics, etc.) have been included. The data on asylum seekers applying to any of the 32 Schengen Area countries between 2008 and 2015 is on a monthly basis. Data on asylum applications come from Eurostat, data on shocks are retrieved from various sources: georeferenced conflict data come from the Uppsala Conflict Data Program (UCDP), data on natural disasters from the Centre for Research on the Epidemiology of Disasters (CRED), data on civil liberties and political rights from Freedom House, data on state terror from the Political Terror Scale (PTS), GDP and population data from the World Bank, and georeferenced population data from the Socioeconomic Data and Applications Center (SEDAC). The authors adopt a Difference-in-Differences identification strategy, exploiting the different timing of several kinds of shocks across countries. The highly skewed distribution of the dependent variable is taken into account by using count data models. In particular, a Zero Inflated Negative Binomial model is adopted. Preliminary results show that different shocks - such as armed conflict and epidemics - exert weak immediate effects on asylum-related migration flows and almost non-existent effects on the gender and age composition. However, this result is certainly affected by the fact that no time lags have been introduced so far. Finding the correct time lags depends on a great many variables not limited to distance alone. Therefore, finding the appropriate time lags is still a work in progress. Considering the ongoing refugee crisis, this topic is more important than ever. The authors hope that this research contributes to a less emotionally led debate.

Keywords: age, asylum, Europe, forced migration, gender

Procedia PDF Downloads 232
15 Biomaterials Solutions to Medical Problems: A Technical Review

Authors: Ashish Thakur

Abstract:

This technical paper was written in view of focusing the biomaterials and its various applications in modern industries. Author tires to elaborate not only the medical, infect plenty of application in other industries. The scope of the research area covers the wide range of physical, biological and chemical sciences that underpin the design of biomaterials and the clinical disciplines in which they are used. A biomaterial is now defined as a substance that has been engineered to take a form which, alone or as part of a complex system, is used to direct, by control of interactions with components of living systems, the course of any therapeutic or diagnostic procedure. Biomaterials are invariably in contact with living tissues. Thus, interactions between the surface of a synthetic material and biological environment must be well understood. This paper reviews the benefits and challenges associated with surface modification of the metals in biomedical applications. The paper also elaborates how the surface characteristics of metallic biomaterials, such as surface chemistry, topography, surface charge, and wettability, influence the protein adsorption and subsequent cell behavior in terms of adhesion, proliferation, and differentiation at the biomaterial–tissue interface. The chapter also highlights various techniques required for surface modification and coating of metallic biomaterials, including physicochemical and biochemical surface treatments and calcium phosphate and oxide coatings. In this review, the attention is focused on the biomaterial-associated infections, from which the need for anti-infective biomaterials originates. Biomaterial-associated infections differ markedly for epidemiology, aetiology and severity, depending mainly on the anatomic site, on the time of biomaterial application, and on the depth of the tissues harbouring the prosthesis. Here, the diversity and complexity of the different scenarios where medical devices are currently utilised are explored, providing an overview of the emblematic applicative fields and of the requirements for anti-infective biomaterials. In addition to this, chapter introduces nanomedicine and the use of both natural and synthetic polymeric biomaterials, focuses on specific current polymeric nanomedicine applications and research, and concludes with the challenges of nanomedicine research. Infection is currently regarded as the most severe and devastating complication associated to the use of biomaterials. Osteoporosis is a worldwide disease with a very high prevalence in humans older than 50. The main clinical consequences are bone fractures, which often lead to patient disability or even death. A number of commercial biomaterials are currently used to treat osteoporotic bone fractures, but most of these have not been specifically designed for that purpose. Many drug- or cell-loaded biomaterials have been proposed in research laboratories, but very few have received approval for commercial use. Polymeric nanomaterial-based therapeutics plays a key role in the field of medicine in treatment areas such as drug delivery, tissue engineering, cancer, diabetes, and neurodegenerative diseases. Advantages in the use of polymers over other materials for nanomedicine include increased functionality, design flexibility, improved processability, and, in some cases, biocompatibility.

Keywords: nanomedicine, tissue, infections, biomaterials

Procedia PDF Downloads 238
14 Epidemiology of Healthcare-Associated Infections among Hematology/Oncology Patients: Results of a Prospective Incidence Survey in a Tunisian University Hospital

Authors: Ezzi Olfa, Bouafia Nabiha, Ammar Asma, Ben Cheikh Asma, Mahjoub Mohamed, Bannour Wadiaa, Achour Bechir, Khelif Abderrahim, Njah Mansour

Abstract:

Background: In hematology/oncology, health care improvement has allowed increasingly aggressive management in diagnostic and therapeutic procedures. Nevertheless, these intensified procedures have been associated with higher risk of healthcare associated infections (HAIs). We undertook this study to estimate the burden of HAIs in the cancer patients in an onco -hematology unit in a Tunisian university hospital. Materials/Methods: A prospective, observational study, based on active surveillance for a period of 06 months from Mars through September 2016, was undertaken in the department of onco-hematology in a university hospital in Tunisia. Patients, who stayed in the unit for ≥ 48 h, were followed until hospital discharge. The Centers for Disease Control and Prevention criteria (CDC) for site-specific infections were used as standard definitions for HAIs. Results: One hundred fifty patients were included in the study. The gender distribution was 33.3% for girls and 66.6% boys. They have a mean age of 23.12 years (SD = 18.36 years). The main patient’s diagnosis is: Acute Lymphoblastic Leukemia (ALL): 48.7 %( n=73). The mean length of stay was 21 days +/- 18 days. Almost 8% of patients had an implantable port (n= 12), 34.9 % (n=52) had a lumber puncture and 42.7 % (n= 64) had a medullary puncture. Chemotherapy was instituted in 88% of patients (n=132). Eighty (53.3%) patients had neutropenia at admission. The incidence rate of HAIs was 32.66 % per patient; the incidence density was 15.73 per 1000 patient-days in the unit. Mortality rate was 9.3% (n= 14), and 50% of cases of death were caused by HAIs. The most frequent episodes of infection were: infection of skin and superficial mucosa (5.3%), pulmonary aspergillosis (4.6%), Healthcare associated pneumonia (HAP) (4%), Central venous catheter associated infection (4%), digestive infection (5%), and primary bloodstream infection (2.6%). Finally, fever of unknown origin (FUO) incidence rate was 14%. In case of skin and superficial infection (n= 8), 4 episodes were documented, and organisms implicated were Escherichia.coli, Geotricum capitatum and Proteus mirabilis. For pulmonary aspergillosis, 6 cases were diagnosed clinically and radiologically, and one was proved by positive aspergillus antigen in bronchial aspiration. Only one patient died due this infection. In HAP (6 cases), four episodes were diagnosed clinically and radiologically. No bacterial etiology was established in these cases. Two patients died due to HAP. For primary bloodstream infection (4 cases), implicated germs were Enterobacter cloacae, Geotricum capitatum, klebsiella pneumoniae, and Streptococcus pneumoniae. Conclusion: This type of prospective study is an indispensable tool for internal quality control. It is necessary to evaluate preventive measures and design control guides and strategies aimed to reduce the HAI’s rate and the morbidity and mortality associated with infection in a hematology/oncology unit.

Keywords: cohort prospective studies, healthcare associated infections, hematology oncology department, incidence

Procedia PDF Downloads 354
13 The Effect of Intimate Partner Violence Prevention Program on Knowledge and Attitude of Victims

Authors: Marzieh Nojomi, Azadeh Mottaghi, Arghavan Haj-Sheykholeslami, Narjes Khalili, Arash Tehrani Banihashemi

Abstract:

Background and objectives: Domestic violence is a global problem with severe consequences throughout the life of the victims. Iran’s Ministry of Health has launched an intimate partner violence (IPV) prevention program, integrated in the primary health care services since 2016. The present study is a part of this national program’s evaluation. In this section, we aimed to examine spousal abuse victims’ knowledge and attitude towards domestic violence before and after receivingthese services. Methods: To assess the knowledge and attitudes of victims, a questionnaire designed by Ahmadzadand colleagues in 2013 was used. This questionnaire includes 15 questions regarding knowledge in the fields of definition, epidemiology, and effects on children, outcomes, and prevention of domestic violence. To assess the attitudes, this questionnaire has 10 questions regarding the attitudes toward the causes, effects, and legal or protective support services of domestic violence. To assess the satisfaction and the effect of the program on prevention or reduction of spousal violence episodes, two more questions were also added. Since domestic violence prevalence differs in different parts of the country, we chose nine areas with the highest, the lowest, and moderate prevalence of IPVfor the study. The link to final electronic version of the questionnaire was sent to the randomly selected public rural or urban health centers in the nine chosen areas. Since the study had to be completed in one month, we used newly identified victims as pre-intervention group and people who had at least received one related service from the program (like psychiatric consultation, education about safety measures, supporting organizations and etc.) during the previous year, as our post- intervention group. Results: A hundred and ninety-two newly identified IPV victims and 267 victims who had at least received one related program service during the previous year entered the study. All of the victims were female. Basic characteristics of the two groups, including age, education, occupation, addiction, spouses’ age, spouses’ addiction, duration of the current marriage, and number of children, were not statistically different. In knowledge questions, post- intervention group had statistically better scores in the fields of domestic violence outcomes and its effects on children; however, in the remaining areas, the scores of both groups were similar. The only significant difference in the attitude across the two groups was in the field of legal or protective support services. From the 267 women who had ever received a service from the program, 91.8% were satisfied with the services, and 74% reported a decrease in the number of violent episodes. Conclusion: National IPV prevention program integrated in the primary health care services in Iran is effective in improving the knowledge of victims about domestic violence outcomes and its effects on children. Improving the attitude and knowledge of domestic violence victims about its causes and preventive measures needs more effective interventions. This program can reduce the number of IPV episodes between the spouses, and satisfaction among the service users is high.

Keywords: intimate partner violence, assessment, health services, efficacy

Procedia PDF Downloads 110
12 Early Biological Effects in Schoolchildren Living in an Area of Salento (Italy) with High Incidence of Chronic Respiratory Diseases: The IMP.AIR. Study

Authors: Alessandra Panico, Francesco Bagordo, Tiziana Grassi, Adele Idolo, Marcello Guido, Francesca Serio, Mattia De Giorgi, Antonella De Donno

Abstract:

In the Province of Lecce (Southeastern Italy) an area with unusual high incidence of chronic respiratory diseases, including lung cancer, was recently identified. The causes of this health emergency are still not entirely clear. In order to determine the risk profile of children living in five municipalities included in this area an epidemiological-molecular study was performed in the years 2014-2016: the IMP.AIR. (Impact of air quality on health of residents in the Municipalities of Sternatia, Galatina, Cutrofiano, Sogliano Cavour and Soleto) study. 122 children aged 6-8 years attending primary school in the study area were enrolled to evaluate the frequency of micronuclei (MNs) in their buccal exfoliated cells. The samples were collected in May 2015 by rubbing the oral mucosa with a soft bristle disposable toothbrush. At the same time, a validated questionnaire was administered to parents to obtain information about health, lifestyle and eating habits of the children. In addition, information on airborne pollutants, routinely detected by the Regional Environmental Agency (ARPA Puglia) in the study area, was acquired. A multivariate analysis was performed to detect any significant association between frequency of MNs (dependent variable) and behavioral factors (independent variables). The presence of MNs was highlighted in the buccal exfoliated cells of about 42% of recruited children with a mean frequency of 0.49 MN/1000 cells, greater than in other areas of Salento. The survey on individual characteristics and lifestyles showed that one in three children was overweight and that most of them had unhealthy eating habits with frequent consumption of foods considered ‘risky’. Moreover many parents (40% of fathers and 12% of mothers) were smokers and about 20% of them admitted to smoking in the house where the children lived. Information regarding atmospheric contaminants was poor. Of the few substances routinely detected by the only one monitoring station located in the study area (PM2.5, SO2, NO2, CO, O3) only ozone showed high concentrations exceeding the limits set by the legislation for 67 times in the year 2015. The study showed that the level of early biological effect markers in children was not negligible. This critical condition could be related to some individual factors and lifestyles such as overweight, unhealthy eating habits and exposure to passive smoking. At present, no relationship with airborne pollutants can be established due to the lack of information on many substances. Therefore, it would be advisable to modify incorrect behaviors and to intensify the monitoring of airborne pollutants (e.g. including detection of PM10, heavy metals, aromatic polycyclic hydrocarbons, benzene) given the epidemiology of chronic respiratory diseases registered in this area.

Keywords: chronic respiratory diseases, environmental pollution, lifestyle, micronuclei

Procedia PDF Downloads 180
11 A Genetic Identification of Candida Species Causing Intravenous Catheter-Associated Candidemia in Heart Failure Patients

Authors: Seyed Reza Aghili, Tahereh Shokohi, Shirin Sadat Hashemi Fesharaki, Mohammad Ali Boroumand, Bahar Salmanian

Abstract:

Introduction: Intravenous catheter-associated fungal infection as nosocomial infection continue to be a deep problem among hospitalized patients, decreasing quality of life and adding healthcare costs. The capacity of catheters in the spread of candidemia in heart failure patients is obvious. The aim of this study was to evaluate the prevalence and genetic identification of Candida species in heart disorder patients. Material and Methods: This study was conducted in Tehran Hospital of Cardiology Center (Tehran, Iran, 2014) during 1.5 years on the patients hospitalized for at least 7 days and who had central or peripheral vein catheter. Culture of catheters, blood and skin of the location of catheter insertion were applied for detecting Candida colonies in 223 patients. Identification of Candida species was made on the basis of a combination of various phenotypic methods and confirmed by sequencing the ITS1-5.8S-ITS2 region amplified from the genomic DNA using PCR and the NCBI BLAST. Results: Of the 223 patients samples tested, we identified totally 15 Candida isolates obtained from 9 (4.04%) catheter cultures, 3 (1.35%) blood cultures and 2 (0.90%) skin cultures of the catheter insertion areas. On the base of ITS region sequencing, out of nine Candida isolates from catheter, 5(55.6%) C. albicans, 2(22.2%) C. glabrata, 1(11.1%) C. membranifiaciens and 1 (11.1%) C. tropicalis were identified. Among three Candida isolates from blood culture, C. tropicalis, C. carpophila and C. membranifiaciens were identified. Non-candida yeast isolated from one blood culture was Cryptococcus albidus. One case of C. glabrata and one case of Candida albicans were isolated from skin culture of the catheter insertion areas in patients with positive catheter culture. In these patients, ITS region of rDNA sequence showed a similarity between Candida isolated from the skin and catheter. However, the blood samples of these patients were negative for fungal growth. We report two cases of catheter-related candidemia caused by C. membranifiaciens and C. tropicalis on the base of genetic similarity of species isolated from blood and catheter which were treated successfully with intravenous fluconazole and catheter removal. In phenotypic identification methods, we could only identify C. albicans and C. tropicalis and other yeast isolates were diagnosed as Candida sp. Discussion: Although more than 200 species of Candida have been identified, only a few cause diseases in humans. There is some evidence that non-albicans infections are increasing. Many risk factors, including prior antibiotic therapy, use of a central venous catheter, surgery, and parenteral nutrition are considered to be associated with candidemia in hospitalized heart failure patients. Identifying the route of infection in candidemia is difficult. Non-albicans candida as the cause of candidemia is increasing dramatically. By using conventional method, many non-albicans isolates remain unidentified. So, using more sensitive and specific molecular genetic sequencing to clarify the aspects of epidemiology of the unknown candida species infections is essential. The positive blood and catheter cultures for candida isolates and high percentage of similarity of their ITS region of rDNA sequence in these two patients confirmed the diagnosis of intravenous catheter-associated candidemia.

Keywords: catheter-associated infections, heart failure patient, molecular genetic sequencing, ITS region of rDNA, Candidemia

Procedia PDF Downloads 301
10 The Pigeon Circovirus Evolution and Epidemiology under Conditions of One Loft Race Rearing System: The Preliminary Results

Authors: Tomasz Stenzel, Daria Dziewulska, Ewa Łukaszuk, Joy Custer, Simona Kraberger, Arvind Varsani

Abstract:

Viral diseases, especially those leading to impairment of the immune system, are among the most important problems in avian pathology. However, there is not much data available on this subject other than commercial poultry bird species. Recently, increasing attention has been paid to racing pigeons, which have been refined for many years in terms of their ability to return to their place of origin. Currently, these birds are used for races at distances from 100 to 1000 km, and winning pigeons are highly valuable. The rearing system of racing pigeons contradicts the principles of biosecurity, as birds originating from various breeding facilities are commonly transported and reared in “One Loft Race” (OLR) facilities. This favors the spread of multiple infections and provides conditions for the development of novel variants of various pathogens through recombination. One of the most significant viruses occurring in this avian species is the pigeon circovirus (PiCV), which is detected in ca. 70% of pigeons. Circoviruses are characterized by vast genetic diversity which is due to, among other things, the recombination phenomenon. It consists of an exchange of fragments of genetic material among various strains of the virus during the infection of one organism. The rate and intensity of the development of PiCV recombinants have not been determined so far. For this reason, an experiment was performed to investigate the frequency of development of novel PiCV recombinants in racing pigeons kept in OLR-type conditions. 15 racing pigeons originating from 5 different breeding facilities, subclinically infected with various PiCV strains, were housed in one room for eight weeks, which was supposed to mimic the conditions of OLR rearing. Blood and swab samples were collected from birds every seven days to recover complete PiCV genomes that were amplified through Rolling Circle Amplification (RCA), cloned, sequenced, and subjected to bioinformatic analyses aimed at determining the genetic diversity and the dynamics of recombination phenomenon among the viruses. In addition, virus shedding rate/level of viremia, expression of the IFN-γ and interferon-related genes, and anti-PiCV antibodies were determined to enable the complete analysis of the course of infection in the flock. Initial results have shown that 336 full PiCV genomes were obtained, exhibiting nucleotide similarity ranging from 86.6 to 100%, and 8 of those were recombinants originating from viruses of different lofts of origin. The first recombinant appeared after seven days of experiment, but most of the recombinants appeared after 14 and 21 days of joint housing. The level of viremia and virus shedding was the highest in the 2nd week of the experiment and gradually decreased to the end of the experiment, which partially corresponded with Mx 1 gene expression and antibody dynamics. The results have shown that the OLR pigeon-rearing system could play a significant role in spreading infectious agents such as circoviruses and contributing to PiCV evolution through recombination. Therefore, it is worth considering whether a popular gambling game such as pigeon racing is sensible from both animal welfare and epidemiological point of view.

Keywords: pigeon circovirus, recombination, evolution, one loft race

Procedia PDF Downloads 47
9 Phenotypical and Molecular Characterization of Burkholderia mallei from Horses with Glanders: Preliminary Data

Authors: A. F. C. Nassar, D. K. Tessler, L. Okuda, C. Del Fava, D. P. Chiebao, A. H. C. N. Romaldini, A. P. Alvim, M. J. Sanchez-Vazquez, M. S. Rosa, J. C. Pompei, R. Harakava, M. C. S. Araujo, G. H. F. Marques, E. M. Pituco

Abstract:

Glanders is a zoonotic disease of Equidae caused by the bacterium Burkholderia mallei presented in acute or chronic clinical forms with inflammatory nodules in the respiratory tract, lymphangitis and caseous lymph nodes. There is not a treatment with veterinary drugs to this life-threatening disease; thus, its occurrence must be notified to official animal health services and any infected animal must be eliminated. This study aims to detect B. mallei from horses euthanized in outbreaks of glanders in Brazil, providing a better understanding of the bacterial characteristics and determine a proper protocol for isolation. The work was carried out with the collaboration of the Ministry of Agriculture and the Sao Paulo State Animal Health Department, while its procedures were approved by the Committee of Ethics in Animal Experimentation from the Instituto Biologico (CETEA n°156/2017). To the present time, 16 horses from farms with outbreaks of glanders detected by complement fixation test (CFT) serology method were analyzed. During the necropsy, samples of possibly affected organs (lymph nodes, lungs, heart, liver, spleen, kidneys and trachea) were collected for bacterial isolation, molecular tests and pathology. Isolation was performed using two enriched mediums, a potato infusion agar with 5% sheep blood, 4% glycerol and antibiotics (penicilin100U/ mL), and another with the same ingredients except the antibiotic. A PCR protocol was modified for this study using primers design to identify a region of the Flip gen of B. mallei. Thru isolation, 12.5% (2/16) animals were confirmed positive using only the enriched medium with antibiotic and confirmed by PCR: from mediastinal and submandibular lymph nodes and lungs in one animal and from mediastinal lymph node in the other. The detection of the bacterium using PCR showed positivity of 100% (16/16) horses from 144 samples of organs. Pathology macroscopic lesions observed were catarrhal nasal discharge, fetlock ulcers, emaciation, lymphangitis in limbs, suppurative lymphangitis, lymph node enlargement, star shaped liver, and spleen scars, adherence of the renal capsule, pulmonary hemorrhage, and miliary nodules. Microscopic lesions were suppurative bronchopneumonia with microabscesses and Langhans giant cells in lungs; lymph nodes with abscesses and intense lymphoid reaction; hemosiderosis and abscesses in spleen. Positive samples on PCR will be sequenced later and analyzed comparing with previous records in the literature. A throughout description of the recent acute cases of glanders occurring in Brazil and characterization of the bacterium related will contribute to advances in the knowledge of the pathogenicity, clinical symptoms, and epidemiology of this zoonotic disease. Acknowledgment: This project is sponsored by FAPESP.

Keywords: equines, bacterial isolation, zoonosis, PCR, pathology

Procedia PDF Downloads 104
8 The Effects of the GAA15 (Gaelic Athletic Association 15) on Lower Extremity Injury Incidence and Neuromuscular Functional Outcomes in Collegiate Gaelic Games: A 2 Year Prospective Study

Authors: Brenagh E. Schlingermann, Clare Lodge, Paula Rankin

Abstract:

Background: Gaelic football, hurling and camogie are highly popular field games in Ireland. Research into the epidemiology of injury in Gaelic games revealed that approximately three quarters of the injuries in the games occur in the lower extremity. These injuries can have player, team and institutional impacts due to multiple factors including financial burden and time loss from competition. Research has shown it is possible to record injury data consistently with the GAA through a closed online recording system known as the GAA injury surveillance database. It has been established that determining the incidence of injury is the first step of injury prevention. The goals of this study were to create a dynamic GAA15 injury prevention programme which addressed five key components/goals; avoid positions associated with a high risk of injury, enhance flexibility, enhance strength, optimize plyometrics and address sports specific agilities. These key components are internationally recognized through the Prevent Injury, Enhance performance (PEP) programme which has proven reductions in ACL injuries by 74%. In national Gaelic games the programme is known as the GAA15 which has been devised from the principles of the PEP. No such injury prevention strategies have been published on this cohort in Gaelic games to date. This study will investigate the effects of the GAA15 on injury incidence and neuromuscular function in Gaelic games. Methods: A total of 154 players (mean age 20.32 ± 2.84) were recruited from the GAA teams within the Institute of Technology Carlow (ITC). Preseason and post season testing involved two objective screening tests; Y balance test and Three Hop Test. Practical workshops, with ongoing liaison, were provided to the coaches on the implementation of the GAA15. The programme was performed before every training session and game and the existing GAA injury surveillance database was accessed to monitor player’s injuries by the college sports rehabilitation athletic therapist. Retrospective analysis of the ITC clinic records were performed in conjunction with the database analysis as a means of tracking injuries that may have been missed. The effects of the programme were analysed by comparing the intervention groups Y balance and three hop test scores to an age/gender matched control group. Results: Year 1 results revealed significant increases in neuromuscular function as a result of the GAA15. Y Balance test scores for the intervention group increased in both the posterolateral (p=.005 and p=.001) and posteromedial reach directions (p= .001 and p=.001). A decrease in performance was determined for the three hop test (p=.039). Overall twenty-five injuries were reported during the season resulting in an injury rate of 3.00 injuries/1000hrs of participation; 1.25 injuries/1000hrs training and 4.25 injuries/1000hrs match play. Non-contact injuries accounted for 40% of the injuries sustained. Year 2 results are pending and expected April 2016. Conclusion: It is envisaged that implementation of the GAA15 will continue to reduce the risk of injury and improve neuromuscular function in collegiate Gaelic games athletes.

Keywords: GAA15, Gaelic games, injury prevention, neuromuscular training

Procedia PDF Downloads 315
7 Absolute Quantification of the Bexsero Vaccine Component Factor H Binding Protein (fHbp) by Selected Reaction Monitoring: The Contribution of Mass Spectrometry in Vaccinology

Authors: Massimiliano Biagini, Marco Spinsanti, Gabriella De Angelis, Sara Tomei, Ilaria Ferlenghi, Maria Scarselli, Alessia Biolchi, Alessandro Muzzi, Brunella Brunelli, Silvana Savino, Marzia M. Giuliani, Isabel Delany, Paolo Costantino, Rino Rappuoli, Vega Masignani, Nathalie Norais

Abstract:

The gram-negative bacterium Neisseria meningitidis serogroup B (MenB) is an exclusively human pathogen representing the major cause of meningitides and severe sepsis in infants and children but also in young adults. This pathogen is usually present in the 30% of healthy population that act as a reservoir, spreading it through saliva and respiratory fluids during coughing, sneezing, kissing. Among surface-exposed protein components of this diplococcus, factor H binding protein is a lipoprotein proved to be a protective antigen used as a component of the recently licensed Bexsero vaccine. fHbp is a highly variable meningococcal protein: to reflect its remarkable sequence variability, it has been classified in three variants (or two subfamilies), and with poor cross-protection among the different variants. Furthermore, the level of fHbp expression varies significantly among strains, and this has also been considered an important factor for predicting MenB strain susceptibility to anti-fHbp antisera. Different methods have been used to assess fHbp expression on meningococcal strains, however, all these methods use anti-fHbp antibodies, and for this reason, the results are affected by the different affinity that antibodies can have to different antigenic variants. To overcome the limitations of an antibody-based quantification, we developed a quantitative Mass Spectrometry (MS) approach. Selected Reaction Monitoring (SRM) recently emerged as a powerful MS tool for detecting and quantifying proteins in complex mixtures. SRM is based on the targeted detection of ProteoTypicPeptides (PTPs), which are unique signatures of a protein that can be easily detected and quantified by MS. This approach, proven to be highly sensitive, quantitatively accurate and highly reproducible, was used to quantify the absolute amount of fHbp antigen in total extracts derived from 105 clinical isolates, evenly distributed among the three main variant groups and selected to be representative of the fHbp circulating subvariants around the world. We extended the study at the genetic level investigating the correlation between the differential level of expression and polymorphisms present within the genes and their promoter sequences. The implications of fHbp expression on the susceptibility of the strain to killing by anti-fHbp antisera are also presented. To date this is the first comprehensive fHbp expression profiling in a large panel of Neisseria meningitidis clinical isolates driven by an antibody-independent MS-based methodology, opening the door to new applications in vaccine coverage prediction and reinforcing the molecular understanding of released vaccines.

Keywords: quantitative mass spectrometry, Neisseria meningitidis, vaccines, bexsero, molecular epidemiology

Procedia PDF Downloads 277
6 Antibiotic Prophylaxis Habits in Oral Implant Surgery in the Netherlands: A Cross-Sectional Survey

Authors: Fabio Rodriguez Sanchez, Josef Bruers, Iciar Arteagoitia, Carlos Rodriguez Andres

Abstract:

Background: Oral implants are a routine treatment to replace lost teeth. Although they have a high rate of success, implant failures do occur. Perioperative antibiotics have been suggested to prevent postoperative infections and dental implant failures, but they remain a controversial treatment among healthy patients. The objective of this study was to determine whether antibiotic prophylaxis is a common treatment in the Netherlands among general dentists, maxillofacial-surgeons, periodontists and implantologists in conjunction with oral implant surgery among healthy patients and to assess the nature of antibiotics prescriptions in order to evaluate whether any consensus has been reached and the current recommendations are being followed. Methodology: Observational cross-sectional study based on a web-survey reported according to the Strengthening the Reporting of Observational studies in Epidemiology (STROBE) guidelines. A validated questionnaire, developed by Deeb et al. (2015), was translated and slightly adjusted to circumstances in the Netherlands. It was used with the explicit permission of the authors. This questionnaire contained both close-ended and some open-ended questions in relation to the following topics: demographics, qualification, antibiotic type, prescription-duration and dosage. An email was sent February 2018 to a sample of 600 general dentists and all 302 oral implantologists, periodontists and maxillofacial surgeons who were recognized by the Dutch Association of Oral Implantology (NVOI) as oral health care providers placing oral implants. The email included a brief introduction about the study objectives and a link to the web questionnaire, which could be filled in anonymously. Overall, 902 questionnaires were sent. However, 29 questionnaires were not correctly received due to an incorrect email address. So a total number of 873 professionals were reached. Collected data were analyzed using SPSS (IBM Corp., released 2012, Armonk, NY). Results: The questionnaire was sent back by a total number of 218 participants (response rate=24.2%), 45 female (20.8%) and 171 male (79.2%). Two respondents were excluded from the study group because they were not currently working as oral health providers. Overall 151 (69.9%) placed oral implants on regular basis. Approximately 79 (52.7%) of these participants prescribed antibiotics only in determined situations, 66 (44.0%) prescribed antibiotics always and 5 dentists (3.3%) did not prescribe antibiotics at all when placing oral implants. Overall, 83 participants who prescribed antibiotics, did so both pre- and postoperatively (58.5%), 12 exclusively postoperative (8.5%), and 47 followed an exclusive preoperative regime (33.1%). A single dose of 2,000 mg amoxicillin orally 1-hour prior treatment was the most prescribed preoperative regimen. The most frequent prescribed postoperative regimen was 500 mg amoxicillin three times daily for 7 days after surgery. On average, oral health professionals prescribed 6,923 mg antibiotics in conjunction with oral implant surgery, varying from 500 to 14,600 mg. Conclusions: Antibiotic prophylaxis in conjunction with oral implant surgery is prescribed in the Netherlands on a rather large scale. Dutch professionals might prescribe antibiotics more cautiously than in other countries and there seems to be a lower range on the different antibiotic types and regimens being prescribed. Anyway, recommendations based on last-published evidence are frequently not being followed.

Keywords: clinical decision making, infection control, antibiotic prophylaxis, dental implants

Procedia PDF Downloads 118
5 Seroprevalence of Middle East Respiratory Syndrome Coronavirus (MERS-Cov) Infection among Healthy and High Risk Individuals in Qatar

Authors: Raham El-Kahlout, Hadi Yassin, Asmaa Athani, Marwan Abou Madi, Gheyath Nasrallah

Abstract:

Background: Since its first isolation in September 2012, Middle East respiratory syndrome coronavirus (MERS-CoV) has diffused across 27 countries infecting more than two thousand individuals with a high case fatality rate. MERS-CoV–specific antibodies are widely found in Dromedary camel along with viral shedding of similar viruses detected in human at same region, suggesting that MERS epidemiology may be central role by camel. Interestingly, MERS-CoV has also been also reported to be asymptomatic or to cause influenza-like mild illnesses. Therefore, in a country like Qatar (bordered Saudi Arabia), where camels are widely spread, serological surveys are important to explore the role of camels in MERS-CoV transmission. However, widespread strategic serological surveillances of MERS-CoV among populations, particularly in endemic country, are infrequent. In the absence of clear epidemiological view, cross-sectional MERS antibody surveillances in human populations are of global concern. Method: We performed a comparative serological screening of 4719 healthy blood donors, 135 baseline case contacts (high risk individual), and four MERS confirmed patients (by PCR) for the presence of anti-MERS IgG. Initially, samples were screened using Euroimmune anti- MERS-CoV IgG ELISA kit, the only commercial kit available in the market and recommended by the CDC as a screening kit. To confirm ELISA test results, farther serological testing was performed for all borderline and positive samples using two assays; the anti MERS-CoV IgG and IgM Euroimmune indirect immunofluorescent test (IIFT) and pseudoviral particle neutralizing assay (PPNA). Additionally, to test cross reactivity of anti-MERS-CoV antibody with other family members of coronavirus, borderline and positive samples were tested for the presence of the of IgG antibody of the following viruses; SARS, HCoV-229E, HKU1 using the Euroimmune IIFT for SARS and HCoV-229E and ELISA for HKU1. Results: In all of 4858 screened 15 samples [10 donors (0.21%, 10/4719), 1 case contact (0.77 %, 1/130), 3 patients (75%, 3/4)] anti-MERS IgG reactive/borderline samples were seen in ELISA. However, only 7 (0.14%) of them gave positive with in IIFT and only 3 (0.06%) was confirmed by the specific anti-MERS PPNA. One of the interesting findings was, a donor, who was selected in the control group as a negative anti-MERS IgG ELISA, yield reactive for anti-MERS IgM IIFT and was confirmed with the PPNA. Further, our preliminary results showed that there was a strong cross reactivity between anti- MERS-COV IgG with both HCoV-229E or anti-HKU1 IgG, yet, no cross reactivity of SARS were found. Conclusions: Our findings suggest that MERS-CoV is not heavily circulated among the population of Qatar and this is also indicated by low number of confirmed cases (only 18) since 2012. Additionally, the presence of antibody of other pathogenic human coronavirus may cause false positive results of both ELISA and IIFT, which stress the need for more evaluation studies for the available serological assays. Conclusion: this study provides an insight about the epidemiological view for MERS-CoV in Qatar population. It also provides a performance evaluation for the available serologic tests for MERS-CoV in a view of serologic status to other human coronaviruses.

Keywords: seroprevalence, MERS-CoV, healthy individuals, Qatar

Procedia PDF Downloads 246
4 CLOUD Japan: Prospective Multi-Hospital Study to Determine the Population-Based Incidence of Hospitalized Clostridium difficile Infections

Authors: Kazuhiro Tateda, Elisa Gonzalez, Shuhei Ito, Kirstin Heinrich, Kevin Sweetland, Pingping Zhang, Catia Ferreira, Michael Pride, Jennifer Moisi, Sharon Gray, Bennett Lee, Fred Angulo

Abstract:

Clostridium difficile (C. difficile) is the most common cause of antibiotic-associated diarrhea and infectious diarrhea in healthcare settings. Japan has an aging population; the elderly are at increased risk of hospitalization, antibiotic use, and C. difficile infection (CDI). Little is known about the population-based incidence and disease burden of CDI in Japan although limited hospital-based studies have reported a lower incidence than the United States. To understand CDI disease burden in Japan, CLOUD (Clostridium difficile Infection Burden of Disease in Adults in Japan) was developed. CLOUD will derive population-based incidence estimates of the number of CDI cases per 100,000 population per year in Ota-ku (population 723,341), one of the districts in Tokyo, Japan. CLOUD will include approximately 14 of the 28 Ota-ku hospitals including Toho University Hospital, which is a 1,000 bed tertiary care teaching hospital. During the 12-month patient enrollment period, which is scheduled to begin in November 2018, Ota-ku residents > 50 years of age who are hospitalized at a participating hospital with diarrhea ( > 3 unformed stools (Bristol Stool Chart 5-7) in 24 hours) will be actively ascertained, consented, and enrolled by study surveillance staff. A stool specimen will be collected from enrolled patients and tested at a local reference laboratory (LSI Medience, Tokyo) using QUIK CHEK COMPLETE® (Abbott Laboratories). which simultaneously tests specimens for the presence of glutamate dehydrogenase (GDH) and C. difficile toxins A and B. A frozen stool specimen will also be sent to the Pfizer Laboratory (Pearl River, United States) for analysis using a two-step diagnostic testing algorithm that is based on detection of C. difficile strains/spores harboring toxin B gene by PCR followed by detection of free toxins (A and B) using a proprietary cell cytotoxicity neutralization assay (CCNA) developed by Pfizer. Positive specimens will be anaerobically cultured, and C. difficile isolates will be characterized by ribotyping and whole genomic sequencing. CDI patients enrolled in CLOUD will be contacted weekly for 90 days following diarrhea onset to describe clinical outcomes including recurrence, reinfection, and mortality, and patient reported economic, clinical and humanistic outcomes (e.g., health-related quality of life, worsening of comorbidities, and patient and caregiver work absenteeism). Studies will also be undertaken to fully characterize the catchment area to enable population-based estimates. The 12-month active ascertainment of CDI cases among hospitalized Ota-ku residents with diarrhea in CLOUD, and the characterization of the Ota-ku catchment area, including estimation of the proportion of all hospitalizations of Ota-ku residents that occur in the CLOUD-participating hospitals, will yield CDI population-based incidence estimates, which can be stratified by age groups, risk groups, and source (hospital-acquired or community-acquired). These incidence estimates will be extrapolated, following age standardization using national census data, to yield CDI disease burden estimates for Japan. CLOUD also serves as a model for studies in other countries that can use the CLOUD protocol to estimate CDI disease burden.

Keywords: Clostridium difficile, disease burden, epidemiology, study protocol

Procedia PDF Downloads 227