Search results for: 10-year retrospective study
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 48887

Search results for: 10-year retrospective study

48287 Premature Departure of Active Women from the Working World: One Year Retrospective Study in the Tunisian Center

Authors: Lamia Bouzgarrou, Amira Omrane, Malika Azzouzi, Asma Kheder, Amira Saadallah, Ilhem Boussarsar, Kamel Rejeb

Abstract:

Introduction: Increasing the women’s labor force participation is a political issue in countries with developed economies and those with low growth prospects. However, in the labor market, women continue to face several obstacles, either for the integration or for the maintenance at work. This study aims to assess the prevalence of premature withdrawal from working life -due to invalidity or medical justified early retirement- among active women in the Tunisian center and to identify its determinants. Material and methods: We conducted a cross-sectional study, over one year, focusing on the agreement for invalidity or early retirement for premature usury of the body- delivered by the medical commission of the National Health Insurance Fund (CNAM) in the central Tunisian district. We exhaustively selected women's files. Data related to Socio-demographic characteristics, professional and medical ones, were collected from the CNAM's administrative and medical files. Results: During the period of one year, 222 women have had an agreement for premature departure of their professional activity. Indeed, 149 women (67.11%) benefit of from invalidity agreement and 20,27% of them from favorable decision for early retirement. The average age was 50 ± 6 years with extremes of 23 and 62 years, and 18.9% of women were under 45 years. Married women accounted for 69.4% and 59.9% of them had at least one dependent child in charge. The average professional seniority in the sector was 23 ± 8 years. The textile-clothing sector was the most affected, with 70.7% of premature departure. Medical reasons for withdrawal from working life were mainly related to neuro-degenerative diseases in 46.8% of cases, rheumatic ones in 35.6% of cases and cardiovascular diseases in 22.1% of them. Psychiatric and endocrine disorders motivated respectively 17.1% and 13.5% of these departures. The evaluation of the sequels induced by these pathologies concluded to an average permanent partial disability equal to 61.4 ± 17.3%. The analytical study concluded that the agreement of disability or early retirement was correlated with the insured ‘age (p = 10-3), the professional seniority (p = 0.003) and the permanent partial incapacity (PPI) rate assessed by the expert physician (p = 0.04). No other social or professional factors were correlated with this decision. Conclusion: Despite many advances in labour law and Tunisian legal text on employability, women still exposed to several social and professional inequalities (payment inequality, precarious work ...). Indeed, women are often pushed to accept working in adverse conditions, thus they are more vulnerable to develop premature wear on the body and being forced to premature departures from the world of work. These premature withdrawals from active life are not only harmful to the concerned women themselves, but also associated with considerable costs for the insurance organism and the society. In order to ensure maintenance at work for women, a political commitment is imperative in the implementation of global prevention strategies and the improvement of working conditions, particularly in our socio-cultural context.

Keywords: Active Women , Early Retirement , Invalidity , Maintenance at Work

Procedia PDF Downloads 136
48286 Automatic Adult Age Estimation Using Deep Learning of the ResNeXt Model Based on CT Reconstruction Images of the Costal Cartilage

Authors: Ting Lu, Ya-Ru Diao, Fei Fan, Ye Xue, Lei Shi, Xian-e Tang, Meng-jun Zhan, Zhen-hua Deng

Abstract:

Accurate adult age estimation (AAE) is a significant and challenging task in forensic and archeology fields. Attempts have been made to explore optimal adult age metrics, and the rib is considered a potential age marker. The traditional way is to extract age-related features designed by experts from macroscopic or radiological images followed by classification or regression analysis. Those results still have not met the high-level requirements for practice, and the limitation of using feature design and manual extraction methods is loss of information since the features are likely not designed explicitly for extracting information relevant to age. Deep learning (DL) has recently garnered much interest in imaging learning and computer vision. It enables learning features that are important without a prior bias or hypothesis and could be supportive of AAE. This study aimed to develop DL models for AAE based on CT images and compare their performance to the manual visual scoring method. Chest CT data were reconstructed using volume rendering (VR). Retrospective data of 2500 patients aged 20.00-69.99 years were obtained between December 2019 and September 2021. Five-fold cross-validation was performed, and datasets were randomly split into training and validation sets in a 4:1 ratio for each fold. Before feeding the inputs into networks, all images were augmented with random rotation and vertical flip, normalized, and resized to 224×224 pixels. ResNeXt was chosen as the DL baseline due to its advantages of higher efficiency and accuracy in image classification. Mean absolute error (MAE) was the primary parameter. Independent data from 100 patients acquired between March and April 2022 were used as a test set. The manual method completely followed the prior study, which reported the lowest MAEs (5.31 in males and 6.72 in females) among similar studies. CT data and VR images were used. The radiation density of the first costal cartilage was recorded using CT data on the workstation. The osseous and calcified projections of the 1 to 7 costal cartilages were scored based on VR images using an eight-stage staging technique. According to the results of the prior study, the optimal models were the decision tree regression model in males and the stepwise multiple linear regression equation in females. Predicted ages of the test set were calculated separately using different models by sex. A total of 2600 patients (training and validation sets, mean age=45.19 years±14.20 [SD]; test set, mean age=46.57±9.66) were evaluated in this study. Of ResNeXt model training, MAEs were obtained with 3.95 in males and 3.65 in females. Based on the test set, DL achieved MAEs of 4.05 in males and 4.54 in females, which were far better than the MAEs of 8.90 and 6.42 respectively, for the manual method. Those results showed that the DL of the ResNeXt model outperformed the manual method in AAE based on CT reconstruction of the costal cartilage and the developed system may be a supportive tool for AAE.

Keywords: forensic anthropology, age determination by the skeleton, costal cartilage, CT, deep learning

Procedia PDF Downloads 55
48285 A Foodborne Cholera Outbreak in a School Caused by Eating Contaminated Fried Fish: Hoima Municipality, Uganda, February 2018

Authors: Dativa Maria Aliddeki, Fred Monje, Godfrey Nsereko, Benon Kwesiga, Daniel Kadobera, Alex Riolexus Ario

Abstract:

Background: Cholera is a severe gastrointestinal disease caused by Vibrio cholera. It has caused several pandemics. On 26 February 2018, a suspected cholera outbreak, with one death, occurred in School X in Hoima Municipality, western Uganda. We investigated to identify the scope and mode of transmission of the outbreak, and recommend evidence-based control measures. Methods: We defined a suspected case as onset of diarrhea, vomiting, or abdominal pain in a student or staff of School X or their family members during 14 February–10 March. A confirmed case was a suspected case with V. cholerae cultured from stool. We reviewed medical records at Hoima Hospital and searched for cases at School X. We conducted descriptive epidemiologic analysis and hypothesis-generating interviews of 15 case-patients. In a retrospective cohort study, we compared attack rates between exposed and unexposed persons. Results: We identified 15 cases among 75 students and staff of School X and their family members (attack rate=20%), with onset from 25-28 February. One patient died (case-fatality rate=6.6%). The epidemic curve indicated a point-source exposure. On 24 February, a student brought fried fish from her home in a fishing village, where a cholera outbreak was ongoing. Of the 21 persons who ate the fish, 57% developed cholera, compared with 5.6% of 54 persons who did not eat (RR=10; 95% CI=3.2-33). None of 4 persons who recooked the fish before eating, compared with 71% of 17 who did not recook it, developed cholera (RR=0.0, 95%CIFisher exact=0.0-0.95). Of 12 stool specimens cultured, 6 yielded V. cholerae. Conclusion: This cholera outbreak was caused by eating fried fish, which might have been contaminated with V. cholerae in a village with an ongoing outbreak. Lack of thorough cooking of the fish might have facilitated the outbreak. We recommended thoroughly cooking fish before consumption.

Keywords: cholera, disease outbreak, foodborne, global health security, Uganda

Procedia PDF Downloads 181
48284 Strengthening Service Delivery to Improving Cervical Cancer Screening in Southwestern Nigeria: A Pilot Project

Authors: Afolabi K. Esther, Kuye Tolulope, Babafemi, L. Olayemi, Omikunle Yemisi

Abstract:

Background: Cervical cancer is a potentially preventable disease of public significance. All sexually active women are at risk of cervical cancer; however, the uptake and coverage are low in low-middle resource countries. Hence, the programme explored the feasibility of demonstrating an innovative and low-cost system approach to cervical cancer screening service delivery among reproductive-aged women in low–resource settings in Southwestern Nigeria. This was to promote the uptake and quality improvement of cervical cancer screening services. Methods: This study was an intervention project in three senatorial districts in Osun State that have primary, secondary and tertiary health facilities. The project was in three phases; Pre-intervention, Intervention, and Post-intervention. The study utilised the existing infrastructure, facilities and staff in project settings. The study population was nurse-midwives, community health workers and reproductive-aged women (30-49 years). The intervention phase entailed using innovative, culturally appropriate strategies to create awareness of cervical cancer and preventive health-seeking behaviour among women in the reproductive-aged group (30-49) years. Also, the service providers (community health workers, Nurses, and Midwives) were trained on screening methods and treatment of pre-cancerous lesions, and there was the provision of essential equipment and supplies for cervical cancer screening services at health facilities. Besides, advocacy and engagement were made with relevant stakeholders to integrate the cervical cancer screening services into related reproductive health services and greater allocation of resources. The expected results compared the pre and post-intervention using the baseline and process indicators and the effect of the intervention phase on screening coverage using a plausibility assessment design. The project lasted 12 months; visual Inspection with Acetic acid (VIA) screening for the women for six months and follow-up in 6 months for women receiving treatment. Results: The pre-intervention phase assessed baseline service delivery statistics in the previous 12 months drawn from the retrospective data collected as part of the routine monitoring and reporting systems. The uptake of cervical cancer screening services was low as the number of women screened in the previous 12 months was 156. Service personnel's competency level was fair (54%), and limited availability of essential equipment and supplies for cervical cancer screening services. At the post-intervention phase, the level of uptake had increased as the number of women screened was 1586 within six months in the study settings. This showed about a 100-%increase in the uptake of cervical cancer screening services compared with the baseline assessment. Also, the post-intervention level of competency of service delivery personnel had increased to 86.3%, which indicates quality improvement of the cervical cancer screening service delivery. Conclusion: the findings from the study have shown an effective approach to strengthening and improving cervical cancer screening service delivery in Southwestern Nigeria. Hence, the intervention promoted a positive attitude and health-seeking behaviour among the target population, significantly influencing the uptake of cervical cancer screening services.

Keywords: cervical cancer, screening, nigeria, health system strengthening

Procedia PDF Downloads 79
48283 Improving the Weekend Handover in General Surgery: A Quality Improvement Project

Authors: Michael Ward, Eliana Kalakouti, Andrew Alabi

Abstract:

Aim: The handover process is recognized as a vulnerable step in the patient care pathway where errors are likely to occur. As such, it is a major preventable cause of patient harm due to human factors of poor communication and systematic error. The aim of this study was to audit the general surgery department’s weekend handover process compared to the recommended criteria for safe handover as set out by the Royal College of Surgeons (RCS). Method: A retrospective audit of the General Surgery department’s Friday patient lists and patient medical notes used for weekend handover in a London-based District General Hospital (DGH). Medical notes were analyzed against RCS's suggested criteria for handover. A standardized paper weekend handover proforma was then developed in accordance with guidelines and circulated in the department. A post-intervention audit was then conducted using the same methods for cycle 1. For cycle 2, we introduced an electronic weekend handover tool along with Electronic Patient Records (EPR). After a one-month period, a second post-intervention audit was conducted. Results: Following cycle 1, the paper weekend handover proforma was only used in 23% of patient notes. However, when it was used, 100% of them had a plan for the weekend, diagnosis and location but only 40% documented potential discharge status and 40% ceiling of care status. Qualitative feedback was that it was time-consuming to fill out. Better results were achieved following cycle 2, with 100% of patient notes having the electronic proforma. Results improved with every patient having documented ceiling of care, discharge status and location. Only 55% of patients had a past surgical history; however, this was still an increase when compared to paper proforma (45%). When comparing electronic versus paper proforma, there was an increase in documentation in every domain of the handover outlined by RCS with an average relative increase of 1.72 times (p<0.05). Qualitative feedback was that the autofill function made it easy to use and simple to view. Conclusion: These results demonstrate that the implementation of an electronic autofill handover proforma significantly improved handover compliance with RCS guidelines, thereby improving the transmission of information from week-day to weekend teams.

Keywords: surgery, handover, proforma, electronic handover, weekend, general surgery

Procedia PDF Downloads 139
48282 Thyroid Malignancy Concurrent with Hyperthyroidism: Variations with Thyroid Status and Age

Authors: N. J. Nawarathna, N. R. Kmarasinghe, D. Chandrasekara, B. M. R. S. Balasooriya, R. A. A. Shaminda, R. J. K. Senevirathne

Abstract:

Introduction: Thyroid malignancy associated with hyperthyroidism is considered rare. Retrospective studies have shown the incidence of thyroid malignancy in hyperthyroid patients to be low (0.7-8.5%). To assess the clinical relevance of this association, thyroid status in a cohort of patients with thyroid malignancy were analyzed. Method: Thyroid malignancies diagnosed histologically in 56 patients, over a 18 month period beginning from April 2013, in a single surgical unit at Teaching Hospital Kandy were included. Preoperative patient details and progression of thyroid status were asessed with Thyroid Stimulating Hormone, free Thyroxin and free Triiodothyronine levels. Results: Amongst 56 patients Papillary carcinoma was diagnosed in 44(78.6%), follicular carcinomas in 7(12.5%) and 5(8.9%) with medullary and anaplastic carcinomas. 12(21.4%) were males and 44(78.6%) were females. 20(35.7%) were less than 40years, 29(51.8%) were between 40 to 59years and 7(12.5%) were above 59years. Cross tabulation of Type of carcinoma with Gender revealed likelihood ratio of 6.908, Significance p = 0.032. Biochemically 12(21.4%) were hyperthyroid. Out of them 5(41.7%) had primary hyperthyroidism and 7(58.3%) had secondary hyperthyroidism. Mean age of euthyroid patients was 43.77years (SD 10.574) and hyperthyroid patients was 53.25years(SD 16.057). Independent Samples Test t is -2.446, two tailed significance p =0.018. When cross tabulate thyroid status with Age group Likelihood Ratio was 9.640, Significance p = 0.008. Conclusion: Papillary carcinoma is seen more among females. Among the patients with thyroid carcinomas, those with biochemically proven hyperthyroidism were more among the older age group than those who were euthyroid. Hence careful evaluation of elderly hyperthyroid patients to select the most suitable therapeutic approach is justified.

Keywords: age, hyperthyroidism, thyroid malignancy, thyroid status

Procedia PDF Downloads 383
48281 Predictors of Motor and Cognitive Domains of Functional Performance after Rehabilitation of Individuals with Acute Stroke

Authors: A. F. Jaber, E. Dean, M. Liu, J. He, D. Sabata, J. Radel

Abstract:

Background: Stroke is a serious health care concern and a major cause of disability in the United States. This condition impacts the individual’s functional ability to perform daily activities. Predicting functional performance of people with stroke assists health care professionals in optimizing the delivery of health services to the affected individuals. The purpose of this study was to identify significant predictors of Motor FIM and of Cognitive FIM subscores among individuals with stroke after discharge from inpatient rehabilitation (typically 4-6 weeks after stroke onset). A second purpose is to explore the relation among personal characteristics, health status, and functional performance of daily activities within 2 weeks of stroke onset. Methods: This study used a retrospective chart review to conduct a secondary analysis of data obtained from the Healthcare Enterprise Repository for Ontological Narration (HERON) database. The HERON database integrates de-identified clinical data from seven different regional sources including hospital electronic medical record systems of the University of Kansas Health System. The initial HERON data extract encompassed 1192 records and the final sample consisted of 207 participants who were mostly white (74%) males (55%) with a diagnosis of ischemic stroke (77%). The outcome measures collected from HERON included performance scores on the National Institute of Health Stroke Scale (NIHSS), the Glasgow Coma Scale (GCS), and the Functional Independence Measure (FIM). The data analysis plan included descriptive statistics, Pearson correlation analysis, and Stepwise regression analysis. Results: significant predictors of discharge Motor FIM subscores included age, baseline Motor FIM subscores, discharge NIHSS scores, and comorbid electrolyte disorder (R2 = 0.57, p <0.026). Significant predictors of discharge Cognitive FIM subscores were age, baseline cognitive FIM subscores, client cooperative behavior, comorbid obesity, and the total number of comorbidities (R2 = 0.67, p <0.020). Functional performance on admission was significantly associated with age (p < 0.01), stroke severity (p < 0.01), and length of hospital stay (p < 0.05). Conclusions: our findings show that younger age, good motor and cognitive abilities on admission, mild stroke severity, fewer comorbidities, and positive client attitude all predict favorable functional outcomes after inpatient stroke rehabilitation. This study provides health care professionals with evidence to evaluate predictors of favorable functional outcomes early at stroke rehabilitation, to tailor individualized interventions based on their client’s anticipated prognosis, and to educate clients about the benefits of making lifestyle changes to improve their anticipated rate of functional recovery.

Keywords: functional performance, predictors, stroke, recovery

Procedia PDF Downloads 130
48280 Retrospective Demographic Analysis of Patients Lost to Follow-Up from Antiretroviral Therapy in Mulanje Mission Hospital, Malawi

Authors: Silas Webb, Joseph Hartland

Abstract:

Background: Long-term retention of patients on ART has become a major health challenge in Sub-Saharan Africa (SSA). In 2010 a systematic review of 39 papers found that 30% of patients were no longer taking their ARTs two years after starting treatment. In the same review, it was noted that there was a paucity of data as to why patients become lost to follow-up (LTFU) in SSA. This project was performed in Mulanje Mission Hospital in Malawi as part of Swindon Academy’s Global Health eSSC. The HIV prevalence for Malawi is 10.3%, one of the highest rates in the world, however prevalence soars to 18% in the Mulanje. Therefore it is essential that patients at risk of being LTFU are identified early and managed appropriately to help them continue to participate in the service. Methodology: All patients on adult antiretroviral formulations at MMH, who were classified as ‘defaulters’ (patients missing a scheduled follow up visit by more than two months) over the last 12 months were included in the study. Demographic varibales were collected from Mastercards for data analysis. A comparison group of patients currently not lost to follow up was created by using all of the patients who attended the HIV clinic between 18th-22nd July 2016 who had never defaulted from ART. Data was analysed using the chi squared (χ²) test, as data collected was categorical, with alpha levels set at 0.05. Results: Overall, 136 patients had defaulted from ART over the past 12 months at MMH. Of these, 43 patients had missing Mastercards, so 93 defaulter datasets were analysed. In the comparison group 93 datasets were also analysed and statistical analysis done using Chi-Squared testing. A higher proportion of men in the defaulting group was noted (χ²=0.034) and defaulters tended to be younger (χ²=0.052). 94.6% of patients who defaulted were taking Tenofovir, Lamivudine and Efavirenz, the standard first line ART therapy in Malawi. The mean length of time on ART was 39.0 months (RR: -22.4-100.4) in the defaulters group and 47.3 months (RR: -19.71-114.23) in the control group, with a mean difference of 8.3 less months in the defaulters group (χ ²=0.056). Discussion: The findings in this study echo the literature, however this review expands on that and shows the demographic for the patient at most risk of defaulting and being LTFU would be: a young male who has missed more than 4 doses of ART and is within his first year of treatment. For the hospital, this data is important at it identifies significant areas for public health focus. For instance, fear of disclosure and stigma may be disproportionately affecting younger men, so interventions can be aimed specifically at them to improve their health outcomes. The mean length of time on medication was 8.3 months less in the defaulters group, with a p-value of 0.056, emphasising the need for more intensive follow-up in the early stages of treatment, when patients are at the highest risk of defaulting.

Keywords: anti-retroviral therapy, ART, HIV, lost to follow up, Malawi

Procedia PDF Downloads 169
48279 Prevalence, Median Time, and Associated Factors with the Likelihood of Initial Antidepressant Change: A Cross-Sectional Study

Authors: Nervana Elbakary, Sami Ouanes, Sadaf Riaz, Oraib Abdallah, Islam Mahran, Noriya Al-Khuzaei, Yassin Eltorki

Abstract:

Major Depressive Disorder (MDD) requires therapeutic interventions during the initial month after being diagnosed for better disease outcomes. International guidelines recommend a duration of 4–12 weeks for an initial antidepressant (IAD) trial at an optimized dose to get a response. If depressive symptoms persist after this duration, guidelines recommend switching, augmenting, or combining strategies as the next step. Most patients with MDD in the mental health setting have been labeled incorrectly as treatment-resistant where in fact they have not been subjected to an adequate trial of guideline-recommended therapy. Premature discontinuation of IAD due to ineffectiveness can cause unfavorable consequences. Avoiding irrational practices such as subtherapeutic doses of IAD, premature switching between the ADs, and refraining from unjustified polypharmacy can help the disease to go into a remission phase We aimed to determine the prevalence and the patterns of strategies applied after an IAD was changed because of a suboptimal response as a primary outcome. Secondary outcomes included the median survival time on IAD before any change; and the predictors that were associated with IAD change. This was a retrospective cross- sectional study conducted in Mental Health Services in Qatar. A dataset between January 1, 2018, and December 31, 2019, was extracted from the electronic health records. Inclusion and exclusion criteria were defined and applied. The sample size was calculated to be at least 379 patients. Descriptive statistics were reported as frequencies and percentages, in addition, to mean and standard deviation. The median time of IAD to any change strategy was calculated using survival analysis. Associated predictors were examined using two unadjusted and adjusted cox regression models. A total of 487 patients met the inclusion criteria of the study. The average age for participants was 39.1 ± 12.3 years. Patients with first experience MDD episode 255 (52%) constituted a major part of our sample comparing to the relapse group 206(42%). About 431 (88%) of the patients had an occurrence of IAD change to any strategy before end of the study. Almost half of the sample (212 (49%); 95% CI [44–53%]) had their IAD changed less than or equal to 30 days. Switching was consistently more common than combination or augmentation at any timepoint. The median time to IAD change was 43 days with 95% CI [33.2–52.7]. Five independent variables (age, bothersome side effects, un-optimization of the dose before any change, comorbid anxiety, first onset episode) were significantly associated with the likelihood of IAD change in the unadjusted analysis. The factors statistically associated with higher hazard of IAD change in the adjusted analysis were: younger age, un-optimization of the IAD dose before any change, and comorbid anxiety. Because almost half of the patients in this study changed their IAD as early as within the first month, efforts to avoid treatment failure are needed to ensure patient-treatment targets are met. The findings of this study can have direct clinical guidance for health care professionals since an optimized, evidence-based use of AD medication can improve the clinical outcomes of patients with MDD; and also, to identify high-risk factors that could worsen the survival time on IAD such as young age and comorbid anxiety

Keywords: initial antidepressant, dose optimization, major depressive disorder, comorbid anxiety, combination, augmentation, switching, premature discontinuation

Procedia PDF Downloads 130
48278 Beneficial Effect of Autologous Endometrial Stromal Cell Co-Culture on Day 3 Embryo Quality

Authors: I. Bochev, A. Shterev, S. Kyurkchiev

Abstract:

One of the factors associated with poor success rates in human in vitro fertilization (IVF) is the suboptimal culture conditions in which fertilization and early embryonic growth occur. Co-culture systems with helper cell lines appear to enhance the in vitro conditions and allow embryos to demonstrate improved in vitro development. The co-culture of human embryos with monolayers of autologous endometrial stromal cell (EnSCs) results in increased blastocyst development with a larger number of blastomeres, lower incidence of fragmentation and higher pregnancy rates in patients with recurrent implantation failure (RIF). The aim of the study was to examine the influence of autologous endometrial stromal cell (EnSC) co-culture on day 3 embryo quality by comparing the morphological status of the embryos from the same patients undergoing consecutive IVF/Intracytoplasmic sperm injection (ICSI) cycles without and with EnSC co-culture. This retrospective randomized study (2015-2017) includes 20 couples and a total of 46 IVF/ICSI cycles. Each patient couple included had at least two IVF/ICSI procedures – one with and one without autologous EnSC co-culture. Embryo quality was assessed at 68±1 hours in culture, according to Istanbul consensus criteria (2010). Day 3 embryos were classified into three groups: good – grade 1; fair – grade 2; poor – grade 3. Embryos from all cycles were divided into two groups (A – co-cultivated; B – not co-cultivated) and analyzed. Second, for each patient couple, embryos from matched IVF/ICSI cycles (with and without co-culture) were analyzed separately. When an analysis of co-cultivated day 3 embryos from all cycles was performed (n=137; group A), 43.1% of the embryos were graded as “good”, which was not significantly different from the respective embryo quality rate of 42.2% (p = NS) in group B (n=147) with non-co-cultivated embryos. The proportions of fair and poor quality embryos in group A and group B were similar as well – 11.7% vs 10.2% and 45.2% vs 47.6% (p=NS), respectively. Nevertheless, the separate embryo analysis by matched cycles for each couple revealed that in 65% of the cases the proportion of morphologically better embryos was increased in cycles with co-culture in comparison with those without co-culture. A decrease in this proportion after endometrial stromal cell co-cultivation was found in 30% of the cases, whereas no difference was observed in only one couple. The results demonstrated that there is no marked difference in the overall morphological quality between co-cultured and non-co-cultured embryos on day 3. However, in significantly greater percentage of couples the process of autologous EnSC co-culture could increase the proportion of morphologically improved day 3 embryos. By mimicking the in vivo relationship between embryo and maternal environment, co-culture in autologous EnSC system represents a perspective approach to improve the quality of embryos in cases with elevated risk for development of embryos with impaired morphology.

Keywords: autologous endometrial stromal cells, co-culture, day 3 embryo, morphological quality

Procedia PDF Downloads 208
48277 Applying the Global Trigger Tool in German Hospitals: A Retrospective Study in Surgery and Neurosurgery

Authors: Mareen Brosterhaus, Antje Hammer, Steffen Kalina, Stefan Grau, Anjali A. Roeth, Hany Ashmawy, Thomas Gross, Marcel Binnebosel, Wolfram T. Knoefel, Tanja Manser

Abstract:

Background: The identification of critical incidents in hospitals is an essential component of improving patient safety. To date, various methods have been used to measure and characterize such critical incidents. These methods are often viewed by physicians and nurses as external quality assurance, and this creates obstacles to the reporting events and the implementation of recommendations in practice. One way to overcome this problem is to use tools that directly involve staff in measuring indicators of quality and safety of care in the department. One such instrument is the global trigger tool (GTT), which helps physicians and nurses identify adverse events by systematically reviewing randomly selected patient records. Based on so-called ‘triggers’ (warning signals), indications of adverse events can be given. While the tool is already used internationally, its implementation in German hospitals has been very limited. Objectives: This study aimed to assess the feasibility and potential of the global trigger tool for identifying adverse events in German hospitals. Methods: A total of 120 patient records were randomly selected from two surgical, and one neurosurgery, departments of three university hospitals in Germany over a period of two months per department between January and July, 2017. The records were reviewed using an adaptation of the German version of the Institute for Healthcare Improvement Global Trigger Tool to identify triggers and adverse event rates per 1000 patient days and per 100 admissions. The severity of adverse events was classified using the National Coordinating Council for Medication Error Reporting and Prevention. Results: A total of 53 adverse events were detected in the three departments. This corresponded to adverse event rates of 25.5-72.1 per 1000 patient-days and from 25.0 to 60.0 per 100 admissions across the three departments. 98.1% of identified adverse events were associated with non-permanent harm without (Category E–71.7%) or with (Category F–26.4%) the need for prolonged hospitalization. One adverse event (1.9%) was associated with potentially permanent harm to the patient. We also identified practical challenges in the implementation of the tool, such as the need for adaptation of the global trigger tool to the respective department. Conclusions: The global trigger tool is feasible and an effective instrument for quality measurement when adapted to the departmental specifics. Based on our experience, we recommend a continuous use of the tool thereby directly involving clinicians in quality improvement.

Keywords: adverse events, global trigger tool, patient safety, record review

Procedia PDF Downloads 232
48276 Creating and Questioning Research-Oriented Digital Outputs to Manuscript Metadata: A Case-Based Methodological Investigation

Authors: Diandra Cristache

Abstract:

The transition of traditional manuscript studies into the digital framework closely affects the methodological premises upon which manuscript descriptions are modeled, created, and questioned for the purpose of research. This paper intends to explore the issue by presenting a methodological investigation into the process of modeling, creating, and questioning manuscript metadata. The investigation is founded on a close observation of the Polonsky Greek Manuscripts Project, a collaboration between the Universities of Cambridge and Heidelberg. More than just providing a realistic ground for methodological exploration, along with a complete metadata set for computational demonstration, the case study also contributes to a broader purpose: outlining general methodological principles for making the most out of manuscript metadata by means of research-oriented digital outputs. The analysis mainly focuses on the scholarly approach to manuscript descriptions, in the specific instance where the act of metadata recording does not have a programmatic research purpose. Close attention is paid to the encounter of 'traditional' practices in manuscript studies with the formal constraints of the digital framework: does the shift in practices (especially from the straight narrative of free writing towards the hierarchical constraints of the TEI encoding model) impact the structure of metadata and its capability to respond specific research questions? It is argued that flexible structure of TEI and traditional approaches to manuscript description lead to a proliferation of markup: does an 'encyclopedic' descriptive approach ensure the epistemological relevance of the digital outputs to metadata? To provide further insight on the computational approach to manuscript metadata, the metadata of the Polonsky project are processed with techniques of distant reading and data networking, thus resulting in a new group of digital outputs (relational graphs, geographic maps). The computational process and the digital outputs are thoroughly illustrated and discussed. Eventually, a retrospective analysis evaluates how the digital outputs respond to the scientific expectations of research, and the other way round, how the requirements of research questions feed back into the creation and enrichment of metadata in an iterative loop.

Keywords: digital manuscript studies, digital outputs to manuscripts metadata, metadata interoperability, methodological issues

Procedia PDF Downloads 122
48275 Clinical and Molecular Characterization of Ichthyosis at King Abdulaziz Medical City, Riyadh KSA

Authors: Reema K. AlEssa, Sahar Alshomer, Abdullah Alfaleh, Sultan ALkhenaizan, Mohammed Albalwi

Abstract:

Ichthyosis is a disorder of abnormal keratinization, characterized by excessive scaling, and consists of more than twenty subtypes varied in severity, mode of inheritance, and the genes involved. There is insufficient data in the literature about the epidemiology and characteristics of ichthyosis locally. Our aim is to identify the histopathological features and genetic profile of ichthyosis. Method: It is an observational retrospective case series study conducted in March 2020, included all patients who were diagnosed with Ichthyosis and confirmed by histological and molecular findings over the last 20 years in King Abdulaziz Medical City (KAMC), Riyadh, Saudi Arabia. Molecular analysis was performed by testing genomic DNA and checking genetic variations using the AmpliSeq panel. All disease-causing variants were checked against HGMD, ClinVar, Genome Aggregation Database (gnomAD), and Exome Aggregation Consortium (ExAC) databases. Result: A total of 60 cases of Ichthyosis were identified with a mean age of 13 ± 9.2. There is an almost equal distribution between female patients 29 (48%) and males 31 (52%). The majority of them were Saudis, 94%. More than half of patients presented with general scaling 33 (55%), followed by dryness and coarse skin 19 (31.6%) and hyperlinearity 5 (8.33%). Family history and history of consanguinity were seen in 26 (43.3% ), 13 (22%), respectively. History of colloidal babies was found in 6 (10%) cases of ichthyosis. The most frequent genes were ALOX12B, ALOXE3, CERS3, CYP4F22, DOLK, FLG2, GJB2, PNPLA1, SLC27A4, SPINK5, STS, SUMF1, TGM1, TGM5, VPS33B. Most frequent variations were detected in CYP4F22 in 16 cases (26.6%) followed by ALOXE3 6 (10%) and STS 6 (10%) then TGM1 5 (8.3) and ALOX12B 5 (8.3). The analysis of molecular genetic identified 23 different genetic variations in the genes of ichthyosis, of which 13 were novel mutations. Homozygous mutations were detected in the majority of ichthyosis cases, 54 (90%), and only 1 case was heterozygous. Few cases, 4 (6.6%) had an unknown type of ichthyosis with a negative genetic result. Conclusion: 13 novel mutations were discovered. Also, about half of ichthyosis patients had a positive history of consanguinity.

Keywords: ichthyosis, genetic profile, molecular characterization, congenital ichthyosis

Procedia PDF Downloads 179
48274 Evaluation of Prehabilitation Prior to Surgery for an Orthopaedic Pathway

Authors: Stephen McCarthy, Joanne Gray, Esther Carr, Gerard Danjoux, Paul Baker, Rhiannon Hackett

Abstract:

Background: The Go Well Health (GWH) platform is a web-based programme that allows patients to access personalised care plans and resources, aimed at prehabilitation prior to surgery. The online digital platform delivers essential patient education and support for patients prior to undergoing total hip replacements (THR) and total knee replacements (TKR). This study evaluated the impact of an online digital platform (ODP) in terms of functional health outcomes, health related quality of life and hospital length of stay following surgery. Methods: A retrospective cohort study comparing a cohort of patients who used the online digital platform (ODP) to deliver patient education and support (PES) prior to undergoing THR and TKR surgery relative to a cohort of patients who did not access the ODP and received usual care. Routinely collected Patient Reported Outcome Measures (PROMs) data was obtained on 2,406 patients who underwent a knee replacement (n=1,160) or a hip replacement (n=1,246) between 2018 and 2019 in a single surgical centre in the United Kingdom. The Oxford Hip and Knee Score and the European Quality of Life Five-Dimensional tool (EQ5D-5L) was obtained both pre-and post-surgery (at 6 months) along with hospital LOS. Linear regression was used to compare the estimate the impact of GWH on both health outcomes and negative binomial regressions were used to impact on LOS. All analyses adjusted for age, sex, Charlson Comorbidity Score and either pre-operative Oxford Hip/Knee scores or pre-operative EQ-5D scores. Fractional polynomials were used to represent potential non-linear relationships between the factors included in the regression model. Findings: For patients who underwent a knee replacement, GWH had a statistically significant impact on Oxford Knee Scores and EQ5D-5L utility post-surgery (p=0.039 and p=0.002 respectively). GWH did not have a statistically significant impact on the hospital length of stay. For those patients who underwent a hip replacement, GWH had a statistically significant impact on Oxford Hip Scores and EQ5D-5L utility post (p=0.000 and p=0.009 respectively). GWH also had a statistically significant reduction in the hospital length of stay (p=0.000). Conclusion: Health Outcomes were higher for patients who used the GWH platform and underwent THR and TKR relative to those who received usual care prior to surgery. Patients who underwent a hip replacement and used GWH also had a reduced hospital LOS. These findings are important for health policy and or decision makers as they suggest that prehabilitation via an ODP can maximise health outcomes for patients following surgery whilst potentially making efficiency savings with reductions in LOS.

Keywords: digital prehabilitation, online digital platform, orthopaedics, surgery

Procedia PDF Downloads 175
48273 Analysis of the Evolution of Techniques and Review in Cleft Surgery

Authors: Tomaz Oliveira, Rui Medeiros, André Lacerda

Abstract:

Introduction: Cleft lip and/or palate are the most frequent forms of congenital craniofacial anomalies, affecting mainly the middle third of the face and manifesting by functional and aesthetic changes. Bilateral cleft lip represents a reconstructive surgical challenge, not only for the labial component but also for the associated nasal deformation. Recently, the paradigm of the approach to this pathology has changed, placing the focus on muscle reconstruction and anatomical repositioning of the nasal cartilages in order to obtain the best aesthetic and functional results. The aim of this study is to carry out a systematic review of the surgical approach to bilateral cleft lip, retrospectively analyzing the case series of Plastic Surgery Service at Hospital Santa Maria (Lisbon, Portugal) regarding this pathology, the global assessment of the characteristics of the operated patients and the study of the different surgical approaches and their complications in the last 20 years. Methods: The present work demonstrates a retrospective and descriptive study of patients who underwent at least one reconstructive surgery for cleft lip and/or palate, in the CPRE service of the HSM, in the period between January 1 of 1997 and December 31 of 2017, in which the data relating to 361 individuals were analyzed who, after applying the exclusion criteria, constituted a sample of 212 participants. The variables analyzed were the year of the first surgery, gender, age, type of orofacial cleft, surgical approach, and its complications. Results: There was a higher overall prevalence in males, with cleft lip and cleft palate occurring in greater proportion in males, with the cleft palate being more common in females. The most frequently recorded malformation was cleft lip and palate, which is complete in most cases. Regarding laterality, alterations with a unilateral labial component were the most commonly observed, with the left lip being described as the most affected. It was found that the vast majority of patients underwent primary intervention up to 12 months of age. The surgical techniques used in the approach to this pathology showed an important chronological variation over the years. Discussion: Cleft lip and/or palate is a medical condition associated with high aesthetic and functional morbidity, which requires early treatment in order to optimize the long-term outcome. The existence of a nasolabial component and its respective surgical correction plays a central role in the treatment of this pathology. The high rates of post-surgical complications and unconvincing aesthetic results have motivated an evolution of the surgical technique, increasingly evident in recent years, allowing today to achieve satisfactory aesthetic results, even in bilateral cleft lip with high deformation complexity. The introduction of techniques that favor nasolabial reconstruction based on anatomical principles has been producing increasingly convincing results. The analyzed sample shows that most of the results obtained in this study are, in general, compatible with the results published in the literature. Conclusion: This work showed that the existence of small variations in the surgical technique can bring significant improvements in the functional and aesthetic results in the treatment of bilateral cleft lip.

Keywords: cleft lip, palate lip, congenital abnormalities, cranofacial malformations

Procedia PDF Downloads 94
48272 An Assessment of Inferior Dental (IDN) and Lingual Nerve (LN) Injuries Following Third Molar Removal Under LA, IVS, and GA - An Audit and Case-Series

Authors: Aamna Tufail, Catherine Anyanwu

Abstract:

Introduction/Aims: Neurosensory deficits following third molar removal affect the quality of life markedly. The purpose of this audit was to evaluate the incidence of IDN and LN damage and to compare departmental rates to an established standard. A secondary objective was to provide a descriptive summary of identified cases for clinical learning. Materials and Methods: A retrospective audit was conducted by a telephone survey of 101 patients who had third molar extractions performed under LA, IVS, or GA from January 2019 to June 2020 at a District General Hospital. The results were compared to a clinical standard identified as Cheng et al1. Data collection included mode of surgery, mode of anaesthesia, grade of clinician, assessment of difficulty, severity, and duration of symptoms. Results/Statistics: A total of 101 patients had 136 third molars extracted. Age range was 18-84 years. 44% extractions were under LA, 52% under GA, and 4% under IV sedation. 30% were simple extractions, 68% were surgical removals, 2% were unspecified. 89% extractions were performed by an Associate Specialist, 5% by a consultant, and 6% by unspecified grade of clinician. The rate of IDN injuries was 2.9% (n=4), higher than standard (0.3%). The rate of LN injuries was 0.7% (n=1), same as standard (0.7%). The 5 cases of neurosensory deficits are discussed in detail. Conclusions/Clinical Relevance: The rate of ID nerve injuries was higher than the standard. The rate of LN complications was lower than the standard.

Keywords: inferior dental nerve, lingual nerve, nerve injuries, third molars

Procedia PDF Downloads 77
48271 Effects of Parental Socio-Economic Status and Individuals' Educational Achievement on Their Socio-Economic Status: A Study of South Korea

Authors: Eun-Jeong Jang

Abstract:

Inequality has been considered as a core issue in public policy. Korea is categorized into one of the countries in the high level of inequality, which matters to not only current but also future generations. The relationship between individuals' origin and destination has an implication of intergenerational inequality. The previous work on this was mostly conducted at macro level using panel data to our knowledge. However, in this level, there is no room to track down what happened during the time between origin and destination. Individuals' origin is represented by their parents' socio-economic status, and in the same way, destination is translated into their own socio-economic status. The first research question is that how origin is related to the destination. Certainly, destination is highly affected by origin. In this view, people's destination is already set to be more or less than a reproduction of previous generations. However, educational achievement is widely believed as an independent factor from the origin. From this point of view, there is a possibility to change the path given by parents by educational attainment. Hence, the second research question would be that how education is related to destination and also, which factor is more influential to destination between origin and education. Also, the focus lies in the mediation of education between origin and destination, which would be the third research question. Socio-economic status in this study is referring to class as a sociological term, as well as wealth including labor and capital income, as an economic term. The combination of class and wealth would be expected to give more accurate picture about the hierarchy in a society. In some cases of non-manual and professional occupations, even though they are categorized into relatively high class, their income is much lower than those who in the same class. Moreover, it is one way to overcome the limitation of the retrospective view during survey. Education is measured as an absolute term, the years of schooling, and also as a relative term, the rank of school. Moreover, all respondents were asked the effort scaled by time intensity, self-motivation, before and during the course of their college based on a standard questionnaire academic achieved model provides. This research is based on a survey at an individual level. The target for sampling is an individual who has a job, regardless of gender, including income-earners and self-employed people and aged between thirties and forties because this age group is considered to reach the stage of job stability. In most cases, the researcher met respondents person to person visiting their work place or home and had a chance to interview some of them. One hundred forty individual data collected from May to August in 2017. It will be analyzed by multiple regression (Q1, Q2) and structural equation modeling (Q3).

Keywords: class, destination, educational achievement, effort, income, origin, socio-economic status, South Korea

Procedia PDF Downloads 251
48270 A Joinpoint Regression Analysis of Trends in Tuberculosis Notifications in Two Urban Regions in Namibia

Authors: Anna M. N. Shifotoka, Richard Walker, Katie Haighton, Richard McNally

Abstract:

An analysis of trends in Case Notification Rates (CNR) can be used to monitor the impact of Tuberculosis (TB) control interventions over time in order to inform the implementation of current and future TB interventions. A retrospective analysis of trends in TB CNR for two urban regions in Namibia, namely Khomas and Erongo regions, was conducted. TB case notification data were obtained from annual TB reports of the national TB programme, Ministry of Health and Social Services, covering the period from 1997 to 2015. Joinpoint regression was used to analyse trends in CNR for different types of TB groups. A trend was considered to be statistically significant when a p-value was less than 0.05. During the period under review, the crude CNR for all forms of TB declined from 808 to 400 per 100 000 population in Khomas, and from 1051 to 611 per 100 000 population in Erongo. In both regions, significant change points in trends were observed for all types of TB groups examined. In Khomas region, the trend for new smear positive pulmonary TB increased significantly by an annual rate of 4.1% (95% Confidence Interval (CI): 0.3% to 8.2%) during the period 1997 to 2004, and thereafter declined significantly by -6.2% (95%CI: -7.7% to -4.3%) per year until 2015. Similarly, the trend for smear negative pulmonary TB increased significantly by 23.7% (95%CI: 9.7 to 39.5) per year from 1997 to 2004 and thereafter declined significantly by an annual change of -26.4% (95%CI: -33.1% to -19.8%). The trend for all forms of TB CNR in Khomas region increased significantly by 8.1% (95%CI: 3.7 to 12.7) per year from 1997 to 2004 and thereafter declined significantly a rate of -8.7% (95%CI: -10.6 to -6.8). In Erongo region, the trend for smear positive pulmonary TB increased at a rate of 1.2% (95%CI: -1.2% to 3.6%) annually during the earlier years (1997 to 2008), and thereafter declined significantly by -9.3% (95%CI: -13.3% to -5.0%) per year from 2008 to 2015. Also in Erongo, the trend for all forms of TB CNR increased significantly by an annual rate of 4.0% (95%CI: 1.4% to 6.6%) during the years between 1997 to 2006 and thereafter declined significantly by -10.4% (95%CI: -12.7% to -8.0%) per year during 2006 to 2015. The trend for extra-pulmonary TB CNR declined but did not reach statistical significance in both regions. In conclusion, CNRs declined for all types of TB examined in both regions. Further research is needed to study trends for other TB dimensions such as treatment outcomes and notification of drug resistant TB cases.

Keywords: epidemiology, Namibia, temporal trends, tuberculosis

Procedia PDF Downloads 129
48269 HRCT of the Chest and the Role of Artificial Intelligence in the Evaluation of Patients with COVID-19

Authors: Parisa Mansour

Abstract:

Introduction: Early diagnosis of coronavirus disease (COVID-19) is extremely important to isolate and treat patients in time, thus preventing the spread of the disease, improving prognosis and reducing mortality. High-resolution computed tomography (HRCT) chest imaging and artificial intelligence (AI)-based analysis of HRCT chest images can play a central role in the treatment of patients with COVID-19. Objective: To investigate different chest HRCT findings in different stages of COVID-19 pneumonia and to evaluate the potential role of artificial intelligence in the quantitative assessment of lung parenchymal involvement in COVID-19 pneumonia. Materials and Methods: This retrospective observational study was conducted between May 1, 2020 and August 13, 2020. The study included 2169 patients with COVID-19 who underwent chest HRCT. HRCT images showed the presence and distribution of lesions such as: ground glass opacity (GGO), compaction, and any special patterns such as septal thickening, inverted halo, mark, etc. HRCT findings of the breast at different stages of the disease (early: andlt) 5 days, intermediate: 6-10 days and late stage: >10 days). A CT severity score (CTSS) was calculated based on the extent of lung involvement on HRCT, which was then correlated with clinical disease severity. Use of artificial intelligence; Analysis of CT pneumonia and quot; An algorithm was used to quantify the extent of pulmonary involvement by calculating the percentage of pulmonary opacity (PO) and gross opacity (PHO). Depending on the type of variables, statistically significant tests such as chi-square, analysis of variance (ANOVA) and post hoc tests were applied when appropriate. Results: Radiological findings were observed in HRCT chest in 1438 patients. A typical pattern of COVID-19 pneumonia, i.e., bilateral peripheral GGO with or without consolidation, was observed in 846 patients. About 294 asymptomatic patients were radiologically positive. Chest HRCT in the early stages of the disease mostly showed GGO. The late stage was indicated by such features as retinal enlargement, thickening and the presence of fibrous bands. Approximately 91.3% of cases with a CTSS = 7 were asymptomatic or clinically mild, while 81.2% of cases with a score = 15 were clinically severe. Mean PO and PHO (30.1 ± 28.0 and 8.4 ± 10.4, respectively) were significantly higher in the clinically severe categories. Conclusion: Because COVID-19 pneumonia progresses rapidly, radiologists and physicians should become familiar with typical TC chest findings to treat patients early, ultimately improving prognosis and reducing mortality. Artificial intelligence can be a valuable tool in treating patients with COVID-19.

Keywords: chest, HRCT, covid-19, artificial intelligence, chest HRCT

Procedia PDF Downloads 49
48268 Application and Utility of the Rale Score for Assessment of Clinical Severity in Covid-19 Patients

Authors: Naridchaya Aberdour, Joanna Kao, Anne Miller, Timothy Shore, Richard Maher, Zhixin Liu

Abstract:

Background: COVID-19 has and continues to be a strain on healthcare globally, with the number of patients requiring hospitalization exceeding the level of medical support available in many countries. As chest x-rays are the primary respiratory radiological investigation, the Radiological Assessment of Lung Edema (RALE) score was used to quantify the extent of pulmonary infection on baseline imaging. Assessment of RALE score's reproducibility and associations with clinical outcome parameters were then evaluated to determine implications for patient management and prognosis. Methods: A retrospective study was performed with the inclusion of patients testing positive for COVID-19 on nasopharyngeal swab within a single Local Health District in Sydney, Australia and baseline x-ray imaging acquired between January to June 2020. Two independent Radiologists viewed the studies and calculated the RALE scores. Clinical outcome parameters were collected and statistical analysis was performed to assess RALE score reproducibility and possible associations with clinical outcomes. Results: A total of 78 patients met inclusion criteria with the age range of 4 to 91 years old. RALE score concordance between the two independent Radiologists was excellent (interclass correlation coefficient = 0.93, 95% CI = 0.88-0.95, p<0.005). Binomial logistics regression identified a positive correlation with hospital admission (1.87 OR, 95% CI= 1.3-2.6, p<0.005), oxygen requirement (1.48 OR, 95% CI= 1.2-1.8, p<0.005) and invasive ventilation (1.2 OR, 95% CI= 1.0-1.3, p<0.005) for each 1-point increase in RALE score. For each one year increased in age, there was a negative correlation with recovery (0.05 OR, 95% CI= 0.92-1.0, p<0.01). RALE scores above three were positively associated with hospitalization (Youden Index 0.61, sensitivity 0.73, specificity 0.89) and above six were positively associated with ICU admission (Youden Index 0.67, sensitivity 0.91, specificity 0.78). Conclusion: The RALE score can be used as a surrogate to quantify the extent of COVID-19 infection and has an excellent inter-observer agreement. The RALE score could be used to prognosticate and identify patients at high risk of deterioration. Threshold values may also be applied to predict the likelihood of hospital and ICU admission.

Keywords: chest radiography, coronavirus, COVID-19, RALE score

Procedia PDF Downloads 162
48267 On the Influence of Sleep Habits for Predicting Preterm Births: A Machine Learning Approach

Authors: C. Fernandez-Plaza, I. Abad, E. Diaz, I. Diaz

Abstract:

Births occurring before the 37th week of gestation are considered preterm births. A threat of preterm is defined as the beginning of regular uterine contractions, dilation and cervical effacement between 23 and 36 gestation weeks. To author's best knowledge, the factors that determine the beginning of the birth are not completely defined yet. In particular, the incidence of sleep habits on preterm births is weekly studied. The aim of this study is to develop a model to predict the factors affecting premature delivery on pregnancy, based on the above potential risk factors, including those derived from sleep habits and light exposure at night (introduced as 12 variables obtained by a telephone survey using two questionnaires previously used by other authors). Thus, three groups of variables were included in the study (maternal, fetal and sleep habits). The study was approved by Research Ethics Committee of the Principado of Asturias (Spain). An observational, retrospective and descriptive study was performed with 481 births between January 1, 2015 and May 10, 2016 in the University Central Hospital of Asturias (Spain). A statistical analysis using SPSS was carried out to compare qualitative and quantitative variables between preterm and term delivery. Chi-square test qualitative variable and t-test for quantitative variables were applied. Statistically significant differences (p < 0.05) between preterm vs. term births were found for primiparity, multi-parity, kind of conception, place of residence or premature rupture of membranes and interruption during nights. In addition to the statistical analysis, machine learning methods to look for a prediction model were tested. In particular, tree based models were applied as the trade-off between performance and interpretability is especially suitable for this study. C5.0, recursive partitioning, random forest and tree bag models were analysed using caret R-package. Cross validation with 10-folds and parameter tuning to optimize the methods were applied. In addition, different noise reduction methods were applied to the initial data using NoiseFiltersR package. The best performance was obtained by C5.0 method with Accuracy 0.91, Sensitivity 0.93, Specificity 0.89 and Precision 0.91. Some well known preterm birth factors were identified: Cervix Dilation, maternal BMI, Premature rupture of membranes or nuchal translucency analysis in the first trimester. The model also identifies other new factors related to sleep habits such as light through window, bedtime on working days, usage of electronic devices before sleeping from Mondays to Fridays or change of sleeping habits reflected in the number of hours, in the depth of sleep or in the lighting of the room. IF dilation < = 2.95 AND usage of electronic devices before sleeping from Mondays to Friday = YES and change of sleeping habits = YES, then preterm is one of the predicting rules obtained by C5.0. In this work a model for predicting preterm births is developed. It is based on machine learning together with noise reduction techniques. The method maximizing the performance is the one selected. This model shows the influence of variables related to sleep habits in preterm prediction.

Keywords: machine learning, noise reduction, preterm birth, sleep habit

Procedia PDF Downloads 127
48266 Risk Assessment Tools Applied to Deep Vein Thrombosis Patients Treated with Warfarin

Authors: Kylie Mueller, Nijole Bernaitis, Shailendra Anoopkumar-Dukie

Abstract:

Background: Vitamin K antagonists particularly warfarin is the most frequently used oral medication for deep vein thrombosis (DVT) treatment and prophylaxis. Time in therapeutic range (TITR) of the international normalised ratio (INR) is widely accepted as a measure to assess the quality of warfarin therapy. Multiple factors can affect warfarin control and the subsequent adverse outcomes including thromboembolic and bleeding events. Predictor models have been developed to assess potential contributing factors and measure the individual risk of these adverse events. These predictive models have been validated in atrial fibrillation (AF) patients, however, there is a lack of literature on whether these can be successfully applied to other warfarin users including DVT patients. Therefore, the aim of the study was to assess the ability of these risk models (HAS BLED and CHADS2) to predict haemorrhagic and ischaemic incidences in DVT patients treated with warfarin. Methods: A retrospective analysis of DVT patients receiving warfarin management by a private pathology clinic was conducted. Data was collected from November 2007 to September 2014 and included demographics, medical and drug history, INR targets and test results. Patients receiving continuous warfarin therapy with an INR reference range between 2.0 and 3.0 were included in the study with mean TITR calculated using the Rosendaal method. Bleeding and thromboembolic events were recorded and reported as incidences per patient. The haemorrhagic risk model HAS BLED and ischaemic risk model CHADS2 were applied to the data. Patients were then stratified into either the low, moderate, or high-risk categories. The analysis was conducted to determine if a correlation existed between risk assessment tool and patient outcomes. Data was analysed using GraphPad Instat Version 3 with a p value of <0.05 considered to be statistically significant. Patient characteristics were reported as mean and standard deviation for continuous data and categorical data reported as number and percentage. Results: Of the 533 patients included in the study, there were 268 (50.2%) female and 265 (49.8%) male patients with a mean age of 62.5 years (±16.4). The overall mean TITR was 78.3% (±12.7) with an overall haemorrhagic incidence of 0.41 events per patient. For the HAS BLED model, there was a haemorrhagic incidence of 0.08, 0.53, and 0.54 per patient in the low, moderate and high-risk categories respectively showing a statistically significant increase in incidence with increasing risk category. The CHADS2 model showed an increase in ischaemic events according to risk category with no ischaemic events in the low category, and an ischaemic incidence of 0.03 in the moderate category and 0.47 high-risk categories. Conclusion: An increasing haemorrhagic incidence correlated to an increase in the HAS BLED risk score in DVT patients treated with warfarin. Furthermore, a greater incidence of ischaemic events occurred in patients with an increase in CHADS2 category. In an Australian population of DVT patients, the HAS BLED and CHADS2 accurately predicts incidences of haemorrhage and ischaemic events respectively.

Keywords: anticoagulant agent, deep vein thrombosis, risk assessment, warfarin

Procedia PDF Downloads 252
48265 Promoting Couple HIV Testing among Migrants for HIV Prevention: Learnings from Integrated Counselling and Testing Centre (ICTC) in Odisha, India

Authors: Sunil Mekale, Debasish Chowdhury, Sanchita Patnaik, Amitav Das, Ashok Agarwal

Abstract:

Background: Odisha is a low HIV prevalence state in India (ANC-HIV positivity of 0.42% as per HIV sentinel surveillance 2010-2011); however, it is an important source migration state with 3.2% of male migrants reporting to be PLHIV. USAID Public Health Foundation of India -PIPPSE project is piloting a source-destination corridor programme between Odisha and Gujarat. In Odisha, the focus has been on developing a comprehensive strategy to reach out to the out migrants and their spouses in the place of their origin based on their availability. The project has made concerted attempts to identify vulnerable districts with high out migration and high positivity rate. Description: 48 out of 97 ICTCs were selected from nine top high out migration districts through multistage sampling. A retrospective descriptive analysis of HIV positive male migrants and their spouses for two years (April 2013-March 2015) was conducted. A total of 3,645 HIV positive records were analysed. Findings: Among 34.2% detected HIV positive in the ICTCs, 23.3% were male migrants and 11% were spouses of male migrants; almost 50% of total ICTC attendees. More than 70% of the PLHIV male migrants and their spouses were less than 45 years old. Conclusions: Couple HIV testing approach may be considered for male migrants and their spouses. ICTC data analysis could guide in identifying the locations with high HIV positivity among male migrants and their spouses.

Keywords: HIV testing, migrants, spouse of migrants, Integrated Counselling and Testing Centre (ICTC)

Procedia PDF Downloads 361
48264 Medical Authorizations for Cannabis-Based Products in Canada: Sante Cannabis Data on Patient’s Safety and Treatment Profiles

Authors: Rihab Gamaoun, Cynthia El Hage, Laura Ruiz, Erin Prosk, Antonio Vigano

Abstract:

Introduction: Santé Cannabis (SC), a Canadian medical cannabis-specialized group of clinics based in Montreal and in the province of Québec, has served more than 5000 patients seeking cannabis-based treatment prescription for medical indications over the past five years. Within a research frame, data on the use of medical cannabis products from all the above patients were prospectively collected, leading to a large real-world database on the use of medical cannabis. The aim of this study was to gather information on the profiles of both patients and prescribed medical cannabis products at SC clinics and to assess the safety of medical cannabis among Canadian patients. Methods: Using a retrospective analysis of the database, records of 2585 patients who were prescribed medical cannabis products for therapeutic purposes between 01-November 2017 and 04-September 2019 were included. Patients’ demographics, primary diagnosis, route of administration, and chemovars recorded at the initial visits were investigated. Results: At baseline: 9% of SC patients were female, with a mean age of 57 (SD= 15.8, range= [18-96]); Cannabis products were prescribed mainly for patients with a diagnosis of chronic pain (65.9% of patients), cancer (9.4%), neurological disorders (6.5%), mood disorders (5.8 %) and inflammatory diseases (4.1%). Route of administration and chemovars of prescribed cannabis products were the following: 96% of patients received cannabis oil (51% CBD rich, 42.5% CBD:THC); 32.1% dried cannabis (21.3% CBD:THC, 7.4% THC rich, 3.4 CBD rich), and 2.1% oral spray cannabis (1.1% CBD:THC, 0.8% CBD rich, 0.2% THC rich). Most patients were prescribed simultaneously, a combination of products with different administration routes and chemovars. Safety analysis is undergoing. Conclusion: Our results provided initial information on the profile of medical cannabis products prescribed in a Canadian population and the experienced adverse events over the past three years. The Santé Cannabis database represents a unique opportunity for comparing clinical practices in prescribing and titrating cannabis-based medications across different centers. Ultimately real-world data, including information about safety and effectiveness, will help to create standardized and validated guidelines for choosing dose, route of administration, and chemovars types for the cannabis-based medication in different diseases and indications.

Keywords: medical cannabis, real-world data, safety, pharmacovigilance

Procedia PDF Downloads 91
48263 Outcomes Following Overcorrecting Minus Lens Therapy for Intermittent Distance Exotropia

Authors: Alasdair Warwick, Luna Dhir

Abstract:

Aim: To ascertain the efficacy of overcorrecting minus lens therapy in intermittent distance exotropia. Methods: Retrospective audit of all intermittent distance exotropia patients seen in the Chelsea and Westminster Hospital pediatric eye clinic between 1st January 2014 and 1st March 2016. Change in LogMAR visual acuity, stereopsis, near and distance angles of deviation, as well as the proportions of patients converting to exophoria or undergoing strabismus surgery, were recorded. Results: 22 patients were identified, 45% male, mean age 5 years (range 0.6 to 18.5 years). The median overminus prescription was -1.0 dioptres (range -0.5 to -1.75 dioptres) and mean follow-up was 15 months (range 3 to 54 months). Visual acuity, near and distance angles of deviation improved but were not statistically significant: -0.15 LogMAR, -0.2 prism dioptres and -1.2 prism dioptres respectively (p>0.05). However, a significant change in stereopsis was observed: -74'' (p<0.01). 27% underwent strabismus surgery and 36% converted to exophoria whilst wearing their overminus prescription. Conclusions: Overcorrecting minus lens therapy is an effective therapy for intermittent distance exotropia. There was no deterioration in visual acuity and a significant improvement in stereopsis was seen in our cohort, with many patients converting to an exophoria. The proportion of patients requiring strabismus surgery was comparable to other studies. Further, follow-up is needed to ascertain long-term outcomes.

Keywords: exotropia, overcorrecting minus lens, refraction, strabismus

Procedia PDF Downloads 230
48262 Emergency Multidisciplinary Continuing Care Case Management

Authors: Mekroud Amel

Abstract:

Emergency departments are known for the workload, the variety of pathologies and the difficulties in their management with the continuous influx of patients The role of our service in the management of patients with two or three mild to moderate organ failures, involving several disciplines at the same time, as well as the effect of this management on the skills and efficiency of our team has been demonstrated Borderline cases between two or three or even more disciplines, with instability of a vital function, which have been successfully managed in the emergency room, the therapeutic procedures adopted, the consequences on the quality and level of care delivered by our team, as well as that the logistical consequences, and the pedagogical consequences are demonstrated. The consequences found are Positive on the emergency teams, in rare situations are negative Regarding clinical situations, it is the entanglement of hemodynamic distress with right, left or global participation, tamponade, low flow with acute pulmonary edema, and/or state of shock With respiratory distress with more or less profound hypoxemia, with haematosis disorder related to a bacterial or viral lung infection, pleurisy, pneumothorax, bronchoconstrictive crisis. With neurological disorders such as recent stroke, comatose state, or others With metabolic disorders such as hyperkalaemia renal insufficiency severe ionic disorders with accidents with anti vitamin K With or without septate effusion of one or more serous membranes with or without tamponade It’s a Retrospective, monocentric, descriptive study Period 05.01.2022 to 10.31.2022 the purpose of our work: Search for a statistically significant link between the type of moderate to severe pathology managed in the emergency room whose problems are multivisceral on the efficiency of the healthcare team and its level of care and optional care offered for patients Statistical Test used: Chi2 test to prove the significant link between the resolution of serious multidisciplinary cases in the emergency room and the effectiveness of the team in the management of complicated cases Search for a statistically significant link : The management of the most difficult clinical cases for organ specialties has given general practitioner emergency teams a great perspective and has been able to improve their efficiency in the face of emergencies received

Keywords: emergency care teams, management of patients with dysfunction of more than one organ, learning curve, quality of care

Procedia PDF Downloads 63
48261 A Case of Borderline Personality Disorder: An Explanatory Study of Unconscious Conflicts through Dream-Analysis

Authors: Mariam Anwaar, Kiran B. Ahmad

Abstract:

Borderline Personality Disorder (BPD) is an invasive presence of affect instability, disturbance in self-concept and attachment in relationships. The profound indicator is the dichotomous approach of the world in which the ego categorizes individuals, especially their significant others, into secure or threatful beings, leaving little room for a complex combination of characteristics in one person. This defense mechanism of splitting their world has been described through the explanatory model of unconscious conflict theorized by Sigmund Freud’s Electra Complex in the Phallic Stage. The central role is of the father with whom the daughter experiences penis envy, thus identifying with the mother’s characteristics to receive the father’s attention. However, Margret Mahler, an object relation theorist, elucidates the central role of the mother and that the split occurs during the pre-Electra complex stage. Amid the 14 and 24 months of the infant, it acknowledges the world away from the mother as they have developed milestones such as crawling. In such novelty, the infant crawls away from the mother creating a sense of independence (individuation). On the other hand, being distant causes anxiety, making them return to their original object of security (separation). In BPD, the separation-individuation stage is disrupted, due to contradictory actions of the caregiver, which results in splitting the object into negative and positive aspects, repressing the former and adhering to the latter for survival. Thus, with time, the ego distorts the reality into dichotomous categories, using the splitting defenses, and the mental representation of the self is distorted due to the internalization of the negative objects. The explanatory model was recognized in the case study of Fizza, at 21-year-old Pakistani female, residing in Karachi. Her marital status is single with an occupation being a dental student. Fizza lives in a nuclear family but is surrounded by her extended family as they all are in close vicinity. She came with the complaints of depressive symptoms for two-years along with self-harm due to severe family conflicts. Through the intervention of Dialectical Behavior Therapy (DBT), the self-harming actions were reduced, however, this libidinal energy transformed into claustrophobic symptoms and, along with this, Fizza has always experienced vivid dreams. A retrospective method of Jungian dream-analysis was applied to locate the origins of the splitting in the unconscious. The result was the revelation of a sexual harassment trauma at the age of six-years which was displaced in the form of self-harm. In addition to this, the presence of a conflict at the separation-individuation stage was detected during the dream-analysis, and it was the underlying explanation of the claustrophobic symptoms. This qualitative case study implicates the use of a patient’s subjective experiences, such as dreams, to journey through the spiral of the unconscious in order to not only detect repressed memories but to use them in psychotherapy as a means of healing the patient.

Keywords: borderline personality disorder, dream-analysis, Electra complex, separation-individuation, splitting, unconscious

Procedia PDF Downloads 140
48260 Early Outcomes and Lessons from the Implementation of a Geriatric Hip Fracture Protocol at a Level 1 Trauma Center

Authors: Peter Park, Alfonso Ayala, Douglas Saeks, Jordan Miller, Carmen Flores, Karen Nelson

Abstract:

Introduction Hip fractures account for more than 300,000 hospital admissions every year. Many present as fragility fractures in geriatric patients with multiple medical comorbidities. Standardized protocols for the multidisciplinary management of this patient population have been shown to improve patient outcomes. A hip fracture protocol was implemented at a Level I Trauma center with a focus on pre-operative medical optimization and early surgical care. This study evaluates the efficacy of that protocol, including the early transition period. Methods A retrospective review was performed of all patients ages 60 and older with isolated hip fractures who were managed surgically between 2020 and 2022. This included patients 1 year prior and 1 year following the implementation of a hip fracture protocol at a Level I Trauma center. Results 530 patients were identified: 249 patients were treated before, and 281 patients were treated after the protocol was instituted. There was no difference in mean age (p=0.35), gender (p=0.3), or Charlson Comorbidity Index (p=0.38) between the cohorts. Following the implementation of the protocol, there were observed increases in time to surgery (27.5h vs. 33.8h, p=0.01), hospital length of stay (6.3d vs. 9.7d, p<0.001), and ED LOS (5.1h vs. 6.2h, p<0.001). There were no differences in in-hospital mortality (2.01% pre vs. 3.20% post, p=0.39) and complication rates (25% pre vs 26% post, p=0.76). A trend towards improved outcomes was seen after the early transition period but failed to yield statistical significance. Conclusion Early medical management and surgical intervention are key determining factors affecting outcomes following fragility hip fractures. The implementation of a hip fracture protocol at this institution has not yet significantly affected these parameters. This could in part be due to the restrictions placed at this institution during the COVID-19 pandemic. Despite this, the time to OR pre-and post-implementation was quicker than figures reported elsewhere in literature. Further longitudinal data will be collected to determine the final influence of this protocol. Significance/Clinical Relevance Given the increasing number of elderly people and the high morbidity and mortality associated with hip fractures in this population finding cost effective ways to improve outcomes in the management of these injuries has the potential to have enormous positive impact for both patients and hospital systems.

Keywords: hip fracture, geriatric, treatment algorithm, preoperative optimization

Procedia PDF Downloads 59
48259 The Effect of Fish and Krill Oil on Warfarin Control

Authors: Rebecca Pryce, Nijole Bernaitis, Andrew K. Davey, Shailendra Anoopkumar-Dukie

Abstract:

Background: Warfarin is an oral anticoagulant widely used in the prevention of strokes in patients with atrial fibrillation (AF) and in the treatment and prevention of deep vein thrombosis (DVT). Regular monitoring of Internationalised Normalised Ratio (INR) is required to ensure therapeutic benefit with time in therapeutic range (TTR) used to measure warfarin control. A number of factors influence TTR including diet, concurrent illness, and drug interactions. Extensive literature exists regarding the effect of conventional medicines on warfarin control, but documented interactions relating to complementary medicines are limited. It has been postulated that fish oil and krill oil supplementation may affect warfarin due to their association with bleeding events. However, to date little is known as to whether fish and krill oil significantly alter the incidence of bleeding with warfarin or impact on warfarin control. Aim:To assess the influence of fish oil and krill oil supplementation on warfarin control in AF and DVT patients by determining the influence of these supplements on TTR and bleeding events. Methods:A retrospective cohort analysis was conducted utilising patient information from a large private pathology practice in Queensland. AF and DVT patients receiving warfarin management by the pathology practice were identified and their TTR calculated using the Rosendaal method. Concurrent medications were analysed and patients taking no other interacting medicines were identified and divided according to users of fish oil and krill oil supplements and those taking no supplements. Study variables included TTR and the incidence of bleeding with exclusion criteria being less than 30 days of treatment with warfarin. Subject characteristics were reported as the mean and standard deviation for continuous data and number and percentages for nominal or categorical data. Data was analysed using GraphPad InStat Version 3 with a p value of <0.05 considered to be statistically significant. Results:Of the 2081 patients assessed for inclusion into this study, a total of 573 warfarin users met the inclusion criteria. Of these, 416 (72.6%) patients were AF patients and 157 (27.4%) DVT patients and overall there were 316 (55.1%) male and 257 (44.9%) female patients. 145 patients were included in the fish oil/krill oil group (supplement) and 428 were included in the control group. The mean TTR of supplement users was 86.9% and for the control group 84.7% with no significant difference between these groups. Control patients experienced 1.6 times the number of minor bleeds per person compared to supplement patients and 1.2 times the number of major bleeds per person. However, this was not statistically significant nor was the comparison between thrombotic events. Conclusion: No significant difference was found between supplement and control patients in terms of mean TTR, the number of bleeds and thrombotic events. Fish oil and krill oil supplements when used concurrently with warfarin do not significantly affect warfarin control as measured by TTR and bleeding incidence.

Keywords: atrial fibrillation, deep vein thormbosis, fish oil, krill oil, warfarin

Procedia PDF Downloads 281
48258 Automated Prediction of HIV-associated Cervical Cancer Patients Using Data Mining Techniques for Survival Analysis

Authors: O. J. Akinsola, Yinan Zheng, Rose Anorlu, F. T. Ogunsola, Lifang Hou, Robert Leo-Murphy

Abstract:

Cervical Cancer (CC) is the 2nd most common cancer among women living in low and middle-income countries, with no associated symptoms during formative periods. With the advancement and innovative medical research, there are numerous preventive measures being utilized, but the incidence of cervical cancer cannot be truncated with the application of only screening tests. The mortality associated with this invasive cervical cancer can be nipped in the bud through the important role of early-stage detection. This study research selected an array of different top features selection techniques which was aimed at developing a model that could validly diagnose the risk factors of cervical cancer. A retrospective clinic-based cohort study was conducted on 178 HIV-associated cervical cancer patients in Lagos University teaching Hospital, Nigeria (U54 data repository) in April 2022. The outcome measure was the automated prediction of the HIV-associated cervical cancer cases, while the predictor variables include: demographic information, reproductive history, birth control, sexual history, cervical cancer screening history for invasive cervical cancer. The proposed technique was assessed with R and Python programming software to produce the model by utilizing the classification algorithms for the detection and diagnosis of cervical cancer disease. Four machine learning classification algorithms used are: the machine learning model was split into training and testing dataset into ratio 80:20. The numerical features were also standardized while hyperparameter tuning was carried out on the machine learning to train and test the data. Logistic Regression (LR), Decision Tree (DT), Random Forest (RF), and K-Nearest Neighbor (KNN). Some fitting features were selected for the detection and diagnosis of cervical cancer diseases from selected characteristics in the dataset using the contribution of various selection methods for the classification cervical cancer into healthy or diseased status. The mean age of patients was 49.7±12.1 years, mean age at pregnancy was 23.3±5.5 years, mean age at first sexual experience was 19.4±3.2 years, while the mean BMI was 27.1±5.6 kg/m2. A larger percentage of the patients are Married (62.9%), while most of them have at least two sexual partners (72.5%). Age of patients (OR=1.065, p<0.001**), marital status (OR=0.375, p=0.011**), number of pregnancy live-births (OR=1.317, p=0.007**), and use of birth control pills (OR=0.291, p=0.015**) were found to be significantly associated with HIV-associated cervical cancer. On top ten 10 features (variables) considered in the analysis, RF claims the overall model performance, which include: accuracy of (72.0%), the precision of (84.6%), a recall of (84.6%) and F1-score of (74.0%) while LR has: an accuracy of (74.0%), precision of (70.0%), recall of (70.0%) and F1-score of (70.0%). The RF model identified 10 features predictive of developing cervical cancer. The age of patients was considered as the most important risk factor, followed by the number of pregnancy livebirths, marital status, and use of birth control pills, The study shows that data mining techniques could be used to identify women living with HIV at high risk of developing cervical cancer in Nigeria and other sub-Saharan African countries.

Keywords: associated cervical cancer, data mining, random forest, logistic regression

Procedia PDF Downloads 68