Search results for: criminal records
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1393

Search results for: criminal records

373 Combined Tarsal Coalition Resection and Arthroereisis in Treatment of Symptomatic Rigid Flat Foot in Pediatric Population

Authors: Michael Zaidman, Naum Simanovsky

Abstract:

Introduction. Symptomatic tarsal coalition with rigid flat foot often demands operative solution. An isolated coalition resection does not guarantee pain relief; correction of co-existing foot deformity may be required. The objective of the study was to analyze the results of combination of tarsal coalition resection and arthroereisis. Patients and methods. We retrospectively reviewed medical records and radiographs of children operatively treated in our institution for symptomatic calcaneonavicular or talocalcaneal coalition between the years 2019 and 2022. Eight patients (twelve feet), 4 boys and 4 girls with mean age 11.2 years, were included in the study. In six patients (10 feet) calcaneonavicular coalition was diagnosed, two patients (two feet) sustained talonavicular coalition. To quantify degrees of foot deformity, we used calcaneal pitch angle, lateral talar-first metatarsal (Meary's) angle, and talonavicular coverage angle. The clinical results were assessed using the American Orthopaedic Foot and Ankle Society (AOFAS) Ankle Hindfoot Score. Results. The mean follow-up was 28 month. The preoperative mean talonavicular coverage angle was 17,75º as compared with postoperative mean angle of 5.4º. The calcaneal pitch angle improved from mean 6,8º to 16,4º. The mean preoperative Meary’s angle of -11.3º improved to mean 2.8º. The preoperative mean AOFAS score improved from 54.7 to 93.1 points post-operatively. In nine of twelve feet, overall clinical outcome judged by AOFAS scale was excellent (90-100 points), in three feet was good (80-90 points). Six patients (ten feet) obviously improved their subtalar range of motion. Conclusion. For symptomatic stiff or rigid flat feet associated with tarsal coalition, the combination of coalition resection and arthroereisis leads to normalization of radiographic parameters, clinical and functional improvement with good patient’s satisfaction and likely to be more effective than the isolated procedures.

Keywords: rigid flat foot, tarsal coalition resection, arthroereisis, outcome

Procedia PDF Downloads 56
372 Demographic Profile, Risk Factors and In-hospital Outcomes of Acute Coronary Syndrome (ACS) in Young Population, in Pakistan-Single Center Real World Experience

Authors: Asma Qudrat, Abid Ullah, Rafi Ullah, Ali Raza, Shah Zeb, Syed Ali Shan Ul-Haq, Shahkar Ahmed Shah, Attiya Hameed Khan, Saad Zaheer, Umama Qasim, Kiran Jamal, Zahoor khan

Abstract:

Objectives: Coronary artery disease (CAD) is the major public health issue associated with high mortality and morbidity rate worldwide. Young patients with ACS have unique characteristics with different demographic profiles and risk factors. The precise diagnosis and early risk stratification is important in guiding treatment and predicting the prognosis of young patients with ACS. To evaluate the associated demographics, risk factors, and outcomes profile of ACS in young age patients. Methods: The research follow a retrospective design, the single centre study of patients diagnosis with the first event of ACS in young age (>18 and <40) were included. Data collection included demographic profiles, risk factors, and in-hospital outcomes of young ACS patients. The patient’s data was retrieved through Electronic Medical Records (EMR) of Peshawar Institute of Cardiology (PIC), and all characteristic were assessed. Results: In this study, 77% were male, and 23% were female patients. The risk factors were assessed with CAD and shown significant results (P < 0.01). The most common presentation was STEMI, with (45%) most in ACS young patients. The angiographic pattern showed single vessel disease (SVD) in 49%, double vessel disease (DVD) in 17% and triple vessel disease (TVD) was found in 10%, and Left Artery Disease (LAD) (54%) was present to be the most common involved artery. Conclusion: It is concluded that the male sex was predominant in ACS young age patients. SVD was the common coronary angiographic finding. Risk factors showed significant results towards CAD and common presentations.

Keywords: coronary artery disease, Non-ST elevation myocardial infarction, ST elevation myocardial infarction, unstable angina, acute coronary syndrome

Procedia PDF Downloads 146
371 Exploring Factors Related to Unplanning Readmission of Elderly Patients in Taiwan

Authors: Hui-Yen Lee, Hsiu-Yun Wei, Guey-Jen Lin, Pi-Yueh Lee Lee

Abstract:

Background: Unplanned hospital readmissions increase healthcare costs and have been considered a marker of poor healthcare performance. The elderly face a higher risk of unplanned readmission due to elderly-specific characteristics such as deteriorating body functions and the relatively high incidence of complications after treatment of acute diseases. Purpose: The aim of this study was exploring the factors that relate to the unplanned readmission of elderly within 14 days of discharge at our hospital in southern Taiwan. Methods: We retrospectively reviewed the medical records of patients aged ≥65 years who had been re-admitted between January 2018 and December 2018.The Charlson Comorbidity score was calculated using previous used method. Related factors that affected the rate of unplanned readmission within 14 days of discharge were screened and analyzed using the chi-squared test and logistic regression analysis. Results: This study enrolled 829 subjects aged more than 65 years. The numbers of unplanned readmission patients within 14 days were 318 cases, while those did not belong to the unplanned readmission were 511 cases. In 2018, the rate of elderly patients in unplanned 14 days readmissions was 38.4%. The majority patients were females (166 cases, 52.2%), with an average age of 77.6 ± 7.90 years (65-98). The average value of Charlson Comorbidity score was 4.42±2.76. Using logistic regression analysis, we found that the gastric or peptic ulcer (OR=1.917 , P< 0.002), diabetes (OR= 0.722, P< 0.043), hemiplegia (OR= 2.292, P< 0.015), metastatic solid tumor (OR= 2.204, P< 0.025), hypertension (OR= 0.696, P< 0.044), and skin ulcer/cellulitis (OR= 2.747, P< 0.022) have significantly higher risk of 14-day readmissions. Conclusion: The results of the present study may assist the healthcare teams to understand the factors that may affect unplanned readmission in the elderly. We recommend that these teams give efficient approach in their medical practice, provide timely health education for elderly, and integrative healthcare for chronic diseases in order to reduce unplanned readmissions.

Keywords: unplanning readmission, elderly, Charlson comorbidity score, logistic regression analysis

Procedia PDF Downloads 126
370 Model of Application of Blockchain Technology in Public Finances

Authors: M. Vlahovic

Abstract:

This paper presents a model of public finances, which combines three concepts: participatory budgeting, crowdfunding and blockchain technology. Participatory budgeting is defined as a process in which community members decide how to spend a part of community’s budget. Crowdfunding is a practice of funding a project by collecting small monetary contributions from a large number of people via an Internet platform. Blockchain technology is a distributed ledger that enables efficient and reliable transactions that are secure and transparent. In this hypothetical model, the government or authorities on local/regional level would set up a platform where they would propose public projects to citizens. Citizens would browse through projects and support or vote for those which they consider justified and necessary. In return, they would be entitled to a tax relief in the amount of their monetary contribution. Since the blockchain technology enables tracking of transactions, it can be used to mitigate corruption, money laundering and lack of transparency in public finances. Models of its application have already been created for e-voting, health records or land registries. By presenting a model of application of blockchain technology in public finances, this paper takes into consideration the potential of blockchain technology to disrupt governments and make processes more democratic, secure, transparent and efficient. The framework for this paper consists of multiple streams of research, including key concepts of direct democracy, public finance (especially the voluntary theory of public finance), information and communication technology, especially blockchain technology and crowdfunding. The framework defines rules of the game, basic conditions for the implementation of the model, benefits, potential problems and development perspectives. As an oversimplified map of a new form of public finances, the proposed model identifies primary factors, that influence the possibility of implementation of the model, and that could be tracked, measured and controlled in case of experimentation with the model.

Keywords: blockchain technology, distributed ledger, participatory budgeting, crowdfunding, direct democracy, internet platform, e-government, public finance

Procedia PDF Downloads 140
369 Prevalence of Pretreatment Drug HIV-1 Mutations in Moscow, Russia

Authors: Daria Zabolotnaya, Svetlana Degtyareva, Veronika Kanestri, Danila Konnov

Abstract:

An adequate choice of the initial antiretroviral treatment determines the treatment efficacy. In the clinical guidelines in Russia non-nucleoside reverse transcriptase inhibitors (NNRTIs) are still considered to be an option for first-line treatment while pretreatment drug resistance (PDR) testing is not routinely performed. We conducted a cohort retrospective study in HIV-positive treatment naïve patients of the H-clinic (Moscow, Russia) who performed PDR testing from July 2017 to November 2021. All the information was obtained from the medical records anonymously. We analyzed the mutations in reverse transcriptase and protease genes. RT-sequences were obtained by AmpliSens HIV-Resist-Seq kit. Drug resistance was defined using the HIVdb Program v. 8.9-1. PDR was estimated using the Stanford algorithm. Descriptive statistics were performed in Excel (Microsoft Office, 2019). A total of 261 HIV-1 infected patients were enrolled in the study including 197 (75.5%) male and 64 (24.5%) female. The mean age was 34.6±8.3 years. The median CD4 count – 521 cells/µl (IQR 367-687 cells/µl). Data on risk factors of HIV-infection were scarce. The total quantity of strains containing mutations in the reverse transcriptase gene was 75 (28.7%). From these 5 (1.9%) mutations were associated with PDR to nucleoside reverse transcriptase inhibitors (NRTIs) and 30 (11.5%) – with PDR to NNRTIs. The number of strains with mutations in protease gene was 43 (16.5%), from these only 3 (1.1%) mutations were associated with resistance to protease inhibitors. For NNRTIs the most prevalent PDR mutations were E138A, V106I. Most of the HIV variants exhibited a single PDR mutation, 2 were found in 3 samples. Most of HIV variants with PDR mutation displayed a single drug class resistance mutation. 2/37 (5.4%) strains had both NRTIs and NNRTIs mutations. There were no strains identified with PDR mutations to all three drug classes. Though earlier data demonstrated a lower level of PDR in HIV treatment naïve population in Russia and our cohort can be not fully representative as it is taken from the private clinic, it reflects the trend of increasing PDR especially to NNRTIs. Therefore, we consider either pretreatment testing or giving the priority to other drugs as first-line treatment necessary.

Keywords: HIV, resistance, mutations, treatment

Procedia PDF Downloads 83
368 Effect of Calving Season on the Economic and Production Efficiency of Dairy Production Breeds

Authors: Eman. K. Ramadan, Abdelgawad. S. El-Tahawy

Abstract:

The objective of this study was to evaluate the effects of calving season on the production and economic efficiency of dairy farms in Egypt. Our study was performed at dairy production farms in the Alexandria, Behera, and Kafr El-Sheikh provinces of Egypt from summer 2010 to winter 2013. The randomly selected dairy farms had herds consisting of Baladi, Holstein-Friesian, or cross-bred (Baladi × Holstein-Friesian) cows. The data were collected from production records and responses to a structured questionnaire. The average total return differed significantly (P < 0.05) between the different cattle breeds and calving seasons. The average total return was highest for the Holstein-Friesian cows that calved in the winter (29106.42 EGP/cow/year), and it was lowest for Baladi cows that calved in the summer (12489.79 EGP/cow/year). Differences in total returns between the cows that calved in the winter or summer or between the foreign and native breeds, as well as variations in calf prices, might have contributed to the differences in milk yield. The average net profit per cow differed significantly (P < 0.05) between the cattle breeds and calving seasons. The average net profit values for the Baladi cows that calved in the winter or summer were 2413 and 2994.96 EGP/cow/year, respectively, and those for the Holstein-Friesian cows were 10744.17 and 7860.56 EGP/cow/year, respectively, whereas those for the cross-bred cows were 10174.86 and 7571.33 EGP/cow/year, respectively. The variations in net profit might have resulted from variation in the availability or price of feed materials, milk prices, or sales volumes. Our results show that the breed and calving season of dairy cows significantly affected the economic efficiency of dairy farms in Egypt. The cows that calved in the winter produced more milk than those that calved in the summer, which may have been the result of seasonal influences, such as temperature, humidity, management practices, and the type of feed or green fodder available.

Keywords: calving season, economic, production, efficiency, dairy

Procedia PDF Downloads 418
367 Long-Term Outcomes of Dysphagia in Children with Severe Cerebral Palsy Using Videofluoroscopic Evaluation

Authors: Eun Jae Ko, In Young Sung, Eui Soo Joeng

Abstract:

Oropharyngeal dysphagia is prevalent in children with cerebral palsy (CP). There are many studies concerning this problem, however, studies examining long term outcomes of dysphagia using videofluoroscopic study (VFSS) are very rare. The Aim of this study is to investigate long-term outcomes of dysphagia in children with severe CP using initial VFSS. It was a retrospective study and chart review was done from January 2000 to December 2013. Thirty one patients under 18 years who have been diagnosed as CP in outpatient clinic of Rehabilitation Medicine, and who did VFSS were included. Long-term outcomes such as feeding method, height percentile, weight percentile, and body mass index (BMI) were tracked up for at least 3 years by medical records. Significant differences between initial and follow-up datas were investigated. The patients consisted of 18 males and 13 females, and the mean age was 31.0±18.0 months old. 64.5% of patients were doing oral diet, and 25.8% of patients were doing non-oral diet. When comparing VFSS findings among oral feeding patients, oral and non-oral feeding patients, and non-oral feeding patients at initial period, dysphagia severity, supraglottic penetration, and subglottic aspiration showed significant differences. Most of the patients who could feed orally at initial period were found to have the same feeding method at follow-up. But among eight patients who required non-oral feeding initially, three patients became possible to feed orally, and one patient was doing oral and non-oral feeding method together at follow-up. Follow up feeding method showed correlation with dysphagia severity by initial VFSS. Weight percentile was decreased in patients with GMFCS level V at follow up, which may represent poor nutritional status due to severe dysphagia compared to other patients. Initial VFSS severity would play a significant role in making an assumption about future diet in children with severe CP. Patients with GMFCS level V seem to have serious dysphagia at follow up and have nutritional deficiency over time, therefore, more careful nutritional support is needed in children with severe CP are suggested.

Keywords: cerebral palsy, child, dysphagia, videofluoroscopic study

Procedia PDF Downloads 241
366 Management of Severe Asthma with Omalizumab in United Arab Emirates

Authors: Shanza Akram, Samir Salah, Imran Saleem, Jassim Abdou, Ashraf Al Zaabi

Abstract:

Estimated prevalence of asthma in UAE is around 10% (900,000 people). Patients with persistent symptoms despite using high dose ICS plus a second controller +/- Oral steroids are considered to have severe asthma. Omalizumab (Xolair) is an anti-IgE monoclonal antibody approved as add-on therapy for severe allergic asthma. The objective of our study was to obtain baseline characteristics of our local cohort, to determine the efficacy of omalizumab based on clinical outcomes pre and post 52 weeks of treatment and to assess safety and tolerability. Medical records of patients receiving omalizumab therapy for asthma at Zayed Military Hospital, Abu Dhabi were retrospectively reviewed. Patients fulfilling the criteria for severe allergic asthma as per GINA guidelines were included. Asthma control over 12 months pre and post omalizumab were analyzed by taking into account the number of exacerbations, hospitalizations, maintenance of medication dosages, the need for reliever therapy and PFT’s. 21 patients (5 females) with mean age 41 years were included. The mean duration of therapy was 22 months. 19 (91%) patients had Allergic Rhinitis/Sinusitis. Mean serum total IgE level was 648 IU/ml (65-1859). 11 (52%) patients were on oral maintenance steroids pre-treatment. 7 patients managed to stop steroids on treatment while 4 were able to decrease the dosage. Mean exacerbation rate decreased from 5 per year pre-treatment to 1.36 while on treatment. The number of hospitalizations decreased from a mean of 2 per year to 0.9 per year. Reliever inhaler usage decreased from mean of 40 to 15 puffs per week.2 patients discontinued therapy, 1 due to lack of benefit (2 doses) and 2nd due to severe persistent side effects. Patient compliance was poor in some cases. Treatment with omalizumab reduced the number of exacerbations, hospitalizations, maintenance and reliever medications, and is generally well tolerated. Our results show that there is room for improved documentation in terms of symptom recording and use of rescue medication at our institution. There is also need for better patient education and counseling in order to improve compliance.

Keywords: asthma, exacerbations, omalizumab, IgE

Procedia PDF Downloads 364
365 Nutritional Impact in Patients Who Underwent Sleeve-Type Bariatric Surgery

Authors: Melissa Mattos, Camila Lima, Ibraim Castro, Augusto Carioca, Saulo Magalhães, Paula Freitas, Keciany Oliveira

Abstract:

Obesity is a chronic, multifactorial, relapsing disease that has increased dramatically over the years. Its control is considered a public health issue, and more and more treatments and interventions are being studied to reduce its prevalence. When interventions in lifestyle and the use of drugs do not generate lasting results, bariatric procedures emerge as a resource for obesity control. The main guidelines for the treatment of obesity emphasize the need for pre-procedure and post-procedure nutritional monitoring to avoid nutritional deficiencies that may occur. The individual who undergoes bariatric surgery needs to understand the changes that will be necessary for life in view of the intense anatomical and metabolic changes that result from surgical techniques. To assess the nutritional profile of patients who undergo bariatric surgery, we analyzed data from the medical records of all people who underwent sleeve-type bariatric surgery from January to June 2022 at a clinic in the City of Fortaleza. 38 patients were analyzed, 32 women and 6 men in the pre-surgical period, 6 and 12 months after surgery. The data showed an average weight loss of 24.45% at 6 months and 30.85% at 12 months, with a reduction of 21.32% and 30.41%, respectively, in the fat percentage, also indicating that 13.15% used drugs for weight loss during this period, leading to reflection on the isolated long-term efficacy of bariatric surgery, requiring multidisciplinary follow-up for a change in lifestyle. Only 12 individuals, corresponding to 31.57%, reached eutrophic BMI 12 months after surgery, 20 individuals remained overweight, corresponding to 52.63% of the sample, and 6 individuals (15.78%) remained in the BMI obese class I. As for body composition, there was a 52.39% reduction in fat mass and a 12.82% reduction in muscle mass, and 21% of individuals underwent cholecystectomy. Sleeve-type bariatric surgery promoted significant weight loss after 1 year of the procedure, with a reduction in body fat percentage and fat mass. Most patients were still overweight and had a significant reduction in muscle mass.

Keywords: bariatric surgery, sleeve gastrectomy, obesity, sleeve

Procedia PDF Downloads 55
364 The Nursing Rounds System: Effect of Patient's Call Light Use, Bed Sores, Fall and Satisfaction Level

Authors: Bassem Saleh, Hussam Nusair, Nariman Al Zubadi, Shams Al Shloul, Usama Saleh

Abstract:

The nursing round system (NRS) means checking patients on an hourly basis during the A (0700–2200 h) shift and once every 2 h during the B (2200–0700 h) by the assigned nursing staff. The overall goal of this prospective study is to implement an NRS in a major rehabilitation centre—Sultan Bin Abdulaziz Humanitarian City—in the Riyadh area of the Kingdom of Saudi Arabia. The purposes of this study are to measure the effect of the NRS on: (i) the use of patient call light; (ii) the number of incidences of patients’ fall; (iii) the number of incidences of hospital-acquired bed sores; and (iv) the level of patients’ satisfaction. All patients hospitalized in the male stroke unit will be involved in this study. For the period of 8 weeks (17 December 2009–17 February 2010) All Nursing staff on the unit will record each call light and the patient’s need. Implementation of the NRS would start on 18 February 2010 and last for 8 weeks, until 18 April 2010. Data collected throughout this period will be compared with data collected during the 8 weeks period immediately preceding the implementation of the NRS (17 December 2009–17 February 2010) in order to measure the impact of the call light use. The following information were collected on all subjects involved in the study: (i) the Demographic Information Form; (ii) authors’ developed NRS Audit Form; (iii) Patient Call Light Audit Form; (iv) Patient Fall Audit Record; (v) Hospital-Acquired Bed Sores Audit Form; and (vi) hospital developed Patient Satisfaction Records. The findings suggested that a significant reduction on the use of call bell (P < 0.001), a significant reduction of fall incidence (P < 0.01) while pressure ulcer reduced by 50% before and after the implementation of NRS. In addition, the implementation of NRS increased patient satisfaction by 7/5 (P < 0.05).

Keywords: call light, patient-care management, patient safety, patient satisfaction, rounds

Procedia PDF Downloads 358
363 Evaluation of Weather Risk Insurance for Agricultural Products Using a 3-Factor Pricing Model

Authors: O. Benabdeljelil, A. Karioun, S. Amami, R. Rouger, M. Hamidine

Abstract:

A model for preventing the risks related to climate conditions in the agricultural sector is presented. It will determine the yearly optimum premium to be paid by a producer in order to reach his required turnover. The model is based on both climatic stability and 'soft' responses of usually grown species to average climate variations at the same place and inside a safety ball which can be determined from past meteorological data. This allows the use of linear regression expression for dependence of production result in terms of driving meteorological parameters, the main ones of which are daily average sunlight, rainfall and temperature. By simple best parameter fit from the expert table drawn with professionals, optimal representation of yearly production is determined from records of previous years, and yearly payback is evaluated from minimum yearly produced turnover. The model also requires accurate pricing of commodity at N+1. Therefore, a pricing model is developed using 3 state variables, namely the spot price, the difference between the mean-term and the long-term forward price, and the long-term structure of the model. The use of historical data enables to calibrate the parameters of state variables, and allows the pricing of commodity. Application to beet sugar underlines pricer precision. Indeed, the percentage of accuracy between computed result and real world is 99,5%. Optimal premium is then deduced and gives the producer a useful bound for negotiating an offer by insurance companies to effectively protect its harvest. The application to beet production in French Oise department illustrates the reliability of present model with as low as 6% difference between predicted and real data. The model can be adapted to almost any agricultural field by changing state parameters and calibrating their associated coefficients.

Keywords: agriculture, production model, optimal price, meteorological factors, 3-factor model, parameter calibration, forward price

Procedia PDF Downloads 362
362 Enhancing Healthcare Data Protection and Security

Authors: Joseph Udofia, Isaac Olufadewa

Abstract:

Everyday, the size of Electronic Health Records data keeps increasing as new patients visit health practitioner and returning patients fulfil their appointments. As these data grow, so is their susceptibility to cyber-attacks from criminals waiting to exploit this data. In the US, the damages for cyberattacks were estimated at $8 billion (2018), $11.5 billion (2019) and $20 billion (2021). These attacks usually involve the exposure of PII. Health data is considered PII, and its exposure carry significant impact. To this end, an enhancement of Health Policy and Standards in relation to data security, especially among patients and their clinical providers, is critical to ensure ethical practices, confidentiality, and trust in the healthcare system. As Clinical accelerators and applications that contain user data are used, it is expedient to have a review and revamp of policies like the Payment Card Industry Data Security Standard (PCI DSS), the Health Insurance Portability and Accountability Act (HIPAA), the Fast Healthcare Interoperability Resources (FHIR), all aimed to ensure data protection and security in healthcare. FHIR caters for healthcare data interoperability, FHIR caters to healthcare data interoperability, as data is being shared across different systems from customers to health insurance and care providers. The astronomical cost of implementation has deterred players in the space from ensuring compliance, leading to susceptibility to data exfiltration and data loss on the security accuracy of protected health information (PHI). Though HIPAA hones in on the security accuracy of protected health information (PHI) and PCI DSS on the security of payment card data, they intersect with the shared goal of protecting sensitive information in line with industry standards. With advancements in tech and the emergence of new technology, it is necessary to revamp these policies to address the complexity and ambiguity, cost barrier, and ever-increasing threats in cyberspace. Healthcare data in the wrong hands is a recipe for disaster, and we must enhance its protection and security to protect the mental health of the current and future generations.

Keywords: cloud security, healthcare, cybersecurity, policy and standard

Procedia PDF Downloads 71
361 A Posterior Predictive Model-Based Control Chart for Monitoring Healthcare

Authors: Yi-Fan Lin, Peter P. Howley, Frank A. Tuyl

Abstract:

Quality measurement and reporting systems are used in healthcare internationally. In Australia, the Australian Council on Healthcare Standards records and reports hundreds of clinical indicators (CIs) nationally across the healthcare system. These CIs are measures of performance in the clinical setting, and are used as a screening tool to help assess whether a standard of care is being met. Existing analysis and reporting of these CIs incorporate Bayesian methods to address sampling variation; however, such assessments are retrospective in nature, reporting upon the previous six or twelve months of data. The use of Bayesian methods within statistical process control for monitoring systems is an important pursuit to support more timely decision-making. Our research has developed and assessed a new graphical monitoring tool, similar to a control chart, based on the beta-binomial posterior predictive (BBPP) distribution to facilitate the real-time assessment of health care organizational performance via CIs. The BBPP charts have been compared with the traditional Bernoulli CUSUM (BC) chart by simulation. The more traditional “central” and “highest posterior density” (HPD) interval approaches were each considered to define the limits, and the multiple charts were compared via in-control and out-of-control average run lengths (ARLs), assuming that the parameter representing the underlying CI rate (proportion of cases with an event of interest) required estimation. Preliminary results have identified that the BBPP chart with HPD-based control limits provides better out-of-control run length performance than the central interval-based and BC charts. Further, the BC chart’s performance may be improved by using Bayesian parameter estimation of the underlying CI rate.

Keywords: average run length (ARL), bernoulli cusum (BC) chart, beta binomial posterior predictive (BBPP) distribution, clinical indicator (CI), healthcare organization (HCO), highest posterior density (HPD) interval

Procedia PDF Downloads 195
360 Absolute Lymphocyte Count as Predictor of Pneumocystis Pneumonia in Patients With Unknown HIV Status at a Private Tertiary Hospital

Authors: Marja A. Bernardo, Coreena A. Bueser, Cybele Lara R. Abad, Raul V. Destura

Abstract:

Pneumocystis jirovecii pneumonia (PCP) is the most common opportunistic infection among people with HIV. Early consideration of PCP should be made even in patients whose HIV status is unknown as delay in treatment may be fatal. The use of absolute lymphocyte count (ALC) has been suggested as an alternative predictor of PCP especially in resource limited settings where PCR testing is costly or delayed. Objective: To determine whether the absolute lymphocyte count (ALC) can be used as a screening tool to predict Pneumocystis pneumonia in patients with unknown HIV status admitted at a private tertiary hospital. Methods: A retrospective cross-sectional study was conducted at a private tertiary medical center. Inpatient medical records of patients aged 18 years old and above from January 2012 to May 2014, in whom a clinical diagnosis of Pneumocystis jirovecii pneumonia was made were reviewed for inclusion. Demographic data, clinical features, hospital course, PCP PCR and HIV results were recorded. Independent t-test and chi-square analysis was used to determine any statistical difference between PCP-positive and PCP-negative groups. Mann-Whitney U-test was used for comparison of hospital stay. Results: There were no statistically significant differences in baseline characteristics between PCP positive and negative groups. While both the percent lymphocyte count (0.14 ± 0.13 vs 0.21 ± 0.16) and ALC (1160 ± 528.67 vs 1493.70 ± 988.61) were lower for the PCP-positive group, only the percent lymphocyte count reached a statistically significant difference (p= 0.067 vs p= 0.042). Conclusion: A quick determination of the ALC may be useful as an additional parameter to help screen for and diagnose pneumocystis pneumonia. In our study, the ALC of patients with PCP appear to be lower than in patients without PCP. A low ALC (e.g. below 1200) may help with the decision regarding empiric treatment. However, it should be used in conjunction with the patient’s clinical presentation, as well as other diagnostic tests. Larger, prospective studies incorporating the ALC with other clinical predictors are necessary to optimally predict those who would benefit from empiric or expedited management for potential PCP.

Keywords: Pneumocystis carinii pneumonia, Absolute Lymphocyte Count, infection, PCP

Procedia PDF Downloads 344
359 The Abnormality of Blood Cells Parasitized by Plasmodium vivax

Authors: Manas Kotepui, Kwuntida Uthaisar, Phiman Thirarattanasunthon, Bhukdee PhunPhuech, Nuoil Phiwklam

Abstract:

Introduction: Malaria due to Plasmodium vivax has placed huge burdens on the health, longevity, and general prosperity of large sections of the human population. This study aimed at prospectively collecting information on the clinical profile of Plasmodium vivax from subjects acutely infected with P. vivax residing in some of the highest malaria transmission regions in Thailand. Methods: A retrospective study of malaria cases, hospitalized between 2013 and 2015 was performed. Clinical characteristics, diagnosis, and parasitological results on admission, age, and gender were mined from medical records at Phop Phra Hospital located in endemic areas of Tak Province, Thailand. Venous blood samples were collected at the time of admission to the hospital to determine the present of parasite and also parasite count by thick and thin film examination, and also Complete blood count (CBC) parameters. Results: Results showed that patients infected with Plasmodium vivax (276 cases) had a high monocyte count (mean=390 cells/µL) during initial stage of infection and continuously lower during later stage (any stage with gametocyte, mean=230 cells/µL) of infection (P value=0.021) whereas, patients infected with Plasmodium vivax had a low basophil count (mean=20 cells/µL) during initial stage of infection and continuously higher during later stage of infection (mean at stage with gametocyte=70 cells/µL) (P value=0.033). In addition, patients with more than one stage infection tend to have lower lymphocyte count (mean=1180 cells/µL) than patients with only one stage infection (mean=1350 cells/µL)(P value=0.011) whereas, patients with more than one stage infection tend to have lower basophil count (mean=60 cells/µL) than patients with only one stage infection (mean=80 cells/µL) (P value=0.01). Conclusion: This study indicated that patients infected with Plasmodium vivax had high monocyte count and low basophil count during initial stage of infection which was continuously lower during later stage of infection. Patients with more than one stage infection tend to have lower lymphocyte count than patients with only one stage infection whereas, patients with more than one stage infection tend to have lower basophil count than patients with only one stage infection. This information contributes to better understanding of pathological characteristic of Plasmodium vivax infection.

Keywords: plasmodium vivax, Thailand, asexual erythrocytic stages, hematological parameters

Procedia PDF Downloads 196
358 Clinicians’ Experiences with IT Systems in a UK District General Hospital: A Qualitative Analysis

Authors: Sunny Deo, Eve Barnes, Peter Arnold-Smith

Abstract:

Introduction: Healthcare technology is a rapidly expanding field in healthcare, with enthusiasts suggesting a revolution in the quality and efficiency of healthcare delivery based on the utilisation of better e-healthcare, including the move to paperless healthcare. The role and use of computers and programmes for healthcare have been increasing over the past 50 years. Despite this, there is no standardised method of assessing the quality of hardware and software utilised by frontline healthcare workers. Methods and subjects: Based on standard Patient Related Outcome Measures, a questionnaire was devised with the aim of providing quantitative and qualitative data on clinicians’ perspectives of their hospital’s Information Technology (IT). The survey was distributed via the Institution’s Intranet to all contracted doctors, and the survey's qualitative results were analysed. Qualitative opinions were grouped as positive, neutral, or negative and further sub-grouped into speed/usability, software/hardware, integration, IT staffing, clinical risk, and wellbeing. Analysis was undertaken on the basis of doctor seniority and by specialty. Results: There were 196 responses, with 51% from senior doctors (consultant grades) and the rest from junior grades, with the largest group of respondents 52% coming from medicine specialties. Differences in the proportion of principle and sub-groups were noted by seniority and specialty. Negative themes were by far the commonest stated opinion type, occurring in almost 2/3’s of responses (63%), while positive comments occurred less than 1 in 10 (8%). Conclusions: This survey confirms strongly negative attitudes to the current state of electronic documentation and IT in a large single-centre cohort of hospital-based frontline physicians after two decades of so-called progress to a paperless healthcare system. Greater use would provide further insights and potentially optimise the focus of development and delivery to improve the quality and effectiveness of IT for clinicians and their patients.

Keywords: information technology, electronic patient records, digitisation, paperless healthcare

Procedia PDF Downloads 73
357 Palliative Orthovoltage Radiotherapy and Subcutaneous Infusion of Carboplatin for Treatment of Appendicular Osteosarcoma in Dogs

Authors: Kathryn L. Duncan, Charles A. Kuntz, Alessandra C. Santamaria, James O. Simcock

Abstract:

Access to megavoltage radiation therapy for small animals is limited in many locations around the world. This can preclude the use of palliative radiation therapy for the treatment of appendicular osteosarcoma in dogs. The objective of this study was to retrospectively assess the adverse effects and survival times of dogs with appendicular osteosarcoma that were treated with hypofractionated orthovoltage radiation therapy and adjunctive carboplatin chemotherapy administered via a single subcutaneous infusion. Medical records were reviewed retrospectively to identify client-owned dogs with spontaneously occurring appendicular osteosarcoma that was treated with palliative orthovoltage radiation therapy and a single subcutaneous infusion of carboplatin. Data recorded included signalment, tumour location, results of diagnostic imaging, haematologic and serum biochemical analyses, adverse effects of radiation therapy and chemotherapy, and survival times. Kaplan-Meier survival analysis was performed, and log-rank analysis was used to determine the impact of specific patient variables on survival time. Twenty-three dogs were identified that met the inclusion criteria. Median survival time for dogs was 182 days. Eleven dogs had adverse haematologic effects, 3 had adverse gastrointestinal effects, 6 had adverse effects at the radiation site and 7 developed infections at the carboplatin infusion site. No statistically significant differences were identified in survival times based on sex, tumour location, development of infection, or pretreatment serum alkaline phosphatase. Median survival time and incidence of adverse effects were comparable to those previously reported in dogs undergoing palliative radiation therapy with megavoltage or cobalt radiation sources and conventional intravenous carboplatin chemotherapy. The use of orthovoltage palliative radiation therapy may be a reasonable alternative to megavoltage radiation in locations where access is limited.

Keywords: radiotherapy, veterinary oncology, chemotherapy, osteosarcoma

Procedia PDF Downloads 62
356 Analyzing the Street Pattern Characteristics on Young People’s Choice to Walk or Not: A Study Based on Accelerometer and Global Positioning Systems Data

Authors: Ebru Cubukcu, Gozde Eksioglu Cetintahra, Burcin Hepguzel Hatip, Mert Cubukcu

Abstract:

Obesity and overweight cause serious health problems. Public and private organizations aim to encourage walking in various ways in order to cope with the problem of obesity and overweight. This study aims to understand how the spatial characteristics of urban street pattern, connectivity and complexity influence young people’s choice to walk or not. 185 public university students in Izmir, the third largest city in Turkey, participated in the study. Each participant had worn an accelerometer and a global positioning (GPS) device for a week. The accelerometer device records data on the intensity of the participant’s activity at a specified time interval, and the GPS device on the activities’ locations. Combining the two datasets, activity maps are derived. These maps are then used to differentiate the participants’ walk trips and motor vehicle trips. Given that, the frequency of walk and motor vehicle trips are calculated at the street segment level, and the street segments are then categorized into two as ‘preferred by pedestrians’ and ‘preferred by motor vehicles’. Graph Theory-based accessibility indices are calculated to quantify the spatial characteristics of the streets in the sample. Six different indices are used: (I) edge density, (II) edge sinuosity, (III) eta index, (IV) node density, (V) order of a node, and (VI) beta index. T-tests show that the index values for the ‘preferred by pedestrians’ and ‘preferred by motor vehicles’ are significantly different. The findings indicate that the spatial characteristics of the street network have a measurable effect on young people’s choice to walk or not. Policy implications are discussed. This study is funded by the Scientific and Technological Research Council of Turkey, Project No: 116K358.

Keywords: graph theory, walkability, accessibility, street network

Procedia PDF Downloads 211
355 Estimation of Fragility Curves Using Proposed Ground Motion Selection and Scaling Procedure

Authors: Esra Zengin, Sinan Akkar

Abstract:

Reliable and accurate prediction of nonlinear structural response requires specification of appropriate earthquake ground motions to be used in nonlinear time history analysis. The current research has mainly focused on selection and manipulation of real earthquake records that can be seen as the most critical step in the performance based seismic design and assessment of the structures. Utilizing amplitude scaled ground motions that matches with the target spectra is commonly used technique for the estimation of nonlinear structural response. Representative ground motion ensembles are selected to match target spectrum such as scenario-based spectrum derived from ground motion prediction equations, Uniform Hazard Spectrum (UHS), Conditional Mean Spectrum (CMS) or Conditional Spectrum (CS). Different sets of criteria exist among those developed methodologies to select and scale ground motions with the objective of obtaining robust estimation of the structural performance. This study presents ground motion selection and scaling procedure that considers the spectral variability at target demand with the level of ground motion dispersion. The proposed methodology provides a set of ground motions whose response spectra match target median and corresponding variance within a specified period interval. The efficient and simple algorithm is used to assemble the ground motion sets. The scaling stage is based on the minimization of the error between scaled median and the target spectra where the dispersion of the earthquake shaking is preserved along the period interval. The impact of the spectral variability on nonlinear response distribution is investigated at the level of inelastic single degree of freedom systems. In order to see the effect of different selection and scaling methodologies on fragility curve estimations, results are compared with those obtained by CMS-based scaling methodology. The variability in fragility curves due to the consideration of dispersion in ground motion selection process is also examined.

Keywords: ground motion selection, scaling, uncertainty, fragility curve

Procedia PDF Downloads 577
354 Real World Cancer Pain Incidence and Treatment in Daily Hospital

Authors: Alexandru Grigorescu, Alexandra Protesanu

Abstract:

Background: Approximately 34-67 percent of cancer patients experience an episode of uncontrolled pain during the course of their disease, depending on the stage. The aim is to provide evidence-based data for pain prevalence, diagnosis and treatment recommendations on an integrative model of medical oncology and palliative care for patients with cancer diagnostic in a day hospital. Patients and method: Consultation registers and electronic records of 166 Patients (Pts) were studied from April 2022 to March 2023. Pts with pain syndrome were selected. The pain was objectified by the visual pain scale. To elucidate the causes of the pain, investigations were carried out: bone scintigraphy, CT scan, and PET-CT. The analgesic treatments were represented by weak and strong morphine, radiotherapy, and bisphosphonates. Result: During the mentioned period, 166 oncological patients (74 women and 92 men) were treated in the oncology day hospitalization service. There were 1,500 consultations, 40 of which were only for pain. The neoplastic locations were: gynecological, malignant melanoma, breast, gastric, bronchopulmonary, colorectal, liver, pancreatic, bladder, and kidney. 70 Pts presented pain syndrome. The causes of the pain were represented by bone metastases, compressive tumors, and post-surgical status. Drug treatment: Tramadol 47 Pts, of which 10 switched to a major opioid (Oxycodonum, Morphine sulfate), 20 Pts were treated with Oxycodonum as the first intention. In 5 patients ry to rotated morphine, 20 Pts received palliative radiotherapy, 10 Pts were treated with bisphosphonates. 2 Pts required neurosurgery consultation for an antalgic intervention. 5 Pts had important adverse reactions to morphine. All patients and their families were advised by a medical oncologist and psychologist for a lifestyle change. Conclusions: The prevalence of pain was similar to that described in the literature. In most cases, the pain could be managed in the day hospital. Weak and strong morphine represented the main pain therapy. Palliative radiotherapy was the second most effective therapy. Treatment with bisphosphonates was useful. Surgical interventions were rarely indicated. Discussions with patients and their families regarding the lifestyle change were important.

Keywords: cancer pain, opioids, medical oncology, palliative care

Procedia PDF Downloads 54
353 Principles of Risk Management in Surgery Department

Authors: Mohammad H. Yarmohammadian, Masoud Ferdosi, Abbas Haghshenas, Fatemeh Rezaei

Abstract:

Surgical procedures aim at preserving human life and improving quality of their life. However, there are many potential risk sources that can cause serious harm to patients. For centuries, managers believed that technical competence of a surgeon is the only key to a successful surgery. But over the past decade, risks are considered in terms of process-based safety procedures, teamwork and inter departmental communication. Aims: This study aims to determine how the process- biased surgical risk management should be done in terms of project management tool named ABS (Activity Breakdown Structure). Settings and Design: This study was conducted in two stages. First, literature review and meeting with professors was done to determine principles and framework of surgical risk management. Next, responsible teams for surgical patient journey were involved in following meeting to develop the process- biased surgical risk management. Methods and Material: This study is a qualitative research in which focus groups with the inductive approach is used. Sampling was performed to achieve representativeness through intensity sampling biased on experience and seniority. Analysis Method used: context analysis of interviews and consensus themes extracted from FDG meetings discussion was the analysis tool. Results: we developed the patient journey process in 5 main phases, 24 activities and 108 tasks. Then, responsible teams, transposition and allocated places for performing determined. Some activities and tasks themes were repeated in each phases like patient identification and records review because of their importance. Conclusions: Risk management of surgical departments is significant as this facility is the hospital’s largest cost and revenue center. Good communication between surgical team and other clinical teams outside surgery department through process- biased perspective could improve safety of patient under this procedure.

Keywords: risk management, activity breakdown structure (ABS), surgical department, medical sciences

Procedia PDF Downloads 286
352 Dietary Micronutritient and Health among Youth in Algeria

Authors: Allioua Meryem

Abstract:

Similar to much of the developing world, Algeria is currently undergoing an epidemiological transition. While mal- and under-nutrition and infectious diseases used to be the main causes of poor health, today there is a higher proportion of chronic, non-communicable diseases (NCDs), including cardiovascular disease, diabetes mellitus, cancer, etc. According to estimates for Algeria from the World Health Organization (WHO), NCDs accounted for 63% of all deaths in 2010. The objective of this study was the assessment of eating habits and anthropometric characteristics in a group of youth aged 15 to 19 years in Tlemcen. This study was conducted on a total effective of 806 youth enrolled in a descriptive cross-sectional study; the classification of nutritional status has been established by international standards IOTF, youth were defined as obese if they had a BMI ≥ 95th percentile, and youth with 85th ≤ BMI ≤ 95th percentile were defined as overweight. Wc is classified by the criteria HD, Wc with moderate risk ≥ 90th percentile and Wc with high risk ≥ 95th percentile. The dietary assessment was based on a 24-hour dietary recall assisted by food records. USDA’S nutrient database for Nutrinux® program was used to analyze dietary intake. Nutrients adequacy ratio was calculated by dividing daily individual intake to dietary recommended intake DRI for each nutrient. 9% of the population was overweight, 3% was obese, 7.5% had abdominal obesity, foods eaten in moderation are chips, cookies, chocolate 1-3 times/day and increased consumption of fried foods in the week, almost half of youth consume sugary drinks more than 3 times per week, we observe a decreased intake of energy, protein (P < 0.001, P = 0.003), SFA (P = 0.018), the NAR of phosphorus, iron, magnesium, vitamin B6, vitamin E, folate, niacin, and thiamin reflecting less consumption of fruit, vegetables, milk, and milk products. Youth surveyed have eating habits at risk of developing obesity and chronic disease.

Keywords: food intake, health, anthropometric characteristics, Algeria

Procedia PDF Downloads 536
351 Montelukast Doesn’t Decrease the Risk of Cardiovascular Disease in Asthma Patients in Taiwan

Authors: Sheng Yu Chen, Shi-Heng Wang

Abstract:

Aim: Based on human, animal experiments, and genetic studies, cysteinyl leukotrienes, LTC4, LTD4, and LTE4, are inflammatory substances that are metabolized by 5-lipooxygenase from arachidonic acid, and these substances trigger asthma. In addition, the synthetic pathway of cysteinyl leukotriene is relevant to the increase in cardiovascular diseases such as myocardial ischemia and stroke. Given the situation, we aim to investigate whether cysteinyl leukotrienes receptor antagonist (LTRA), montelukast which cures those who have asthma has potential protective effects on cardiovascular diseases. Method: We conducted a cohort study, and enrolled participants which are newly diagnosed with asthma (ICD-9 CM code 493. X) between 2002 to 2011. The data source is from Taiwan National Health Insurance Research Database Patients with a previous history of myocardial infarction or ischemic stroke were excluded. Among the remaining participants, every montelukast user was matched with two randomly non-users by sex, and age. The incident cardiovascular diseases, including myocardial infarction and ischemic stroke, were regarded as outcomes. We followed the participants until outcomes come first or the end of the following period. To explore the protective effect of montelukast on the risk of cardiovascular disease, we use multivariable Cox regression to estimate the hazard ratio with adjustment for potential confounding factors. Result: There are 55876 newly diagnosed asthma patients who had at least one claim of inpatient admission or at least three claims of outpatient records. We enrolled 5350 montelukast users and 10700 non-users in this cohort study. The following mean (±SD) time of the Montelukast group is 5 (±2.19 )years, and the non-users group is 6.2 5.47 (± 2.641) years. By using multivariable Cox regression, our analysis indicated that the risk of incident cardiovascular diseases between montelukast users (n=43, 0.8%) and non-users (n=111, 1.04%) is approximately equal. [adjusted hazard ratio 0.992; P-value:0.9643] Conclusion: In this population-based study, we found that the use of montelukast is not associated with a decrease in incident MI or IS.

Keywords: asthma, inflammation, montelukast, insurance research database, cardiovascular diseases

Procedia PDF Downloads 72
350 Executive Functions Directly Associated with Severity of Perceived Pain above and beyond Depression in the Context of Medical Rehabilitation

Authors: O. Elkana, O Heyman, S. Hamdan, M. Franko, J. Vatine

Abstract:

Objective: To investigate whether a direct link exists between perceived pain (PP) and executive functions (EF), above and beyond the influence of depression symptoms, in the context of medical rehabilitation. Design: Cross-sectional study. Setting: Rehabilitation Hospital. Participants: 125 medical records of hospitalized patients were screened for matching to our inclusion criteria. Only 60 patients were found fit and were asked to participate. 19 decline to participate on personal basis. The 41 neurologically intact patients (mean age 46, SD 14.96) that participated in this study were in their sub-acute stage of recovery, with fluent Hebrew, with intact upper limb (to neutralize influence on psychomotor performances) and without an organic brain damage. Main Outcome Measures: EF were assessed using the Wisconsin Card Sorting Test (WCST) and the Stop-Signal Test (SST). PP was measured using 3 well-known pain questionnaires: Pain Disability Index (PDI), The Short-Form McGill Questionnaire (SF-MPQ) and the Pain Catastrophizing Scale (PCS). Perceived pain index (PPI) was calculated by the mean score composite from the 3 pain questionnaires. Depression symptoms were assessed using the Patient Health Questionnaire (PHQ-9). Results: The results indicate that irrespective of the presence of depression symptoms, PP is directly correlated with response inhibition (SST partial correlation: r=0.5; p=0.001) and mental flexibility (WSCT partial correlation: r=-0.37; p=0.021), suggesting decreased performance in EF as PP severity increases. High correlations were found between the 3 pain measurements: SF-MPQ with PDI (r=0.62, p<0.001), SF-MPQ with PCS (r=0.58, p<0.001) and PDI with PCS (r=0.38, p=0.016) and each questionnaire alone was also significantly associated with EF; thus, no specific questionnaires ‘pulled’ the results obtained by the general index (PPI). Conclusion: Examining the direct association between PP and EF, beyond the contribution of depression symptoms, provides further clinical evidence suggesting that EF and PP share underlying mediating neuronal mechanisms. Clinically, the importance of assessing patients' EF abilities as well as PP severity during rehabilitation is underscored.

Keywords: depression, executive functions, mental-flexibility, neuropsychology, pain perception, perceived pain, response inhibition

Procedia PDF Downloads 233
349 Comparing Performance of Neural Network and Decision Tree in Prediction of Myocardial Infarction

Authors: Reza Safdari, Goli Arji, Robab Abdolkhani Maryam zahmatkeshan

Abstract:

Background and purpose: Cardiovascular diseases are among the most common diseases in all societies. The most important step in minimizing myocardial infarction and its complications is to minimize its risk factors. The amount of medical data is increasingly growing. Medical data mining has a great potential for transforming these data into information. Using data mining techniques to generate predictive models for identifying those at risk for reducing the effects of the disease is very helpful. The present study aimed to collect data related to risk factors of heart infarction from patients’ medical record and developed predicting models using data mining algorithm. Methods: The present work was an analytical study conducted on a database containing 350 records. Data were related to patients admitted to Shahid Rajaei specialized cardiovascular hospital, Iran, in 2011. Data were collected using a four-sectioned data collection form. Data analysis was performed using SPSS and Clementine version 12. Seven predictive algorithms and one algorithm-based model for predicting association rules were applied to the data. Accuracy, precision, sensitivity, specificity, as well as positive and negative predictive values were determined and the final model was obtained. Results: five parameters, including hypertension, DLP, tobacco smoking, diabetes, and A+ blood group, were the most critical risk factors of myocardial infarction. Among the models, the neural network model was found to have the highest sensitivity, indicating its ability to successfully diagnose the disease. Conclusion: Risk prediction models have great potentials in facilitating the management of a patient with a specific disease. Therefore, health interventions or change in their life style can be conducted based on these models for improving the health conditions of the individuals at risk.

Keywords: decision trees, neural network, myocardial infarction, Data Mining

Procedia PDF Downloads 416
348 Breech Versus Cephalic Elective Caesarean Deliveries – A Comparison of Immediate Neonatal Outcomes

Authors: Genevieve R. Kan, Jolyon Ford

Abstract:

Background: Caesarean section has become the routine route of delivery for breech fetuses, but breech cesarean deliveries are hypothesized to have poorer immediate neonatal outcomes when compared to cephalic deliveries. In accordance with this, in many Australian hospitals, the pediatric team is routinely required to attend every elective breech cesarean section in case urgent resuscitation is required. Our study aimed to determine whether term elective breech deliveries indeed had worse immediate neonatal outcomes at delivery, which will justify the necessity of pediatric staff presence at every elective breech cesarean delivery and influence the workload for the pediatric team. Objective: Elective breech cesarean deliveries were compared to elective cephalic cesarean deliveries at 37 weeks gestation or above to evaluate the immediate neonatal outcomes (Apgar scores <7 at 5 minutes, and Special Care Nursery admissions on Day 1 of life) of each group. Design: A retrospective cohort study Method: This study examined 2035 elective breech and cephalic singleton cesarean deliveries at term over 5 years from July 2017 to July 2022 at Frankston Hospital, a metropolitan hospital in Melbourne, Australia. There were 260 breech deliveries and 1775 cephalic deliveries. De-identified patient data were collected retrospectively from the hospital’s electronically integrated pregnancy and birth records to assess demographics and neonatal outcomes. Results: Apgar scores <7 at 5 minutes of life were worse in the breech group compared to the cephalic group (3.4% vs 1.6%). Special Care Nursery admissions on Day 1 of life were also higher for the breech cohort compared to the cephalic cohort (9.6% vs 8.7%). Conclusions: Our results support the expected findings that breech deliveries are associated with worse immediate neonatal outcomes. It, therefore, suggests that routine attendance at elective breech cesarean deliveries by the pediatric team is indeed required to assist with potentially higher needs for neonatal resuscitation and special care nursery admission.

Keywords: breech, cesarean section, Apgar scores, special care nursery admission

Procedia PDF Downloads 95
347 Men's Intimate Violence: Theory and Practice Relationship

Authors: Omer Zvi Shaked

Abstract:

Intimate Partner Violence (IPV) is a widespread social problem. Since the 1970's, and due to political changes resulting from the feminist movement, western society has been changing its attitude towards the phenomenon and has been taking an active approach to reduce its magnitude. Enterprises in the form of legislation, awareness and prevention campaigns, women's shelters, and community intervention programs became more prevalent as years progressed. Although many initiatives were found to be productive, the effectiveness of one, however, remained questionable throughout the years: intervention programs for men's intimate violence. Surveys outline two main intervention models for men's intimate violence. The first is the Duluth model, which argued that men are socialized to be dominant - while women are socialized to be subordinate - and men are therefore required by social imperative to enforce, physically if necessary, their dominance. The Duluth model became the chief authorized intervention program, and some states in the US even regulated it as the standard criminal justice program for men's intimate violence. However, meta-analysis findings demonstrated that based on a partner's reports, Duluth treatment completers have 44% recidivism rate, and between 40% and 85% dropout range. The second model is the Cognitive-Behavioral Model (CBT), which is a highly accepted intervention worldwide. The model argues that cognitive misrepresentations of intimate situations precede violent behaviors frequently when anger predisposition exists. Since anger dysregulation mediates between one's cognitive schemes and violent response, anger regulation became the chief purpose of the intervention. Yet, a meta-analysis found only a 56% risk reduction for CBT interventions. It is, therefore, crucial to understand the background behind the domination of both the Duluth model and CBT interventions. This presentation will discuss the ways in which theoretical conceptualizations of men's intimate violence, as well as ideologies, had contributed to the above-mentioned interventions' wide acceptance, despite known lack of scientific and evidential support. First, the presentation will review the prominent interventions for male intimate violence, the Duluth model, and CBT. Second, the presentation will review the prominent theoretical models explaining men's intimate violence: The Patriarchal model, the Abusive Personality model, and the Post-Traumatic Stress model. Third, the presentation will discuss the interrelation between theory and practice, and the nature of affinity between research and practice regarding men's intimate violence. Finally, the presentation will set new directions for further research, aiming to improve intervention's efficiency with men's intimate violence and advance social work practice in the field.

Keywords: intimate partner violence, theory and practice relationship, Duluth, CBT, abusive personality, post-traumatic stress

Procedia PDF Downloads 122
346 Do Interventions for Increasing Minorities' Access to Higher Education Work? The Case of Ethiopians in Israel

Authors: F. Nasser-Abu Alhija

Abstract:

In many countries, much efforts and resources are devoted to empowering and integrating minorities within the mainstream population. Major ventures in this route are crafted in higher education institutions where different outreach programs and methods such as lenient entry requirements, monitory incentives, learning skills workshops, tutoring and mentoring, are utilized. Although there is some information regarding these programs, their effectiveness still needs to be thoroughly examined. The Ethiopian community In Israel is one of the minority groups that has been targeted by sponsoring foundations and higher education institutions with the aim to ease the access, persistence and success of its young people in higher education and later in the job market. The evaluation study we propose to present focuses on the implementation of a program designed for this purpose. This program offers relevant candidates for study at a prestigious university a variety of generous incentives that include tuitions, livening allowance, tutoring, mentoring, skills and empowerment workshops and cultural meetings. Ten students were selected for the program and they started their studies in different subject areas before three and half years. A longitudinal evaluation has been conducted since the implementation of the program. Data were collected from different sources: participating students, program coordinator, mentors, tutors, program documents and university records. Questionnaires and interviews were used for collecting data on the different components of the program and on participants' perception of their effectiveness. Participants indicate that the lenient entry requirements and the monitory incentives are critical for starting their studies. During the first year, skills and empowering workshops, torturing and mentoring were evaluated as very important for persistence and success in studies. Tutoring was perceived as very important also at the second year but less importance is attributed to mentoring. Mixed results regarding integration in the Israeli culture emerged. The results are discussed with reference to findings from different settings around the world.

Keywords: access to higher education, minority groups, monitory incentives, torturing, mentoring

Procedia PDF Downloads 366
345 Assessing the Survival Time of Hospitalized Patients in Eastern Ethiopia During 2019–2020 Using the Bayesian Approach: A Retrospective Cohort Study

Authors: Chalachew Gashu, Yoseph Kassa, Habtamu Geremew, Mengestie Mulugeta

Abstract:

Background and Aims: Severe acute malnutrition remains a significant health challenge, particularly in low‐ and middle‐income countries. The aim of this study was to determine the survival time of under‐five children with severe acute malnutrition. Methods: A retrospective cohort study was conducted at a hospital, focusing on under‐five children with severe acute malnutrition. The study included 322 inpatients admitted to the Chiro hospital in Chiro, Ethiopia, between September 2019 and August 2020, whose data was obtained from medical records. Survival functions were analyzed using Kaplan‒Meier plots and log‐rank tests. The survival time of severe acute malnutrition was further analyzed using the Cox proportional hazards model and Bayesian parametric survival models, employing integrated nested Laplace approximation methods. Results: Among the 322 patients, 118 (36.6%) died as a result of severe acute malnutrition. The estimated median survival time for inpatients was found to be 2 weeks. Model selection criteria favored the Bayesian Weibull accelerated failure time model, which demonstrated that age, body temperature, pulse rate, nasogastric (NG) tube usage, hypoglycemia, anemia, diarrhea, dehydration, malaria, and pneumonia significantly influenced the survival time of severe acute malnutrition. Conclusions: This study revealed that children below 24 months, those with altered body temperature and pulse rate, NG tube usage, hypoglycemia, and comorbidities such as anemia, diarrhea, dehydration, malaria, and pneumonia had a shorter survival time when affected by severe acute malnutrition under the age of five. To reduce the death rate of children under 5 years of age, it is necessary to design community management for acute malnutrition to ensure early detection and improve access to and coverage for children who are malnourished.

Keywords: Bayesian analysis, severe acute malnutrition, survival data analysis, survival time

Procedia PDF Downloads 23
344 Milk Yield and Fingerprinting of Beta-Casein Precursor (CSN2) Gene in Some Saudi Camel Breeds

Authors: Amr A. El Hanafy, Yasser M. Saad, Saleh A. Alkarim, Hussein A. Almehdar, Elrashdy M. Redwan

Abstract:

Camels are substantial providers of transport, milk, sport, meat, shelter, fuel, security and capital in many countries, particularly Saudi Arabia. Identification of animal breeds has progressed rapidly during the last decade. Advanced molecular techniques are playing a significant role in breeding or strain protection laws. On the other hand, fingerprinting of some molecular markers related to some productive traits in farm animals represents most important studies to our knowledge, which aim to conserve these local genetic resources, and to the genetic improvement of such local breeds by selective programs depending on gene markers. Milk records were taken two days in each week from female camels of Majahem, Safara, Wathaha, and Hamara breeds, respectively from different private farms in northern Jeddah, Riyadh and Alwagh governorates and average weekly yields were calculated. DNA sequencing for CSN2 gene was used for evaluating the genetic variations and calculating the genetic distance values among four Saudi camel populations which are Hamra(R), Safra(Y), Wadha(W) and Majaheim(M). In addition, this marker was analyzed for reconstructing the Neighbor joining tree among evaluating camel breeds. In respect to milk yield during winter season, result indicated that average weekly milk yield of Safara camel breed (30.05 Kg/week) is significantly (p < 0.05) lower than the other 3 breeds which ranged from 39.68 for Hamara to 42.42 Kg/week for Majahem, while there are not significant differences between these three breeds. The Neighbor Joining analysis that re-constructed based on DNA variations showed that samples are clustered into two unique clades. The first clade includes Y (from Y4 to Y18) and M (from M1, to M9). On the other hand, the second cluster is including all R (from R1 to R6) and W (from W1 to W6). The genetic distance values were equal 0.0068 (between the groups M&Y and R&W) and equal 0 (within each group).

Keywords: milk yield, beta-casein precursor (CSN2), Saudi camel, molecular markers

Procedia PDF Downloads 205