Search results for: logistic
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 815

Search results for: logistic

155 Rural Livelihood under a Changing Climate Pattern in the Zio District of Togo, West Africa

Authors: Martial Amou

Abstract:

This study was carried out to assess the situation of households’ livelihood under a changing climate pattern in the Zio district of Togo, West Africa. The study examined three important aspects: (i) assessment of households’ livelihood situation under a changing climate pattern, (ii) farmers’ perception and understanding of local climate change, (iii) determinants of adaptation strategies undertaken in cropping pattern to climate change. To this end, secondary sources of data, and survey data collected from 235 farmers in four villages in the study area were used. Adapted conceptual framework from Sustainable Livelihood Framework of DFID, two steps Binary Logistic Regression Model and descriptive statistics were used in this study as methodological approaches. Based on Sustainable Livelihood Approach (SLA), various factors revolving around the livelihoods of the rural community were grouped into social, natural, physical, human, and financial capital. Thus, the study came up that households’ livelihood situation represented by the overall livelihood index in the study area (34%) is below the standard average households’ livelihood security index (50%). The natural capital was found as the poorest asset (13%) and this will severely affect the sustainability of livelihood in the long run. The result from descriptive statistics and the first step regression (selection model) indicated that most of the farmers in the study area have clear understanding of climate change even though they do not have any idea about greenhouse gases as the main cause behind the issue. From the second step regression (output model) result, education, farming experience, access to credit, access to extension services, cropland size, membership of a social group, distance to the nearest input market, were found to be the significant determinants of adaptation measures undertaken in cropping pattern by farmers in the study area. Based on the result of this study, recommendations are made to farmers, policy makers, institutions, and development service providers in order to better target interventions which build, promote or facilitate the adoption of adaptation measures with potential to build resilience to climate change and then improve rural livelihood.

Keywords: climate change, rural livelihood, cropping pattern, adaptation, Zio District

Procedia PDF Downloads 300
154 A Case-Control Study on Dietary Heme/Nonheme Iron and Colorectal Cancer Risk

Authors: Alvaro L. Ronco

Abstract:

Background and purpose: Although our country is a developing one, it has a typical Western meat-rich dietary style. Based on estimates of heme and nonheme iron contents in representative foods, we carried out the present epidemiologic study, with the aim of accurately analyzing dietary iron and its role on CRC risk. Subjects/methods: Patients (611 CRC incident cases and 2394 controls, all belonging to public hospitals of our capital city) were interviewed through a questionnaire including socio-demographic, reproductive and lifestyle variables, and a food frequency questionnaire of 64 items, which asked about food intake 5 years before the interview. The sample included 1937 men and 1068 women. Controls were matched by sex and age (± 5 years) to cases. Food-derived nutrients were calculated from available databases. Total dietary iron was calculated and classified by heme or nonheme source, following data of specific Dutch and Canadian studies, and additionally adjusted by energy. Odds Ratios (OR) and 95% confidence intervals were calculated through unconditional logistic regression, adjusting for relevant potential confounders (education, body mass index, family history of cancer, energy, infusions, and others). A heme/nonheme (H/NH) ratio was created and the interest variables were categorized into tertiles, for analysis purposes. Results: The following risk estimations correspond to the highest tertiles. Total iron intake showed no association with CRC risk neither among men (OR=0.83, ptrend =.18) nor among women (OR=1.48, ptrend =.09). Heme iron was positively associated among men (OR=1.88, ptrend < .001) and for the overall sample (OR=1.44, ptrend =.002), however, it was not associated among women (OR=0.91, ptrend =.83). Nonheme iron showed an inverse association among men (OR=0.53, ptrend < .001) and the overall sample (OR=0.78, ptrend =.04), but was not associated among women (OR=1.46, ptrend =.14). Regarding H/NH ratio, risks increased only among men (OR=2.12, ptrend < .001) but lacked of association among women (OR=0.81, ptrend =.29). Conclusions. We have observed different types of associations between CRC risk and high dietary heme, nonheme and H/NH iron ratio. Therefore, the source of the available iron might be of importance as a link to colorectal carcinogenesis, perhaps pointing to reconsider the animal/plant proportions of this vital mineral within diet. Nevertheless, the different associations observed for each sex, demand further studies in order to clarify these points.

Keywords: chelation, colorectal cancer, heme, iron, nonheme

Procedia PDF Downloads 146
153 Adapting to Rural Demographic Change: Impacts, Challenges and Opportunities for Ageing Farmers in Prachin Buri Province, Thailand

Authors: Para Jansuwan, Kerstin K. Zander

Abstract:

Most people in rural Thailand still depend on agriculture. The rural areas are undergoing changes in their demographic structures with an increasing older population, out migration of younger people and a shift away from work in the agricultural sector towards manufacturing and service provisioning. These changes may lead to a decline in agricultural productivity and food insecurity. Our research aims to examine perceptions of older farmers on how rural demographic change affects them, to investigate how farmers may change their agricultural practices to cope with their ageing and to explore the factors affecting these changes, including the opportunities and challenges arising from them. The data were collected through a household survey with 368 farmers in the Prachin Buri province in central Thailand, the main area for agricultural production. A series of binomial logistic regression models were applied to analyse the data. We found that most farmers suffered from age-related diseases, which compromised their working capacity. Most farmers attempted to reduce labour intense work, by either stopping farming through transferring farmland to their children (41%), stopping farming by giving the land to the others (e.g., selling, leasing out) (28%) and continuing farming with making some changes (e.g., changing crops, employing additional workers) (24%). Farmers’ health and having a potential farm successor were positively associated with the probability of stopping farming by transferring the land to the children. Farmers with a successor were also less likely to stop farming by giving the land to the others. Farmers’ age was negatively associated with the likelihood of continuing farming by making some changes. The results show that most farmers base their decisions on the hope that their children will take over the farms, and that without successor, farmers lease out or sell the land. Without successor, they also no longer invest in expansion and improvement of their farm production, especially adoption of innovative technologies that could help them to maintain their farm productivity. To improve farmers’ quality of life and sustain their farm productivity, policies are needed to support the viability of farms, the access to a pension system and the smooth and successful transfer of the land to a successor of farmers.

Keywords: rural demographic change, older farmer, stopping farming, continuing farming, health and age, farm successor, Thailand

Procedia PDF Downloads 75
152 A Cross-Sectional Study on Clinical Self-Efficacy of Final Year School of Nursing Students among Universities of Tigray Region, Northern Ethiopia

Authors: Awole Seid, Yosef Zenebe, Hadgu Gerensea, Kebede Haile Misgina

Abstract:

Background: Clinical competence is one of the ultimate goals of nursing education. Clinical skills are more than successfully performing tasks; it incorporates client assessment, identification of deficits and the ability to critically think to provide solutions. Assessment of clinical competence, particularly identifying gaps that need improvement and determining the educational needs of nursing students have great importance in nursing education. Thus this study aims determining clinical self-efficacy of final year school of nursing students in three universities of Tigray Region. Methods: A cross-sectional study was conducted on 224 final year school of nursing students from department of nursing, psychiatric nursing, and midwifery on three universities of Tigray region. Anonymous self-administered questionnaire was administered to generate data collected on June, 2017. The data were analyzed using SPSS version 20. The result is described using tables and charts as required. Logistic regression was employed to test associations. Result: The mean age of students was 22.94 + 1.44. Generally, 21% of students have been graduated in the department in which they are not interested. The study demonstrated 28.6% had poor and 71.4% had good perceived clinical self-efficacy. Beside this, 43.8% of psychiatric nursing and 32.6% of comprehensive nursing students have poor clinical self-efficacy. Among the four domains, 39.3% and 37.9% have poor clinical self- efficacy with regard to ‘Professional development’ and ‘Management of care’. Place of the institution [AOR=3.480 (1.333 - 9.088), p=0.011], interest during department selection [AOR=2.202 (1.045 - 4.642), p=.038], and theory-practice gap [AOR=0.224 (0.110 - 0.457), p=0.000] were significantly associated with perceived clinical self-efficacy. Conclusion: The magnitude of students with poor clinically self efficacy was high. Place of institution, theory-practice gap, students interest to the discipline were the significant predictors of clinical self-efficacy. Students from youngest universities have good clinical self-efficacy. During department selection, student’s interest should be respected. The universities and other stakeholders should improve the capacity of surrounding affiliate teaching hospitals to set and improve care standards in order to narrow the theory-practice gap. School faculties should provide trainings to hospital staffs and monitor standards of clinical procedures.

Keywords: clinical self-efficacy, nursing students, Tigray, northern Ethiopia

Procedia PDF Downloads 138
151 Delays for Emergency Cesarean Sections and Neonatal Outcomes in Three Rural District Hospitals in Rwanda: A Retrospective Cross-Sectional Study

Authors: J. Niyitegeka, G. Nshimirimana, A. Silverstein, J. Odhiambo, Y. Lin, T. Nkurunziza, R. Riviello, S. Rulisa, P. Banguti, H. Magge, M. Macharia, J. P. Dushime, R. Habimana, B. Hedt-Gauthier

Abstract:

In low-resource settings, women needing an emergency cesarean section experiences various delays in both reaching and receiving care that is often linked to poor neonatal outcomes. In this study, we quantified different measures of delays and assessed the association between these delays and neonatal outcomes at three rural district hospitals in Rwanda. This retrospective study included 441 neonates and their mothers who underwent emergency cesarean sections in 2015 at Butaro, Kirehe and Rwinkwavu District Hospitals. Four possible delays were measured: Time from start of labor to district hospital admission, travel time from a health center to the district hospital, time from admission to surgical incision, and time from the decision for the emergency cesarean section to surgical incision. Neonatal outcomes were categorized as unfavorable (APGAR < 7 or death) and favorable (APGAR ≥ 7). We assessed the relationship between each type of delay and neonatal outcomes using multivariate logistic regression. In our study, 38.7% (108 out of 279) of neonates’ mothers labored for 12 to 24 hours before hospital admission and 44.7% (159 of 356) of mothers were transferred from health centers that required 30 to 60 minutes of travel time to reach the district hospital. 48.1% (178 of 370) of caesarean sections started within five hours after admission and 85.2% (288 of 338) started more than thirty minutes after the decision for the emergency cesarean section was made. Neonatal outcomes were significantly worse among mothers with more than 90 minutes of travel time from the health center to the district hospital compared to health centers attached to the hospital (OR = 5.12, p = 0.02). Neonatal outcomes were also significantly different depending on decision to incision intervals; neonates with cesarean deliveries starting more than thirty minutes after decision had better outcomes than those started immediately (OR = 0.32, p = 0.04). Interventions that decrease barriers to access to maternal health care services can improve neonatal outcome after emergency cesarean section. Triaging could explain the inverse relationship between time from decision to incision and neonatal outcome; this must be studied more in the future.

Keywords: Africa, emergency obstetric care, rural health delivery, maternal and child health

Procedia PDF Downloads 199
150 Prediction of Alzheimer's Disease Based on Blood Biomarkers and Machine Learning Algorithms

Authors: Man-Yun Liu, Emily Chia-Yu Su

Abstract:

Alzheimer's disease (AD) is the public health crisis of the 21st century. AD is a degenerative brain disease and the most common cause of dementia, a costly disease on the healthcare system. Unfortunately, the cause of AD is poorly understood, furthermore; the treatments of AD so far can only alleviate symptoms rather cure or stop the progress of the disease. Currently, there are several ways to diagnose AD; medical imaging can be used to distinguish between AD, other dementias, and early onset AD, and cerebrospinal fluid (CSF). Compared with other diagnostic tools, blood (plasma) test has advantages as an approach to population-based disease screening because it is simpler, less invasive also cost effective. In our study, we used blood biomarkers dataset of The Alzheimer’s disease Neuroimaging Initiative (ADNI) which was funded by National Institutes of Health (NIH) to do data analysis and develop a prediction model. We used independent analysis of datasets to identify plasma protein biomarkers predicting early onset AD. Firstly, to compare the basic demographic statistics between the cohorts, we used SAS Enterprise Guide to do data preprocessing and statistical analysis. Secondly, we used logistic regression, neural network, decision tree to validate biomarkers by SAS Enterprise Miner. This study generated data from ADNI, contained 146 blood biomarkers from 566 participants. Participants include cognitive normal (healthy), mild cognitive impairment (MCI), and patient suffered Alzheimer’s disease (AD). Participants’ samples were separated into two groups, healthy and MCI, healthy and AD, respectively. We used the two groups to compare important biomarkers of AD and MCI. In preprocessing, we used a t-test to filter 41/47 features between the two groups (healthy and AD, healthy and MCI) before using machine learning algorithms. Then we have built model with 4 machine learning methods, the best AUC of two groups separately are 0.991/0.709. We want to stress the importance that the simple, less invasive, common blood (plasma) test may also early diagnose AD. As our opinion, the result will provide evidence that blood-based biomarkers might be an alternative diagnostics tool before further examination with CSF and medical imaging. A comprehensive study on the differences in blood-based biomarkers between AD patients and healthy subjects is warranted. Early detection of AD progression will allow physicians the opportunity for early intervention and treatment.

Keywords: Alzheimer's disease, blood-based biomarkers, diagnostics, early detection, machine learning

Procedia PDF Downloads 296
149 Drug Therapy Problem and Its Contributing Factors among Pediatric Patients with Infectious Diseases Admitted to Jimma University Medical Center, South West Ethiopia: Prospective Observational Study

Authors: Desalegn Feyissa Desu

Abstract:

Drug therapy problem is a significant challenge to provide high quality health care service for the patients. It is associated with morbidity, mortality, increased hospital stay, and reduced quality of life. Moreover, pediatric patients are quite susceptible to drug therapy problems. Thus this study aimed to assess drug therapy problem and its contributing factors among pediatric patients diagnosed with infectious disease admitted to pediatric ward of Jimma university medical center, from April 1 to June 30, 2018. Prospective observational study was conducted among pediatric patients with infectious disease admitted from April 01 to June 30, 2018. Drug therapy problems were identified by using Cipolle’s and strand’s drug related problem classification method. Patient’s written informed consent was obtained after explaining the purpose of the study. Patient’s specific data were collected using structured questionnaire. Data were entered into Epi data version 4.0.2 and then exported to statistical software package version 21.0 for analysis. To identify predictors of drug therapy problems occurrence, multiple stepwise backward logistic regression analysis was done. The 95% CI was used to show the accuracy of data analysis and statistical significance was considered at p-value < 0.05. A total of 304 pediatric patients were included in the study. Of these, 226(74.3%) patients had at least one drug therapy problem during their hospital stay. A total of 356 drug therapy problems were identified among two hundred twenty six patients. Non-compliance (28.65%) and dose too low (27.53%) were the most common type of drug related problems while disease comorbidity [AOR=3.39, 95% CI= (1.89-6.08)], Polypharmacy [AOR=3.16, 95% CI= (1.61-6.20)] and more than six days stay in hospital [AOR=3.37, 95% CI= (1.71-6.64) were independent predictors of drug therapy problem occurrence. Drug therapy problems were common in pediatric patients with infectious disease in the study area. Presence of comorbidity, polypharmacy and prolonged hospital stay were the predictors of drug therapy problem in study area. Therefore, to overcome the significant gaps in pediatric pharmaceutical care, clinical pharmacists, Pediatricians, and other health care professionals have to work in collaboration.

Keywords: drug therapy problem, pediatric, infectious disease, Ethiopia

Procedia PDF Downloads 127
148 Towards Real-Time Classification of Finger Movement Direction Using Encephalography Independent Components

Authors: Mohamed Mounir Tellache, Hiroyuki Kambara, Yasuharu Koike, Makoto Miyakoshi, Natsue Yoshimura

Abstract:

This study explores the practicality of using electroencephalographic (EEG) independent components to predict eight-direction finger movements in pseudo-real-time. Six healthy participants with individual-head MRI images performed finger movements in eight directions with two different arm configurations. The analysis was performed in two stages. The first stage consisted of using independent component analysis (ICA) to separate the signals representing brain activity from non-brain activity signals and to obtain the unmixing matrix. The resulting independent components (ICs) were checked, and those reflecting brain-activity were selected. Finally, the time series of the selected ICs were used to predict eight finger-movement directions using Sparse Logistic Regression (SLR). The second stage consisted of using the previously obtained unmixing matrix, the selected ICs, and the model obtained by applying SLR to classify a different EEG dataset. This method was applied to two different settings, namely the single-participant level and the group-level. For the single-participant level, the EEG dataset used in the first stage and the EEG dataset used in the second stage originated from the same participant. For the group-level, the EEG datasets used in the first stage were constructed by temporally concatenating each combination without repetition of the EEG datasets of five participants out of six, whereas the EEG dataset used in the second stage originated from the remaining participants. The average test classification results across datasets (mean ± S.D.) were 38.62 ± 8.36% for the single-participant, which was significantly higher than the chance level (12.50 ± 0.01%), and 27.26 ± 4.39% for the group-level which was also significantly higher than the chance level (12.49% ± 0.01%). The classification accuracy within [–45°, 45°] of the true direction is 70.03 ± 8.14% for single-participant and 62.63 ± 6.07% for group-level which may be promising for some real-life applications. Clustering and contribution analyses further revealed the brain regions involved in finger movement and the temporal aspect of their contribution to the classification. These results showed the possibility of using the ICA-based method in combination with other methods to build a real-time system to control prostheses.

Keywords: brain-computer interface, electroencephalography, finger motion decoding, independent component analysis, pseudo real-time motion decoding

Procedia PDF Downloads 115
147 “Environmental-Friendly” and “People-Friendly” Project for a New North-East Italian Hospital

Authors: Emanuela Zilli, Antonella Ruffatto, Davide Bonaldo, Stefano Bevilacqua, Tommaso Caputo, Luisa Fontana, Carmelina Saraceno, Antonio Sturaroo, Teodoro Sava, Antonio Madia

Abstract:

The new Hospital in Cittadella - ULSS 6 Euganea Health Trust, in the North-East of Italy (400 beds, project completion date in 2026), will partially take the place of the existing building. Interesting features have been suggested in order to project a modern, “environmental-friendly” and “people-friendly” building. Specific multidisciplinary meetings (involving stakeholders and professionals with different backgrounds) have been organized on a periodic basis in order to guarantee the appropriate implementation of logistic and organizational solutions related to eco-sustainability, integration with the context, and the concept of “design for all” and “humanization of care.” The resulting building will be composed of organic shapes determined by the external environment (sun movement, climate, landscape, pre-existing buildings, roads) and the needs of the internal environment (areas of care and diagnostic-treatment paths reorganized with experience gained during the pandemic), with extensive use of renewable energy, solar panels, a 4th-generation heating system, sanitised and maintainable surfaces. There is particular attention to the quality of the staff areas, which include areas dedicated to psycho-physical well-being (relax points, yoga gym), study rooms, and a centralized conference room. Outdoor recreational spaces and gardens for music and watercolour therapy will be included; atai-chi gym is dedicated to oncology patients. Integration in the urban and social context is emphasized through window placement toward the gardens (maternal-infant, mental health, and rehabilitation wards). Service areas such as dialysis, radiology, and labs have views of the medieval walls, the symbol of the city’s history. The new building has been designed to pursue the maximum level of eco-sustainability, harmony with the environment, and integration with the historical, urban, and social context; the concept of humanization of care has been considered in all the phases of the project management.

Keywords: environmental-friendly, humanization, eco-sustainability, new hospital

Procedia PDF Downloads 72
146 Factors Associated with Death during Tuberculosis Treatment of Patients Co-Infected with HIV at a Tertiary Care Setting in Cameroon: An 8-Year Hospital-Based Retrospective Cohort Study (2006-2013)

Authors: A. A. Agbor, Jean Joel R. Bigna, Serges Clotaire Billong, Mathurin Cyrille Tejiokem, Gabriel L. Ekali, Claudia S. Plottel, Jean Jacques N. Noubiap, Hortence Abessolo, Roselyne Toby, Sinata Koulla-Shiro

Abstract:

Background: Contributors to fatal outcomes in patients undergoing tuberculosis (TB) treatment in the setting of HIV co-infection are poorly characterized, especially in sub-Saharan Africa. Our study’s aim was to assess factors associated with death in TB/HIV co-infected patients during the first 6 months their TB treatment. Methods: We conducted a tertiary-care hospital-based retrospective cohort study from January 2006 to December 2013 at the Yaoundé Central Hospital, Cameroon. We reviewed medical records to identify hospitalized co-infected TB/HIV patients aged 15 years and older. Death was defined as any death occurring during TB treatment, as per the World Health Organization’s recommendations. Logistic regression analysis identified factors associated with death. Magnitudes of associations were expressed by adjusted odds ratio (aOR) with 95% confidence interval. A p value < 0.05 was considered statistically significant. Results: The 337 patients enrolled had a mean age of 39.3 (+/- 10.3) years and more (54.3%) were women. TB treatment outcomes included: treatment success in 60.8% (n=205), death in 29.4% (n=99), not evaluated in 5.3% (n=18), loss to follow-up in 5.3% (n=14), and failure in 0.3% (n=1) . After exclusion of patients lost to follow-up and not evaluated, death in TB/HIV co-infected patients during TB treatment was associated with: a TB diagnosis made before national implementation of guidelines regarding initiation of antiretroviral therapy (aOR = 2.50 [1.31-4.78]; p = 0.006), the presence of other AIDS-defining infections (aOR = 2.73 [1.27-5.86]; p = 0.010), non-AIDS comorbidities (aOR = 3.35 [1.37-8.21]; p = 0.008), not receiving co-trimoxazole prophylaxis (aOR = 3.61 [1.71-7.63]; p = 0.001), not receiving antiretroviral therapy (aOR = 2.45 [1.18-5.08]; p = 0.016), and CD4 cell counts < 50 cells/mm3 (aOR = 16.43 [1.05-258.04]; p = 0.047). Conclusions: The success rate of anti-tuberculosis treatment among hospitalized TB/HIV co-infected patients in our setting is low. Mortality in the first 6 months of treatment was high and strongly associated with specific clinical factors including states of greater immunosuppression, highlighting the urgent need for targeted interventions, including provision of anti-retroviral therapy and co-trimoxazole prophylaxis in order to enhance patient outcomes.

Keywords: TB/HIV co-infection, death, treatment outcomes, factors

Procedia PDF Downloads 421
145 Storage Assignment Strategies to Reduce Manual Picking Errors with an Emphasis on an Ageing Workforce

Authors: Heiko Diefenbach, Christoph H. Glock

Abstract:

Order picking, i.e., the order-based retrieval of items in a warehouse, is an important time- and cost-intensive process for many logistic systems. Despite the ongoing trend of automation, most order picking systems are still manual picker-to-parts systems, where human pickers walk through the warehouse to collect ordered items. Human work in warehouses is not free from errors, and order pickers may at times pick the wrong or the incorrect number of items. Errors can cause additional costs and significant correction efforts. Moreover, age might increase a person’s likelihood to make mistakes. Hence, the negative impact of picking errors might increase for an aging workforce currently witnessed in many regions globally. A significant amount of research has focused on making order picking systems more efficient. Among other factors, storage assignment, i.e., the assignment of items to storage locations (e.g., shelves) within the warehouse, has been subject to optimization. Usually, the objective is to assign items to storage locations such that order picking times are minimized. Surprisingly, there is a lack of research concerned with picking errors and respective prevention approaches. This paper hypothesize that the storage assignment of items can affect the probability of pick errors. For example, storing similar-looking items apart from one other might reduce confusion. Moreover, storing items that are hard to count or require a lot of counting at easy-to-access and easy-to-comprehend self heights might reduce the probability to pick the wrong number of items. Based on this hypothesis, the paper discusses how to incorporate error-prevention measures into mathematical models for storage assignment optimization. Various approaches with respective benefits and shortcomings are presented and mathematically modeled. To investigate the newly developed models further, they are compared to conventional storage assignment strategies in a computational study. The study specifically investigates how the importance of error prevention increases with pickers being more prone to errors due to age, for example. The results suggest that considering error-prevention measures for storage assignment can reduce error probabilities with only minor decreases in picking efficiency. The results might be especially relevant for an aging workforce.

Keywords: an aging workforce, error prevention, order picking, storage assignment

Procedia PDF Downloads 175
144 Factors Associated with Involvement in Physical Activity among Children (Aged 6-18 Years) Training at Excel Soccer Academy in Uganda

Authors: Syrus Zimaze, George Nsimbe, Valley Mugwanya, Matiya Lule, Edgar Watson, Patrick Gwayambadde

Abstract:

Physical inactivity is a growing global epidemic, also recognised as a major public health challenge. Globally, there are alarming rates of children reported with cardiovascular disease and obesity with limited interventions. In Sub Saharan Africa, there is limited information about involvement in physical activity especially among children aged 6 to 18 years. The aim of this study was to explore factors associated with involvement in physical activity among children in Uganda. Methods: We included all parents with children aged 6 to 18 years training with Excel Soccer Academy between January 2017 and June 2018. Physical activity definition was time spent participating in routine soccer training at the academy for more than 30 days. Each child's attendance was recorded, and parents provided demographic and social economic data. Data on predictors of physical activity involvement were collected using a standardized questionnaire. Descriptive statistics and frequency were used. Binary logistic regression was used at the multi variable level adjusting for education, residence, transport means and access to information technology. Results: Overall 356 parents were interviewed; Boys 318 (89.3%) engaged more in physical activity than girls. The median age for children was 13 years (IQR:6-18) and 42 years (IQR:37-49) among parents. The median time spent at the Excel soccer academy was 13.4 months (IQR: 4.6-35.7) Majority of the children attended formal education, p < 0.001). Factors associated with involvement in physical activity included: owning a permanent house compared to a rented house (odds ratio [OR] :2.84: 95% CI: 2.09-3.86, p < 0.0001), owning a car compared to using public transport (OR: 5.64 CI: 4.80-6.63, p < 0.0001), a parent having received formal education compared to non-formal education (OR: 2.93 CI: 2.47-3.46, p < 0.0001) and daily access to information technology (OR:0.40 CI:0.25-0.66, p < 0.001). Parent’s age and gender were not associated to involvement in physical activity. Conclusions: Socioeconomic factors were positively associated with involvement in physical activity with boys participating more than girls in soccer activities. More interventions are required geared towards increasing girl’s participation in physical activity and those targeting children from less privilege homes.

Keywords: physical activity, Sub-Saharan Africa, social economic factors, children

Procedia PDF Downloads 132
143 Determinants of Household Food Security in Addis Ababa City Administration

Authors: Estibe Dagne Mekonnen

Abstract:

In recent years, the prevalence of undernourishment was 30 percent for sub-Saharan Africa, compared with 16 percent for Asia and the Pacific (Ali, 2011). In Ethiopia, almost 40 percent of the total population in the country and 57 percent of Addis Ababa population lives below the international poverty line of US$ 1.25 per day (UNICEF, 2009). This study aims to analyze the determinant of household food secrity in Addis Ababa city administration. Primary data were collected from a survey of 256 households in the selected sub-city, namely Addis Ketema, Arada, and Kolfe Keranio, in the year 2022. Both Purposive and multi-stage cluster random sampling procedures were employed to select study areas and respondents. Descriptive statistics and order logistic regression model were used to test the formulated hypotheses. The result reveals that out of the total sampled households, 25% them were food secured, 13% were mildly food insecure, 26% were moderately food insecure and 36% were severely food insecure. The study indicates that household family size, house ownership, household income, household food source, household asset possession, household awareness on inflation, household access to social protection program, household access to credit and saving and household access to training and supervision on food security have a positive and significant effect on the likelihood of household food security status. However, marital status of household head, employment sector of household head, dependency ratio and household’s nonfood expenditure has a negative and significant influence on household food security status. The study finally suggests that the government in collaboration with financial institutions and NGO should work on sustaining household food security by creating awareness, providing credit, facilitate rural-urban linkage between producer and consumer and work on urban infrastructure improvement. Moreover, the governments also work closely and monitor consumer good suppliers, if possible find a way to subsidize consumable goods to more insecure households and make them to be food secured. Last but not least, keeping this country’s peace will play a crucial role to sustain food security.

Keywords: determinants, household, food security, order logit model, Addis Ababa

Procedia PDF Downloads 28
142 Medical Student's Responses to Emotional Content in Doctor-Patient Communication: To Explore Differences in Communication Training of Medical Students and Its Impact on Doctor-Patient Communication

Authors: Stephanie Yun Yu Law

Abstract:

Background: This study aims to investigate into communication between trainee doctors and patients, especially how doctor’s reaction to patient’s emotional issues expressed in the consultation affect patient’s satisfaction. Objectives: Thus, there are three aims in this study, 1.) how do trainee doctors react to patients emotional cues in OSCE station? 2.) Any differences in the respond type to emotional cues between first year students and third year students? 3.) Is response type (reducing space) related to OSCE outcome (patient satisfaction and expert rating)? Methods: Fifteen OSCE stations was videotaped, in which 9 were stations with first-year students and 6 were with third-year students. OSCE outcomes were measured by Communication Assessment Tool and Examiners Checklist. Analyses: All patient’s cues/concerns and student’s reaction were coded by Verona Coding Definitions of Emotional Sequence. Descriptive data was gathered from Observer XT and logistic regression (two-level) was carried out to see if occurrence of reducing space response can be predicted by OSCE outcomes. Results: Reducing space responses from all students were slightly less than a half in total responses to patient’s cues. The mean percentage of reducing space behaviours was lower among first year students when compared to third year students. Patient’s satisfaction significantly (p<0.05) and negatively predicted reducing space behaviours. Conclusions: Most of the medical students, to some extent, did not provide adequate responses for patient’s emotional cues. But first year students did provide more space for patients to talk about their emotional issues when compared to third year students. Lastly, patients would feel less satisfied if trainee doctors use more reducing space responses in reaction to patient’s expressed emotional cues/concerns. Practical implications: Firstly, medical training programme can be tailored on teaching students how to detect and respond appropriately to emotional cues in order to improve underperformed student’s communication skills in healthcare setting. Furthermore, trainee doctor’s relationship with patients in clinical practice can also be improved by reacting appropriately to patient’s emotive cues in consultations (such as limit the use of reducing space behaviours).

Keywords: doctors-patients communication, applied clinical psychology, health psychology, healthcare professionals

Procedia PDF Downloads 196
141 Food Intake Pattern and Nutritional Status of Preschool Children of Chakma Ethnic Community

Authors: Md Monoarul Haque

Abstract:

Nutritional status is a sensitive indicator of community health and nutrition among preschool children, especially the prevalence of undernutrition that affects all dimensions of human development and leads to growth faltering in early life. The present study is an attempt to assess the food intake pattern and nutritional status of pre-school Chakma tribe children. It was a cross-sectional community based study. The subjects were selected purposively. This study was conducted at Savar Upazilla of Rangamati. Rangamati is located in the Chittagong Division. Anthropometric data height and weight of the study subjects were collected by standard techniques. Nutritional status was measured using Z score according WHO classification. χ2 test, independent t-test, Pearson’s correlation, multiple regression and logistic regression was performed as P<0.05 level of significance. Statistical analyses were performed by appropriate univariate and multivariate techniques using SPSS windows 11.5. Moderate (-3SD to <-2SD) to severe underweight (<-3SD) were 23.8% and 76.2% study subjects had normal weight for their age. Moderate (-3SD to <-2SD) to severe (<-3SD) stunted children were only 25.6% and 74.4% children were normal and moderate to severe wasting were 14.7% whereas normal child was 85.3%. Significant association had been found between child nutritional status and monthly family income, mother education and occupation of father and mother. Age, sex and incomes of the family, education of mother and occupation of father were significantly associated with WAZ and HAZ of the study subjects (P=0.0001, P=0.025, P=0.001 and P=0.0001, P=0.003, P=0.031, P=0.092, P=0.008). Maximum study subjects took local small fish and some traditional tribal food like bashrool, jhijhipoka and pork very much popular food among tribal children. Energy, carbohydrate and fat intake was significantly associated with HAZ, WAZ, BAZ and MUACZ. This study demonstrates that malnutrition among tribal children in Bangladesh is much better than national scenario in Bangladesh. Significant association was found between child nutritional status and family monthly income, mother education and occupation of father and mother. Most of the study subjects took local small fish and some traditional tribal food. Significant association was also found between child nutritional status and dietary intake of energy, carbohydrate and fat.

Keywords: food intake pattern, nutritional status, preschool children, Chakma ethnic community

Procedia PDF Downloads 477
140 Effects of Renin Angiotensin Pathway Inhibition on Efficacy of Anti-PD-1/PD-L1 Treatment in Metastatic Cancer

Authors: Philip Friedlander, John Rutledge, Jason Suh

Abstract:

Inhibition of programmed death-1 (PD-1) or its ligand PD-L1 confers therapeutic efficacy in a wide range of solid tumor malignancies. Primary or acquired resistance can develop through activation of immunosuppressive immune cells such as tumor-associated macrophages. The renin angiotensin system (RAS) systemically regulates fluid and sodium hemodynamics, but components are expressed on and regulate the activity of immune cells, particularly of myeloid lineage. We hypothesized that inhibition of RAS would improve the efficacy of PD-1/PD-L-1 treatment. A retrospective analysis was performed through a chart review of patients with solid metastatic malignancies treated with a PD-1/PD-L1 inhibitor between 1/2013 and 6/2019 at Valley Hospital, a community hospital in New Jersey, USA. Efficacy was determined by medical oncologist documentation of clinical benefit in visit notes and by the duration of time on immunotherapy treatment. The primary endpoint was the determination of efficacy differences in patients treated with an inhibitor of RAS ( ace inhibitor, ACEi, or angiotensin blocker, ARB) compared to patients not treated with these inhibitors. To control for broader antihypertensive effects, efficacy as a function of treatment with beta blockers was assessed. 173 patients treated with PD-1/PD-L-1 inhibitors were identified of whom 52 were also treated with an ACEi or ARB. Chi-square testing revealed a statistically significant relationship between being on an ACEi or ARB and efficacy to PD-1/PD-L-1 therapy (p=0.001). No statistically significant relationship was seen between patients taking or not taking beta blocker antihypertensives (p= 0.33). Kaplan-Meier analysis showed statistically significant improvement in the duration of therapy favoring patients concomitantly treated with ACEi or ARB compared to patients not exposed to antihypertensives and to those treated with beta blockers. Logistic regression analysis revealed that age, gender, and cancer type did not have significant effects on the odds of experiencing clinical benefit (p=0.74, p=0.75, and p=0.81, respectively). We conclude that retrospective analysis of the treatment of patients with solid metastatic tumors with anti-PD-1/PD-L1 in a community setting demonstrates greater clinical benefit in the context of concomitant ACEi or ARB inhibition, irrespective of gender or age. This data supports the development of prospective assessment through randomized clinical trials.

Keywords: angiotensin, cancer, immunotherapy, PD-1, efficacy

Procedia PDF Downloads 47
139 Improvements of the Difficulty in Hospital Acceptance at the Scene by the Introduction of Smartphone Application for Emergency-Medical-Service System: A Population-Based Before-And-After Observation Study in Osaka City, Japan

Authors: Yusuke Katayama, Tetsuhisa Kitamura, Kosuke Kiyohara, Sumito Hayashida, Taku Iwami, Takashi Kawamura, Takeshi Shimazu

Abstract:

Background: Recently, the number of ambulance dispatches has been increasing in Japan and it is, therefore, difficult to accept emergency patients to hospitals smoothly and appropriately because of the limited hospital capacity. To facilitate the request for patient transport by ambulances and hospital acceptance, the emergency information system using information technology has been built up and introduced in various communities. However, its effectiveness has not been insufficiently revealed in Japan. In 2013, we developed a smartphone application system that enables the emergency-medical-service (EMS) personnel to share information about on-scene ambulance and hospital situation. The aim of this study was to assess the introduction effect of this application for EMS system in Osaka City, Japan. Methods: This study was a retrospective study with population-based ambulance records of Osaka Municipal Fire Department. This study period was six years from January 1, 2010 to December 31, 2015. In this study, we enrolled emergency patients that on-scene EMS personnel conducted the hospital selection for them. The main endpoint was difficulty in hospital acceptance at the scene. The definition of difficulty in hospital acceptance at the scene was to make >=5 phone calls by EMS personnel at the scene to each hospital until a decision to transport was determined. The definition of the smartphone application group was emergency patients transported in the period of 2013-2015 after the introduction of this application, and we assessed the introduction effect of smartphone application with multivariable logistic regression model. Results: A total of 600,526 emergency patients for whom EMS personnel selected hospitals were eligible for our analysis. There were 300,131 smartphone application group (50.0%) in 2010-2012 and 300,395 non-smartphone application group (50.0%) in 2013-2015. The proportion of the difficulty in hospital acceptance was 14.2% (42,585/300,131) in the smartphone application group and 10.9% (32,819/300,395) in the non-smartphone application group, and the difficulty in hospital acceptance significantly decreased by the introduction of the smartphone application (adjusted odds ration; 0.730, 95% confidence interval; 0.718-0.741, P<0.001). Conclusions: Sharing information between ambulance and hospital by introducing smartphone application at the scene was associated with decreasing the difficulty in hospital acceptance. Our findings may be considerable useful for developing emergency medical information system with using IT in other areas of the world.

Keywords: difficulty in hospital acceptance, emergency medical service, infomation technology, smartphone application

Procedia PDF Downloads 249
138 A Small-Scale Survey on Risk Factors of Musculoskeletal Disorders in Workers of Logistics Companies in Cyprus and on the Early Adoption of Industrial Exoskeletons as Mitigation Measure

Authors: Kyriacos Clerides, Panagiotis Herodotou, Constantina Polycarpou, Evagoras Xydas

Abstract:

Background: Musculoskeletal disorders (MSDs) in the workplace is a very common problem in Europe which are caused by multiple risk factors. In recent years, wearable devices and exoskeletons for the workplace have been trying to address the various risk factors that are associated with strenuous tasks in the workplace. The logistics sector is a huge sector that includes warehousing, storage, and transportation. However, the task associated with logistics is not well-studied in terms of MSDs risk. This study was aimed at looking into the MSDs affecting workers of logistics companies. It compares the prevalence of MSDs among workers and evaluates multiple risk factors that contribute to the development of MSDs. Moreover, this study seeks to obtain user feedback on the adoption of exoskeletons in such a work environment. Materials and Methods: The study was conducted among workers in logistics companies in Nicosia, Cyprus, from July to September 2022. A set of standardized questionnaires was used for collecting different types of data. Results: A high proportion of logistics professionals reported MSDs in one or more other body regions, the lower back being the most commonly affected area. Working in the same position for long periods, working in awkward postures, and handling an excessive load, were found to be the most commonly reported job risk factor that contributed to the development of MSDs, in this study. A significant number of participants consider the back region as the most to be benefited from a wearable exoskeleton device. Half of the participants would like to have at least a 50% reduction in their daily effort. The most important characteristics for the adoption of exoskeleton devices were found to be how comfortable the device is and its weight. Conclusion: Lower back and posture were the highest risk factors among all logistics professionals assessed in this study. A larger scale study using quantitative analytical tools may give a more accurate estimate of MSDs, which would pave the way for making more precise recommendations to eliminate the risk factors and thereby prevent MSDs. A follow-up study using exoskeletons in the workplace should be done to assess whether they assist in MSD prevention.

Keywords: musculoskeletal disorders, occupational health, safety, occupational risk, logistic companies, workers, Cyprus, industrial exoskeletons, wearable devices

Procedia PDF Downloads 73
137 Moderating Effect of Owner's Influence on the Relationship between the Probability of Client Failure and Going Concern Opinion Issuance

Authors: Mohammad Noor Hisham Osman, Ahmed Razman Abdul Latiff, Zaidi Mat Daud, Zulkarnain Muhamad Sori

Abstract:

The problem that Malaysian auditors do not issue going concern opinion (GC opinion) to seriously financially distressed companies is still a pressing issue. Policy makers, particularly the Financial Statement Review Committee (FSRC) of Malaysian Institute of Accountant, have raised this issue as early as in 2009. Similar problem happened in the US, UK, and many developing countries. It is important for auditors to issue GC opinion properly because such opinion is one signal about the viability of a company much needed by stakeholders. There are at least two unanswered questions or research gaps in the literature on determinants of GC opinion. Firstly, is client’s probability of failure associated with GC opinion issuance? Secondly, to what extent influential owners (management, family, and institution) moderate the association between client probability of failure and GC opinion issuance. The objective of this study is, therefore, twofold; (1) To examine the extent of the relationship between the probability of client failure and the issuance of GC opinion and (2) To examine the level of management, family, and institutional ownerships moderate the association between client probability of failure and the issuance of GC opinion. This study is quantitative in nature, and the sources of data are secondary (mainly company’s annual reports). A total of four hypotheses have been developed and tested on data accumulated from annual reports of seriously financially distressed Malaysian public listed companies. Data from 2006 to 2012 on a sample of 644 observations have been analyzed using panel logistic regression. It is found that certainty (rather than probability) of client failure affects the issuance of GC opinion. In addition, it is found that only the level of family ownership does positively moderate the relationship between client probability of failure and GC opinion issuance. This study is a contribution to auditing literature as its findings can enhance our understanding about audit quality; particularly on the variables that are associated with the issuance of GC opinion. The findings of this study shed light on the roles family owners in GC opinion issuance process, and this would open ways for the researcher to suggest measures that can be used to tackle the problem of auditors do not want to issue GC opinion to financially distressed clients. The measures to be suggested can be useful to policy makers in formulating future promulgations.

Keywords: audit quality, auditing, auditor characteristics, going concern opinion, Malaysia

Procedia PDF Downloads 228
136 Tuberculosis and Associated Transient Hyperglycaemia in Peri-Urban South Africa: Implications for Diabetes Screening in High Tuberculosis/HIV Burden Settings

Authors: Mmamapudi Kubjane, Natacha Berkowitz, Rene Goliath, Naomi S. Levitt, Robert J. Wilkinson, Tolu Oni

Abstract:

Background: South Africa remains a high tuberculosis (TB) burden country globally and the burden of diabetes – a TB risk factor is growing rapidly. As an infectious disease, TB also induces transient hyperglycaemia. Therefore, screening for diabetes in newly diagnosed tuberculosis patients may result in misclassification of transient hyperglycaemia as diabetes. Objective: The objective of this study was to determine and compare the prevalence of hyperglycaemia (diabetes and impaired glucose regulation (IGR)) in TB patients and to assess the cross-sectional association between TB and hyperglycaemia at enrolment and after three months of follow-up. Methods: Consecutive adult TB and non-TB participants presenting at a TB clinic in Cape Town were enrolled in this cross-sectional study and follow-up between July 2013 and August 2015. Diabetes was defined as self-reported diabetes, fasting plasma glucose (FPG) ≥ 7.0 mmol·L⁻¹ or glycated haemoglobin (HbA1c) ≥ 6.5%. IGR was defined as FPG 5.5– < 7.0 mmol·L⁻¹ or HbA1c 5.7– < 6.5%. TB patients initiated treatment. After three months, all participants were followed up and screened for diabetes again. The association between TB and hyperglycaemia was assessed using logistic regression adjusting for potential confounders including sex, age, income, hypertension, waist circumference, previous prisoner, marital status, work status, HIV status. Results: Diabetes screening was performed in 852 participants (414 TB and 438 non-TB) at enrolment and in 639 (304 TB and 335 non-TB) at three-month follow-up. The prevalence of HIV-1 infection was 69.6% (95% confidence interval (CI), 64.9–73.8 %) among TB patients, and 58.2% (95% CI, 53.5–62.8 %) among the non-TB participants. Glycaemic levels were much higher in TB patients than in the non-TB participants but decreased over time. Among TB patients, the prevalence of IGR was 65.2% (95% CI 60.1 - 69.9) at enrollment and 21.5% (95% CI 17.2-26.5) at follow-up; and was 50% (45.1 - 54.94) and 32% (95% CI 27.9 - 38.0) respectively, among non-TB participants. The prevalence of diabetes in TB patients was 12.5% (95% CI 9.69 – 16.12%) at enrolment and 9.2% (95% CI, 6.43–13.03%) at follow-up; and was 10.04% (95% CI, 7.55–13.24%) and 8.06% (95% CI, 5.58–11.51) respectively, among non-TB participants. The association between TB and IGT was significant at enrolment (adjusted odds ratio (OR) 2.26 (95% CI, 1.55-3.31) but disappeared at follow-up 0.84 (0.53 - 1.36). However, the TB-diabetes association remained positive and significant both at enrolment (2.41 (95% CI, 1.3-4.34)) and follow-up (OR 3.31 (95% CI, 1.5 - 7.25)). Conclusion: Transient hyperglycaemia exists during tuberculosis. This has implications on diabetes screening in TB patients and suggests a need for diabetes confirmation tests during or after TB treatment. Nonetheless, the association between TB and diabetes noted at enrolment persists at 3 months highlighting the importance of diabetes control and prevention for TB control. Further research is required to investigate the impact of hyperglycaemia (transient or otherwise) on TB outcomes to ascertain the clinical significance of hyperglycemia at enrolment.

Keywords: diabetes, impaired glucose regulation, transient hyperglycaemia, tuberculosis

Procedia PDF Downloads 132
135 Burnout among Healthcare Workers in Poland during the COVID-19 Pandemic

Authors: Zbigniew Izdebski, Alicja Kozakiewicz, Maciej Białorudzki, Joanna Mazur

Abstract:

Work is an extremely important part of everyone's life and affects functioning in daily life. Healthcare workers (HCW) are suffering from negative actions in and out of the workplace, such as harassment, abuse, long working hours, mental suffering, exhaustion, and professional burnout. Staff burnout is detrimental not only in terms of individual employees but also to working with patients and to the healthcare institution as a whole. The purpose of this study was to explore the level of professional burnout among HCW working in medical institutions during the COVID-19 pandemic in Poland. The extent to which selected sociodemographic factors and perceived stress increase the risk of professional burnout was assessed. In addition, the frequency of use of professional psychological help and less formal support groups by HCW in relation to the level of professional burnout was presented. The survey was conducted as part of a larger project on the humanization of medicine and clinical communication from February-April 2022. This study used a self-administered online survey (CAWI) technique and PAPI (pen and paper interview) technique. The BAT-12 scale was used to measure burnout, the PSS-4 scale was used to measure stress, and questions formulated by the research team were also used. For the purpose of analysis, the sample was limited to 2196 HCWs who worked on a daily basis with patients during the COVID-19 pandemic. Frequency distributions were analyzed, and multivariate logistic regression was performed. The mean scores (scores) of job burnout as measured by the BAT-12 scale ranged among the professional groups from 2.15(0.69) to 2.30 (0.69) and remained highest for the nurses' group. The groups differed significantly in levels of burnout (chi-sq=17.719; d.f.=8; p<0.023). In the final model, raised stress most likely increased the risk of burnout (OR=3.88; 95%CI <3.13-3.81>; p<0,001). Other significant predictors of burnout included: traumatic work-related experience (OR=1.91, p<0.001), mobbing (OR=1.83, p<0.001), and a higher workload than before the pandemic (OR=1.41, p=0.002). Only 7% of respondents decided to use various forms of psychological support during the pandemic. HCW experiences challenges in dealing with an unpredictable pandemic. Limited preparedness can lead to physical and psychological problems such as high-stress levels, anxiety, fear, helplessness, hopelessness, anger and stigma. The workload can lead to professional burnout, as well as threaten patient safety.

Keywords: burnout, work, healthcare, healthcare worker, stress

Procedia PDF Downloads 45
134 Effects of the Affordable Care Act On Preventive Care Disparities

Authors: Cagdas Agirdas

Abstract:

Background: The Affordable Care Act (ACA) requires non-grandfathered private insurance plans, starting with plan years on or after September 23rd, 2010, to provide certain preventive care services without any cost sharing in the form of deductibles, copayments or co-insurance. This requirement may affect racial and ethnic disparities in preventive care as it provides the largest copay reduction in preventive care. Objectives: We ask whether the ACA’s free preventive care benefits are associated with a reduction in racial and ethnic disparities in the utilization of four preventive services: cholesterol screenings, colonoscopies, mammograms, and pap smears. Methods: We use a data set of over 6,000 individuals from the 2009, 2010, and 2013 Medical Expenditure Panel Surveys (MEPS). We restrict our data set only to individuals who are old enough to be eligible for each preventive service. Our difference-in-differences logistic regression model classifies privately-insured Hispanics, African Americans, and Asians as the treatment groups and 2013 as the after-policy year. Our control group consists of non-Hispanic whites on Medicaid as this program already covered preventive care services for free or at a low cost before the ACA. Results: After controlling for income, education, marital status, preferred interview language, self-reported health status, employment, having a usual source of care, age and gender, we find that the ACA is associated with increases in the probability of the median, privately-insured Hispanic person to get a colonoscopy by 3.6% and a mammogram by 3.1%, compared to a non-Hispanic white person on Medicaid. Similarly, we find that the median, privately-insured African American person’s probability of receiving these two preventive services improved by 2.3% and 2.4% compared to a non-Hispanic white person on Medicaid. We do not find any significant improvements for any racial or ethnic group for cholesterol screenings or pap smears. Furthermore, our results do not indicate any significant changes for Asians compared to non-Hispanic whites in utilizing the four preventive services. These reductions in racial/ethnic disparities are robust to reconfigurations of time periods, previous diagnosis, and residential status. Conclusions: Early effects of the ACA’s provision of free preventive care are significant for Hispanics and African Americans. Further research is needed for the later years as more individuals became aware of these benefits.

Keywords: preventive care, Affordable Care Act, cost sharing, racial disparities

Procedia PDF Downloads 119
133 Antigen Stasis can Predispose Primary Ciliary Dyskinesia (PCD) Patients to Asthma

Authors: Nadzeya Marozkina, Joe Zein, Benjamin Gaston

Abstract:

Introduction: We have observed that many patients with Primary Ciliary Dyskinesia (PCD) benefit from asthma medications. In healthy airways, the ciliary function is normal. Antigens and irritants are rapidly cleared, and NO enters the gas phase normally to be exhaled. In the PCD airways, however, antigens, such as Dermatophagoides, are not as well cleared. This defect leads to oxidative stress, marked by increased DUOX1 expression and decreased superoxide dismutase [SOD] activity (manuscript under revision). H₂O₂, in high concentrations in the PCD airway, injures the airway. NO is oxidized rather than being exhaled, forming cytotoxic peroxynitrous acid. Thus, antigen stasis on PCD airway epithelium leads to airway injury and may predispose PCD patients to asthma. Indeed, recent population genetics suggest that PCD genes may be associated with asthma. We therefore hypothesized that PCD patients would be predisposed to having asthma. Methods. We analyzed our database of 18 million individual electronic medical records (EMRs) in the Indiana Network for Patient Care research database (INPCR). There is not an ICD10 code for PCD itself; code Q34.8 is most commonly used clinically. To validate analysis of this code, we queried patients who had an ICD10 code for both bronchiectasis and situs inversus totalis in INPCR. We also studied a validation cohort using the IBM Explorys® database (over 80 million individuals). Analyses were adjusted for age, sex and race using a 1 PCD: 3 controls matching method in INPCR and multivariable logistic regression in the IBM Explorys® database. Results. The prevalence of asthma ICD10 codes in subjects with a code Q34.8 was 67% vs 19% in controls (P < 0.0001) (Regenstrief Institute). Similarly, in IBM*Explorys, the OR [95% CI] for having asthma if a patient also had ICD10 code 34.8, relative to controls, was =4.04 [3.99; 4.09]. For situs inversus alone the OR [95% CI] was 4.42 [4.14; 4.71]; and bronchiectasis alone the OR [95% CI] =10.68 (10.56; 10.79). For both bronchiectasis and situs inversus together, the OR [95% CI] =28.80 (23.17; 35.81). Conclusions: PCD causes antigen stasis in the human airway (under review), likely predisposing to asthma in addition to oxidative and nitrosative stress and to airway injury. Here, we show that, by several different population-based metrics, and using two large databases, patients with PCD appear to have between a three- and 28-fold increased risk of having asthma. These data suggest that additional studies should be undertaken to understand the role of ciliary dysfunction in the pathogenesis and genetics of asthma. Decreased antigen clearance caused by ciliary dysfunction may be a risk factor for asthma development.

Keywords: antigen, PCD, asthma, nitric oxide

Procedia PDF Downloads 70
132 Locus of Control and Self-Esteem as Predictors of Maternal and Child Healthcare Services Utilization in Nigeria

Authors: Josephine Aikpitanyi, Friday Okonofua, Lorrettantoimo, Sandy Tubeuf

Abstract:

Every day, 800 women die from conditions related to pregnancy and childbirth, resulting in an estimated 300,000 maternal deaths worldwide per year. Over 99 percent of all maternal deaths occur in developing countries, with more than half of them occurring in sub-Saharan Africa. Nigeria being the most populous nation in sub-Saharan Africa bears a significant burden of worsening maternal and child health outcomes with a maternal mortality rate of 917 per 100,000 live births and child mortality rate of 117 per 1,000 live births. While several studies have documented that financial barriers disproportionately discourage poor women from seeking needed maternal and child healthcare, other studies have indicated otherwise. Evidence shows that there are instances where health facilities with skilled healthcare providers exist, and yet maternal, and child health outcomes remain abysmally low, indicating the presence of non-cognitive and behavioural factors that may affect the utilization of healthcare services. This study investigated the influence of locus of control and self-esteem on utilization of maternal and child healthcare services in Nigeria. Specifically, it explored the differences in utilization of antenatal care, skilled birth care, postnatal care, and child vaccination by women having an internal and external locus of control and women having high and low self-esteem. We collected information on non-cognitive traits of 1411 randomly selected women, along with information on utilization of the various indicators of maternal and child healthcare. Estimating logistic regression models for various components of healthcare services utilization, we found that women’s internal locus of control was a significant predictor of utilization of antenatal care, skilled birth care, and completion of child vaccination. We also found that having high self-esteem was a significant predictor of utilization of antenatal care, postnatal care, and completion of child vaccination after adjusting for other control variables. By improving our understanding of non-cognitive traits as possible barriers to maternal and child healthcare utilization, our findings offer important insights for enhancing participant engagement in intervention programs that are initiated to improve maternal and child health outcomes in low-and-middle-income countries.

Keywords: behavioural economics, health-seeking behaviour, locus of control and self-esteem, maternal and child healthcare, non-cognitive traits, and healthcare utilization

Procedia PDF Downloads 128
131 Pre-Operative Psychological Factors Significantly Add to the Predictability of Chronic Narcotic Use: A Two Year Prospective Study

Authors: Dana El-Mughayyar, Neil Manson, Erin Bigney, Eden Richardson, Dean Tripp, Edward Abraham

Abstract:

Use of narcotics to treat pain has increased over the past two decades and is a contributing factor to the current public health crisis. Understanding the pre-operative risks of chronic narcotic use may be aided through investigation of psychological measures. The objective of the reported study is to determine predictors of narcotic use two years post-surgery in a thoracolumbar spine surgery population, including an array of psychological factors. A prospective observational study of 191 consecutively enrolled adult patients having undergone thoracolumbar spine surgery is presented. Baseline measures of interest included the Pain Catastrophizing Scale (PCS), Tampa Scale for Kinesiophobia, Multidimensional Scale for Perceived Social Support (MSPSS), Chronic Pain Acceptance Questionnaire (CPAQ-8), Oswestry Disability Index (ODI), Numeric Rating Scales for back and leg pain (NRS-B/L), SF-12’s Mental Component Summary (MCS), narcotic use and demographic variables. The post-operative measure of interest is narcotic use at 2-year follow-up. Narcotic use is collapsed into binary categories of use and no use. Descriptive statistics are run. Chi Square analysis is used for categorical variables and an ANOVA for continuous variables. Significant variables are built into a hierarchical logistic regression to determine predictors of post-operative narcotic use. Significance is set at α < 0.05. Results: A total of 27.23% of the sample were using narcotics two years after surgery. The regression model included ODI, NRS-Leg, time with condition, chief complaint, pre-operative drug use, gender, MCS, PCS subscale helplessness, and CPAQ subscale pain willingness and was significant χ² (13, N=191)= 54.99; p = .000. The model accounted for 39.6% of the variance in narcotic use and correctly predicted in 79.7% of cases. Psychological variables accounted for 9.6% of the variance over and above the other predictors. Conclusions: Managing chronic narcotic usage is central to the patient’s overall health and quality of life. Psychological factors in the preoperative period are significant predictors of narcotic use 2 years post-operatively. The psychological variables are malleable, potentially allowing surgeons to direct their patients to preventative resources prior to surgery.

Keywords: narcotics, psychological factors, quality of life, spine surgery

Procedia PDF Downloads 112
130 Farmers’ Perception, Willingness and Capacity in Utilization of Household Sewage Sludge as Organic Resources for Peri-Urban Agriculture around Jos Nigeria

Authors: C. C. Alamanjo, A. O. Adepoju, H. Martin, R. N. Baines

Abstract:

Peri-urban agriculture in Jos Nigeria serves as a major means of livelihood for both urban and peri-urban poor, and constitutes huge commercial inclination with a target market that has spanned beyond Plateau State. Yet, the sustainability of this sector is threatened by intensive application of urban refuse ash contaminated with heavy metals, as a result of the highly heterogeneous materials used in ash production. Hence, this research aimed to understand the current fertilizer employed by farmers, their perception and acceptability in utilization of household sewage sludge for agricultural purposes and their capacity in mitigating risks associated with such practice. Mixed methods approach was adopted, and data collection tools used include survey questionnaire, focus group discussion with farmers, participants and field observation. The study identified that farmers maintain a complex mixture of organic and chemical fertilizers, with mixture composition that is dependent on fertilizer availability and affordability. Also, farmers have decreased the rate of utilization of urban refuse ash due to labor and increased logistic cost and are keen to utilize household sewage sludge for soil fertility improvement but are mainly constrained by accessibility of this waste product. Nevertheless, farmers near to sewage disposal points have commenced utilization of household sewage sludge for improving soil fertility. Farmers were knowledgeable on composting but find their strategic method of dewatering and sun drying more convenient. Irrigation farmers were not enthusiastic for treatment, as they desired both water and sludge. Secondly, household sewage sludge observed in the field is heterogeneous due to nearness between its disposal point and that of urban refuse, which raises concern for possible cross-contamination of pollutants and also portrays lack of extension guidance as regards to treatment and management of household sewage sludge for agricultural purposes. Hence, farmers concerns need to be addressed, particularly in providing extension advice and establishment of decentralized household sewage sludge collection centers, for continuous availability of liquid and concentrated sludge. Urgent need is also required for the Federal Government of Nigeria to increase commitment towards empowering her subsidiaries for efficient discharge of corporate responsibilities.

Keywords: ash, farmers, household, peri-urban, refuse, sewage, sludge, urban

Procedia PDF Downloads 100
129 Factors Associated with Pesticides Used and Plasma Cholinesterase Level among Agricultural Workers in Rural Area, Thailand

Authors: Pirakorn Sukonthaman, Paphitchaya Temphattharachok, Warangkana Thammasanya, Kraichart Tantrakarnarpa, Tanongson Tientavorn

Abstract:

Agriculture is the main occupation in Thailand. Excessive amount of pesticides are used to increase the products but are toxic to human body. In 2009, Bureau of Epidemiology received 1,691 cases reported with pesticides toxicity (2.66:100,000) which 10.61 % of them is caused by Organophosphate. The purposes are to find factors associated with pesticides used and plasma cholinesterase level and other emerging issues that previous studies did not explain among agricultural workers in Baan Na Yao, Chachoengsao, Thailand. This research was an exploratory mixed method study. Qualitative interviews and quantitative questionnaires were used together in order to gather information from the agricultural workers (mainly cassava and rice farming) directly exposed to pesticides within 2 months simultaneously. Qualitative participants were selected by purposive sampling and a total survey for quantitative ones. The quantitative data was statistically analyzed by using multiple logistic regression model. Qualitative data was transcribed verbatim and thematically analyzed. For qualitative study, 15 participants were interviewed and 300/323 participants (92.88%) were given questionnaires, of which were 175 male and 125 female and 113 among them were spraymen. The prevalence of abnormal plasma cholinesterase level was 92.28% (Safe 7.72% Risky 49.33% and Unsafe 42.95%). Participants with inappropriate behaviors during spraying had a significant association with plasma cholinesterase level (95%CI=1.399-14.858) but other factors such as age, gender, education, attitude and knowledge had no association. They also had encountered various symptoms from pesticides such as fatigue (61%), vertigo (59.67%) and headache (58.86%), etc. Although they had high knowledge and attitude they still had poor behaviors. Moreover, our qualitative component showed that though they had worn the personal protective equipment (PPE) regularly, their PPE was not standard. Not only substandard PPE, but also there were obstacles of wearing such as the hot climate and inconvenience. They misunderstood their symptoms from using pesticides as allergy. Therefore, they did not seek for proper medical check-ups and treatment. This research revealed almost all of the participants have abnormal levels of plasma cholinesterase related especially those with poor behaviors. They also wore PPE but inadequately and misunderstood the symptoms produced by organophosphate use as allergy. Therefore, they did not seek for medical treatment. Occupation health education, modification of PPE and periodic medical checking are ways to make agricultural workers concern and know if there is any progression in a long term.

Keywords: pesticides, plasma cholinesterase level, spraymen, agricultural workers

Procedia PDF Downloads 330
128 Unlocking Health Insights: Studying Data for Better Care

Authors: Valentina Marutyan

Abstract:

Healthcare data mining is a rapidly developing field at the intersection of technology and medicine that has the potential to change our understanding and approach to providing healthcare. Healthcare and data mining is the process of examining huge amounts of data to extract useful information that can be applied in order to improve patient care, treatment effectiveness, and overall healthcare delivery. This field looks for patterns, trends, and correlations in a variety of healthcare datasets, such as electronic health records (EHRs), medical imaging, patient demographics, and treatment histories. To accomplish this, it uses advanced analytical approaches. Predictive analysis using historical patient data is a major area of interest in healthcare data mining. This enables doctors to get involved early to prevent problems or improve results for patients. It also assists in early disease detection and customized treatment planning for every person. Doctors can customize a patient's care by looking at their medical history, genetic profile, current and previous therapies. In this way, treatments can be more effective and have fewer negative consequences. Moreover, helping patients, it improves the efficiency of hospitals. It helps them determine the number of beds or doctors they require in regard to the number of patients they expect. In this project are used models like logistic regression, random forests, and neural networks for predicting diseases and analyzing medical images. Patients were helped by algorithms such as k-means, and connections between treatments and patient responses were identified by association rule mining. Time series techniques helped in resource management by predicting patient admissions. These methods improved healthcare decision-making and personalized treatment. Also, healthcare data mining must deal with difficulties such as bad data quality, privacy challenges, managing large and complicated datasets, ensuring the reliability of models, managing biases, limited data sharing, and regulatory compliance. Finally, secret code of data mining in healthcare helps medical professionals and hospitals make better decisions, treat patients more efficiently, and work more efficiently. It ultimately comes down to using data to improve treatment, make better choices, and simplify hospital operations for all patients.

Keywords: data mining, healthcare, big data, large amounts of data

Procedia PDF Downloads 32
127 Impact of Diabetes Mellitus Type 2 on Clinical In-Stent Restenosis in First Elective Percutaneous Coronary Intervention Patients

Authors: Leonard Simoni, Ilir Alimehmeti, Ervina Shirka, Endri Hasimi, Ndricim Kallashi, Verona Beka, Suerta Kabili, Artan Goda

Abstract:

Background: Diabetes Mellitus type 2, small vessel calibre, stented length of vessel, complex lesion morphology, and prior bypass surgery have resulted risk factors for In-Stent Restenosis (ISR). However, there are some contradictory results about body mass index (BMI) as a risk factor for ISR. Purpose: We want to identify clinical, lesional and procedural factors that can predict clinical ISR in our patients. Methods: Were enrolled 759 patients who underwent first-time elective PCI with Bare Metal Stents (BMS) from September 2011 to December 2013 in our Department of Cardiology and followed them for at least 1.5 years with a median of 862 days (2 years and 4 months). Only the patients re-admitted with ischemic heart disease underwent control coronary angiography but no routine angiographic control was performed. Patients were categorized in ISR and non-ISR groups and compared between them. Multivariate analysis - Binary Logistic Regression: Forward Conditional Method was used to identify independent predictive risk factors. P was considered statistically significant when <0.05. Results: ISR compared to non-ISR individuals had a significantly lower BMI (25.7±3.3 vs. 26.9±3.7, p=0.004), higher risk anatomy (LM + 3-vessel CAD) (23% vs. 14%, p=0.03), higher number of stents/person used (2.1±1.1 vs. 1.75±0.96, p=0.004), greater length of stents/person used (39.3±21.6 vs. 33.3±18.5, p=0.01), and a lower use of clopidogrel and ASA (together) (95% vs. 99%, p=0.012). They also had a higher, although not statistically significant, prevalence of Diabetes Mellitus (42% vs. 32%, p=0.072) and a greater number of treated vessels (1.36±0.5 vs. 1.26±0.5, p=0.08). In the multivariate analysis, Diabetes Mellitus type 2 and multiple stents used were independent predictors risk factors for In-Stent Restenosis, OR 1.66 [1.03-2.68], p=0.039, and OR 1.44 [1.16-1.78,] p=0.001, respectively. On the other side higher BMI and use of clopidogrel and ASA together resulted protective factors OR 0.88 [0.81-0.95], p=0.001 and OR 0.2 [0.06-0.72] p=0.013, respectively. Conclusion: Diabetes Mellitus and multiple stents are strong predictive risk factors, whereas the use of clopidogrel and ASA together are protective factors for clinical In-Stent Restenosis. Paradoxically High BMI is a protective factor for In-stent Restenosis, probably related to a larger diameter of vessels and consequently a larger diameter of stents implanted in these patients. Further studies are needed to clarify this finding.

Keywords: body mass index, diabetes mellitus, in-stent restenosis, percutaneous coronary intervention

Procedia PDF Downloads 177
126 Shades of Violence – Risks of Male Violence Exposure for Mental and Somatic-Disorders and Risk-Taking Behavior: A Prevalence Study

Authors: Dana Cassandra Winkler, Delia Leiding, Rene Bergs, Franziska Kaiser, Ramona Kirchhart, Ute Habel

Abstract:

Background: Violence is a multidimensional phenomenon, affecting people of every age, socio-economic status and gender. Nevertheless, most studies primarily focus on men perpetrating women. Aim of the present study is to identify the likelihood of mental and somatic disorders and risk-taking behavior in male violence affected. In addition, the relationship between age of violence experience and the risk for health-related problems was analyzed. Method: On the basis of current evidence, a questionnaire was developed focusing on demographic background, health status, risk-taking behavior, and active and passive violence exposure. In total, 5221 males (Mean: 56,1 years, SD: 17,6) were consulted. To account for the time of violence experience in an efficient way, age clusters ‘0-12 years’, ‘13-20 years’, ‘21-35 years’, ‘36-65 years’ and ‘over 65 years’ were defined. A binary logistic regression was calculated to reveal differences in violence-affected and non-violence affected males regarding health and risk-taking factors. Males who experienced violence on a daily/ almost daily basis vs. males who reported violence occurrence once/ several times a month/ year were compared with respect to health factors and risk-taking behavior. Data of males, who indicated active and passive violence exposure, were analyzed by a chi²-analysis, to investigate a possible relation between the age of victimization and violence perpetration. Findings: Results imply that general violence experience, independent of active and passive violence exposure increases the likelihood in favor of somatic-, psychosomatic- and mental disorders as well as risk-taking behavior in males. Experiencing violence on a daily or almost daily basis in childhood and adolescence may serve as a predictor for increased health problems and risk-taking behavior. Furthermore, the violence experience and perpetration occur significantly within the same age cluster. This underlines the importance of a near-term intervention to minimize the risk, that victims become perpetrators later. Conclusion: The present study reveals predictors concerning health risk factors as well as risk-taking behavior in males with violence exposure. The results of this study may underscore the benefit of intervention and regular health care approaches in violence-affected males and underline the importance of acknowledging the overlap of violence experience and perpetration for further research.

Keywords: health disease, male, mental health, prevalence, risk-taking behavior, violence

Procedia PDF Downloads 187