Search results for: Nothing Gold Can Stay
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1140

Search results for: Nothing Gold Can Stay

30 Promotion of Healthy Food Choices in School Children through Nutrition Education

Authors: Vinti Davar

Abstract:

Introduction: Childhood overweight increases the risk for certain medical and psychological conditions. Millions of school-age children worldwide are affected by serious yet easily treatable and preventable illnesses that inhibit their ability to learn. Healthier children stay in school longer, attend more regularly, learn more and become healthier and more productive adults. Schools are an important setting for nutrition education because one can reach most children, teachers and parents. These years offer a key window for shaping their lifetime habits, which have an impact on their health throughout life. Against this background, an attempt was made to impart nutrition education to school children in Haryana state of India to promote healthy food choices and assess the effectiveness of this program. Methodology: This study was completed in two phases. During the first phase, pre-intervention anthropometric and dietary survey was conducted; the teaching materials for nutrition intervention program were developed and tested; and the questionnaire was validated. In the second phase, an intervention was implemented in two schools of Kurukshetra, Haryana for six months by personal visits once a week. A total of 350 children in the age group of 6-12 years were selected. Out of these, 279 children, 153 boys and 126 girls completed the study. The subjects were divided into four groups namely: underweight, normal, overweight and obese based on body mass index-for-age categories. A power point colorful presentation to improve the quality of tiffin, snacks and meals emphasizing inclusion of all food groups especially vegetables every day and fruits at least 3-4 days per week was used. An extra 20 minutes of aerobic exercise daily was likewise organized and a healthy school environment created. Provision of clean drinking water by school authorities was ensured. Selling of soft drinks and energy-dense snacks in the school canteen as well as advertisements about soft drink and snacks on the school walls were banned. Post intervention, anthropometric indices and food selections were reassessed. Results: The results of this study reiterate the critical role of nutrition education and promotion in improving the healthier food choices by school children. It was observed that normal, overweight and obese children participating in nutrition education intervention program significantly (p≤0.05) increased their daily seasonal fruit and vegetable consumption. Fat and oil consumption was significantly reduced by overweight and obese subjects. Fast food intake was controlled by obese children. The nutrition knowledge of school children significantly improved (p≤0.05) from pre to post intervention. A highly significant increase (p≤0.00) was noted in the nutrition attitude score after intervention in all four groups. Conclusion: This study has shown that a well-planned nutrition education program could improve nutrition knowledge and promote positive changes in healthy food choices. A nutrition program inculcates wholesome eating and active life style habits in children and adolescents that could not only prevent them from chronic diseases and early death but also reduce healthcare cost and enhance the quality of life of citizens and thereby nations.

Keywords: children, eating habits healthy food, obesity, school going, fast foods

Procedia PDF Downloads 182
29 Use of Machine Learning Algorithms to Pediatric MR Images for Tumor Classification

Authors: I. Stathopoulos, V. Syrgiamiotis, E. Karavasilis, A. Ploussi, I. Nikas, C. Hatzigiorgi, K. Platoni, E. P. Efstathopoulos

Abstract:

Introduction: Brain and central nervous system (CNS) tumors form the second most common group of cancer in children, accounting for 30% of all childhood cancers. MRI is the key imaging technique used for the visualization and management of pediatric brain tumors. Initial characterization of tumors from MRI scans is usually performed via a radiologist’s visual assessment. However, different brain tumor types do not always demonstrate clear differences in visual appearance. Using only conventional MRI to provide a definite diagnosis could potentially lead to inaccurate results, and so histopathological examination of biopsy samples is currently considered to be the gold standard for obtaining definite diagnoses. Machine learning is defined as the study of computational algorithms that can use, complex or not, mathematical relationships and patterns from empirical and scientific data to make reliable decisions. Concerning the above, machine learning techniques could provide effective and accurate ways to automate and speed up the analysis and diagnosis for medical images. Machine learning applications in radiology are or could potentially be useful in practice for medical image segmentation and registration, computer-aided detection and diagnosis systems for CT, MR or radiography images and functional MR (fMRI) images for brain activity analysis and neurological disease diagnosis. Purpose: The objective of this study is to provide an automated tool, which may assist in the imaging evaluation and classification of brain neoplasms in pediatric patients by determining the glioma type, grade and differentiating between different brain tissue types. Moreover, a future purpose is to present an alternative way of quick and accurate diagnosis in order to save time and resources in the daily medical workflow. Materials and Methods: A cohort, of 80 pediatric patients with a diagnosis of posterior fossa tumor, was used: 20 ependymomas, 20 astrocytomas, 20 medulloblastomas and 20 healthy children. The MR sequences used, for every single patient, were the following: axial T1-weighted (T1), axial T2-weighted (T2), FluidAttenuated Inversion Recovery (FLAIR), axial diffusion weighted images (DWI), axial contrast-enhanced T1-weighted (T1ce). From every sequence only a principal slice was used that manually traced by two expert radiologists. Image acquisition was carried out on a GE HDxt 1.5-T scanner. The images were preprocessed following a number of steps including noise reduction, bias-field correction, thresholding, coregistration of all sequences (T1, T2, T1ce, FLAIR, DWI), skull stripping, and histogram matching. A large number of features for investigation were chosen, which included age, tumor shape characteristics, image intensity characteristics and texture features. After selecting the features for achieving the highest accuracy using the least number of variables, four machine learning classification algorithms were used: k-Nearest Neighbour, Support-Vector Machines, C4.5 Decision Tree and Convolutional Neural Network. The machine learning schemes and the image analysis are implemented in the WEKA platform and MatLab platform respectively. Results-Conclusions: The results and the accuracy of images classification for each type of glioma by the four different algorithms are still on process.

Keywords: image classification, machine learning algorithms, pediatric MRI, pediatric oncology

Procedia PDF Downloads 124
28 Impact of Six-Minute Walk or Rest Break during Extended GamePlay on Executive Function in First Person Shooter Esport Players

Authors: Joanne DiFrancisco-Donoghue, Seth E. Jenny, Peter C. Douris, Sophia Ahmad, Kyle Yuen, Hillary Gan, Kenney Abraham, Amber Sousa

Abstract:

Background: Guidelines for the maintenance of health of esports players and the cognitive changes that accompany competitive gaming are understudied. Executive functioning is an important cognitive skill for an esports player. The relationship between executive functions and physical exercise has been well established. However, the effects of prolonged sitting regardless of physical activity level have not been established. Prolonged uninterrupted sitting reduces cerebral blood flow. Reduced cerebral blood flow is associated with lower cognitive function and fatigue. This decrease in cerebral blood flow has been shown to be offset by frequent and short walking breaks. These short breaks can be as little as 2 minutes at the 30-minute mark and 6 minutes following 60 minutes of prolonged sitting. The rationale is the increase in blood flow and the positive effects this has on metabolic responses. The primary purpose of this study was to evaluate executive function changes following 6-minute bouts of walking and complete rest mid-session, compared to no break, during prolonged gameplay in competitive first-person shooter (FPS) esports players. Methods: This study was conducted virtually due to the Covid-19 pandemic and was approved by the New York Institute of Technology IRB. Twelve competitive FPS participants signed written consent to participate in this randomized pilot study. All participants held a gold ranking or higher. Participants were asked to play for 2 hours on three separate days. Outcome measures to test executive function included the Color Stroop and the Tower of London tests which were administered online each day prior to gaming and at the completion of gaming. All participants completed the tests prior to testing for familiarization. One day of testing consisted of a 6-minute walk break after 60-75 minutes of play. The Rate of Perceived Exertion (RPE) was recorded. The participant continued to play for another 60-75 minutes and completed the tests again. Another day the participants repeated the same methods replacing the 6-minute walk with lying down and resting for 6 minutes. On the last day, the participant played continuously with no break for 2 hours and repeated the outcome tests pre and post-play. A Latin square was used to randomize the treatment order. Results: Using descriptive statistics, the largest change in mean reaction time incorrect congruent pre to post play was seen following the 6-minute walk (662.0 (609.6) ms pre to 602.8 (539.2) ms post), followed by the 6-minute rest group (681.7(618.1) ms pre to 666.3 (607.9) ms post), and with minimal change in the continuous group (594.0(534.1) ms pre to 589.6(552.9) ms post). The mean solution time was fastest in the resting condition (7774.6(6302.8)ms), followed by the walk condition (7929.4 (5992.8)ms), with the continuous condition being slowest (9337.3(7228.7)ms). the continuous group 9337.3(7228.7) ms; 7929.4 (5992.8 ) ms 774.6(6302.8) ms. Conclusion: Short walking breaks improve blood flow and reduce the risk of venous thromboembolism during prolonged sitting. This pilot study demonstrated that a low intensity 6 -minute walk break, following 60 minutes of play, may also improve executive function in FPS gamers.

Keywords: executive function, FPS, physical activity, prolonged sitting

Procedia PDF Downloads 180
27 Mobile App versus Website: A Comparative Eye-Tracking Case Study of Topshop

Authors: Zofija Tupikovskaja-Omovie, David Tyler, Sam Dhanapala, Steve Hayes

Abstract:

The UK is leading in online retail and mobile adoption. However, there is a dearth of information relating to mobile apparel retail, and developing an understanding about consumer browsing and purchase behavior in m-retail channel would provide apparel marketers, mobile website and app developers with the necessary understanding of consumers’ needs. Despite the rapid growth of mobile retail businesses, no published study has examined shopping behaviour on fashion mobile websites and apps. A mixed method approach helped to understand why fashion consumers prefer websites on mobile devices, when mobile apps are also available. The following research methods were employed: survey, eye-tracking experiments, observation, and interview with retrospective think aloud. The mobile gaze tracking device by SensoMotoric Instruments was used to understand frustrations in navigation and other issues facing consumers in mobile channel. This method helped to validate and compliment other traditional user-testing approaches in order to optimize user experience and enhance the development of mobile retail channel. The study involved eight participants - females aged 18 to 35 years old, who are existing mobile shoppers. The participants used the Topshop mobile app and website on a smart phone to complete a task according to a specified scenario leading to a purchase. The comparative study was based on: duration and time spent at different stages of the shopping journey, number of steps involved and product pages visited, search approaches used, layout and visual clues, as well as consumer perceptions and expectations. The results from the data analysis show significant differences in consumer behaviour when using a mobile app or website on a smart phone. Moreover, two types of problems were identified, namely technical issues and human errors. Having a mobile app does not guarantee success in satisfying mobile fashion consumers. The differences in the layout and visual clues seem to influence the overall shopping experience on a smart phone. The layout of search results on the website was different from the mobile app. Therefore, participants, in most cases, behaved differently on different platforms. The number of product pages visited on the mobile app was triple the number visited on the website due to a limited visibility of products in the search results. Although, the data on traffic trends held by retailers to date, including retail sector breakdowns for visits and views, data on device splits and duration, might seem a valuable source of information, it cannot explain why consumers visit many product pages, stay longer on the website or mobile app, or abandon the basket. A comprehensive list of pros and cons was developed by highlighting issues for website and mobile app, and recommendations provided. The findings suggest that fashion retailers need to be aware of actual consumers’ behaviour on the mobile channel and their expectations in order to offer a seamless shopping experience. Added to which is the challenge of retaining existing and acquiring new customers. There seem to be differences in the way fashion consumers search and shop on mobile, which need to be explored in further studies.

Keywords: consumer behavior, eye-tracking technology, fashion retail, mobile app, m-retail, smart phones, topshop, user experience, website

Procedia PDF Downloads 423
26 Grisotti Flap as Treatment for Central Tumors of the Breast

Authors: R. Pardo, P. Menendez, MA Gil-Olarte, S. Sanchez, E. García, R. Quintana, J. Martín

Abstract:

Introduction : Within oncoplastic breast techniques there is increased interest in immediate partial breast reconstruction. The volume resected is greater than that of conventional conservative techniques. Central tumours of the breast have classically been treated with a mastectomy with regard to oncological safety and cosmetic secondary effects after wide central resection of the nipple and breast tissue beneath. Oncological results for central quadrantectomy have a recurrence level, disease- free period and survival identical to mastectomy. Grissoti flap is an oncoplastic surgical technique that allows the surgeon to perform a safe central quadrantectomy with excellent cosmetic results. Material and methods: The Grissoti flap is a glandular cutaneous advancement rotation flap that can fill the defect in the central portion of the excised breast. If the inferior border is affected by tumour and further surgery is decided upon at the Multidisciplinary Team Meeting, it will be necessary to perform a mastectomy. All patients with a Grisotti flap undergoing surgery since 2009 were reviewed obtaining the following data: age, hystopathological diagnosis, size, operating time, volume of tissue resected, postoperative admission time, re-excisions due to positive margins affected by tumour, wound dehiscence, complications and recurrence. Analysis and results of sentinel node biopsy were also obtained. Results: 12 patients underwent surgery between 2009-2015. The mean age was 54 years (34-67) . All had a preoperative diagnosis of ductal infiltrative carcinoma of less than 2 cm,. Diagnosis was made with Ultrasound, Mamography or both . Magnetic resonance was used in 5 cases. No patients had preoperative positive axilla after ultrasound exploration. Mean operating time was 104 minutes (84-130). Postoperative stay was 24 hours. Mean volume resected was 159 cc (70-286). In one patient the surgical border was affected by tumour and a further procedure with resection of the affected border was performed as ambulatory surgery. The sentinel node biopsy was positive for micrometastasis in only two cases. In one case lymphadenectomy was performed in 2009. In the other, treated in 2015, no lymphadenectomy was performed as the patient had a favourable histopathological prognosis and the multidisciplinary team meeting agreed that lymphadenectomy was not required. No recurrence has been diagnosed in any of the patients who underwent surgery and they are all disease free at present. Conclusions: Conservative surgery for retroareolar central tumours of the breast results in good local control of the disease with free surgical borders, including resection of the nipple areola complex and pectoral major muscle fascia. Reconstructive surgery with the inferior Grissoti flap adequately fills the defect after central quadrantectomy with creation of a new cutaneous disc where a new nipple areola complex is reconstructed with a local flap or micropigmentation. This avoids the need for contralateral symmetrization. Sentinel Node biopsy can be performed without added morbidity. When feasible, the Grissoti flap will avoid skin-sparing mastectomy for central breast tumours that will require the use of an expander, prosthesis or myocutaneous flap, with all the complications of a more complex operation.

Keywords: Grisotti flap, oncoplastic surgery, central tumours, breast

Procedia PDF Downloads 295
25 Multilocus Phylogenetic Approach Reveals Informative DNA Barcodes for Studying Evolution and Taxonomy of Fusarium Fungi

Authors: Alexander A. Stakheev, Larisa V. Samokhvalova, Sergey K. Zavriev

Abstract:

Fusarium fungi are among the most devastating plant pathogens distributed all over the world. Significant reduction of grain yield and quality caused by Fusarium leads to multi-billion dollar annual losses to the world agricultural production. These organisms can also cause infections in immunocompromised persons and produce the wide range of mycotoxins, such as trichothecenes, fumonisins, and zearalenone, which are hazardous to human and animal health. Identification of Fusarium fungi based on the morphology of spores and spore-forming structures, colony color and appearance on specific culture media is often very complicated due to the high similarity of these features for closely related species. Modern Fusarium taxonomy increasingly uses data of crossing experiments (biological species concept) and genetic polymorphism analysis (phylogenetic species concept). A number of novel Fusarium sibling species has been established using DNA barcoding techniques. Species recognition is best made with the combined phylogeny of intron-rich protein coding genes and ribosomal DNA sequences. However, the internal transcribed spacer of (ITS), which is considered to be universal DNA barcode for Fungi, is not suitable for genus Fusarium, because of its insufficient variability between closely related species and the presence of non-orthologous copies in the genome. Nowadays, the translation elongation factor 1 alpha (TEF1α) gene is the “gold standard” of Fusarium taxonomy, but the search for novel informative markers is still needed. In this study, we used two novel DNA markers, frataxin (FXN) and heat shock protein 90 (HSP90) to discover phylogenetic relationships between Fusarium species. Multilocus phylogenetic analysis based on partial sequences of TEF1α, FXN, HSP90, as well as intergenic spacer of ribosomal DNA (IGS), beta-tubulin (β-TUB) and phosphate permease (PHO) genes has been conducted for 120 isolates of 19 Fusarium species from different climatic zones of Russia and neighboring countries using maximum likelihood (ML) and maximum parsimony (MP) algorithms. Our analyses revealed that FXN and HSP90 genes could be considered as informative phylogenetic markers, suitable for evolutionary and taxonomic studies of Fusarium genus. It has been shown that PHO gene possesses more variable (22 %) and parsimony informative (19 %) characters than other markers, including TEF1α (12 % and 9 %, correspondingly) when used for elucidating phylogenetic relationships between F. avenaceum and its closest relatives – F. tricinctum, F. acuminatum, F. torulosum. Application of novel DNA barcodes confirmed the fact that F. arthrosporioides do not represent a separate species but only a subspecies of F. avenaceum. Phylogeny based on partial PHO and FXN sequences revealed the presence of separate cluster of four F. avenaceum strains which were closer to F. torulosum than to major F. avenaceum clade. The strain F-846 from Moldova, morphologically identified as F. poae, formed a separate lineage in all the constructed dendrograms, and could potentially be considered as a separate species, but more information is needed to confirm this conclusion. Variable sites in PHO sequences were used for the first-time development of specific qPCR-based diagnostic assays for F. acuminatum and F. torulosum. This work was supported by Russian Foundation for Basic Research (grant № 15-29-02527).

Keywords: DNA barcode, fusarium, identification, phylogenetics, taxonomy

Procedia PDF Downloads 293
24 Sensitivity and Specificity of Some Serological Tests Used for Diagnosis of Bovine Brucellosis in Egypt on Bacteriological and Molecular Basis

Authors: Hosein I. Hosein, Ragab Azzam, Ahmed M. S. Menshawy, Sherin Rouby, Khaled Hendy, Ayman Mahrous, Hany Hussien

Abstract:

Brucellosis is a highly contagious bacterial zoonotic disease of a worldwide spread and has different names; Infectious or enzootic abortion and Bang's disease in animals; and Mediterranean or Malta fever, Undulant Fever and Rock fever in humans. It is caused by the different species of genus Brucella which is a Gram-negative, aerobic, non-spore forming, facultative intracellular bacterium. Brucella affects a wide range of mammals including bovines, small ruminants, pigs, equines, rodents, marine mammals as well as human resulting in serious economic losses in animal populations. In human, Brucella causes a severe illness representing a great public health problem. The disease was reported in Egypt for the first time in 1939; since then the disease remained endemic at high levels among cattle, buffalo, sheep and goat and is still representing a public health hazard. The annual economic losses due to brucellosis were estimated to be about 60 million Egyptian pounds yearly, but actual estimates are still missing despite almost 30 years of implementation of the Egyptian control programme. Despite being the gold standard, bacterial isolation has been reported to show poor sensitivity for samples with low-level of Brucella and is impractical for regular screening of large populations. Thus, serological tests still remain the corner stone for routine diagnosis of brucellosis, especially in developing countries. In the present study, a total of 1533 cows (256 from Beni-Suef Governorate, 445 from Al-Fayoum Governorate and 832 from Damietta Governorate), were employed for estimation of relative sensitivity, relative specificity, positive predictive value and negative predictive value of buffered acidified plate antigen test (BPAT), rose bengal test (RBT) and complement fixation test (CFT). The overall seroprevalence of brucellosis revealed (19.63%). Relative sensitivity, relative specificity, positive predictive value and negative predictive value of BPAT,RBT and CFT were estimated as, (96.27 %, 96.76 %, 87.65 % and 99.10 %), (93.42 %, 96.27 %, 90.16 % and 98.35%) and (89.30 %, 98.60 %, 94.35 %and 97.24 %) respectively. BPAT showed the highest sensitivity among the three employed serological tests. RBT was less specific than BPAT. CFT showed the least sensitivity 89.30 % among the three employed serological tests but showed the highest specificity. Different tissues specimens of 22 seropositive cows (spleen, retropharyngeal udder, and supra-mammary lymph nodes) were subjected for bacteriological studies for isolation and identification of Brucella organisms. Brucella melitensis biovar 3 could be recovered from 12 (54.55%) cows. Bacteriological examinations failed to classify 10 cases (45.45%) and were culture negative. Bruce-ladder PCR was carried out for molecular identification of the 12 Brucella isolates at the species level. Three fragments of 587 bp, 1071 bp and 1682 bp sizes were amplified indicating Brucella melitensis. The results indicated the importance of using several procedures to overcome the problem of escaping of some infected animals from diagnosis.Bruce-ladder PCR is an important tool for diagnosis and epidemiologic studies, providing relevant information for identification of Brucella spp.

Keywords: brucellosis, relative sensitivity, relative specificity, Bruce-ladder, Egypt

Procedia PDF Downloads 314
23 When It Wasn’t There: Understanding the Importance of High School Sports

Authors: Karen Chad, Louise Humbert, Kenzie Friesen, Dave Sandomirsky

Abstract:

Background: The pandemic of COVID-19 presented many historical challenges to the sporting community. For organizations and individuals, sport was put on hold resulting in social, economic, physical, and mental health consequences for all involved. High school sports are seen as an effective and accessible pathway for students to receive health, social, and academic benefits. Studies examining sport cessation due to COVID-19 found substantial negative outcomes on the physical and mental well-being of participants in the high school setting. However, the pandemic afforded an opportunity to examine sport participation and the value people place upon their engagement in high school sport. Study objectives: (1) Examine the experiences of students, parents, administrators, officials, and coaches during a year without high school sports; (2) Understand why participants are involved in high school sports; and (3) Learn what supports are needed for future involvement. Methodology: A mixed method design was used, including semi-structured interviews and a survey (SurveyMonkey software), which was disseminated electronically to high school students, coaches, school administrators, parents, and officials. Results: 1222 respondents completed the survey. Findings showed: (1) 100% of students participate in high school sports to improve their mental health, with >95% said it keeps them active and healthy, helps them make friends and teaches teamwork, builds confidence and positive self-perceptions, teaches resiliency, enhances connectivity to their school, and supports academic learning; (2) Top three reasons teachers coach is their desire to make a difference in the lives of students, enjoyment, and love of the sport, and to give back. Teachers said what they enjoy most is contributing to and watching athletes develop, direct involvement with student sport success, and the competitiveatmosphere; (3) 90% of parents believe playing sports is a valuable experience for their child, 95% said it enriches student academic learning and educational experiences, and 97% encouraged their child to play school sports; (4) Officials participate because of their enjoyment and love of the sport, experience, and expertise, desire to make a difference in the lives of children, the competitive/sporting atmosphere and growing the sport. 4% of officials said it was financially motivated; (5) 100% of administrators said high school sports are important for everyone. 80% believed the pandemic will decrease teachers coaching and increase student mental health and well-being. When there was no sport, many athletes got a part-time job and tried to stay active, with limited success. Coaches, officials, and parents spent more time with family. All participants did little physical activity, were bored; and struggled with mental health and poor physical health. Respondents recommended better communication, promotion, and branding of high school sport benefits, equitable funding for all sports, athlete development, compensation and recognition for coaching, and simple processes to strengthen the high school sport model. Conclusions: High school sport is an effective vehicle for athletes, parents, coaches, administrators, and officials to derive many positive outcomes. When it is taken away, serious consequences prevail. Paying attention to important success factors will be important for the effectiveness of high school sports.

Keywords: physical activity, high school, sports, pandemic

Procedia PDF Downloads 106
22 Bio-Inspired Information Complexity Management: From Ant Colony to Construction Firm

Authors: Hamza Saeed, Khurram Iqbal Ahmad Khan

Abstract:

Effective information management is crucial for any construction project and its success. Primary areas of information generation are either the construction site or the design office. There are different types of information required at different stages of construction involving various stakeholders creating complexity. There is a need for effective management of information flows to reduce uncertainty creating complexity. Nature provides a unique perspective in terms of dealing with complexity, in particular, information complexity. System dynamics methodology provides tools and techniques to address complexity. It involves modeling and simulation techniques that help address complexity. Nature has been dealing with complex systems since its creation 4.5 billion years ago. It has perfected its system by evolution, resilience towards sudden changes, and extinction of unadaptable and outdated species that are no longer fit for the environment. Nature has been accommodating the changing factors and handling complexity forever. Humans have started to look at their natural counterparts for inspiration and solutions for their problems. This brings forth the possibility of using a biomimetics approach to improve the management practices used in the construction sector. Ants inhabit different habitats. Cataglyphis and Pogonomyrmex live in deserts, Leafcutter ants reside in rainforests, and Pharaoh ants are native to urban developments of tropical areas. Detailed studies have been done on fifty species out of fourteen thousand discovered. They provide the opportunity to study the interactions in diverse environments to generate collective behavior. Animals evolve to better adapt to their environment. The collective behavior of ants emerges from feedback through interactions among individuals, based on a combination of three basic factors: The patchiness of resources in time and space, operating cost, environmental stability, and the threat of rupture. If resources appear in patches through time and space, the response is accelerating and non-linear, and if resources are scattered, the response follows a linear pattern. If the acquisition of energy through food is faster than energy spent to get it, the default is to continue with an activity unless it is halted for some reason. If the energy spent is rather higher than getting it, the default changes to stay put unless activated. Finally, if the environment is stable and the threat of rupture is low, the activation and amplification rate is slow but steady. Otherwise, it is fast and sporadic. To further study the effects and to eliminate the environmental bias, the behavior of four different ant species were studied, namely Red Harvester ants (Pogonomyrmex Barbatus), Argentine ants (Linepithema Humile), Turtle ants (Cephalotes Goniodontus), Leafcutter ants (Genus: Atta). This study aims to improve the information system in the construction sector by providing a guideline inspired by nature with a systems-thinking approach, using system dynamics as a tool. Identified factors and their interdependencies were analyzed in the form of a causal loop diagram (CLD), and construction industry professionals were interviewed based on the developed CLD, which was validated with significance response. These factors and interdependencies in the natural system corresponds with the man-made systems, providing a guideline for effective use and flow of information.

Keywords: biomimetics, complex systems, construction management, information management, system dynamics

Procedia PDF Downloads 113
21 Comparative Assessment of the Thermal Tolerance of Spotted Stemborer, Chilo partellus Swinhoe (Lepidoptera: Crambidae) and Its Larval Parasitoid, Cotesia sesamiae Cameron (Hymenoptera: Braconidae)

Authors: Reyard Mutamiswa, Frank Chidawanyika, Casper Nyamukondiwa

Abstract:

Under stressful thermal environments, insects adjust their behaviour and physiology to maintain key life-history activities and improve survival. For interacting species, mutual or antagonistic, thermal stress may affect the participants in differing ways, which may then affect the outcome of the ecological relationship. In agroecosystems, this may be the fate of relationships between insect pests and their antagonistic parasitoids under acute and chronic thermal variability. Against this background, we therefore investigated the thermal tolerance of different developmental stages of Chilo partellus Swinhoe (Lepidoptera: Crambidae) and its larval parasitoid Cotesia sesamiae Cameron (Hymenoptera: Braconidae) using both dynamic and static protocols. In laboratory experiments, we determined lethal temperature assays (upper and lower lethal temperatures) using direct plunge protocols in programmable water baths (Systronix, Scientific, South Africa), effects of ramping rate on critical thermal limits following standardized protocols using insulated double-jacketed chambers (‘organ pipes’) connected to a programmable water bath (Lauda Eco Gold, Lauda DR.R. Wobser GMBH and Co. KG, Germany), supercooling points (SCPs) following dynamic protocols using a Pico logger connected to a programmable water bath, heat knock-down time (HKDT) and chill-coma recovery (CCRT) time following static protocols in climate chambers (HPP 260, Memmert GmbH + Co.KG, Germany) connected to a camera (HD Covert Network Camera, DS-2CD6412FWD-20, Hikvision Digital Technology Co., Ltd, China). When exposed for two hours to a static temperature, lower lethal temperatures ranged -9 to 6; -14 to -2 and -1 to 4ºC while upper lethal temperatures ranged from 37 to 48; 41 to 49 and 36 to 39ºC for C. partellus eggs, larvae and C. sesamiae adults respectively. Faster heating rates improved critical thermal maxima (CTmax) in C. partellus larvae and adult C. partellus and C. sesamiae. Lower cooling rates improved critical thermal minima (CTmin) in C. partellus and C. sesamiae adults while compromising CTmin in C. partellus larvae. The mean SCPs for C. partellus larvae, pupae and adults were -11.82±1.78, -10.43±1.73 and -15.75±2.47 respectively with adults having the lowest SCPs. Heat knock-down time and chill-coma recovery time varied significantly between C. partellus larvae and adults. Larvae had higher HKDT than adults, while the later recovered significantly faster following chill-coma. Current results suggest developmental stage differences in C. partellus thermal tolerance (with respect to lethal temperatures and critical thermal limits) and a compromised temperature tolerance of parasitoid C. sesamiae relative to its host, suggesting potential asynchrony between host-parasitoid population phenology and consequently biocontrol efficacy under global change. These results have broad implications to biological pest management insect-natural enemy interactions under rapidly changing thermal environments.

Keywords: chill-coma recovery time, climate change, heat knock-down time, lethal temperatures, supercooling point

Procedia PDF Downloads 212
20 Calpoly Autonomous Transportation Experience: Software for Driverless Vehicle Operating on Campus

Authors: F. Tang, S. Boskovich, A. Raheja, Z. Aliyazicioglu, S. Bhandari, N. Tsuchiya

Abstract:

Calpoly Autonomous Transportation Experience (CATE) is a driverless vehicle that we are developing to provide safe, accessible, and efficient transportation of passengers throughout the Cal Poly Pomona campus for events such as orientation tours. Unlike the other self-driving vehicles that are usually developed to operate with other vehicles and reside only on the road networks, CATE will operate exclusively on walk-paths of the campus (potentially narrow passages) with pedestrians traveling from multiple locations. Safety becomes paramount as CATE operates within the same environment as pedestrians. As driverless vehicles assume greater roles in today’s transportation, this project will contribute to autonomous driving with pedestrian traffic in a highly dynamic environment. The CATE project requires significant interdisciplinary work. Researchers from mechanical engineering, electrical engineering and computer science are working together to attack the problem from different perspectives (hardware, software and system). In this abstract, we describe the software aspects of the project, with a focus on the requirements and the major components. CATE shall provide a GUI interface for the average user to interact with the car and access its available functionalities, such as selecting a destination from any origin on campus. We have developed an interface that provides an aerial view of the campus map, the current car location, routes, and the goal location. Users can interact with CATE through audio or manual inputs. CATE shall plan routes from the origin to the selected destination for the vehicle to travel. We will use an existing aerial map for the campus and convert it to a spatial graph configuration where the vertices represent the landmarks and edges represent paths that the car should follow with some designated behaviors (such as stay on the right side of the lane or follow an edge). Graph search algorithms such as A* will be implemented as the default path planning algorithm. D* Lite will be explored to efficiently recompute the path when there are any changes to the map. CATE shall avoid any static obstacles and walking pedestrians within some safe distance. Unlike traveling along traditional roadways, CATE’s route directly coexists with pedestrians. To ensure the safety of the pedestrians, we will use sensor fusion techniques that combine data from both lidar and stereo vision for obstacle avoidance while also allowing CATE to operate along its intended route. We will also build prediction models for pedestrian traffic patterns. CATE shall improve its location and work under a GPS-denied situation. CATE relies on its GPS to give its current location, which has a precision of a few meters. We have implemented an Unscented Kalman Filter (UKF) that allows the fusion of data from multiple sensors (such as GPS, IMU, odometry) in order to increase the confidence of localization. We also noticed that GPS signals can easily get degraded or blocked on campus due to high-rise buildings or trees. UKF can also help here to generate a better state estimate. In summary, CATE will provide on-campus transportation experience that coexists with dynamic pedestrian traffic. In future work, we will extend it to multi-vehicle scenarios.

Keywords: driverless vehicle, path planning, sensor fusion, state estimate

Procedia PDF Downloads 114
19 Household Water Practices in a Rapidly Urbanizing City and Its Implications for the Future of Potable Water: A Case Study of Abuja Nigeria

Authors: Emmanuel Maiyanga

Abstract:

Access to sufficiently good quality freshwater has been a global challenge, but more notably in low-income countries, particularly in the Sub-Saharan countries, which Nigeria is one. Urban population is soaring, especially in many low-income countries, the existing centralised water supply infrastructures are ageing and inadequate, moreover in households peoples’ lifestyles have become more water-demanding. So, people mostly device coping strategies where municipal supply is perceived to have failed. This development threatens the futures of groundwater and calls for a review of management strategy and research approach. The various issues associated with water demand management in low-income countries and Nigeria, in particular, are well documented in the literature. However, the way people use water daily in households and the reasons they do so, and how the situation is constructing demand among the middle-class population in Abuja Nigeria is poorly understood. This is what this research aims to unpack. This is achieved by using the social practices research approach (which is based on the Theory of Practices) to understand how this situation impacts on the shared groundwater resource. A qualitative method was used for data gathering. This involved audio-recorded interviews of householders and water professionals in the private and public sectors. It also involved observation, note-taking, and document study. The data were analysed thematically using NVIVO software. The research reveals the major household practices that draw on the water on a domestic scale, and they include water sourcing, body hygiene and sanitation, laundry, kitchen, and outdoor practices (car washing, domestic livestock farming, and gardening). Among all the practices, water sourcing, body hygiene, kitchen, and laundry practices, are identified to impact most on groundwater, with impact scale varying with household peculiarities. Water sourcing practices involve people sourcing mostly from personal boreholes because the municipal water supply is perceived inadequate and unreliable in terms of service delivery and water quality, and people prefer easier and unlimited access and control using boreholes. Body hygiene practices reveal that every respondent prefers bucket bathing at least once daily, and the majority bathe twice or more every day. Frequency is determined by the feeling of hotness and dirt on the skin. Thus, people bathe to cool down, stay clean, and satisfy perceived social, religious, and hygiene demand. Kitchen practice consumes water significantly as people run the tap for vegetable washing in daily food preparation and dishwashing after each meal. Laundry practice reveals that most people wash clothes most frequently (twice in a week) during hot and dusty weather, and washing with hands in basins and buckets is the most prevalent and water wasting due to soap overdose. The research also reveals poor water governance as a major cause of current inadequate municipal water delivery. The implication poor governance and widespread use of boreholes is an uncontrolled abstraction of groundwater to satisfy desired household practices, thereby putting the future of the shared aquifer at great risk of total depletion with attendant multiplying effects on the people and the environment and population continues to soar.

Keywords: boreholes, groundwater, household water practices, self-supply

Procedia PDF Downloads 95
18 The Need for a More Defined Role for Psychologists in Adult Consultation Liaison Services in Hospital Settings

Authors: Ana Violante, Jodie Maccarrone, Maria Fimiani

Abstract:

In the United States, over 30 million people are hospitalized annually for conditions that require acute, 24-hour, supervised care. The experience of hospitalization can be traumatic, exposing the patient to loss of control, autonomy, and productivity. Furthermore, 40% of patients admitted to hospitals for general medical illness have a comorbid psychiatric diagnosis. Research suggests individuals admitted with psychiatric comorbidities experience poorer health outcomes, higher utilization rates and increased overall cost of care. Empirical work suggests hospital settings that include a consultation liaison (CL) service report reduced length of stay, lower costs per patient, improved medical staff and patient satisfaction and reduced readmission after 180 days. Despite the overall positive impact CL services can have on patient care, it is estimated that only 1% - 2.8% of hospital admits receive these services, and most research has been conducted by the field of psychiatry. Health psychologists could play an important role in increasing access to this valuable service, though the extent to which health psychologists participate in CL settings is not well known. Objective: Outline the preliminary findings from an empirical study to understand how many APPIC internship training programs offer adult consultation liaison rotations within inpatient hospital settings nationally, as well as describe the specific nature of these training experiences. Research Method/Design: Data was exported into Excel from the 2022-2023 APPIC Directory categorized as “health psychology” sites. It initially returned a total of 537 health training programs out 1518 total programs (35% of all APPIC programs). A full review included a quantitative and qualitative comprehensive review of the APPIC program summary, the site website, and program brochures. The quantitative review extracted the number of training positions; amount of stipend; location or state of program, patient, population, and rotation. The qualitative review examined the nature of the training experience. Results: 29 (5%) of all APPIC health psychology internship training programs (2%) respectively of all APPIC training internship programs offering internship CL training were identified. Of the 29 internship training programs, 16 were exclusively within a pediatric setting (55%), 11 were exclusively within an adult setting (38%), and two were a mix of pediatric and adult settings (7%). CL training sites were located to 19 states, offering a total of 153 positions nationally, with Florida containing the largest number of programs (4). Only six programs offered 12-month training opportunities while the rest offered CL as a major (6 month) to minor (3-4 month) rotation. The program’s stipend for CL training positions ranged from $25,000 to $62,400, with an average of $32,056. Conclusions: These preliminary findings suggest CL training and services are currently limited. Training opportunities that do exist are mostly limited to minor, short rotations and governed by psychiatry. Health psychologists are well-positioned to better define the role of psychology in consultation liaison services and enhance and formalize existing training protocols. Future research should explore in more detail empirical outcomes of CL services that employ psychology and delineate the contributions of psychology from psychiatry and other disciplines within an inpatient hospital setting.

Keywords: consultation liaison, health psychology, hospital setting, training

Procedia PDF Downloads 43
17 Production, Characterisation, and in vitro Degradation and Biocompatibility of a Solvent-Free Polylactic-Acid/Hydroxyapatite Composite for 3D-Printed Maxillofacial Bone-Regeneration Implants

Authors: Carlos Amnael Orozco-Diaz, Robert David Moorehead, Gwendolen Reilly, Fiona Gilchrist, Cheryl Ann Miller

Abstract:

The current gold-standard for maxillofacial reconstruction surgery (MRS) utilizes auto-grafted cancellous bone as a filler. This study was aimed towards developing a polylactic-acid/hydroxyapatite (PLA-HA) composite suitable for fused-deposition 3D printing. Functionalization of the polymer through the addition of HA was directed to promoting bone-regeneration properties so that the material can rival the performance of cancellous bone grafts in terms of bone-lesion repair. This kind of composite enables the production of MRS implants based off 3D-reconstructions from image studies – namely computed tomography – for anatomically-correct fitting. The present study encompassed in-vitro degradation and in-vitro biocompatibility profiling for 3D-printed PLA and PLA-HA composites. PLA filament (Verbatim Co.) and Captal S hydroxyapatite micro-scale HA powder (Plasma Biotal Ltd) were used to produce PLA-HA composites at 5, 10, and 20%-by-weight HA concentration. These were extruded into 3D-printing filament, and processed in a BFB-3000 3D-Printer (3D Systems Co.) into tensile specimens, and were mechanically challenged as per ASTM D638-03. Furthermore, tensile specimens were subjected to accelerated degradation in phosphate-buffered saline solution at 70°C for 23 days, as per ISO-10993-13-2010. This included monitoring of mass loss (through dry-weighing), crystallinity (through thermogravimetric analysis/differential thermal analysis), molecular weight (through gel-permeation chromatography), and tensile strength. In-vitro biocompatibility analysis included cell-viability and extracellular matrix deposition, which were performed both on flat surfaces and on 3D-constructs – both produced through 3D-printing. Discs of 1 cm in diameter and cubic 3D-meshes of 1 cm3 were 3D printed in PLA and PLA-HA composites (n = 6). The samples were seeded with 5000 MG-63 osteosarcoma-like cells, with cell viability extrapolated throughout 21 days via resazurin reduction assays. As evidence of osteogenicity, collagen and calcium deposition were indirectly estimated through Sirius Red staining and Alizarin Red staining respectively. Results have shown that 3D printed PLA loses structural integrity as early as the first day of accelerated degradation, which was significantly faster than the literature suggests. This was reflected in the loss of tensile strength down to untestable brittleness. During degradation, mass loss, molecular weight, and crystallinity behaved similarly to results found in similar studies for PLA. All composite versions and pure PLA were found to perform equivalent to tissue-culture plastic (TCP) in supporting the seeded-cell population. Significant differences (p = 0.05) were found on collagen deposition for higher HA concentrations, with composite samples performing better than pure PLA and TCP. Additionally, per-cell-calcium deposition on the 3D-meshes was significantly lower when comparing 3D-meshes to discs of the same material (p = 0.05). These results support the idea that 3D-printable PLA-HA composites are a viable resorbable material for artificial grafts for bone-regeneration. Degradation data suggests that 3D-printing of these materials – as opposed to other manufacturing methods – might result in faster resorption than currently-used PLA implants.

Keywords: bone regeneration implants, 3D-printing, in vitro testing, biocompatibility, polymer degradation, polymer-ceramic composites

Procedia PDF Downloads 128
16 Electromyographic Analysis of Biceps Brachii during Golf Swing and Review of Its Impact on Return to Play Following Tendon Surgery

Authors: Amin Masoumiganjgah, Luke Salmon, Julianne Burnton, Fahimeh Bagheri, Gavin Lenton, S. L. Ezekial Tan

Abstract:

Introduction: The incidence of proximal biceps tenodesis and acute distal biceps repair is increasing, and rehabilitation protocols following both are variable. Golf is a popular sport within Australia, and the Gold Coast has become a Mecca for golfers, with more courses per capita than anywhere else in the world. Currently, there are no clear guidelines regarding return to golf play following biceps procedures. The aim of this study was to determine biceps brachii activation during the golf swing through electromyographic analysis, and subsequently, aid in rehabilitation guidelines and return to golf following tenodesis and repair. Methods: Subjects were amateur golfers with no previous upper limb surgery. Surface electromyography (EMG) and high-speed video recording were used to analyse activation of the left and right biceps brachii and the anterior deltoid during the golf swing. Each participant’s maximum voluntary contraction (MVC) was recorded, and they were then required to hit a golf ball aiming for specific distances of 2, 50, 100 and 150 metres at a driving range. Noraxon myoResearch and Matlab were used for data analysis. Mean % MVC was calculated for leading and trailing arms during the full swing and its’ 4 phases: back-swing, acceleration, early follow-through and late follow-through. Results: 12 golfers (2 female and 10 male), participated in the study. Median age was 27 (25 – 38), with all being right handed. Over all distances, the mean activation of the short and long head of biceps brachii was < 10% through the full swing. When breaking down the 50, 100 and 150m swing into phases, mean MVC activation was lowest in backswing (5.1%), followed by acceleration (9.7%), early follow-through (9.2%), and late follow-through (21.4%). There was more variation and slightly higher activation in the right biceps (trailing arm) in backswing, acceleration, and early follow-through; with higher activation in the leading arm in late follow-through (25.4% leading, 17.3% trailing). 2m putts resulted in low MVC values (3.1% ) with little variation across swing phases. There was considerable individual variation in results – one tense subject averaged 11.0% biceps MVC through the 2m putting stroke and others recorded peak mean MVC biceps activations of 68.9% at 50m, 101.3% at 100m, and 111.3% at 150m. Discussion: Previous studies have investigated the role of rotator cuff, spine, and hip muscles during the golf swing however, to our knowledge, this is the first study that investigates the activation of biceps brachii. Many rehabilitation programs following a biceps tenodesis or repair allow active range against gravity and restrict strengthening exercises until 6 weeks, and this does not appear to be associated with any adverse outcome. Previous studies demonstrate a range of < 10% MVC is similar to the unloaded biceps brachii during walking(1), active elbow flexion with the hand positioned either in pronation or supination will produce MVC < 20% throughout range(2) and elbow flexion with a 4kg dumbbell can produce mean MVC’s of around 40%(3). Our study demonstrates that increasing activation is associated with the leading arm, increasing shot distance and the late follow-through phase. Although the cohort mean MVC of the biceps brachii is <10% through the full swing, variability is high and biceps activation reach peak mean MVC’s of over 100% in different swing phases for some individuals. Given these EMG values, caution is advised when advising patients post biceps procedures to return to long distance golf shots, particularly when the leading arm is involved. Even though it would appear that putting would be as safe as having an unloaded hand out of a sling following biceps procedures, the variability of activation patterns across different golfers would lead us to caution against accelerated golf rehabilitation in those who may be particularly tense golfers. The 50m short iron shot was too long to consider as a chip shot and more work can be done in this area to determine the safety of chipping.

Keywords: electromyographic analysis, biceps brachii rupture, golf swing, tendon surgery

Procedia PDF Downloads 54
15 A Multiple Freezing/Thawing Cycles Influence Internal Structure and Mechanical Properties of Achilles Tendon

Authors: Martyna Ekiert, Natalia Grzechnik, Joanna Karbowniczek, Urszula Stachewicz, Andrzej Mlyniec

Abstract:

Tendon grafting is a common procedure performed to treat tendon rupture. Before the surgical procedure, tissues intended for grafts (i.e., Achilles tendon) are stored in ultra-low temperatures for a long time and also may be subjected to unfavorable conditions, such as repetitive freezing (F) and thawing (T). Such storage protocols may highly influence the graft mechanical properties, decrease its functionality and thus increase the risk of complications during the transplant procedure. The literature reports on the influence of multiple F/T cycles on internal structure and mechanical properties of tendons stay inconclusive, confirming and denying the negative influence of multiple F/T at the same time. An inconsistent research methodology and lack of clear limit of F/T cycles, which disqualifies tissue for surgical graft purposes, encouraged us to investigate the issue of multiple F/T cycles by the mean of biomechanical tensile tests supported with Scanning Electron Microscope (SEM) imaging. The study was conducted on male bovine Achilles tendon-derived from the local abattoir. Fresh tendons were cleaned of excessive membranes and then sectioned to obtained fascicle bundles. Collected samples were randomly assigned to 6 groups subjected to 1, 2, 4, 6, 8 and 12 cycles of freezing-thawing (F/T), respectively. Each F/T cycle included deep freezing at -80°C temperature, followed by thawing at room temperature. After final thawing, thin slices of the side part of samples subjected to 1, 4, 8 and 12 F/T cycles were collected for SEM imaging. Then, the width and thickness of all samples were measured to calculate the cross-sectional area. Biomechanical tests were performed using the universal testing machine (model Instron 8872, INSTRON®, Norwood, Massachusetts, USA) using a load cell with a maximum capacity of 250 kN and standard atmospheric conditions. Both ends of each fascicle bundle were manually clamped in grasping clamps using abrasive paper and wet cellulose wadding swabs to prevent tissue slipping while clamping and testing. Samples were subjected to the testing procedure including pre-loading, pre-cycling, loading, holding and unloading steps to obtain stress-strain curves for representing tendon stretching and relaxation. The stiffness of AT fascicles bundle samples was evaluated in terms of modulus of elasticity (Young’s modulus), calculated from the slope of the linear region of stress-strain curves. SEM imaging was preceded by chemical sample preparation including 24hr fixation in 3% glutaraldehyde buffered with 0.1 M phosphate buffer, washing with 0.1 M phosphate buffer solution and dehydration in a graded ethanol solution. SEM images (Merlin Gemini II microscope, ZEISS®) were taken using 30 000x mag, which allowed measuring a diameter of collagen fibrils. The results confirm a decrease in fascicle bundles Young’s modulus as well as a decrease in the diameter of collagen fibrils. These results confirm the negative influence of multiple F/T cycles on the mechanical properties of tendon tissue.

Keywords: biomechanics, collagen, fascicle bundles, soft tissue

Procedia PDF Downloads 101
14 Delivering Safer Clinical Trials; Using Electronic Healthcare Records (EHR) to Monitor, Detect and Report Adverse Events in Clinical Trials

Authors: Claire Williams

Abstract:

Randomised controlled Trials (RCTs) of efficacy are still perceived as the gold standard for the generation of evidence, and whilst advances in data collection methods are well developed, this progress has not been matched for the reporting of adverse events (AEs). Assessment and reporting of AEs in clinical trials are fraught with human error and inefficiency and are extremely time and resource intensive. Recent research conducted into the quality of reporting of AEs during clinical trials concluded it is substandard and reporting is inconsistent. Investigators commonly send reports to sponsors who are incorrectly categorised and lacking in critical information, which can complicate the detection of valid safety signals. In our presentation, we will describe an electronic data capture system, which has been designed to support clinical trial processes by reducing the resource burden on investigators, improving overall trial efficiencies, and making trials safer for patients. This proprietary technology was developed using expertise proven in the delivery of the world’s first prospective, phase 3b real-world trial, ‘The Salford Lung Study, ’ which enabled robust safety monitoring and reporting processes to be accomplished by the remote monitoring of patients’ EHRs. This technology enables safety alerts that are pre-defined by the protocol to be detected from the data extracted directly from the patients EHR. Based on study-specific criteria, which are created from the standard definition of a serious adverse event (SAE) and the safety profile of the medicinal product, the system alerts the investigator or study team to the safety alert. Each safety alert will require a clinical review by the investigator or delegate; examples of the types of alerts include hospital admission, death, hepatotoxicity, neutropenia, and acute renal failure. This is achieved in near real-time; safety alerts can be reviewed along with any additional information available to determine whether they meet the protocol-defined criteria for reporting or withdrawal. This active surveillance technology helps reduce the resource burden of the more traditional methods of AE detection for the investigators and study teams and can help eliminate reporting bias. Integration of multiple healthcare data sources enables much more complete and accurate safety data to be collected as part of a trial and can also provide an opportunity to evaluate a drug’s safety profile long-term, in post-trial follow-up. By utilising this robust and proven method for safety monitoring and reporting, a much higher risk of patient cohorts can be enrolled into trials, thus promoting inclusivity and diversity. Broadening eligibility criteria and adopting more inclusive recruitment practices in the later stages of drug development will increase the ability to understand the medicinal products risk-benefit profile across the patient population that is likely to use the product in clinical practice. Furthermore, this ground-breaking approach to AE detection not only provides sponsors with better-quality safety data for their products, but it reduces the resource burden on the investigator and study teams. With the data taken directly from the source, trial costs are reduced, with minimal data validation required and near real-time reporting enables safety concerns and signals to be detected more quickly than in a traditional RCT.

Keywords: more comprehensive and accurate safety data, near real-time safety alerts, reduced resource burden, safer trials

Procedia PDF Downloads 50
13 Photosynthesis Metabolism Affects Yield Potentials in Jatropha curcas L.: A Transcriptomic and Physiological Data Analysis

Authors: Nisha Govender, Siju Senan, Zeti-Azura Hussein, Wickneswari Ratnam

Abstract:

Jatropha curcas, a well-described bioenergy crop has been extensively accepted as future fuel need especially in tropical regions. Ideal planting material required for large-scale plantation is still lacking. Breeding programmes for improved J. curcas varieties are rendered difficult due to limitations in genetic diversity. Using a combined transcriptome and physiological data, we investigated the molecular and physiological differences in high and low yielding Jatropha curcas to address plausible heritable variations underpinning these differences, in regard to photosynthesis, a key metabolism affecting yield potentials. A total of 6 individual Jatropha plant from 4 accessions described as high and low yielding planting materials were selected from the Experimental Plot A, Universiti Kebangsaan Malaysia (UKM), Bangi. The inflorescence and shoots were collected for transcriptome study. For the physiological study, each individual plant (n=10) from the high and low yielding populations were screened for agronomic traits, chlorophyll content and stomatal patterning. The J. curcas transcriptomes are available under BioProject PRJNA338924 and BioSample SAMN05827448-65, respectively Each transcriptome was subjected to functional annotation analysis of sequence datasets using the BLAST2Go suite; BLASTing, mapping, annotation, statistical analysis and visualization Large-scale phenotyping of the number of fruits per plant (NFPP) and fruits per inflorescence (FPI) classified the high yielding Jatropha accessions with average NFPP =60 and FPI > 10, whereas the low yielding accessions yielded an average NFPP=10 and FPI < 5. Next generation sequencing revealed genes with differential expressions in the high yielding Jatropha relative to the low yielding plants. Distinct differences were observed in transcript level associated to photosynthesis metabolism. DEGs collection in the low yielding population showed comparable CAM photosynthetic metabolism and photorespiration, evident as followings: phosphoenolpyruvate phosphate translocator chloroplastic like isoform with 2.5 fold change (FC) and malate dehydrogenase (2.03 FC). Green leaves have the most pronounced photosynthetic activity in a plant body due to significant accumulation of chloroplast. In most plants, the leaf is always the dominant photosynthesizing heart of the plant body. Large number of the DEGS in the high-yielding population were found attributable to chloroplast and chloroplast associated events; STAY-GREEN chloroplastic, Chlorophyllase-1-like (5.08 FC), beta-amylase (3.66 FC), chlorophyllase-chloroplastic-like (3.1 FC), thiamine thiazole chloroplastic like (2.8 FC), 1-4, alpha glucan branching enzyme chloroplastic amyliplastic (2.6FC), photosynthetic NDH subunit (2.1 FC) and protochlorophyllide chloroplastic (2 FC). The results were parallel to a significant increase in chlorophyll a content in the high yielding population. In addition to the chloroplast associated transcript abundance, the TOO MANY MOUTHS (TMM) at 2.9 FC, which code for distant stomatal distribution and patterning in the high-yielding population may explain high concentration of CO2. The results were in agreement with the role of TMM. Clustered stomata causes back diffusion in the presence of gaps localized closely to one another. We conclude that high yielding Jatropha population corresponds to a collective function of C3 metabolism with a low degree of CAM photosynthetic fixation. From the physiological descriptions, high chlorophyll a content and even distribution of stomata in the leaf contribute to better photosynthetic efficiency in the high yielding Jatropha compared to the low yielding population.

Keywords: chlorophyll, gene expression, genetic variation, stomata

Procedia PDF Downloads 210
12 Impact of Elevated Temperature on Spot Blotch Development in Wheat and Induction of Resistance by Plant Growth Promoting Rhizobacteria

Authors: Jayanwita Sarkar, Usha Chakraborty, Bishwanath Chakraborty

Abstract:

Plants are constantly interacting with various abiotic and biotic stresses. In changing climate scenario plants are continuously modifying physiological processes to adapt to changing environmental conditions which profoundly affect plant-pathogen interactions. Spot blotch in wheat is a fast-rising disease in the warmer plains of South Asia where the rise in minimum average temperature over most of the year already affecting wheat production. Hence, the study was undertaken to explore the role of elevated temperature in spot blotch disease development and modulation of antioxidative responses by plant growth promoting rhizobacteria (PGPR) for biocontrol of spot blotch at high temperature. Elevated temperature significantly increases the susceptibility of wheat plants to spot blotch causing pathogen Bipolaris sorokiniana. Two PGPR Bacillus safensis (W10) and Ochrobactrum pseudogrignonense (IP8) isolated from wheat (Triticum aestivum L.) and blady grass (Imperata cylindrical L.) rhizophere respectively, showing in vitro antagonistic activity against Bipolaris sorokiniana were tested for growth promotion and induction of resistance against spot blotch in wheat. GC-MS analysis showed that Bacillus safensis (W10) and Ochrobactrum pseudogrignonense (IP8) produced antifungal and antimicrobial compounds in culture. Seed priming with these two bacteria significantly increase growth, modulate antioxidative signaling and induce resistance and eventually reduce disease incidence in wheat plants at optimum as well as elevated temperature which was further confirmed by indirect immunofluorescence assay using polyclonal antibody raised against Bipolaris sorokiniana. Application of the PGPR led to enhancement in activities of plant defense enzymes- phenylalanine ammonia lyase, peroxidase, chitinase and β-1,3 glucanase in infected leaves. Immunolocalization of chitinase and β-1,3 glucanase in PGPR primed and pathogen inoculated leaf tissue was further confirmed by transmission electron microscopy using PAb of chitinase, β-1,3 glucanase and gold labelled conjugates. Activity of ascorbate-glutathione redox cycle related enzymes such as ascorbate peroxidase, superoxide dismutase and glutathione reductase along with antioxidants such as carotenoids, glutathione and ascorbate and osmolytes like proline and glycine betain accumulation were also increased during disease development in PGPR primed plant in comparison to unprimed plants at high temperature. Real-time PCR analysis revealed enhanced expression of defense genes- chalcone synthase and phenyl alanineammonia lyase. Over expression of heat shock proteins like HSP 70, small HSP 26.3 and heat shock factor HsfA3 in PGPR primed plants effectively protect plants against spot blotch infection at elevated temperature as compared with control plants. Our results revealed dynamic biochemical cross talk between elevated temperature and spot blotch disease development and furthermore highlight PGPR mediated array of antioxidative and molecular alterations responsible for induction of resistance against spot blotch disease at elevated temperature which seems to be associated with up-regulation of defense genes, heat shock proteins and heat shock factors, less ROS production, membrane damage, increased expression of redox enzymes and accumulation of osmolytes and antioxidants.

Keywords: antioxidative enzymes, defense enzymes, elevated temperature, heat shock proteins, PGPR, Real-Time PCR, spot blotch, wheat

Procedia PDF Downloads 139
11 Bicycle Tourism and Sharing Economy (C2C-Tourism): Analysis of the Reciprocity Behavior in the Case of Warmshowers

Authors: Jana Heimel, Franziska Drescher, Lauren Ugur, Graciela Kuchle

Abstract:

Sharing platforms are a widely investigated field. However, there is a research gap with a lack of focus on ‘real’ (non-profit-orientated) sharing platforms. The research project addresses this gap by conducting an empirical study on a private peer-to-peer (P2P) network to investigate cooperative behavior from a socio-psychological perspective. In recent years the conversion from possession to accessing is increasingly influencing different sectors, particularly the traveling industry. The number of people participating in hospitality exchange platforms like Airbnb, Couchsurfing, and Warmshowers (WS) is rapidly growing. WS is an increasingly popular online community that is linking cycling tourists and locals. It builds on the idea of the “sharing economy” as a not-for-profit hospitality network for bicycle tourists. Hosts not only provide a sleeping berth and warm shower free of charge but also offer additional services to their guests, such as cooking and washing clothes for them. According to previous studies, they are motivated by the idea of promoting cultural experience and forming new friendships. Trust and reciprocity are supposed to play major roles in the success of such platforms. The objective of this research project is to analyze the reciprocity behavior within the WS community. Reciprocity is the act of giving and taking among each other. Individuals feel obligated to return a favor and often expect to increase their own chances of receiving future benefits for themselves. Consequently, the drivers that incite giving and taking, as well as the motivation for hosts and guests, are examined. Thus, the project investigates a particular tourism offer that contributes to sustainable tourism by analyzing P2P resp. cyclist-to-cyclist, C2C) tourism. C2C tourism is characterized by special hospitality and generosity. To find out what motivations drive the hosts and which determinants drive the sharing cycling economy, an empirical study has been conducted globally through an online survey. The data was gathered through the WS community and comprised responses from more than 10,000 cyclists around the globe. Next to general information mostly comprising quantitative data on bicycle tourism (year/tour distance, duration and budget), qualitative information on traveling with WS as well as hosting was collected. The most important motivations for a traveler is to explore the local culture, to save money, and to make friends. The main reasons to host a guest are to promote the use of bicycles and to make friends, but also to give back and pay forward. WS members prefer to stay with/host cyclists. The results indicate that C2C tourists share homogenous characteristics and a similar philosophy, which is crucial for building mutual trust. Members of WS are generally extremely trustful. The study promotes an ecological form of tourism by combining sustainability, regionality, health, experience and the local communities' cultures. The empirical evidence found and analyzed, despite evident limitations, enabled us to shed light, especially on the issue of motivations and social capital, and on the functioning of ‘sharing’ platforms. Final research results are intended to promote C2C tourism around the globe to further replace conventional by sustainable tourism.

Keywords: bicycle tourism, homogeneity, reciprocity, sharing economy, trust

Procedia PDF Downloads 90
10 The Procedural Sedation Checklist Manifesto, Emergency Department, Jersey General Hospital

Authors: Jerome Dalphinis, Vishal Patel

Abstract:

The Bailiwick of Jersey is an island British crown dependency situated off the coast of France. Jersey General Hospital’s emergency department sees approximately 40,000 patients a year. It’s outside the NHS, with secondary care being free at the point of care. Sedation is a continuum which extends from a normal conscious level to being fully unresponsive. Procedural sedation produces a minimally depressed level of consciousness in which the patient retains the ability to maintain an airway, and they respond appropriately to physical stimulation. The goals of it are to improve patient comfort and tolerance of the procedure and alleviate associated anxiety. Indications can be stratified by acuity, emergency (cardioversion for life-threatening dysrhythmia), and urgency (joint reduction). In the emergency department, this is most often achieved using a combination of opioids and benzodiazepines. Some departments also use ketamine to produce dissociative sedation, a cataleptic state of profound analgesia and amnesia. The response to pharmacological agents is highly individual, and the drugs used occasionally have unpredictable pharmacokinetics and pharmacodynamics, which can always result in progression between levels of sedation irrespective of the intention. Therefore, practitioners must be able to ‘rescue’ patients from deeper sedation. These practitioners need to be senior clinicians with advanced airway skills (AAS) training. It can lead to adverse effects such as dangerous hypoxia and unintended loss of consciousness if incorrectly undertaken; studies by the National Confidential Enquiry into Patient Outcome and Death (NCEPOD) have reported avoidable deaths. The Royal College of Emergency Medicine, UK (RCEM) released an updated ‘Safe Sedation of Adults in the Emergency Department’ guidance in 2017 detailing a series of standards for staff competencies, and the required environment and equipment, which are required for each target sedation depth. The emergency department in Jersey undertook audit research in 2018 to assess their current practice. It showed gaps in clinical competency, the need for uniform care, and improved documentation. This spurred the development of a checklist incorporating the above RCEM standards, including contraindication for procedural sedation and difficult airway assessment. This was approved following discussion with the relevant heads of departments and the patient safety directorates. Following this, a second audit research was carried out in 2019 with 17 completed checklists (11 relocation of joints, 6 cardioversions). Data was obtained from looking at the controlled resuscitation drugs book containing documented use of ketamine, alfentanil, and fentanyl. TrakCare, which is the patient electronic record system, was then referenced to obtain further information. The results showed dramatic improvement compared to 2018, and they have been subdivided into six categories; pre-procedure assessment recording of significant medical history and ASA grade (2 fold increase), informed consent (100% documentation), pre-oxygenation (88%), staff (90% were AAS practitioners) and monitoring (92% use of non-invasive blood pressure, pulse oximetry, capnography, and cardiac rhythm monitoring) during procedure, and discharge instructions including the documented return of normal vitals and consciousness (82%). This procedural sedation checklist is a safe intervention that identifies pertinent information about the patient and provides a standardised checklist for the delivery of gold standard of care.

Keywords: advanced airway skills, checklist, procedural sedation, resuscitation

Procedia PDF Downloads 90
9 The Distribution of Prevalent Supplemental Nutrition Assistance Program-Authorized Food Store Formats Differ by U.S. Region and Rurality: Implications for Food Access and Obesity Linkages

Authors: Bailey Houghtaling, Elena Serrano, Vivica Kraak, Samantha Harden, George Davis, Sarah Misyak

Abstract:

United States (U.S.) Department of Agriculture Supplemental Nutrition Assistance Program (SNAP) participants are low-income Americans receiving federal dollars for supplemental food and beverage purchases. Participants use a variety of (traditional/non-traditional) SNAP-authorized stores for household dietary purchases - also representing food access points for all Americans. Importantly consumers' food and beverage purchases from non-traditional store formats tend to be higher in saturated fats, added sugars, and sodium when compared to purchases from traditional (e.g., grocery/supermarket) formats. Overconsumption of energy-dense and low-nutrient food and beverage products contribute to high obesity rates and adverse health outcomes that differ in severity among urban/rural U.S. locations and high/low-income populations. Little is known about the SNAP-authorized food store format landscape nationally, regionally, or by urban-rural status, as traditional formats are currently used as the gold standard in food access research. This research utilized publicly available U.S. databases to fill this large literature gap and to provide insight into modes of food access for vulnerable U.S. populations: (1) SNAP Retailer Locator which provides a list of all authorized food stores in the U.S., and; (2) Rural-Urban Continuum Codes (RUCC) that categorize U.S. counties as urban (RUCC 1-3) or rural (RUCC 4-9). Frequencies were determined for the highest occurring food store formats nationally and within two regionally diverse U.S. states – Virginia in the east and California in the west. Store format codes were assigned (e.g., grocery, drug, convenience, mass merchandiser, supercenter, dollar, club, or other). RUCC was applied to investigate state-level differences in urbanity-rurality regarding prevalent food store formats and Chi Square test of independence was used to determine if food store format distributions significantly (p < 0.05) differed by region or rurality. The resulting research sample that represented highly prevalent SNAP-authorized food stores nationally included 41.25% of all SNAP stores in the U.S. (N=257,839), comprised primarily of convenience formats (31.94%) followed by dollar (25.58%), drug (19.24%), traditional (10.87%), supercenter (6.85%), mass merchandiser (1.62%), non-food store or restaurant (1.81%), and club formats (1.09%). Results also indicated that the distribution of prevalent SNAP-authorized formats significantly differed by state. California had a lower proportion of traditional (9.96%) and a higher proportion of drug (28.92%) formats than Virginia- 11.55% and 19.97%, respectively (p < 0.001). Virginia also had a higher proportion of dollar formats (26.11%) when compared to California (10.64%) (p < 0.001). Significant differences were also observed for rurality variables (p < 0.001). Prominently, rural Virginia had a significantly higher proportion of dollar formats (41.71%) when compared to urban Virginia (21.78%) and rural California (21.21%). Non-traditional SNAP-authorized formats are highly prevalent and significantly differ in distribution by U.S. region and rurality. The largest proportional difference was observed for dollar formats where the least nutritious consumer purchases are documented in the literature. Researchers/practitioners should investigate non-traditional food stores at the local level using these research findings and similar applied methodologies to determine how access to various store formats impact obesity prevalence. For example, dollar stores may be prime targets for interventions to enhance nutritious consumer purchases in rural Virginia while targeting drug formats in California may be more appropriate.

Keywords: food access, food store format, nutrition interventions, SNAP consumers

Procedia PDF Downloads 110
8 Revolutionizing Oil Palm Replanting: Geospatial Terrace Design for High-precision Ground Implementation Compared to Conventional Methods

Authors: Nursuhaili Najwa Masrol, Nur Hafizah Mohammed, Nur Nadhirah Rusyda Rosnan, Vijaya Subramaniam, Sim Choon Cheak

Abstract:

Replanting in oil palm cultivation is vital to enable the introduction of planting materials and provides an opportunity to improve the road, drainage, terrace design, and planting density. Oil palm replanting is fundamentally necessary every 25 years. The adoption of the digital replanting blueprint is imperative as it can assist the Malaysia Oil Palm industry in addressing challenges such as labour shortages and limited expertise related to replanting tasks. Effective replanting planning should commence at least 6 months prior to the actual replanting process. Therefore, this study will help to plan and design the replanting blueprint with high-precision translation on the ground. With the advancement of geospatial technology, it is now feasible to engage in thoroughly researched planning, which can help maximize the potential yield. A blueprint designed before replanting is to enhance management’s ability to optimize the planting program, address manpower issues, or even increase productivity. In terrace planting blueprints, geographic tools have been utilized to design the roads, drainages, terraces, and planting points based on the ARM standards. These designs are mapped with location information and undergo statistical analysis. The geospatial approach is essential in precision agriculture and ensuring an accurate translation of design to the ground by implementing high-accuracy technologies. In this study, geospatial and remote sensing technologies played a vital role. LiDAR data was employed to determine the Digital Elevation Model (DEM), enabling the precise selection of terraces, while ortho imagery was used for validation purposes. Throughout the designing process, Geographical Information System (GIS) tools were extensively utilized. To assess the design’s reliability on the ground compared with the current conventional method, high-precision GPS instruments like EOS Arrow Gold and HIPER VR GNSS were used, with both offering accuracy levels between 0.3 cm and 0.5cm. Nearest Distance Analysis was generated to compare the design with actual planting on the ground. The analysis revealed that it could not be applied to the roads due to discrepancies between actual roads and the blueprint design, which resulted in minimal variance. In contrast, the terraces closely adhered to the GPS markings, with the most variance distance being less than 0.5 meters compared to actual terraces constructed. Considering the required slope degrees for terrace planting, which must be greater than 6 degrees, the study found that approximately 65% of the terracing was constructed at a 12-degree slope, while over 50% of the terracing was constructed at slopes exceeding the minimum degrees. Utilizing blueprint replanting promising strategies for optimizing land utilization in agriculture. This approach harnesses technology and meticulous planning to yield advantages, including increased efficiency, enhanced sustainability, and cost reduction. From this study, practical implementation of this technique can lead to tangible and significant improvements in agricultural sectors. In boosting further efficiencies, future initiatives will require more sophisticated techniques and the incorporation of precision GPS devices for upcoming blueprint replanting projects besides strategic progression aims to guarantee the precision of both blueprint design stages and its subsequent implementation on the field. Looking ahead, automating digital blueprints are necessary to reduce time, workforce, and costs in commercial production.

Keywords: replanting, geospatial, precision agriculture, blueprint

Procedia PDF Downloads 46
7 Optical Coherence Tomography in Differentiation of Acute and Non-Healing Wounds

Authors: Ananya Barui, Provas Banerjee, Jyotirmoy Chatterjee

Abstract:

Application of optical technology in medicine and biology has a long track-record. In this endeavor, OCT is able to attract both engineers and biologists to work together in the field of photonics for establishing a striking non-invasive imaging technology. In contrast to other in vivo imaging modalities like Raman imaging, confocal imaging, two-photon microscopy etc. which can perform in vivo imaging upto 100-200 micron depth due to limitation in numerical aperture or scattering, however, OCT can achieve high-resolution imaging upto few millimeters of tissue structures depending on their refractive index in different anatomical location. This tomographic system depends on interference of two light waves in an interferometer to produce a depth profile of specimen. In wound healing, frequent collection of biopsies for follow-up of repair process could be avoided by such imaging technique. Real time skin OCT (the optical biopsy) has efficacy in deeper and faster illumination of cutaneou tissue to acquire high resolution cross sectional images of their internal micro-structure. Swept Source-OCT (SS-OCT), a novel imaging technique, can generate high-speed depth profile (~ 2 mm) of wound at a sweeping rate of laser with micron level resolution and optimum coherent length of 5-6 mm. Normally multi-layered skin tissue depicts different optical properties along with variation in thickness, refractive index and composition (i.e. keratine layer, water, fat etc.) according to their anatomical location. For instance, stratum corneum, the upper-most and relatively dehydrated layer of epidermis reflects more light and produces more lucid and a sharp demarcation line with rest of the hydrated epidermal region. During wound healing or regeneration, optical properties of cutaneous tissue continuously altered with maturation of wound bed. More mature and less hydrated tissue component reflects more light and becomes visible as a brighter area in comparison to immature region which content higher amount water or fat that depicts as a darker area in OCT image. Non-healing wound possess prolonged inflammation and inhibits nascent proliferative stage. Accumulation of necrotic tissues also prevents the repair of non-healing wounds. Due to high resolution and potentiality to reflect the compositional aspects of tissues in terms of their optical properties, this tomographic method may facilitate in differentiating non-healing and acute wounds in addition to clinical observations. Non-invasive OCT offers better insight regarding specific biological status of tissue in health and pathological conditions, OCT images could be associated with histo-pathological ‘gold standard’. This correlated SS-OCT and microscopic evaluation of the wound edges can provide information regarding progressive healing and maturation of the epithelial components. In the context of searching analogy between two different imaging modalities, their relative performances in imaging of healing bed were estimated for probing an alternative approach. Present study validated utility of SS-OCT in revealing micro-anatomic structure in the healing bed with newer information. Exploring precise correspondence of OCT images features with histo-chemical findings related to epithelial integrity of the regenerated tissue could have great implication. It could establish the ‘optical biopsy’ as a potent non-invasive diagnostic tool for cutaneous pathology.

Keywords: histo-pathology, non invasive imaging, OCT, wound healing

Procedia PDF Downloads 253
6 Modern Cardiac Surgical Outcomes in Nonagenarians: A Multicentre Retrospective Observational Study

Authors: Laurence Weinberg, Dominic Walpole, Dong-Kyu Lee, Michael D’Silva, Jian W. Chan, Lachlan F. Miles, Bradley Carp, Adam Wells, Tuck S. Ngun, Siven Seevanayagam, George Matalanis, Ziauddin Ansari, Rinaldo Bellomo, Michael Yii

Abstract:

Background: There have been multiple recent advancements in the selection, optimization and management of cardiac surgical patients. However, there is limited data regarding the outcomes of nonagenarians undergoing cardiac surgery, despite this vulnerable cohort increasingly receiving these interventions. This study describes the patient characteristics, management and outcomes of a group of nonagenarians undergoing cardiac surgery in the context of contemporary peri-operative care. Methods: A retrospective observational study was conducted of patients 90 to 99 years of age (i.e., nonagenarians) who had undergone cardiac surgery requiring a classic median sternotomy (i.e., open-heart surgery). All operative indications were included. Patients who underwent minimally invasive surgery, transcatheter aortic valve implantation and thoracic aorta surgery were excluded. Data were collected from four hospitals in Victoria, Australia, over an 8-year period (January 2012 – December 2019). The primary objective was to assess six-month mortality in nonagenarians undergoing open-heart surgery and to evaluate the incidence and severity of postoperative complications using the Clavien-Dindo classification system. The secondary objective was to provide a detailed description of the characteristics and peri-operative management of this group. Results: A total of 12,358 adult patients underwent cardiac surgery at the study centers during the observation period, of whom 18 nonagenarians (0.15%) fulfilled the inclusion criteria. The median (IQR) [min-max] age was 91 years (90.0:91.8) [90-94] and 14 patients (78%) were men. Cardiovascular comorbidities, polypharmacy and frailty, were common. The median (IQR) predicted in-hospital mortality by EuroSCORE II was 6.1% (4.1-14.5). All patients were optimized preoperatively by a multidisciplinary team of surgeons, cardiologists, geriatricians and anesthetists. All index surgeries were performed on cardiopulmonary bypass. Isolated coronary artery bypass grafting (CABG) and CABG with aortic valve replacement were the most common surgeries being performed in four and five patients, respectively. Half the study group underwent surgery involving two or more major procedures (e.g. CABG and valve replacement). Surgery was undertaken emergently in 44% of patients. All patients except one experienced at least one postoperative complication. The most common complications were acute kidney injury (72%), new atrial fibrillation (44%) and delirium (39%). The highest Clavien-Dindo complication grade was IIIb occurring once each in three patients. Clavien-Dindo grade IIIa complications occurred in only one patient. The median (IQR) postoperative length of stay was 11.6 days (9.8:17.6). One patient was discharged home and all others to an inpatient rehabilitation facility. Three patients had an unplanned readmission within 30 days of discharge. All patients had follow-up to at least six months after surgery and mortality over this period was zero. The median (IQR) duration of follow-up was 11.3 months (6.0:26.4) and there were no cases of mortality observed within the available follow-up records. Conclusion: In this group of nonagenarians undergoing cardiac surgery, postoperative six-month mortality was zero. Complications were common but generally of low severity. These findings support carefully selected nonagenarian patients being offered cardiac surgery in the context of contemporary, multidisciplinary perioperative care. Further, studies are needed to assess longer-term mortality and functional and quality of life outcomes in this vulnerable surgical cohort.

Keywords: cardiac surgery, mortality, nonagenarians, postoperative complications

Procedia PDF Downloads 91
5 A Prospective Neurosurgical Registry Evaluating the Clinical Care of Traumatic Brain Injury Patients Presenting to Mulago National Referral Hospital in Uganda

Authors: Benjamin J. Kuo, Silvia D. Vaca, Joao Ricardo Nickenig Vissoci, Catherine A. Staton, Linda Xu, Michael Muhumuza, Hussein Ssenyonjo, John Mukasa, Joel Kiryabwire, Lydia Nanjula, Christine Muhumuza, Henry E. Rice, Gerald A. Grant, Michael M. Haglund

Abstract:

Background: Traumatic Brain Injury (TBI) is disproportionally concentrated in low- and middle-income countries (LMICs), with the odds of dying from TBI in Uganda more than 4 times higher than in high income countries (HICs). The disparities in the injury incidence and outcome between LMICs and resource-rich settings have led to increased health outcomes research for TBIs and their associated risk factors in LMICs. While there have been increasing TBI studies in LMICs over the last decade, there is still a need for more robust prospective registries. In Uganda, a trauma registry implemented in 2004 at the Mulago National Referral Hospital (MNRH) showed that RTI is the major contributor (60%) of overall mortality in the casualty department. While the prior registry provides information on injury incidence and burden, it’s limited in scope and doesn’t follow patients longitudinally throughout their hospital stay nor does it focus specifically on TBIs. And although these retrospective analyses are helpful for benchmarking TBI outcomes, they make it hard to identify specific quality improvement initiatives. The relationship among epidemiology, patient risk factors, clinical care, and TBI outcomes are still relatively unknown at MNRH. Objective: The objectives of this study are to describe the processes of care and determine risk factors predictive of poor outcomes for TBI patients presenting to a single tertiary hospital in Uganda. Methods: Prospective data were collected for 563 TBI patients presenting to a tertiary hospital in Kampala from 1 June – 30 November 2016. Research Electronic Data Capture (REDCap) was used to systematically collect variables spanning 8 categories. Univariate and multivariate analysis were conducted to determine significant predictors of mortality. Results: 563 TBI patients were enrolled from 1 June – 30 November 2016. 102 patients (18%) received surgery, 29 patients (5.1%) intended for surgery failed to receive it, and 251 patients (45%) received non-operative management. Overall mortality was 9.6%, which ranged from 4.7% for mild and moderate TBI to 55% for severe TBI patients with GCS 3-5. Within each TBI severity category, mortality differed by management pathway. Variables predictive of mortality were TBI severity, more than one intracranial bleed, failure to receive surgery, high dependency unit admission, ventilator support outside of surgery, and hospital arrival delayed by more than 4 hours. Conclusions: The overall mortality rate of 9.6% in Uganda for TBI is high, and likely underestimates the true TBI mortality. Furthermore, the wide-ranging mortality (3-82%), high ICU fatality, and negative impact of care delays suggest shortcomings with the current triaging practices. Lack of surgical intervention when needed was highly predictive of mortality in TBI patients. Further research into the determinants of surgical interventions, quality of step-up care, and prolonged care delays are needed to better understand the complex interplay of variables that affect patient outcome. These insights guide the development of future interventions and resource allocation to improve patient outcomes.

Keywords: care continuum, global neurosurgery, Kampala Uganda, LMIC, Mulago, prospective registry, traumatic brain injury

Procedia PDF Downloads 204
4 Restoring Total Form and Function in Patients with Lower Limb Bony Defects Utilizing Patient-Specific Fused Deposition Modelling- A Neoteric Multidisciplinary Reconstructive Approach

Authors: Divya SY. Ang, Mark B. Tan, Nicholas EM. Yeo, Siti RB. Sudirman, Khong Yik Chew

Abstract:

Introduction: The importance of the amalgamation of technological and engineering advances with surgical principles of reconstruction cannot be overemphasized. With earlier detection of cancer, consequences of high-speed living and neglect, like traumatic injuries and infection, resulting in increasingly younger patients with bone defects. This may result in malformations and suboptimal function that is more noticeable and palpable in the younger, active demographic. Our team proposes a technique that encapsulates a mesh of multidisciplinary effort, tissue engineering and reconstructive principles. Methods/Materials: Our patient was a young competitive footballer in his early 30s who was diagnosed with submandibular adenoid cystic carcinoma with bony involvement. He was thus counselled for a right hemi mandibulectomy, the floor of mouth resection, right selective neck dissection, tracheostomy, and free fibular flap reconstruction of his mandible and required post-operative radiotherapy. Being young and in his prime sportsman years, he was unable to accept the morbidities associated with using his fibula to reconstruct his mandible despite it being the gold standard reconstructive option. The fibula is an ideal vascularized bone flap because it’s reliable and easily shaped with relatively minimal impact on functional outcomes. The fibula contributes to 30% of weightbearing and is the attachment for the lateral compartment muscles; it is stronger in footballers concerning lateral bending. When harvesting the fibula, the distal 6-8cm and up to 10% of the total length is preserved to maintain the ankle’s stability, thus, minimizing the impact on daily activities. There are studies that have noted gait variability post-operatively. Therefore, returning to a premorbid competitive level may be doubtful. To improve his functional outcomes, the decision was made to try and restore the fibula's form and function. Using the concept of Fused Deposition Modelling (FDM), our team comprising of Plastics, Otolaryngology, Orthopedics and Radiology, worked with Osteopore to design a 3D bioresorbable implant to regenerate the fibula defect (14.5cm). Bone marrow was harvested via reaming the contralateral hip prior to the wide resection. 30mls of his blood was obtained for extracting platelet rich plasma. These were packed into the Osteopore 3D-printed bone scaffold. This was then secured into the fibula defect with titanium plates and screws. The flexor hallucis longus and soleus were anchored along the construct and intraosseous membrane, done in a single setting. Results: He was reviewed closely as an outpatient over 10 months post operatively. He reported no discernable loss or difference in ankle function. He is satisfied and back in training and our team has video and photographs that substantiate his progress. Conclusion: FDM allows regeneration of long bone defects. However, we aimed to also restore his eversion and inversion that is imperative for footballers and hence reattached his previously dissected muscles along the length of the Osteopore implant. We believe that the reattachment of the muscle stabilizes not only the construct but allows optimum muscle tensioning when moving his ankle. This is a simple but effective technique in restoring complete function and form in a young patient whose minute muscle control is imperative to life.

Keywords: fused deposition modelling, functional reconstruction, lower limb bony defects, regenerative surgery, 3D printing, tissue engineering

Procedia PDF Downloads 44
3 Adequate Nutritional Support and Monitoring in Post-Traumatic High Output Duodenal Fistula

Authors: Richa Jaiswal, Vidisha Sharma, Amulya Rattan, Sushma Sagar, Subodh Kumar, Amit Gupta, Biplab Mishra, Maneesh Singhal

Abstract:

Background: Adequate nutritional support and daily patient monitoring have an independent therapeutic role in the successful management of high output fistulae and early recovery after abdominal trauma. Case presentation: An 18-year-old girl was brought to AIIMS emergency with alleged history of fall of a heavy weight (electric motor) over abdomen. She was evaluated as per Advanced Trauma Life Support(ATLS) protocols and diagnosed to have significant abdominal trauma. After stabilization, she was referred to Trauma center. Abdomen was guarded and focused assessment with sonography for trauma(FAST) was found positive. Complete duodenojejunal(DJ) junction transection was found at laparotomy, and end-to-end repair was done. However, patient was re-explored in view of biliary peritonitis on post-operative day3, and anastomotic leak was found with sloughing of duodenal end. Resection of non-viable segments was done followed by side-to-side anastomosis. Unfortunately, the anastomosis leaked again, this time due to a post-anastomotic kink, diagnosed on dye study. Due to hostile abdomen, the patient was planned for supportive care, with plan of build-up and delayed definitive surgery. Percutaneous transheptic biliary drainage (PTBD) and STSG were required in the course as well. Nutrition: In intensive care unit (ICU), major goals of nutritional therapy were to improve wound healing, optimize nutrition, minimize enteral feed associated complications, reduce biliary fistula output, and prepare the patient for definitive surgeries. Feeding jejunostomy (FJ) was started from day 4 at the rate of 30ml/h along with total parenteral nutrition (TPN) and intra-venous (IV) micronutrients support. Due to high bile output, bile refeed started from day 13.After 23 days of ICU stay, patient was transferred to general ward with body mass index (BMI)<11kg/m2 and serum albumin –1.5gm%. Patient was received in the ward in catabolic phase with high risk of refeeding syndrome. Patient was kept on FJ bolus feed at the rate of 30–50 ml/h. After 3–4 days, while maintaining patient diet book log it was observed that patient use to refuse feed at night and started becoming less responsive with every passing day. After few minutes of conversation with the patient for a couple of days, she complained about enteral feed discharge in urine, mild pain and sign of dumping syndrome. Dye study was done, which ruled out any enterovesical fistula and conservative management were planned. At this time, decision was taken for continuous slow rate feeding through commercial feeding pump at the rate of 2–3ml/min. Drastic improvement was observed from the second day in gastro-intestinal symptoms and general condition of the patient. Nutritional composition of feed, TPN and diet ranged between 800 and 2100 kcal and 50–95 g protein. After STSG, TPN was stopped. Periodic diet counselling was given to improve oral intake. At the time of discharge, serum albumin level was 2.1g%, weight – 38.6, BMI – 15.19 kg/m2. Patient got discharge on an oral diet. Conclusion: Successful management of post-traumatic proximal high output fistulae is a challenging task, due to impaired nutrient absorption and enteral feed associated complications. Strategic- and goal-based nutrition support can salvage such critically ill patients, as demonstrated in the present case.

Keywords: nutritional monitoring, nutritional support, duodenal fistula, abdominal trauma

Procedia PDF Downloads 232
2 Improving Sanitation and Hygiene Using a Behavioral Change Approach in Public and Private Schools in Kampala, Uganda

Authors: G. Senoga, D. Nakimuli, B. Ndagire, B. Lukwago, D. Kyamagwa

Abstract:

Background: The COVID-19 epidemic affected the education sector, with some private schools closing while other children missed schooling for fear contracting COVID-19. Post COVID-19, PSIU in collaborated with Kampala City Council Authority Directorate of Education and Social Science, Water and Sanitation department, and Directorate of Public Health and Environment to improve sanitation and hygiene among pupils and staff in 50 public and private school system in Kampala city. The “Be Clean, Stay Healthy Campaign” used a behavioral change approach in educating, reinforcing and engaging learners on proper hand washing behaviors, proper toilet usage and garbage disposal. In April 2022, 40 Washa lots were constructed, to reduce the pupil - hand wash station ratio; distributed KCCA approved printed materials; oriented 50 teachers, WASH committees to execute and implement hygiene promotion. To ensure sustainability, WASH messages were memorized and practiced through hand washing songs, Pledge, prayer, Poems, Skits, Music, dance and drama, coupled with participatory, practical demonstrations using peer to peer approach, guest speakers at assemblies and in classes. This improved hygiene and sanitation practices. Premised on this, PSI conducted an end line assessment to explore the impact of a hand washing campaign in regards to improvements in hand washing practices and hand hygiene among pupils, accessibility, functionality and usage of the constructed hygiene and sanitation facilities. Method: A cross-sectional post intervention assessment using a mixed methods approach, targeting headteachers, wash committee members and pupils less <17 years was used. Quantitative approaches with a mix of open-ended questions were used in purposively selected respondents in 50 schools. Primary three to primary seven pupils were randomly selected, data was analyzed using the Statistical Package for Social Scientists (SPSS) Outcomes and Findings: 46,989 pupils (51% female), 1,127 and 524 teaching and non-teaching staff were reached by the intervention, respectively. 96% of schools trained on sanitation, sustainable water usage and hygiene constituted 17-man school WASH committees with teacher, parents and pupils representatives. (31%) of the WASH committees developed workplans, (78%) held WASH meetings monthly. This resulted into improved sanitation, water usage, waste management, proper use of toilets, and improved pupils’ health with reduced occurrences of stomach upsets, diarrhoea initially attributed to improper use of latrines and general waste management. Teachers reported reduced number of school absenteeism due to improved hygiene and general waste management at school, especially proper management of sanitary pads. School administrations response rate in purchase of hygiene equipment’s and detergents like soap improved. Regular WASH meetings in classes, teachers and community supervision ensured WASH facilities are used appropriately. Conclusion and Recommendations: Practical behaviour change innovations improves pupil’s knowledge and understanding of hygiene messages and usage. Over 70% of pupils had clear recall of key WASH Messages. There is need for continuous water flow in the Washa lots, harvesting rain water would reduce water bills while complementing National water supply coupled with increasing on Washa lots in densely populated schools.

Keywords: handwashing, hygyiene, sanitation, behaviour change

Procedia PDF Downloads 53
1 Advancing Dialysis Care Access And Health Information Management: A Blueprint For Nairobi Hospital

Authors: Kimberly Winnie Achieng Otieno

Abstract:

Nairobi Hospital plays a pivotal role in healthcare provision in East and Central Africa, yet it faces challenges in providing accessible dialysis care and managing health information efficiently. This paper explores strategic interventions to enhance dialysis care, access and streamline health information management, fostering an integrated and patient-centered healthcare system. Challenges at Nairobi Hospital: The Nairobi Hospital currently grapples with insufficient dialysis machines, resulting in extended turn around time in between dialysis sessions for patients. This issue stems from both staffing bottle necks and infrastructural limitations given our growing demand for renal care services. Paper-based records and fragmented information systems hinder the hospital’s ability to manage health data effectively. A lack of hospital systems integration with other facilities jeopardizes patient care access by posing challenges. These inefficiencies hinder collaborative efforts within the healthcare network. An investment in the expanding Nairobi Hospital dialysis facilities to communities is crucial with the high number of new cases of patients with chronic kidney disease. Setting up satellite clinics that are closer to people who live in areas far from the main hospital will ensure better access. This includes acquiring physical space within the greater Nairobi region, and the incorporation of mobile dialysis units to reach underserved areas. By decentralizing services, Nairobi Hospital can extend its reach and cater to a larger patient population. Community Outreach and Education: Implementing educational programs on kidney health within local communities is vital for early detection and prevention. Collaborating with local leaders and organizations can establish a proactive approach to renal health hence reducing the demand for acute dialysis interventions. it can amplify this effort by expanding Nairobi Hospital’s corporate social responsibility outreach program. Increasing the hospital’s footprint would also require an equal ramp up of staff recruitment. Support for continuous training programs will ensure that healthcare providers stay abreast of evolving practices, contributing to improved patient outcomes and service quality. Streamlining Health Information Management: Fully embracing a shift to 100% Electronic Health Records (EHRs) is a transformative step toward efficient health information management. Customizing these systems to Nairobi Hospital’s specific needs allows for seamless data recording, retrieval, and sharing among healthcare professionals. Doing so will help the hospital guarantee a continuum of care for patients transferring from other facilities. A 100% transition to digital record will also pose its own security threats. Ensuring robust security measures protects patient data and builds trust. Adherence to healthcare data privacy regulations is non-negotiable, and a comprehensive strategy for encryption, access controls, and regular audits should be implemented. Integrating systems to enable interoperability with other healthcare providers facilitates a cohesive healthcare network. Shared information promotes a holistic understanding of patients’ medical history, minimizing redundancies and enhancing overall care quality. Implementation Strategies: To manage the transition to community-based care and EHRs effectively, a phased implementation approach is recommended. Prioritizing dialysis care improvements, at a local level, in the initial stages allows the hospital to address immediate patient needs, followed by the integration of health information management changes. Engaging hospital staff, patients, and local communities is paramount. Collaboration with government agencies, non-governmental organizations (NGOs), and international partners enhances support and resources for successful implementation. Conclusion: By strategically enhancing dialysis care access and streamlining health information management, Nairobi Hospital can strengthen its position as a leading healthcare institution in both East and Central Africa. This comprehensive approach aligns with the hospital’s commitment to providing high-quality, accessible, and patient-centered care in the evolving landscape of healthcare delivery.

Keywords: Africa, urology, diaylsis, healthcare

Procedia PDF Downloads 18