Search results for: access to care
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6348

Search results for: access to care

18 Resilience Compendium: Strategies to Reduce Communities' Risk to Disasters

Authors: Caroline Spencer, Suzanne Cross, Dudley McArdle, Frank Archer

Abstract:

Objectives: The evolution of the Victorian Compendium of Community-Based Resilience Building Case Studies and its capacity to help communities implement activities that encourage adaptation to disaster risk reduction and promote community resilience in rural and urban locations provide this paper's objectives. Background: Between 2012 and 2019, community groups presented at the Monash University Disaster Resilience Initiative (MUDRI) 'Advancing Community Resilience Annual Forums', provided opportunities for communities to impart local resilience activities, how to solve challenges and share unforeseen learning and be considered for inclusion in the Compendium. A key tenet of the Compendium encourages compiling and sharing of grass-roots resilience building activities to help communities before, during, and after unexpected emergencies. The online Compendium provides free access for anyone wanting to help communities build expertise, reduce program duplication, and save valuable community resources. Identifying case study features across the emergency phases and analyzing critical success factors helps communities understand what worked and what did not work to achieve success and avoid known barriers. International exemplars inform the Compendium, which represents an Australian first and enhances Victorian community resilience initiatives. Emergency Management Victoria provided seed funding for the Compendium. MUDRI matched this support and continues to fund the project. A joint Steering Committee with broad-based user input and Human ethics approval guides its continued growth. Methods: A thematic analysis of the Compendium identified case study features, including critical success factors. Results: The Compendium comprises 38 case studies, representing all eight Victorian regions. Case studies addressed emergency phases, before (29), during (7), and after (17) events. Case studies addressed all hazards (23), bushfires (11), heat (2), fire safety (1), and house fires (1). Twenty case studies used a framework. Thirty received funding, of which nine received less than $20,000 and five received more than $100,000. Twenty-nine addressed a whole of community perspective. Case studies revealed unique and valuable learning in diverse settings. Critical success factors included strong governance; board support, leadership, and trust; partnerships; commitment, adaptability, and stamina; community-led initiatives. Other success factors included a paid facilitator and local government support; external funding, and celebrating success. Anecdotally, we are aware that community groups reference Compendium and that its value adds to community resilience planning. Discussion: The Compendium offers an innovative contribution to resilience research and practice. It augments the seven resilience characteristics to strengthen and encourage communities as outlined in the Statewide Community Resilience Framework for Emergency Management; brings together people from across sectors to deliver distinct, yet connected actions to strengthen resilience as a part of the Rockefeller funded Resilient Melbourne Strategy, and supports communities and economies to be resilient when a shock occurs as identified in the recently published Australian National Disaster Risk Reduction Framework. Each case study offers learning about connecting with community and how to increase their resilience to disaster risks and to keep their community safe from unexpected emergencies. Conclusion: The Compendium enables diverse communities to adopt or adapt proven resilience activities, thereby preserving valuable community resources and offers the opportunity to extend to a national or international Compendium.

Keywords: case study, community, compendium, disaster risk reduction, resilience

Procedia PDF Downloads 94
17 Burkholderia Cepacia ST 767 Causing a Three Years Nosocomial Outbreak in a Hemodialysis Unit

Authors: Gousilin Leandra Rocha Da Silva, Stéfani T. A. Dantas, Bruna F. Rossi, Erika R. Bonsaglia, Ivana G. Castilho, Terue Sadatsune, Ary Fernandes Júnior, Vera l. M. Rall

Abstract:

Kidney failure causes decreased diuresis and accumulation of nitrogenous substances in the body. To increase patient survival, hemodialysis is used as a partial substitute for renal function. However, contamination of the water used in this treatment, causing bacteremia in patients, is a worldwide concern. The Burkholderia cepacia complex (Bcc), a group of bacteria with more than 20 species, is frequently isolated from hemodialysis water samples and comprises opportunistic bacteria, affecting immunosuppressed patients, due to its wide variety of virulence factors, in addition to innate resistance to several antimicrobial agents, contributing to the permanence in the hospital environment and to the pathogenesis in the host. The objective of the present work was to characterize molecularly and phenotypically Bcc isolates collected from the water and dialysate of the Hemodialysis Unit and from the blood of patients at a Public Hospital in Botucatu, São Paulo, Brazil, between 2019 and 2021. We used 33 Bcc isolates, previously obtained from blood cultures from patients with bacteremia undergoing hemodialysis treatment (2019-2021) and 24 isolates obtained from water and dialysate samples in a Hemodialysis Unit (same period). The recA gene was sequenced to identify the specific species among the Bcc group. All isolates were tested for the presence of some genes that encode virulence factors such as cblA, esmR, zmpA and zmpB. Considering the epidemiology of the outbreak, the Bcc isolates were molecularly characterized by Multi Locus Sequence Type (MLST) and by pulsed-field gel electrophoresis (PFGE). The verification and quantification of biofilm in a polystyrene microplate were performed by submitting the isolates to different incubation temperatures (20°C, average water temperature and 35°C, optimal temperature for group growth). The antibiogram was performed with disc diffusion tests on agar, using discs impregnated with cefepime (30µg), ceftazidime (30µg), ciprofloxacin (5µg), gentamicin (10µg), imipenem (10µg), amikacin 30µg), sulfametazol/trimethoprim (23.75/1.25µg) and ampicillin/sulbactam (10/10µg). The presence of ZmpB was identified in all isolates, while ZmpA was observed in 96.5% of the isolates, while none of them presented the cblA and esmR genes. The antibiogram of the 33 human isolates indicated that all were resistant to gentamicin, colistin, ampicillin/sulbactam and imipenem. 16 (48.5%) isolates were resistant to amikacin and lower rates of resistance were observed for meropenem, ceftazidime, cefepime, ciprofloxacin and piperacycline/tazobactam (6.1%). All isolates were sensitive to sulfametazol/trimethoprim, levofloxacin and tigecycline. As for the water isolates, resistance was observed only to gentamicin (34.8%) and imipenem (17.4%). According to PFGE results, all isolates obtained from humans and water belonged to the same pulsotype (1), which was identified by recA sequencing as B. cepacia¸, belonging to sequence type ST-767. By observing a single pulse type over three years, one can observe the persistence of this isolate in the pipeline, contaminating patients undergoing hemodialysis, despite the routine disinfection of water with peracetic acid. This persistence is probably due to the production of biofilm, which protects bacteria from disinfectants and, making this scenario more critical, several isolates proved to be multidrug-resistant (resistance to at least three groups of antimicrobials), turning the patient care even more difficult.

Keywords: hemodialysis, burkholderia cepacia, PFGE, MLST, multi drug resistance

Procedia PDF Downloads 65
16 Improving Data Completeness and Timely Reporting: A Joint Collaborative Effort between Partners in Health and Ministry of Health in Remote Areas, Neno District, Malawi

Authors: Wiseman Emmanuel Nkhomah, Chiyembekezo Kachimanga, Moses Banda Aron, Julia Higgins, Manuel Mulwafu, Kondwani Mpinga, Mwayi Chunga, Grace Momba, Enock Ndarama, Dickson Sumphi, Atupere Phiri, Fabien Munyaneza

Abstract:

Background: Data is key to supporting health service delivery as stakeholders, including NGOs rely on it for effective service delivery, decision-making, and system strengthening. Several studies generated debate on data quality from national health management information systems (HMIS) in sub-Saharan Africa. This limits the utilization of data in resource-limited settings, which already struggle to meet standards set by the World Health Organization (WHO). We aimed to evaluate data quality improvement of Neno district HMIS over a 4-year period (2018 – 2021) following quarterly data reviews introduced in January 2020 by the district health management team and Partners In Health. Methods: Exploratory Mixed Research was used to examine report rates, followed by in-depth interviews using Key Informant Interviews (KIIs) and Focus Group Discussions (FGDs). We used the WHO module desk review to assess the quality of HMIS data in the Neno district captured from 2018 to 2021. The metrics assessed included the completeness and timeliness of 34 reports. Completeness was measured as a percentage of non-missing reports. Timeliness was measured as the span between data inputs and expected outputs meeting needs. We computed T-Test and recorded P-values, summaries, and percentage changes using R and Excel 2016. We analyzed demographics for key informant interviews in Power BI. We developed themes from 7 FGDs and 11 KIIs using Dedoose software, from which we picked perceptions of healthcare workers, interventions implemented, and improvement suggestions. The study was reviewed and approved by Malawi National Health Science Research Committee (IRB: 22/02/2866). Results: Overall, the average reporting completeness rate was 83.4% (before) and 98.1% (after), while timeliness was 68.1% and 76.4 respectively. Completeness of reports increased over time: 2018, 78.8%; 2019, 88%; 2020, 96.3% and 2021, 99.9% (p< 0.004). The trend for timeliness has been declining except in 2021, where it improved: 2018, 68.4%; 2019, 68.3%; 2020, 67.1% and 2021, 81% (p< 0.279). Comparing 2021 reporting rates to the mean of three preceding years, both completeness increased from 88% to 99% (in 2021), while timeliness increased from 68% to 81%. Sixty-five percent of reports have maintained meeting a national standard of 90%+ in completeness while only 24% in timeliness. Thirty-two percent of reports met the national standard. Only 9% improved on both completeness and timeliness, and these are; cervical cancer, nutrition care support and treatment, and youth-friendly health services reports. 50% of reports did not improve to standard in timeliness, and only one did not in completeness. On the other hand, factors associated with improvement included improved communications and reminders using internal communication, data quality assessments, checks, and reviews. Decentralizing data entry at the facility level was suggested to improve timeliness. Conclusion: Findings suggest that data quality in HMIS for the district has improved following collaborative efforts. We recommend maintaining such initiatives to identify remaining quality gaps and that results be shared publicly to support increased use of data. These results can inform Ministry of Health and its partners on some interventions and advise initiatives for improving its quality.

Keywords: data quality, data utilization, HMIS, collaboration, completeness, timeliness, decision-making

Procedia PDF Downloads 51
15 An Engaged Approach to Developing Tools for Measuring Caregiver Knowledge and Caregiver Engagement in Juvenile Type 1 Diabetes

Authors: V. Howard, R. Maguire, S. Corrigan

Abstract:

Background: Type 1 Diabetes (T1D) is a chronic autoimmune disease, typically diagnosed in childhood. T1D puts an enormous strain on families; controlling blood-glucose in children is difficult and the consequences of poor control for patient health are significant. Successful illness management and better health outcomes can be dependent on quality of caregiving. On diagnosis, parent-caregivers face a steep learning curve as T1D care requires a significant level of knowledge to inform complex decision making throughout the day. The majority of illness management is carried out in the home setting, independent of clinical health providers. Parent-caregivers vary in their level of knowledge and their level of engagement in applying this knowledge in the practice of illness management. Enabling researchers to quantify these aspects of the caregiver experience is key to identifying targets for psychosocial support interventions, which are desirable for reducing stress and anxiety in this highly burdened cohort, and supporting better health outcomes in children. Currently, there are limited tools available that are designed to capture this information. Where tools do exist, they are not comprehensive and do not adequately capture the lived experience. Objectives: Development of quantitative tools, informed by lived experience, to enable researchers gather data on parent-caregiver knowledge and engagement, which accurately represents the experience/cohort and enables exploration of questions that are of real-world value to the cohort themselves. Methods: This research employed an engaged approach to address the problem of quantifying two key aspects of caregiver diabetes management: Knowledge and engagement. The research process was multi-staged and iterative. Stage 1: Working from a constructivist standpoint, literature was reviewed to identify relevant questionnaires, scales and single-item measures of T1D caregiver knowledge and engagement, and harvest candidate questionnaire items. Stage 2: Aggregated findings from the review were circulated among a PPI (patient and public involvement) expert panel of caregivers (n=6), for discussion and feedback. Stage 3: In collaboration with the expert panel, data were interpreted through the lens of lived experience to create a long-list of candidate items for novel questionnaires. Items were categorized as either ‘knowledge’ or ‘engagement’. Stage 4: A Delphi-method process (iterative surveys) was used to prioritize question items and generate novel questions that further captured the lived experience. Stage 5: Both questionnaires were piloted to refine wording of text to increase accessibility and limit socially desirable responding. Stage 6: Tools were piloted using an online survey that was deployed using an online peer-support group for caregivers for Juveniles with T1D. Ongoing Research: 123 parent-caregivers completed the survey. Data analysis is ongoing to establish face and content validity qualitatively and through exploratory factor analysis. Reliability will be established using an alternative-form method and Cronbach’s alpha will assess internal consistency. Work will be completed by early 2024. Conclusion: These tools will enable researchers to gain deeper insights into caregiving practices among parents of juveniles with T1D. Development was driven by lived experience, illustrating the value of engaged research at all levels of the research process.

Keywords: caregiving, engaged research, juvenile type 1 diabetes, quantified engagement and knowledge

Procedia PDF Downloads 25
14 Transitioning towards a Circular Economy in the Textile Industry: Approaches to Address Environmental Challenges

Authors: Mozhdeh Khalili Kordabadi

Abstract:

Textiles play a vital role in human life, particularly in the form of clothing. However, the alarming rate at which textiles end up in landfills presents a significant environmental risk. With approximately one garbage truck per second being filled with discarded textiles, urgent measures are required to mitigate this trend. Governments and responsible organizations are calling upon various stakeholders to shift from a linear economy to a circular economy model in the textile industry. This article highlights several key approaches that can be undertaken to address this pressing issue. These approaches include the creation of renewable raw material sources, rethinking production processes, maximizing the use and reuse of textile products, implementing reproduction and recycling strategies, exploring redistribution to new markets, and finding innovative means to extend the lifespan of textiles. By adopting these strategies, the textile industry can contribute to a more sustainable and environmentally friendly future. Introduction: Textiles, particularly clothing, are essential to human existence. However, the rapid accumulation of textiles in landfills poses a significant threat to the environment. This article explores the urgent need for the textile industry to transition from a linear economy model to a circular economy model. The linear model, characterized by the creation, use, and disposal of textiles, is unsustainable in the long term. By adopting a circular economy approach, the industry can minimize waste, reduce environmental impact, and promote sustainable practices. This article outlines key approaches that can be undertaken to drive this transition. Approaches to Address Environmental Challenges: Creation of Renewable Raw Materials Sources: Exploring and promoting the use of renewable and sustainable raw materials, such as organic cotton, hemp, and recycled fibers, can significantly reduce the environmental footprint of textile production. Rethinking Production Processes: Implementing cleaner production techniques, optimizing resource utilization, and minimizing waste generation are crucial steps in reducing the environmental impact of textile manufacturing. Maximizing Use and Reuse of Textile Products: Encouraging consumers to prolong the lifespan of textile products through proper care, maintenance, and repair services can reduce the frequency of disposal and promote a culture of sustainability. Reproduction and Recycling Strategies: Investing in innovative technologies and infrastructure to enable efficient reproduction and recycling of textiles can close the loop and minimize waste generation. Redistribution of Textiles to New Markets: Exploring opportunities to redistribute textiles to new and parallel markets, such as resale platforms, can extend their lifecycle and prevent premature disposal. Improvising Means to Extend Textile Lifespan: Encouraging design practices that prioritize durability, versatility, and timeless aesthetics can contribute to prolonging the lifespan of textiles. Conclusion: The textile industry must urgently transition from a linear economy to a circular economy model to mitigate the adverse environmental impact caused by textile waste. By implementing the outlined approaches, such as sourcing renewable raw materials, rethinking production processes, promoting reuse and recycling, exploring new markets, and extending the lifespan of textiles, stakeholders can work together to create a more sustainable and environmentally friendly textile industry. These measures require collective action and collaboration between governments, organizations, manufacturers, and consumers to drive positive change and safeguard the planet for future generations.

Keywords: textiles, circular economy, environmental challenges, renewable raw materials, production processes, reuse, recycling, redistribution, textile lifespan extension.

Procedia PDF Downloads 57
13 Exploring Factors That May Contribute to the Underdiagnosis of Hereditary Transthyretin Amyloidosis in African American Patients

Authors: Kelsi Hagerty, Ami Rosen, Aaliyah Heyward, Nadia Ali, Emily Brown, Erin Demo, Yue Guan, Modele Ogunniyi, Brianna McDaniels, Alanna Morris, Kunal Bhatt

Abstract:

Hereditary transthyretin amyloidosis (hATTR) is a progressive, multi-systemic, and life-threatening disease caused by a disruption in the TTR protein that delivers thyroxine and retinol to the liver. This disruption causes the protein to misfold into amyloid fibrils, leading to the accumulation of the amyloid fibrils in the heart, nerves, and GI tract. Over 130 variants in the TTR gene are known to cause hATTR. The Val122Ile variant is the most common in the United States and is seen almost exclusively in people of African descent. TTR variants are inherited in an autosomal dominant fashion and have incomplete penetrance and variable expressivity. Individuals with hATTR may exhibit symptoms from as early as 30 years to as late as 80 years of age. hATTR is characterized by a wide range of clinical symptoms such as cardiomyopathy, neuropathy, carpal tunnel syndrome, and GI complications. Without treatment, hATTR leads to progressive disease and can ultimately lead to heart failure. hATTR disproportionately affects individuals of African descent; the estimated prevalence of hATTR among Black individuals in the US is 3.4%. Unfortunately, hATTR is often underdiagnosed and misdiagnosed because many symptoms of the disease overlap with other cardiac conditions. Due to the progressive nature of the disease, multi-systemic manifestations that can lead to a shortened lifespan, and the availability of free genetic testing and promising FDA-approved therapies that enhance treatability, early identification of individuals with a pathogenic hATTR variant is important, as this can significantly impact medical management for patients and their relatives. Furthermore, recent literature suggests that TTR genetic testing should be performed in all patients with suspicion of TTR-related cardiomyopathy, regardless of age, and that follow-up with genetic counseling services is recommended. Relatives of patients with hATTR benefit from genetic testing because testing can identify carriers early and allow relatives to receive regular screening and management. Despite the striking prevalence of hATTR among Black individuals, hATTR remains underdiagnosed in this patient population, and germline genetic testing for hATTR in Black individuals seems to be underrepresented, though the reasons for this have not yet been brought to light. Historically, Black patients experience a number of barriers to seeking healthcare that has been hypothesized to perpetuate the underdiagnosis of hATTR, such as lack of access and mistrust of healthcare professionals. Prior research has described a myriad of factors that shape an individual’s decision about whether to pursue presymptomatic genetic testing for a familial pathogenic variant, such as family closeness and communication, family dynamics, and a desire to inform other family members about potential health risks. This study explores these factors through 10 in-depth interviews with patients with hATTR about what factors may be contributing to the underdiagnosis of hATTR in the Black population. Participants were selected from the Emory University Amyloidosis clinic based on having a molecular diagnosis of hATTR. Interviews were recorded and transcribed verbatim, then coded using MAXQDA software. Thematic analysis was completed to draw commonalities between participants. Upon preliminary analysis, several themes have emerged. Barriers identified include i) Misdiagnosis and a prolonged diagnostic odyssey, ii) Family communication and dynamics surrounding health issues, iii) Perceptions of healthcare and one’s own health risks, and iv) The need for more intimate provider-patient relationships and communication. Overall, this study gleaned valuable insight from members of the Black community about possible factors contributing to the underdiagnosis of hATTR, as well as potential solutions to go about resolving this issue.

Keywords: cardiac amyloidosis, heart failure, TTR, genetic testing

Procedia PDF Downloads 73
12 Assessing Diagnostic and Evaluation Tools for Use in Urban Immunisation Programming: A Critical Narrative Review and Proposed Framework

Authors: Tim Crocker-Buque, Sandra Mounier-Jack, Natasha Howard

Abstract:

Background: Due to both the increasing scale and speed of urbanisation, urban areas in low and middle-income countries (LMICs) host increasingly large populations of under-immunized children, with the additional associated risks of rapid disease transmission in high-density living environments. Multiple interdependent factors are associated with these coverage disparities in urban areas and most evidence comes from relatively few countries, e.g., predominantly India, Kenya, Nigeria, and some from Pakistan, Iran, and Brazil. This study aimed to identify, describe, and assess the main tools used to measure or improve coverage of immunisation services in poor urban areas. Methods: Authors used a qualitative review design, including academic and non-academic literature, to identify tools used to improve coverage of public health interventions in urban areas. Authors selected and extracted sources that provided good examples of specific tools, or categories of tools, used in a context relevant to urban immunization. Diagnostic (e.g., for data collection, analysis, and insight generation) and programme tools (e.g., for investigating or improving ongoing programmes) and interventions (e.g., multi-component or stand-alone with evidence) were selected for inclusion to provide a range of type and availability of relevant tools. These were then prioritised using a decision-analysis framework and a tool selection guide for programme managers developed. Results: Authors reviewed tools used in urban immunisation contexts and tools designed for (i) non-immunization and/or non-health interventions in urban areas, and (ii) immunisation in rural contexts that had relevance for urban areas (e.g., Reaching every District/Child/ Zone). Many approaches combined several tools and methods, which authors categorised as diagnostic, programme, and intervention. The most common diagnostic tools were cross-sectional surveys, key informant interviews, focus group discussions, secondary analysis of routine data, and geographical mapping of outcomes, resources, and services. Programme tools involved multiple stages of data collection, analysis, insight generation, and intervention planning and included guidance documents from WHO (World Health Organisation), UNICEF (United Nations Children's Fund), USAID (United States Agency for International Development), and governments, and articles reporting on diagnostics, interventions, and/or evaluations to improve urban immunisation. Interventions involved service improvement, education, reminder/recall, incentives, outreach, mass-media, or were multi-component. The main gaps in existing tools were an assessment of macro/policy-level factors, exploration of effective immunization communication channels, and measuring in/out-migration. The proposed framework uses a problem tree approach to suggest tools to address five common challenges (i.e. identifying populations, understanding communities, issues with service access and use, improving services, improving coverage) based on context and available data. Conclusion: This study identified many tools relevant to evaluating urban LMIC immunisation programmes, including significant crossover between tools. This was encouraging in terms of supporting the identification of common areas, but problematic as data volumes, instructions, and activities could overwhelm managers and tools are not always suitably applied to suitable contexts. Further research is needed on how best to combine tools and methods to suit local contexts. Authors’ initial framework can be tested and developed further.

Keywords: health equity, immunisation, low and middle-income countries, poverty, urban health

Procedia PDF Downloads 115
11 Development of an Omaha System-Based Remote Intervention Program for Work-Related Musculoskeletal Disorders (WMSDs) Among Front-Line Nurses

Authors: Tianqiao Zhang, Ye Tian, Yanliang Yin, Yichao Tian, Suzhai Tian, Weige Sun, Shuhui Gong, Limei Tang, Ruoliang Tang

Abstract:

Introduction: Healthcare workers, especially the nurses all over the world, are highly vulnerable to work-related musculoskeletal disorders (WMSDs), experiencing high rates of neck, shoulder, and low back injuries, due to the unfavorable working conditions. To reduce WMSDs among nursing personnel, many workplace interventions have been developed and implemented. Unfortunately, the ongoing Covid-19 (SARS-CoV-2) pandemic has posed great challenges to the ergonomic practices and interventions in healthcare facilities, particularly the hospitals, since current Covid-19 mitigation measures, such as social distancing and working remotely, has substantially minimized in-person gatherings and trainings. On the other hand, hospitals throughout the world have been short-staffed, resulting in disturbance of shift scheduling and more importantly, the increased job demand among the available caregivers, particularly the doctors and nurses. With the latest development in communication technology, remote intervention measures have been developed as an alternative, without the necessity of in-person meetings. The Omaha System (OS) is a standardized classification system for nursing practices, including a problem classification system, an intervention system, and an outcome evaluation system. This paper describes the development of an OS-based ergonomic intervention program. Methods: First, a comprehensive literature search was performed among worldwide electronic databases, including PubMed, Web of Science, Cochrane Library, China National Knowledge Infrastructure (CNKI), between journal inception to May 2020, resulting in a total of 1,418 scientific articles. After two independent screening processes, the final knowledge pool included eleven randomized controlled trial studies to develop the draft of the intervention program with Omaha intervention subsystem as the framework. After the determination of sample size needed for statistical power and the potential loss to follow-up, a total of 94 nurses from eight clinical departments agreed to provide written, informed consent to participate in the study, which were subsequently assigned into two random groups (i.e., intervention vs. control). A subgroup of twelve nurses were randomly selected to participate in a semi-structured interview, during which their general understanding and awareness of musculoskeletal disorders and potential interventions was assessed. Then, the first draft was modified to reflect the findings from these interviews. Meanwhile, the tentative program schedule was also assessed. Next, two rounds of consultation were conducted among experts in nursing management, occupational health, psychology, and rehabilitation, to further adjust and finalize the intervention program. The control group had access to all the information and exercise modules at baseline, while an interdisciplinary research team was formed and supervised the implementation of the on-line intervention program through multiple social media groups. Outcome measures of this comparative study included biomechanical load assessed by the Quick Exposure Check and stresses due to awkward body postures. Results and Discussion: Modification to the draft included (1) supplementing traditional Chinese medicine practices, (2) adding the use of assistive patient handling equipment, and (3) revising the on-line training method. Information module should be once a week, lasting about 20 to 30 minutes, for a total of 6 weeks, while the exercise module should be 5 times a week, each lasting about 15 to 20 minutes, for a total of 6 weeks.

Keywords: ergonomic interventions, musculoskeletal disorders (MSDs), omaha system, nurses, Covid-19

Procedia PDF Downloads 131
10 Multimodal Integration of EEG, fMRI and Positron Emission Tomography Data Using Principal Component Analysis for Prognosis in Coma Patients

Authors: Denis Jordan, Daniel Golkowski, Mathias Lukas, Katharina Merz, Caroline Mlynarcik, Max Maurer, Valentin Riedl, Stefan Foerster, Eberhard F. Kochs, Andreas Bender, Ruediger Ilg

Abstract:

Introduction: So far, clinical assessments that rely on behavioral responses to differentiate coma states or even predict outcome in coma patients are unreliable, e.g. because of some patients’ motor disabilities. The present study was aimed to provide prognosis in coma patients using markers from electroencephalogram (EEG), blood oxygen level dependent (BOLD) functional magnetic resonance imaging (fMRI) and [18F]-fluorodeoxyglucose (FDG) positron emission tomography (PET). Unsuperwised principal component analysis (PCA) was used for multimodal integration of markers. Methods: Approved by the local ethics committee of the Technical University of Munich (Germany) 20 patients (aged 18-89) with severe brain damage were acquired through intensive care units at the Klinikum rechts der Isar in Munich and at the Therapiezentrum Burgau (Germany). At the day of EEG/fMRI/PET measurement (date I) patients (<3.5 month in coma) were grouped in the minimal conscious state (MCS) or vegetative state (VS) on the basis of their clinical presentation (coma recovery scale-revised, CRS-R). Follow-up assessment (date II) was also based on CRS-R in a period of 8 to 24 month after date I. At date I, 63 channel EEG (Brain Products, Gilching, Germany) was recorded outside the scanner, and subsequently simultaneous FDG-PET/fMRI was acquired on an integrated Siemens Biograph mMR 3T scanner (Siemens Healthineers, Erlangen Germany). Power spectral densities, permutation entropy (PE) and symbolic transfer entropy (STE) were calculated in/between frontal, temporal, parietal and occipital EEG channels. PE and STE are based on symbolic time series analysis and were already introduced as robust markers separating wakefulness from unconsciousness in EEG during general anesthesia. While PE quantifies the regularity structure of the neighboring order of signal values (a surrogate of cortical information processing), STE reflects information transfer between two signals (a surrogate of directed connectivity in cortical networks). fMRI was carried out using SPM12 (Wellcome Trust Center for Neuroimaging, University of London, UK). Functional images were realigned, segmented, normalized and smoothed. PET was acquired for 45 minutes in list-mode. For absolute quantification of brain’s glucose consumption rate in FDG-PET, kinetic modelling was performed with Patlak’s plot method. BOLD signal intensity in fMRI and glucose uptake in PET was calculated in 8 distinct cortical areas. PCA was performed over all markers from EEG/fMRI/PET. Prognosis (persistent VS and deceased patients vs. recovery to MCS/awake from date I to date II) was evaluated using the area under the curve (AUC) including bootstrap confidence intervals (CI, *: p<0.05). Results: Prognosis was reliably indicated by the first component of PCA (AUC=0.99*, CI=0.92-1.00) showing a higher AUC when compared to the best single markers (EEG: AUC<0.96*, fMRI: AUC<0.86*, PET: AUC<0.60). CRS-R did not show prediction (AUC=0.51, CI=0.29-0.78). Conclusion: In a multimodal analysis of EEG/fMRI/PET in coma patients, PCA lead to a reliable prognosis. The impact of this result is evident, as clinical estimates of prognosis are inapt at time and could be supported by quantitative biomarkers from EEG, fMRI and PET. Due to the small sample size, further investigations are required, in particular allowing superwised learning instead of the basic approach of unsuperwised PCA.

Keywords: coma states and prognosis, electroencephalogram, entropy, functional magnetic resonance imaging, machine learning, positron emission tomography, principal component analysis

Procedia PDF Downloads 310
9 Synthesis of Chitosan/Silver Nanocomposites: Antibacterial Properties and Tissue Regeneration for Thermal Burn Injury

Authors: B.L. España-Sánchez, E. Luna-Hernández, R.A. Mauricio-Sánchez, M.E. Cruz-Soto, F. Padilla-Vaca, R. Muñoz, L. Granados-López, L.R. Ovalle-Flores, J.L. Menchaca-Arredondo, G. Luna-Bárcenas

Abstract:

Treatment of burn injured has been considered an important clinical problem due to the fluid control and the presence of microorganisms during the healing process. Conventional treatment includes antiseptic techniques, topical medication and surgical removal of damaged skin, to avoid bacterial growth. In order to accelerate this process, different alternatives for tissue regeneration have been explored, including artificial skin, polymers, hydrogels and hybrid materials. Some requirements consider a nonreactive organic polymer with high biocompatibility and skin adherence, avoiding bacterial infections. Chitin-derivative biopolymer such as chitosan (CS) has been used in skin regeneration following third-degree burns. The biological interest of CS is associated with the improvement of tissue cell stimulation, biocompatibility and antibacterial properties. In particular, antimicrobial properties of CS can be significantly increased when is blended with nanostructured materials. Silver-based nanocomposites have gained attention in medicine due to their high antibacterial properties against pathogens, related to their high surface area/volume ratio at nanomolar concentrations. Silver nanocomposites can be blended or synthesized with chitin-derivative biopolymers in order to obtain a biodegradable/antimicrobial hybrid with improved physic-mechanical properties. In this study, nanocomposites based on chitosan/silver nanoparticles (CS/nAg) were synthesized by the in situ chemical reduction method, improving their antibacterial properties against pathogenic bacteria and enhancing the healing process in thermal burn injuries produced in an animal model. CS/nAg was prepared in solution by the chemical reduction method, using AgNO₃ as precursor. CS was dissolved in acetic acid and mixed with different molar concentrations of AgNO₃: 0.01, 0.025, 0.05 and 0.1 M. Solutions were stirred at 95°C during 20 hours, in order to promote the nAg formation. CS/nAg solutions were placed in Petri dishes and dried, to obtain films. Structural analyses confirm the synthesis of silver nanoparticles (nAg) by means of UV-Vis and TEM, with an average size of 7.5 nm and spherical morphology. FTIR analyses showed the complex formation by the interaction of hydroxyl and amine groups with metallic nanoparticles, and surface chemical analysis (XPS) shows low concentration of Ag⁰/Ag⁺ species. Topography surface analyses by means of AFM shown that hydrated CS form a mesh with an average diameter of 10 µm. Antibacterial activity against S. aureus and P. aeruginosa was improved in all evaluated conditions, such as nAg loading and interaction time. CS/nAg nanocomposites films did not show Ag⁰/Ag⁺ release in saline buffer and rat serum after exposition during 7 days. Healing process was significantly enhanced by the presence of CS/nAg nanocomposites, inducing the production of myofibloblasts, collagen remodelation, blood vessels neoformation and epidermis regeneration after 7 days of injury treatment, by means of histological and immunohistochemistry assays. The present work suggests that hydrated CS/nAg nanocomposites can be formed a mesh, improving the bacterial penetration and the contact with embedded nAg, producing complete growth inhibition after 1.5 hours. Furthermore, CS/nAg nanocomposites improve the cell tissue regeneration in thermal burn injuries induced in rats. Synthesis of antibacterial, non-toxic, and biocompatible nanocomposites can be an important issue in tissue engineering and health care applications.

Keywords: antibacterial, chitosan, healing process, nanocomposites, silver

Procedia PDF Downloads 259
8 Feasibility and Acceptability of an Emergency Department Digital Pain Self-Management Intervention: An Randomized Controlled Trial Pilot Study

Authors: Alexandria Carey, Angela Starkweather, Ann Horgas, Hwayoung Cho, Jason Beneciuk

Abstract:

Background/Significance: Over 3.4 million acute axial low back pain (aLBP) cases are treated annually in the United States (US) emergency departments (ED). ED patients with aLBP receive varying verbal and written discharge routine care (RC), leading to ineffective patient self-management. Ineffective self-management increase chronic low back pain (cLPB) transition risks, a chief cause of worldwide disability, with associated costs >$60 million annually. This research addresses this significant problem by evaluating an ED digital pain self-management intervention (EDPSI) focused on improving self-management through improved knowledge retainment, skills, and self-efficacy (confidence) (KSC) thus reducing aLBP to cLBP transition in ED patients discharged with aLBP. The research has significant potential to increase self-efficacy, one of the most potent mechanisms of behavior change and improve health outcomes. Focusing on accessibility and usability, the intervention may reduce discharge disparities in aLBP self-management, especially with low health literacy. Study Questions: This research will answer the following questions: 1) Will an EDPSI focused on improving KSC progress patient self-management behaviors and health status?; 2) Is the EDPSI sustainable to improve pain severity, interference, and pain recurrence?; 3) Will an EDPSI reduce aLBP to cLBP transition in patients discharged with aLBP? Aims: The pilot randomized-controlled trial (RCT) study’s objectives assess the effects of a 12-week digital self-management discharge tool in patients with aLBP. We aim to 1) Primarily assess the feasibility [recruitment, enrollment, and retention], and [intervention] acceptability, and sustainability of EDPSI on participant’s pain self-management; 2) Determine the effectiveness and sustainability of EDPSI on pain severity/interference among participants. 3) Explore patient preferences, health literacy, and changes among participants experiencing the transition to cLBP. We anticipate that EDPSI intervention will increase likelihood of achieving self-management milestones and significantly improve pain-related symptoms in aLBP. Methods: The study uses a two-group pilot RCT to enroll 30 individuals who have been seen in the ED with aLBP. Participants are randomized into RC (n=15) or RC + EDPSI (n=15) and receive follow-up surveys for 12-weeks post-intervention. EDPSI innovative content focuses on 1) highlighting discharge education; 2) provides self-management treatment options; 3) actor demonstration of ergonomics, range of motion movements, safety, and sleep; 4) complementary alternative medicine (CAM) options including acupuncture, yoga, and Pilates; 5) combination therapies including thermal application, spinal manipulation, and PT treatments. The intervention group receives Booster sessions via Zoom to assess and reinforce their knowledge retention of techniques and provide return demonstration reinforcing ergonomics, in weeks two and eight. Outcome Measures: All participants are followed for 12-weeks, assessing pain severity/ interference using the Brief Pain Inventory short-form (BPI-sf) survey, self-management (measuring KSC) using the short 13-item Patient Activation Measure (PAM), and self-efficacy using the Pain Self-Efficacy Questionnaire (PSEQ) weeks 1, 6, and 12. Feasibility is measured by recruitment, enrollment, and retention percentages. Acceptability and education satisfaction are measured using the Education-Preference and Satisfaction Questionnaire (EPSQ) post-intervention. Self-management sustainment is measured including PSEQ, PAM, and patient satisfaction and healthcare utilization (PSHU) requesting patient overall satisfaction, additional healthcare utilization, and pain management related to continued back pain or complications post-injury.

Keywords: digital, pain self-management, education, tool

Procedia PDF Downloads 3
7 Modern Day Second Generation Military Filipino Amerasians and Ghosts of the U.S. Military Prostitution System in West Central Luzon's 'AMO Amerasian Triangle'

Authors: P. C. Kutschera, Elena C. Tesoro, Mary Grace Talamera-Sandico, Jose Maria G. Pelayo III

Abstract:

Second generation military Filipino Amerasians comprise a formidable contemporary segment of the estimated 250,000-plus biracial Amerasians in the Philippines today. Overall, they are a stigmatized and socioeconomically marginalized diaspora, historically; they were abandoned or estranged by U.S. military personnel fathers assigned during the century-long Colonial, Post-World War II and Cold War Era of permanent military basing (1898-1992). Indeed, U.S. military personnel remain stationed in smaller numbers in the Philippines today. This inquiry is an outgrowth of two recent small sample studies. The first surfaced the impact of the U.S. military prostitution system on formation of the ‘Derivative Amerasian Family Construct’ on first generation Amerasians; a second, qualitative case study suggested the continued effect of the prostitution systems' destructive impetuous on second generation Amerasians. The intent of this current qualitative, multiple-case study was to actively seek out second generation sex industry toilers. The purpose was to focus further on this human phenomenon in the post-basing and post-military prostitution system eras. As background, the former military prostitution apparatus has transformed into a modern dynamic of rampant sex tourism and prostitution nationwide. This is characterized by hotel and resorts offering unrestricted carnal access, urban and provincial brothels (casas), discos, bars and pickup clubs, massage parlors, local barrio karaoke bars and street prostitution. A small case study sample (N = 4) of female and male second generation Amerasians were selected. Sample formation employed a non-probability ‘snowball’ technique drawing respondents from the notorious Angeles, Metro Manila, Olongapo City ‘AMO Amerasian Triangle’ where most former U.S. military installations were sited and modern sex tourism thrives. A six-month study and analysis of in-depth interviews of female and male sex laborers, their families and peers revealed a litany of disturbing, and troublesome experiences. Results showed profiles of debilitating human poverty, history of family disorganization, stigmatization, social marginalization and the ghost of the military prostitution system and its harmful legacy on Amerasian family units. Emerging were testimonials of wayward young people ensnared in a maelstrom of deep economic deprivation, familial dysfunction, psychological desperation and societal indifference. The paper recommends that more study is needed and implications of unstudied psychosocial and socioeconomic experiences of distressed younger generations of military Amerasians require specific research. Heretofore apathetic or disengaged U.S. institutions need to confront the issue and formulate activist and solution-oriented social welfare, human services and immigration easement policies and alternatives. These institutions specifically include academic and social science research agencies, corporate foundations, the U.S. Congress, and Departments of State, Defense and Health and Human Services, and Homeland Security (i.e. Citizen and Immigration Services) It is them who continue to endorse a laissez-faire policy of non-involvement over the entire Filipino Amerasian question. Such apathy, the paper concludes, relegates this consequential but neglected blood progeny to the status of humiliating destitution and exploitation. Amerasians; thus, remain entrapped in their former colonial, and neo-colonial habitat. Ironically, they are unwitting victims of a U.S. American homeland that fancies itself geo-politically as a strong and strategic military treaty ally of the Philippines in the Western Pacific.

Keywords: Asian Americans, diaspora, Filipino Amerasians, military prostitution, stigmatization

Procedia PDF Downloads 456
6 Ultra-Rapid and Efficient Immunomagnetic Separation of Listeria Monocytogenes from Complex Samples in High-Gradient Magnetic Field Using Disposable Magnetic Microfluidic Device

Authors: L. Malic, X. Zhang, D. Brassard, L. Clime, J. Daoud, C. Luebbert, V. Barrere, A. Boutin, S. Bidawid, N. Corneau, J. Farber, T. Veres

Abstract:

The incidence of infections caused by foodborne pathogens such as Listeria monocytogenes (L. monocytogenes) poses a great potential threat to public health and safety. These issues are further exacerbated by legal repercussions due to “zero tolerance” food safety standards adopted in developed countries. Unfortunately, a large number of related disease outbreaks are caused by pathogens present in extremely low counts currently undetectable by available techniques. The development of highly sensitive and rapid detection of foodborne pathogens is therefore crucial, and requires robust and efficient pre-analytical sample preparation. Immunomagnetic separation is a popular approach to sample preparation. Microfluidic chips combined with external magnets have emerged as viable high throughput methods. However, external magnets alone are not suitable for the capture of nanoparticles, as very strong magnetic fields are required. Devices that incorporate externally applied magnetic field and microstructures of a soft magnetic material have thus been used for local field amplification. Unfortunately, very complex and costly fabrication processes used for integration of soft magnetic materials in the reported proof-of-concept devices would prohibit their use as disposable tools for food and water safety or diagnostic applications. We present a sample preparation magnetic microfluidic device implemented in low-cost thermoplastic polymers using fabrication techniques suitable for mass-production. The developed magnetic capture chip (M-chip) was employed for rapid capture and release of L. monocytogenes conjugated to immunomagnetic nanoparticles (IMNs) in buffer and beef filtrate. The M-chip relies on a dense array of Nickel-coated high-aspect ratio pillars for capture with controlled magnetic field distribution and a microfluidic channel network for sample delivery, waste, wash and recovery. The developed Nickel-coating process and passivation allows generation of switchable local perturbations within the uniform magnetic field generated with a pair of permanent magnets placed at the opposite edges of the chip. This leads to strong and reversible trapping force, wherein high local magnetic field gradients allow efficient capture of IMNs conjugated to L. monocytogenes flowing through the microfluidic chamber. The experimental optimization of the M-chip was performed using commercially available magnetic microparticles and fabricated silica-coated iron-oxide nanoparticles. The fabricated nanoparticles were optimized to achieve the desired magnetic moment and surface functionalization was tailored to allow efficient capture antibody immobilization. The integration, validation and further optimization of the capture and release protocol is demonstrated using both, dead and live L. monocytogenes through fluorescence microscopy and plate- culture method. The capture efficiency of the chip was found to vary as function of listeria to nanoparticle concentration ratio. The maximum capture efficiency of 30% was obtained and the 24-hour plate-culture method allowed the detection of initial sample concentration of only 16 cfu/ml. The device was also very efficient in concentrating the sample from a 10 ml initial volume. Specifically, 280% concentration efficiency was achieved in 17 minutes only, demonstrating the suitability of the system for food safety applications. In addition, flexible design and low-cost fabrication process will allow rapid sample preparation for applications beyond food and water safety, including point-of-care diagnosis.

Keywords: array of pillars, bacteria isolation, immunomagnetic sample preparation, polymer microfluidic device

Procedia PDF Downloads 250
5 A Comprehensive Study of Spread Models of Wildland Fires

Authors: Manavjit Singh Dhindsa, Ursula Das, Kshirasagar Naik, Marzia Zaman, Richard Purcell, Srinivas Sampalli, Abdul Mutakabbir, Chung-Horng Lung, Thambirajah Ravichandran

Abstract:

These days, wildland fires, also known as forest fires, are more prevalent than ever. Wildfires have major repercussions that affect ecosystems, communities, and the environment in several ways. Wildfires lead to habitat destruction and biodiversity loss, affecting ecosystems and causing soil erosion. They also contribute to poor air quality by releasing smoke and pollutants that pose health risks, especially for individuals with respiratory conditions. Wildfires can damage infrastructure, disrupt communities, and cause economic losses. The economic impact of firefighting efforts, combined with their direct effects on forestry and agriculture, causes significant financial difficulties for the areas impacted. This research explores different forest fire spread models and presents a comprehensive review of various techniques and methodologies used in the field. A forest fire spread model is a computational or mathematical representation that is used to simulate and predict the behavior of a forest fire. By applying scientific concepts and data from empirical studies, these models attempt to capture the intricate dynamics of how a fire spreads, taking into consideration a variety of factors like weather patterns, topography, fuel types, and environmental conditions. These models assist authorities in understanding and forecasting the potential trajectory and intensity of a wildfire. Emphasizing the need for a comprehensive understanding of wildfire dynamics, this research explores the approaches, assumptions, and findings derived from various models. By using a comparison approach, a critical analysis is provided by identifying patterns, strengths, and weaknesses among these models. The purpose of the survey is to further wildfire research and management techniques. Decision-makers, researchers, and practitioners can benefit from the useful insights that are provided by synthesizing established information. Fire spread models provide insights into potential fire behavior, facilitating authorities to make informed decisions about evacuation activities, allocating resources for fire-fighting efforts, and planning for preventive actions. Wildfire spread models are also useful in post-wildfire mitigation strategies as they help in assessing the fire's severity, determining high-risk regions for post-fire dangers, and forecasting soil erosion trends. The analysis highlights the importance of customized modeling approaches for various circumstances and promotes our understanding of the way forest fires spread. Some of the known models in this field are Rothermel’s wildland fuel model, FARSITE, WRF-SFIRE, FIRETEC, FlamMap, FSPro, cellular automata model, and others. The key characteristics that these models consider include weather (includes factors such as wind speed and direction), topography (includes factors like landscape elevation), and fuel availability (includes factors like types of vegetation) among other factors. The models discussed are physics-based, data-driven, or hybrid models, also utilizing ML techniques like attention-based neural networks to enhance the performance of the model. In order to lessen the destructive effects of forest fires, this initiative aims to promote the development of more precise prediction tools and effective management techniques. The survey expands its scope to address the practical needs of numerous stakeholders. Access to enhanced early warning systems enables decision-makers to take prompt action. Emergency responders benefit from improved resource allocation strategies, strengthening the efficacy of firefighting efforts.

Keywords: artificial intelligence, deep learning, forest fire management, fire risk assessment, fire simulation, machine learning, remote sensing, wildfire modeling

Procedia PDF Downloads 33
4 Translation of Self-Inject Contraception Training Objectives Into Service Performance Outcomes

Authors: Oluwaseun Adeleke, Samuel O. Ikani, Simeon Christian Chukwu, Fidelis Edet, Anthony Nwala, Mopelola Raji, Simeon Christian Chukwu

Abstract:

Background: Health service providers are offered in-service training periodically to strengthen their ability to deliver services that are ethical, quality, timely and safe. Not all capacity-building courses have successfully resulted in intended service delivery outcomes because of poor training content, design, approach, and ambiance. The Delivering Innovations in Selfcare (DISC) project developed a Moment of Truth innovation, which is a proven training model focused on improving consumer/provider interaction that leads to an increase in the voluntary uptake of subcutaneous depot medroxyprogesterone acetate (DMPA-SC) self-injection among women who opt for injectable contraception. Methodology: Six months after training on a moment of truth (MoT) training manual, the project conducted two intensive rounds of qualitative data collection and triangulation that included provider, client, and community mobilizer interviews, facility observations, and routine program data collection. Respondents were sampled according to a convenience sampling approach, and data collected was analyzed using a codebook and Atlas-TI. Providers and clients were interviewed to understand their experience, perspective, attitude, and awareness about the DMPA-SC self-inject. Data were collected from 12 health facilities in three states – eight directly trained and four cascades trained. The research team members came together for a participatory analysis workshop to explore and interpret emergent themes. Findings: Quality-of-service delivery and performance outcomes were observed to be significantly better in facilities whose providers were trained directly trained by the DISC project than in sites that received indirect training through master trainers. Facilities that were directly trained recorded SI proportions that were twice more than in cascade-trained sites. Direct training comprised of full-day and standalone didactic and interactive sessions constructed to evoke commitment, passion and conviction as well as eliminate provider bias and misconceptions in providers by utilizing human interest stories and values clarification exercises. Sessions also created compelling arguments using evidence and national guidelines. The training also prioritized demonstration sessions, utilized job aids, particularly videos, strengthened empathetic counseling – allaying client fears and concerns about SI, trained on positioning self-inject first and side effects management. Role plays and practicum was particularly useful to enable providers to retain and internalize new knowledge. These sessions provided experiential learning and the opportunity to apply one's expertise in a supervised environment where supportive feedback is provided in real-time. Cascade Training was often a shorter and abridged form of MoT training that leveraged existing training already planned by master trainers. This training was held over a four-hour period and was less emotive, focusing more on foundational DMPA-SC knowledge such as a reorientation to DMPA-SC, comparison of DMPA-SC variants, counseling framework and skills, data reporting and commodity tracking/requisition – no facility practicums. Training on self-injection was not as robust, presumably because they were not directed at methods in the contraceptive mix that align with state/organizational sponsored objectives – in this instance, fostering LARC services. Conclusion: To achieve better performance outcomes, consideration should be given to providing training that prioritizes practice-based and emotive content. Furthermore, a firm understanding and conviction about the value training offers improve motivation and commitment to accomplish and surpass service-related performance outcomes.

Keywords: training, performance outcomes, innovation, family planning, contraception, DMPA-SC, self-care, self-injection.

Procedia PDF Downloads 52
3 Tackling the Decontamination Challenge: Nanorecycling of Plastic Waste

Authors: Jocelyn Doucet, Jean-Philippe Laviolette, Ali Eslami

Abstract:

The end-of-life management and recycling of polymer wastes remains a key environment issue in on-going efforts to increase resource efficiency and attaining GHG emission reduction targets. Half of all the plastics ever produced were made in the last 13 years, and only about 16% of that plastic waste is collected for recycling, while 25% is incinerated, 40% is landfilled, and 19% is unmanaged and leaks in the environment and waterways. In addition to the plastic collection issue, the UN recently published a report on chemicals in plastics, which adds another layer of challenge when integrating recycled content containing toxic products into new products. To tackle these important issues, innovative solutions are required. Chemical recycling of plastics provides new complementary alternatives to the current recycled plastic market by converting waste material into a high value chemical commodity that can be reintegrated in a variety of applications, making the total market size of the output – virgin-like, high value products - larger than the market size of the input – plastic waste. Access to high-quality feedstock also remains a major obstacle, primarily due to material contamination issues. Pyrowave approaches this challenge with its innovative nano-recycling technology, which purifies polymers at the molecular level, removing undesirable contaminants and restoring the resin to its virgin state without having to depolymerise it. This breakthrough approach expands the range of plastics that can be effectively recycled, including mixed plastics with various contaminants such as lead, inorganic pigments, and flame retardants. The technology allows yields below 100ppm, and purity can be adjusted to an infinitesimal level depending on the customer's specifications. The separation of the polymer and contaminants in Pyrowave's nano-recycling process offers the unique ability to customize the solution on targeted additives and contaminants to be removed based on the difference in molecular size. This precise control enables the attainment of a final polymer purity equivalent to virgin resin. The patented process involves dissolving the contaminated material using a specially formulated solvent, purifying the mixture at the molecular level, and subsequently extracting the solvent to yield a purified polymer resin that can directly be reintegrated in new products without further treatment. Notably, this technology offers simplicity, effectiveness, and flexibility while minimizing environmental impact and preserving valuable resources in the manufacturing circuit. Pyrowave has successfully applied this nano-recycling technology to decontaminate polymers and supply purified, high-quality recycled plastics to critical industries, including food-contact compliance. The technology is low-carbon, electrified, and provides 100% traceable resins with properties identical to those of virgin resins. Additionally, the issue of low recycling rates and the limited market for traditionally hard-to-recycle plastic waste has fueled the need for new complementary alternatives. Chemical recycling, such as Pyrowave's microwave depolymerization, presents a sustainable and efficient solution by converting plastic waste into high-value commodities. By employing microwave catalytic depolymerization, Pyrowave enables a truly circular economy of plastics, particularly in treating polystyrene waste to produce virgin-like styrene monomers. This revolutionary approach boasts low energy consumption, high yields, and a reduced carbon footprint. Pyrowave offers a portfolio of sustainable, low-carbon, electric solutions to give plastic waste a second life and paves the way to the new circular economy of plastics. Here, particularly for polystyrene, we show that styrene monomer yields from Pyrowave’s polystyrene microwave depolymerization reactor is 2,2 to 1,5 times higher than that of the thermal conventional pyrolysis. In addition, we provide a detailed understanding of the microwave assisted depolymerization via analyzing the effects of microwave power, pyrolysis time, microwave receptor and temperature on the styrene product yields. Furthermore, we investigate life cycle environmental impact assessment of microwave assisted pyrolysis of polystyrene in commercial-scale production. Finally, it is worth pointing out that Pyrowave is able to treat several tons of polystyrene to produce virgin styrene monomers and manage waste/contaminated polymeric materials as well in a truly circular economy.

Keywords: nanorecycling, nanomaterials, plastic recycling, depolymerization

Procedia PDF Downloads 38
2 A Study on the Use Intention of Smart Phone

Authors: Zhi-Zhong Chen, Jun-Hao Lu, Jr., Shih-Ying Chueh

Abstract:

Based on Unified Theory of Acceptance and Use of Technology (UTAUT), the study investigates people’s intention on using smart phones. The study additionally incorporates two new variables: 'self-efficacy' and 'attitude toward using'. Samples are collected by questionnaire survey, in which 240 are valid. After Correlation Analysis, Reliability Test, ANOVA, t-test and Multiple Regression Analysis, the study finds that social impact and self-efficacy have positive effect on use intentions, and the use intentions also have positive effect on use behavior.

Keywords: [1] Ajzen & Fishbein (1975), “Belief, attitude, intention and behavior: An introduction to theory and research”, Reading MA: Addison-Wesley. [2] Bandura (1977) Self-efficacy: toward a unifying theory of behavioural change. Psychological Review , 84, 191–215. [3] Bandura( 1986) A. Bandura, Social foundations of though and action, Prentice-Hall. Englewood Cliffs. [4] Ching-Hui Huang (2005). The effect of Regular Exercise on Elderly Optimism: The Self-efficacy and Theory of Reasoned Action Perspectives.(Master's dissertation, National Taiwan Sport University, 2005).National Digital Library of Theses and Dissertations in Taiwan。 [5] Chun-Mo Wu (2007).The Effects of Perceived Risk and Service Quality on Purchase Intention - an Example of Taipei City Long-Term Care Facilities. (Master's dissertation, Ming Chuan University, 2007).National Digital Library of Theses and Dissertations in Taiwan. [6] Compeau, D.R., and Higgins, C.A., (1995) “Application of social cognitive theory to training for computer skills.”, Information Systems Research, 6(2), pp.118-143. [7] computer-self-efficacy and mediators of the efficacy-performance relationship. International Journal of Human-Computer Studies, 62, 737-758. [8] Davis et al(1989), “User acceptance of computer technology: A comparison of two theoretical models ”, Management Science, 35(8), p.982-1003. [9] Davis et al(1989), “User acceptance of computer technology:A comparison of two theoretical models ”, Management Science, 35(8), p.982-1003. [10] Davis, F.D. (1989). Perceived Usefulness, Perceived Ease of Use and User Acceptance of Information Technology. MIS Quarterly, 13(3), 319-340。 [11] Davis. (1989). Perceived Usefulness, Perceived Ease of Use, and User Acceptance of Information Technology. MIS Quarterly, 13(3), 319–340. doi:10.2307/249008 [12] Johnson, R. D. (2005). An empirical investigation of sources of application-specific [13] Mei-yin Hsu (2010).The Study on Attitude and Satisfaction of Electronic Documents System for Administrators of Elementary Schools in Changhua County.(Master's dissertation , Feng Chia University, 2010).National Digital Library of Theses and Dissertations in Taiwan. [14] Ming-Chun Hsieh (2010). Research on Parents’ Attitudes Toward Electronic Toys: The case of Taichung City.(Master's dissertation, Chaoyang University of Technology,2010).National Digital Library of Theses and Dissertations in Taiwan. [15] Moon and Kim(2001). Extending the TAM for a World-Wide-Web context, Information and Management, v.38 n.4, p.217-230. [16] Shang-Yi Hu (2010).The Impacts of Knowledge Management on Customer Relationship Management – Enterprise Characteristicsand Corporate Governance as a Moderator.(Master's dissertation, Leader University, 2010)。National Digital Library of Theses and Dissertations in Taiwan. [17] Sheng-Yi Hung (2013, September10).Worldwide sale of smartphones to hit one billion IDC:Android dominate the market. ETtoday. Retrieved data form the available protocol:2013/10/3. [18] Thompson, R.L., Higgins, C.A., and Howell, J.M.(1991), “Personal Computing: Toward a Conceptual Model of Utilization”, MIS Quarterly(15:1), pp. 125-143. [19] Venkatesh, V., M.G. Morris, G.B. Davis, and F. D. Davis (2003), “User acceptance of information technology: Toward a unified view, ” MIS Quarterly, 27, No. 3, pp.425-478. [20] Vijayasarathy, L. R. (2004), Predicting Consumer Intentions to Use On-Line Shopping: The Case for an Augmented Technology Acceptance Model, Information and Management, Vol.41, No.6, pp.747-762. [21] Wikipedia - smartphone (http://zh.wikipedia.org/zh-tw/%E6%99%BA%E8%83%BD%E6%89%8B%E6%9C%BA)。 [22] Wu-Minsan (2008).The impacts of self-efficacy, social support on work adjustment with hearing impaired. (Master's dissertation, Southern Taiwan University of Science and Technology, 2008).National Digital Library of Theses and Dissertations in Taiwan. [23] Yu-min Lin (2006). The Influence of Business Employee’s MSN Self-efficacy On Instant Messaging Usage Behavior and Communicaiton Satisfaction.(Master's dissertation, National Taiwan University of Science and Technology, 2006).National Digital Library of Theses and Dissertations in Taiwan.

Procedia PDF Downloads 375
1 The Road Ahead: Merging Human Cyber Security Expertise with Generative AI

Authors: Brennan Lodge

Abstract:

Cybersecurity professionals have long been embroiled in a digital arms race, confronting increasingly sophisticated threats with innovative solutions. The field of cybersecurity is in an unending race against malicious adversaries. As threats evolve in complexity, the tools used to defend against them need to advance even faster. Burdened with a vast arsenal of tools and an expansive scope of threat intelligence, analysts frequently navigate a complex web, trying to discern patterns amidst information overload. Herein lies the potential of Retrieval Augmented Generation (RAG). By combining the capabilities of Large Language Models (LLMs) with a generative AI facet, RAG brings to the table an unparalleled ability for real-time cross-referencing, bridging the gap between raw data and actionable insights. Imagine an analyst named Sarah working at a global Fortune 500 company. Every day, Sarah navigates a maze of diverse knowledge bases, real-time threat intelligence, and her company's vast proprietary data, from network specifics to intricate technical blueprints. One day, she's challenged by a potential breach through a personal device due to the company's global "Bring Your Own Device" policy. With the clock ticking, Sarah has mere minutes to trace the malware's origin, all while considering complex regional regulations. As she races against the benchmark of Mean Time To Resolution (MTTR), she wonders: Could "Cozy Bear" with its notorious malware tactic, HAMMERTOSS, be behind this? Balancing policy intricacies, global network considerations, and ever-emerging cyber threats, Sarah's role epitomizes the intense challenges faced by today's cybersecurity analysts. While analysts grapple with this array of intricate, time-sensitive challenges, the necessity for precision and efficiency is key. RAG technology—a cutting-edge advancement in Gen AI—is a promising solution. Designed to assimilate diverse data sources such as cyber advisory notices, phishing email sentiment, secure and insecure code examples, information security policy documentation, and the MITRE ATT&CK framework, RAG equips analysts with real-time querying capabilities through a vector database and a cross referenced concise response from a Gen AI model. Traditional relational databases often necessitate a tedious process of filtering through numerous entries. Now, with the synergy of vector databases and Gen AI models, analysts can rapidly access both contextually or semantically akin data points. This augmented approach equips analysts with a comprehensive understanding of the prevailing cyber threats, elevating the robustness of cybersecurity defenses and upskilling the analyst and team, too. Vector databases underpin the knowledge translation in Gen AI. They bridge the gap between raw data and translation into meaningful insights, ensuring that analysts are equipped with comprehensive and relevant information. This superior capability of the RAG framework, with its impressive depth and precision, finds application across a broad spectrum of cybersecurity challenges. Let's delve into some use cases where its potential becomes particularly evident: Phishing Email Sentiment Analysis: Phishing remains a predominant vector for cybersecurity breaches. Leveraging RAG's capabilities, analysts can not only assess the potential malevolence of an email but can also understand the context behind it. By cross-referencing patterns from varied data sources in real-time, the detection process evolves from a mere content evaluation to a holistic understanding of attacker tactics, behaviors, and evolving profiles. This allows for the identification of nuanced phishing strategies that might otherwise go undetected. Insecure Code Analysis: Software vulnerabilities form a critical entry point for cyber adversaries. With RAG, the process of code evaluation undergoes a transformation. Instead of manual code reviews, the system pulls insights from vector databases and historical code snippets marked as insecure, enabling detection of vulnerabilities based on historical patterns, emerging threat vectors, and even predictive threat modeling. This ensures that even the most obfuscated or embedded vulnerabilities are identified, and corrective measures can be promptly implemented. Vulnerability and Upskill Advisory: In the fast-paced world of cybersecurity, staying updated is paramount. Through RAG's capabilities, analysts are not only made aware of real-time vulnerabilities but are also guided on the necessary skills and tools needed to combat them. By dynamically sourcing data through vulnerability advisories, news on advanced persistent threats, and tactics to defend, RAG ensures that analysts are not only reactive to threats but are also proactively upskilled, thereby bolstering their defense mechanisms. Information Security Policies for Compliance Teams: Compliance remains at the heart of many organizational cybersecurity strategies. However, with ever-shifting regulatory landscapes, staying compliant becomes a moving target. RAG's ability to source real-time data ensures that compliance teams always have access to the latest policy changes, guidelines, and best practices. This not only facilitates adherence to current standards but also anticipates future shifts, assists with audits, and ensures that organizations remain ahead of the compliance curve. Fusing a RAG architecture with platforms like Slack amplifies its practical utility. Slack, known for its real-time communication prowess, seamlessly evolves into more than just a messaging platform in this context. Cybersecurity analysts can pose intricate queries within Slack and, almost instantaneously, receive comprehensive feedback powered by the harmonious interplay of RAG and Gen AI. This integration effectively transforms Slack into an AI-augmented chatbot-like assistant for cybersecurity professionals, always ready to provide informed insights on-demand, making it an indispensable ally in the ever-evolving cyber battlefield. Navigating the vast landscape of cybersecurity, analysts often encounter unfamiliar terminologies and techniques., analysts require tools that not only detect or inform them of threats, like CISA (U.S Cybersecurity Infrastructure Security Agency) Advisories, but also interpret and communicate them effectively. Consider a junior cybersecurity analyst named Alex, who comes across the term "Kerberoasting" while reviewing a network log. Unfamiliar with its intricacies, Alex turns to Slack to pose a query: "chat explain is Kerberoasting, using CISA." Almost instantaneously, Slack, powered by the harmonious interplay of RAG and Gen AI, provides a detailed response, cross-referencing a recent cyber advisory on the technique. It explains how attackers can exploit the Kerberos Ticket Granting Service to decipher service account passwords, potentially compromising a network. In this dynamic realm of cybersecurity, the blend of RAG and Generative AI represents more than just a technological leap. It embodies a paradigm shift, promising a future where human expertise and AI-driven precision join forces. As cyber threats continue their relentless advance, this synergy ensures that defenders are equipped with an arsenal that's not just reactive, but also profoundly insightful. No longer should analysts be submerged in a deluge of data without direction. Instead, they should be empowered, to discern, act, and preempt with unparalleled clarity and confidence. By harmoniously intertwining human discernment with AI capabilities, we should chart a path towards a future where cybersecurity is not just about defense, but about achieving a strategic advantage, paving the way for a safer, informed and a more secure digital horizon.

Keywords: cybersecurity, gen AI, retrieval augmented generation, cybersecurity defense strategies

Procedia PDF Downloads 49