Search results for: border literature
47 A Comparative Evaluation of Cognitive Load Management: Case Study of Postgraduate Business Students
Authors: Kavita Goel, Donald Winchester
Abstract:
In a world of information overload and work complexities, academics often struggle to create an online instructional environment enabling efficient and effective student learning. Research has established that students’ learning styles are different, some learn faster when taught using audio and visual methods. Attributes like prior knowledge and mental effort affect their learning. ‘Cognitive load theory’, opines learners have limited processing capacity. Cognitive load depends on the learner’s prior knowledge, the complexity of content and tasks, and instructional environment. Hence, the proper allocation of cognitive resources is critical for students’ learning. Consequently, a lecturer needs to understand the limits and strengths of the human learning processes, various learning styles of students, and accommodate these requirements while designing online assessments. As acknowledged in the cognitive load theory literature, visual and auditory explanations of worked examples potentially lead to a reduction of cognitive load (effort) and increased facilitation of learning when compared to conventional sequential text problem solving. This will help learner to utilize both subcomponents of their working memory. Instructional design changes were introduced at the case site for the delivery of the postgraduate business subjects. To make effective use of auditory and visual modalities, video recorded lectures, and key concept webinars were delivered to students. Videos were prepared to free up student limited working memory from irrelevant mental effort as all elements in a visual screening can be viewed simultaneously, processed quickly, and facilitates greater psychological processing efficiency. Most case study students in the postgraduate programs are adults, working full-time at higher management levels, and studying part-time. Their learning style and needs are different from other tertiary students. The purpose of the audio and visual interventions was to lower the students cognitive load and provide an online environment supportive to their efficient learning. These changes were expected to impact the student’s learning experience, their academic performance and retention favourably. This paper posits that these changes to instruction design facilitates students to integrate new knowledge into their long-term memory. A mixed methods case study methodology was used in this investigation. Primary data were collected from interviews and survey(s) of students and academics. Secondary data were collected from the organisation’s databases and reports. Some evidence was found that the academic performance of students does improve when new instructional design changes are introduced although not statistically significant. However, the overall grade distribution of student’s academic performance has changed and skewed higher which shows deeper understanding of the content. It was identified from feedback received from students that recorded webinars served as better learning aids than material with text alone, especially with more complex content. The recorded webinars on the subject content and assessments provides flexibility to students to access this material any time from repositories, many times, and this enhances students learning style. Visual and audio information enters student’s working memory more effectively. Also as each assessment included the application of the concepts, conceptual knowledge interacted with the pre-existing schema in the long-term memory and lowered student’s cognitive load.Keywords: cognitive load theory, learning style, instructional environment, working memory
Procedia PDF Downloads 14346 Effect of Black Cumin (Nigella sativa) Extract on Damaged Brain Cells
Authors: Batul Kagalwala
Abstract:
The nervous system is made up of complex delicate structures such as the spinal cord, peripheral nerves and the brain. These are prone to various types of injury ranging from neurodegenerative diseases to trauma leading to diseases like Parkinson's, Alzheimer's, multiple sclerosis, amyotrophic lateral sclerosis (ALS), multiple system atrophy etc. Unfortunately, because of the complicated structure of nervous system, spontaneous regeneration, repair and healing is seldom seen due to which brain damage, peripheral nerve damage and paralysis from spinal cord injury are often permanent and incapacitating. Hence, innovative and standardized approach is required for advance treatment of neurological injury. Nigella sativa (N. sativa), an annual flowering plant native to regions of southern Europe and Asia; has been suggested to have neuroprotective and anti-seizures properties. Neuroregeneration is found to occur in damaged cells when treated using extract of N. sativa. Due to its proven health benefits, lots of experiments are being conducted to extract all the benefits from the plant. The flowers are delicate and are usually pale blue and white in color with small black seeds. These seeds are the source of active components such as 30–40% fixed oils, 0.5–1.5% essential oils, pharmacologically active components containing thymoquinone (TQ), ditimoquinone (DTQ) and nigellin. In traditional medicine, this herb was identified to have healing properties and was extensively used Middle East and Far East for treating diseases such as head ache, back pain, asthma, infections, dysentery, hypertension, obesity and gastrointestinal problems. Literature studies have confirmed the extract of N. sativa seeds and TQ have inhibitory effects on inducible nitric oxide synthase and production of nitric oxide as well as anti-inflammatory and anticancer activities. Experimental investigation will be conducted to understand which ingredient of N. sativa causes neuroregeneration and roots to its healing property. An aqueous/ alcoholic extract of N. sativa will be made. Seed oil is also found to have used by researchers to prepare such extracts. For the alcoholic extracts, the seeds need to be powdered and soaked in alcohol for a period of time and the alcohol must be evaporated using rotary evaporator. For aqueous extracts, the powder must be dissolved in distilled water to obtain a pure extract. The mobile phase will be the extract while the suitable stationary phase (substance that is a good adsorbent e.g. silica gels, alumina, cellulose etc.) will be selected. Different ingredients of N. sativa will be separated using High Performance Liquid Chromatography (HPLC) for treating damaged cells. Damaged brain cells will be treated individually and in different combinations of 2 or 3 compounds for different intervals of time. The most suitable compound or a combination of compounds for the regeneration of cells will be determined using DOE methodology. Later the gene will also be determined and using Polymerase Chain Reaction (PCR) it will be replicated in a plasmid vector. This plasmid vector shall be inserted in the brain of the organism used and replicated within. The gene insertion can also be done by the gene gun method. The gene in question can be coated on a micro bullet of tungsten and bombarded in the area of interest and gene replication and coding shall be studied. Investigation on whether the gene replicates in the organism or not will be examined.Keywords: black cumin, brain cells, damage, extract, neuroregeneration, PCR, plasmids, vectors
Procedia PDF Downloads 65645 Amphiphilic Compounds as Potential Non-Toxic Antifouling Agents: A Study of Biofilm Formation Assessed by Micro-titer Assays with Marine Bacteria and Eco-toxicological Effect on Marine Algae
Authors: D. Malouch, M. Berchel, C. Dreanno, S. Stachowski-Haberkorn, P-A. Jaffres
Abstract:
Biofilm is a predominant lifestyle chosen by bacteria. Whether it is developed on an immerged surface or a mobile biofilm known as flocs, the bacteria within this form of life show properties different from its planktonic ones. Within the biofilm, the self-formed matrix of Extracellular Polymeric Substances (EPS) offers hydration, resources capture, enhanced resistance to antimicrobial agents, and allows cell-communication. Biofouling is a complex natural phenomenon that involves biological, physical and chemical properties related to the environment, the submerged surface and the living organisms involved. Bio-colonization of artificial structures can cause various economic and environmental impacts. The increase in costs associated with the over-consumption of fuel from biocolonized vessels has been widely studied. Measurement drifts from submerged sensors, as well as obstructions in heat exchangers, and deterioration of offshore structures are major difficulties that industries are dealing with. Therefore, surfaces that inhibit biocolonization are required in different areas (water treatment, marine paints, etc.) and many efforts have been devoted to produce efficient and eco-compatible antifouling agents. The different steps of surface fouling are widely described in literature. Studying the biofilm and its stages provides a better understanding of how to elaborate more efficient antifouling strategies. Several approaches are currently applied, such as the use of biocide anti-fouling paint6 (mainly with copper derivatives) and super-hydrophobic coatings. While these two processes are proving to be the most effective, they are not entirely satisfactory, especially in a context of a changing legislation. Nowadays, the challenge is to prevent biofouling with non-biocide compounds, offering a cost effective solution, but with no toxic effects on marine organisms. Since the micro-fouling phase plays an important role in the regulation of the following steps of biofilm formation7, it is desired to reduce or delate biofouling of a given surface by inhibiting the micro fouling at its early stages. In our recent works, we reported that some amphiphilic compounds exhibited bacteriostatic or bactericidal properties at a concentration that did not affect eukaryotic cells. These remarkable properties invited us to assess this type of bio-inspired phospholipids9 to prevent the colonization of surfaces by marine bacteria. Of note, other studies reported that amphiphilic compounds interacted with bacteria leading to a reduction of their development. An amphiphilic compound is a molecule consisting of a hydrophobic domain and a polar head (ionic or non-ionic). These compounds appear to have interesting antifouling properties: some ionic compounds have shown antimicrobial activity, and zwitterions can reduce nonspecific adsorption of proteins. Herein, we investigate the potential of amphiphilic compounds as inhibitors of bacterial growth and marine biofilm formation. The aim of this study is to compare the efficacy of four synthetic phospholipids that features a cationic charge (BSV36, KLN47) or a zwitterionic polar-head group (SL386, MB2871) to prevent microfouling with marine bacteria. We also study the toxicity of these compounds in order to identify the most promising compound that must feature high anti-adhesive properties and a low cytotoxicity on two links representative of coastal marine food webs: phytoplankton and oyster larvae.Keywords: amphiphilic phospholipids, bacterial biofilm, marine microfouling, non-toxic antifouling
Procedia PDF Downloads 14544 Laying the Proto-Ontological Conditions for Floating Architecture as a Climate Adaptation Solution for Rising Sea Levels: Conceptual Framework and Definition of a Performance Based Design
Authors: L. Calcagni, A. Battisti, M. Hensel, D. S. Hensel
Abstract:
Since the beginning of the 21st century, we have seen a dynamic growth of water-based (WB) architecture, mainly due to the increasing threat of floods caused by sea level rise and heavy rains, all correlated with climate change. At the same time, the shortage of land available for urban development also led architects, engineers, and policymakers to reclaim the seabed or to build floating structures. Furthermore, the drive to produce energy from renewable resources has expanded the sector of offshore research, mining, and energy industry which seeks new types of WB structures. In light of these considerations, the time is ripe to consider floating architecture as a full-fledged building typology. Currently, there is no universally recognized academic definition of a floating building. Research on floating architecture lacks a proper, commonly shared vocabulary and typology distinction. Moreover, there is no global international legal framework for urban development on water, and there is no structured performance based building design (PBBD) approach for floating architecture in most countries, let alone national regulatory systems. Thus, first of all, the research intends to overcome the semantic and typological issues through the conceptualization of floating architecture, laying the proto-ontological conditions for floating development, and secondly to identify the parameters to be considered in the definition of a specific PBBD framework, setting the scene for national planning strategies. The theoretical overview and re-semanticization process involve the attribution of a new meaning to the term floating architecture. This terminological work of semantic redetermination is carried out through a systematic literature review and involves quantitative and historical research as well as logical argumentation methods. As it is expected that floating urban development is most likely to take place as an extension of coastal areas, the needs and design criteria are definitely more similar to those of the urban environment than to those of the offshore industry. Therefore, the identification and categorization of parameters –looking towards the potential formation of a PBBD framework for floating development– takes the urban and architectural guidelines and regulations as the starting point, taking the missing aspects, such as hydrodynamics (i.e. stability and buoyancy) from the offshore and shipping regulatory frameworks. This study is carried out through an evidence-based assessment of regulatory systems that are effective in different countries around the world, addressing on-land and on-water architecture as well as offshore and shipping industries. It involves evidence-based research and logical argumentation methods. Overall, inhabiting water is proposed not only as a viable response to the problem of rising sea levels, thus as a resilient frontier for urban development, but also as a response to energy insecurity, clean water, and food shortages, environmental concerns, and urbanization, in line with Blue Economy principles and the Agenda 2030. This review shows how floating architecture is to all intents and purposes, an urban adaptation measure and a solution towards self-sufficiency and energy-saving objectives. Moreover, the adopted methodology is, to all extents, open to further improvements and integrations, thus not rigid and already completely determined. Along with new designs and functions that will come into play in the practice field, eventually, life on water will seem no more unusual than life on land, especially by virtue of the multiple advantages it provides not only to users but also to the environment.Keywords: adaptation measures, building typology, floating architecture, performance based building design, rising sea levels
Procedia PDF Downloads 9643 Working at the Interface of Health and Criminal Justice: An Interpretative Phenomenological Analysis Exploration of the Experiences of Liaison and Diversion Nurses – Emerging Findings
Authors: Sithandazile Masuku
Abstract:
Introduction: Public health approaches to offender mental health are driven by international policies and frameworks in response to the disproportionately large representation of people with mental health problems within the offender pathway compared to the general population. Public health service innovations include mental health courts in the US, restorative models in Singapore and, liaison and diversion services in Australia, the UK, and some other European countries. Mental health nurses are at the forefront of offender health service innovations. In the U.K. context, police custody has been identified as an early point within the offender pathway where nurses can improve outcomes by offering assessments and share information with criminal justice partners. This scope of nursing practice has introduced challenges related to skills and support required for nurses working at the interface of health and the criminal justice system. Parallel literature exploring experiences of nurses working in forensic settings suggests the presence of compassion fatigue, burnout and vicarious trauma that may impede risk harm to the nurses in these settings. Published research explores mainly service-level outcomes including monitoring of figures indicative of a reduction in offending behavior. There is minimal research exploring the experiences of liaison and diversion nurses who are situated away from a supportive clinical environment and engaged in complex autonomous decision-making. Aim: This paper will share qualitative findings (in progress) from a PhD study that aims to explore the experiences of liaison and diversion nurses in one service in the U.K. Methodology: This is a qualitative interview study conducted using an Interpretative Phenomenological Analysis to gain an in-depth analysis of lived experiences. Methods: A purposive sampling technique was used to recruit n=8 mental health nurses registered with the UK professional body, Nursing and Midwifery Council, from one UK Liaison and Diversion service. All participants were interviewed online via video call using semi-structured interview topic guide. Data were recorded and transcribed verbatim. Data were analysed using the seven steps of the Interpretative Phenomenological Analysis data analysis method. Emerging Findings Analysis to date has identified pertinent themes: • Difficulties of meaning-making for nurses because of the complexity of their boundary spanning role. • Emotional burden experienced in a highly emotive and fast-changing environment. • Stress and difficulties with role identity impacting on individual nurses’ ability to be resilient. • Challenges to wellbeing related to a sense of isolation when making complex decisions. Conclusion Emerging findings have highlighted the lived experiences of nurses working in liaison and diversion as challenging. The nature of the custody environment has an impact on role identity and decision making. Nurses left feeling isolated and unsupported are less resilient and may go on to experience compassion fatigue. The findings from this study thus far point to a need to connect nurses working in these boundary spanning roles with a supportive infrastructure where the complexity of their role is acknowledged, and they can be connected with a health agenda. In doing this, the nurses would be protected from harm and the likelihood of sustained positive outcomes for service users is optimised.Keywords: liaison and diversion, nurse experiences, offender health, staff wellbeing
Procedia PDF Downloads 13342 A Parallel Cellular Automaton Model of Tumor Growth for Multicore and GPU Programming
Authors: Manuel I. Capel, Antonio Tomeu, Alberto Salguero
Abstract:
Tumor growth from a transformed cancer-cell up to a clinically apparent mass spans through a range of spatial and temporal magnitudes. Through computer simulations, Cellular Automata (CA) can accurately describe the complexity of the development of tumors. Tumor development prognosis can now be made -without making patients undergo through annoying medical examinations or painful invasive procedures- if we develop appropriate CA-based software tools. In silico testing mainly refers to Computational Biology research studies of application to clinical actions in Medicine. To establish sound computer-based models of cellular behavior, certainly reduces costs and saves precious time with respect to carrying out experiments in vitro at labs or in vivo with living cells and organisms. These aim to produce scientifically relevant results compared to traditional in vitro testing, which is slow, expensive, and does not generally have acceptable reproducibility under the same conditions. For speeding up computer simulations of cellular models, specific literature shows recent proposals based on the CA approach that include advanced techniques, such the clever use of supporting efficient data structures when modeling with deterministic stochastic cellular automata. Multiparadigm and multiscale simulation of tumor dynamics is just beginning to be developed by the concerned research community. The use of stochastic cellular automata (SCA), whose parallel programming implementations are open to yield a high computational performance, are of much interest to be explored up to their computational limits. There have been some approaches based on optimizations to advance in multiparadigm models of tumor growth, which mainly pursuit to improve performance of these models through efficient memory accesses guarantee, or considering the dynamic evolution of the memory space (grids, trees,…) that holds crucial data in simulations. In our opinion, the different optimizations mentioned above are not decisive enough to achieve the high performance computing power that cell-behavior simulation programs actually need. The possibility of using multicore and GPU parallelism as a promising multiplatform and framework to develop new programming techniques to speed-up the computation time of simulations is just starting to be explored in the few last years. This paper presents a model that incorporates parallel processing, identifying the synchronization necessary for speeding up tumor growth simulations implemented in Java and C++ programming environments. The speed up improvement that specific parallel syntactic constructs, such as executors (thread pools) in Java, are studied. The new tumor growth parallel model is proved using implementations with Java and C++ languages on two different platforms: chipset Intel core i-X and a HPC cluster of processors at our university. The parallelization of Polesczuk and Enderling model (normally used by researchers in mathematical oncology) proposed here is analyzed with respect to performance gain. We intend to apply the model and overall parallelization technique presented here to solid tumors of specific affiliation such as prostate, breast, or colon. Our final objective is to set up a multiparadigm model capable of modelling angiogenesis, or the growth inhibition induced by chemotaxis, as well as the effect of therapies based on the presence of cytotoxic/cytostatic drugs.Keywords: cellular automaton, tumor growth model, simulation, multicore and manycore programming, parallel programming, high performance computing, speed up
Procedia PDF Downloads 24241 Internet of Assets: A Blockchain-Inspired Academic Program
Authors: Benjamin Arazi
Abstract:
Blockchain is the technology behind cryptocurrencies like Bitcoin. It revolutionizes the meaning of trust in the sense of offering total reliability without relying on any central entity that controls or supervises the system. The Wall Street Journal states: “Blockchain Marks the Next Step in the Internet’s Evolution”. Blockchain was listed as #1 in Linkedin – The Learning Blog “most in-demand hard skills needed in 2020”. As stated there: “Blockchain’s novel way to store, validate, authorize, and move data across the internet has evolved to securely store and send any digital asset”. GSMA, a leading Telco organization of mobile communications operators, declared that “Blockchain has the potential to be for value what the Internet has been for information”. Motivated by these seminal observations, this paper presents the foundations of a Blockchain-based “Internet of Assets” academic program that joins under one roof leading application areas that are characterized by the transfer of assets over communication lines. Two such areas, which are pillars of our economy, are Fintech – Financial Technology and mobile communications services. The next application in line is Healthcare. These challenges are met based on available extensive professional literature. Blockchain-based assets communication is based on extending the principle of Bitcoin, starting with the basic question: If digital money that travels across the universe can ‘prove its own validity’, can this principle be applied to digital content. A groundbreaking positive answer here led to the concept of “smart contract” and consequently to DLT - Distributed Ledger Technology, where the word ‘distributed’ relates to the non-existence of reliable central entities or trusted third parties. The terms Blockchain and DLT are frequently used interchangeably in various application areas. The World Bank Group compiled comprehensive reports, analyzing the contribution of DLT/Blockchain to Fintech. The European Central Bank and Bank of Japan are engaged in Project Stella, “Balancing confidentiality and auditability in a distributed ledger environment”. 130 DLT/Blockchain focused Fintech startups are now operating in Switzerland. Blockchain impact on mobile communications services is treated in detail by leading organizations. The TM Forum is a global industry association in the telecom industry, with over 850 member companies, mainly mobile operators, that generate US$2 trillion in revenue and serve five billion customers across 180 countries. From their perspective: “Blockchain is considered one of the digital economy’s most disruptive technologies”. Samples of Blockchain contributions to Fintech (taken from a World Bank document): Decentralization and disintermediation; Greater transparency and easier auditability; Automation & programmability; Immutability & verifiability; Gains in speed and efficiency; Cost reductions; Enhanced cyber security resilience. Samples of Blockchain contributions to the Telco industry. Establishing identity verification; Record of transactions for easy cost settlement; Automatic triggering of roaming contract which enables near-instantaneous charging and reduction in roaming fraud; Decentralized roaming agreements; Settling accounts per costs incurred in accordance with agreement tariffs. This clearly demonstrates an academic education structure where fundamental technologies are studied in classes together with these two application areas. Advanced courses, treating specific implementations then follow separately. All are under the roof of “Internet of Assets”.Keywords: blockchain, education, financial technology, mobile telecommunications services
Procedia PDF Downloads 17940 Utilizing Extended Reality in Disaster Risk Reduction Education: A Scoping Review
Authors: Stefano Scippo, Damiana Luzzi, Stefano Cuomo, Maria Ranieri
Abstract:
Background: In response to the rise in natural disasters linked to climate change, numerous studies on Disaster Risk Reduction Education (DRRE) have emerged since the '90s, mainly using a didactic transmission-based approach. Effective DRRE should align with an interactive, experiential, and participatory educational model, which can be costly and risky. A potential solution is using simulations facilitated by eXtended Reality (XR). Research Question: This study aims to conduct a scoping review to explore educational methodologies that use XR to enhance knowledge among teachers, students, and citizens about environmental risks, natural disasters (including climate-related ones), and their management. Method: A search string of 66 keywords was formulated, spanning three domains: 1) education and target audience, 2) environment and natural hazards, and 3) technologies. On June 21st, 2023, the search string was used across five databases: EBSCOhost, IEEE Xplore, PubMed, Scopus, and Web of Science. After deduplication and removing papers without abstracts, 2,152 abstracts (published between 2013 and 2023) were analyzed and 2,062 papers were excluded, followed by the exclusion of 56 papers after full-text scrutiny. Excluded studies focused on unrelated technologies, non-environmental risks, and lacked educational outcomes or accessible texts. Main Results: The 34 reviewed papers were analyzed for context, risk type, research methodology, learning objectives, XR technology use, outcomes, and educational affordances of XR. Notably, since 2016, there has been a rise in scientific publications, focusing mainly on seismic events (12 studies) and floods (9), with a significant contribution from Asia (18 publications), particularly Japan (7 studies). Methodologically, the studies were categorized into empirical (26) and non-empirical (8). Empirical studies involved user or expert validation of XR tools, while non-empirical studies included systematic reviews and theoretical proposals without experimental validation. Empirical studies were further classified into quantitative, qualitative, or mixed-method approaches. Six qualitative studies involved small groups of users or experts, while 20 quantitative or mixed-method studies used seven different research designs, with most (17) employing a quasi-experimental, one-group post-test design, focusing on XR technology usability over educational effectiveness. Non-experimental studies had methodological limitations, making their results hypothetical and in need of further empirical validation. Educationally, the learning objectives centered on knowledge and skills for surviving natural disaster emergencies. All studies recommended XR technologies for simulations or serious games but did not develop comprehensive educational frameworks around these tools. XR-based tools showed potential superiority over traditional methods in teaching risk and emergency management skills. However, conclusions were more valid in studies with experimental designs; otherwise, they remained hypothetical without empirical evidence. The educational affordances of XR, mainly user engagement, were confirmed by the studies. Authors’ Conclusions: The analyzed literature lacks specific educational frameworks for XR in DRRE, focusing mainly on survival knowledge and skills. There is a need to expand educational approaches to include uncertainty education, developing competencies that encompass knowledge, skills, and attitudes like risk perception.Keywords: disaster risk reduction education, educational technologies, scoping review, XR technologies
Procedia PDF Downloads 2239 Autologous Blood for Conjunctival Autograft Fixation in Primary Pterygium Surgery: a Systematic Review and Meta-Analysis
Authors: Mohamed Abdelmongy
Abstract:
Autologous Blood for Conjunctival Autograft Fixation in Primary Pterygium Surgery: A Systematic Review and Meta-analysis Hossam Zein1,2, Ammar Ismail1,3, Mohamed Abdelmongy1,4, Sherif Elsherif1,5,6, Ahmad Hassanen1,4, Basma Muhammad2, Fathy Assaf1,3, Ahmed Elsehili1,7, Ahmed Negida1,7, Shin Yamane9, Mohamed M. Abdel-Daim8,9 and Kazuaki Kadonosono9 https://www.ncbi.nlm.nih.gov/pubmed/30277146 BACKGROUND: Pterygium is a benign ocular lesion characterized by triangular fibrovascular growth of conjunctival tissue over the cornea. Patients complain of the bad cosmetic appearance, ocular surface irritation and decreased visual acuity if the pterygium is large enough to cause astigmatism or encroach on the pupil. The definitive treatment of pterygium is surgical removal. However, outcomes are compromised by recurrence . The aim of the current study is to systematically review the current literature to explore the efficacy and safety of fibrin glue, suture and autologous blood coagulum for conjunctivalautograft fixation in primary pterygium surgery. OBJECTIVES: To assess the effectiveness of fibrin glue compared to sutures and autologous blood coagulum in conjunctival autografting for the surgical treatment of pterygium. METHODS: During preparing this manuscript, we followed the steps adequately illustrated in the Cochrane Handbook for Systematic Reviews of Interventions version 5.3, and reported it according to the preferred reporting of systematic review and meta-analysis (PRISMA) statement guidelines. We searched PubMed, Ovid (both through Medline), ISI Web of Science, and Cochrane Central Register of Controlled Trials (Central) through January 2017, using the following keywords “Pterygium AND (blood OR glue OR suture)” SELECTION CRITERIA: We included all randomized controlled trials (RCTs) that met the following criteria: 1) comparing autologous blood vs fibrin glue for conjunctivalautograft fixation in primary pterygium surgery 2) comparing autologous blood vs sutures for conjunctivalautograft fixation in primary pterygium surgery DATA COLLECTION AND ANALYSIS: Two review authors independently screened the search results, assessed trial quality, and extracted data using standard methodological procedures expected by Cochrane. The extracted data included A) study design, sample size, and main findings, B) Baseline characteristics of patients included in this review including their age, sex, pterygium site and grade, and graft size. C) Study outcomes comprising 1) primary outcomes: recurrence rate 2) secondary outcomes: graft stability outcomes (graft retraction, graft displacement), operation time (min) and postoperative symptoms (pain, discomfort, foreign body sensation, tearing) MAIN RESULTS: We included 7 RCTs and The review included662eyes (Blood: 293; Glue: 198; Suture: 171). we assess the 1) primary outcomes: recurrence rate 2) secondary outcomes: graft stability outcomes (graft retraction, graft displacement), operation time (min) and postoperative symptoms (pain, discomfort, foreign body sensation, tearing) CONCLUSIONS: Autologous blood for conjunctivalautograft fixation in pterygium surgery is associated with lower graft stability than fibrin glue or sutures. It was not inferior to fibrin glue or sutures regarding recurrence rate. The overall quality of evidence is low. Further well designed RCTs are needed to fully explore the efficacy of this new technique.Keywords: pterygium, autograft, ophthalmology, cornea
Procedia PDF Downloads 16038 Rationally Designed Dual PARP-HDAC Inhibitor Elicits Striking Anti-leukemic Effects
Authors: Amandeep Thakur, Yi-Hsuan Chu, Chun-Hsu Pan, Kunal Nepali
Abstract:
The transfer of ADP-ribose residues onto target substrates from nicotinamide adenine dinucleotide (NAD) (PARylation) is catalyzed by Poly (ADP-ribose) polymerases (PARPs). Amongst the PARP family members, the DNA damage response in cancer is majorly regulated by PARP1 and PARP2. The blockade of DNA repair by PARP inhibitors leads to the progression of DNA single-strand breaks (induced by some triggering factors) to double-strand breaks. Notably, PARP inhibitors are remarkably effective in cancers with defective homologous recombination repair (HRR). In particular, cancer cells with BRCA mutations are responsive to therapy with PARP inhibitors. The aforementioned requirement for PARP inhibitors to be effective confers a narrow activity spectrum to PARP inhibitors, which hinders their clinical applicability. Thus, the quest to expand the application horizons of PARP inhibitors beyond BRCA mutations is the need of the hour. Literature precedents reveal that HDAC inhibition induces BRCAness in cancer cells and can broaden the therapeutic scope of PARP inhibitors. Driven by such disclosures, dual inhibitors targeting both PARP and HDAC enzymes were designed by our research group to extend the efficacy of PARP inhibitors beyond BRCA-mutated cancers to cancers with induced BRCAness. The design strategy involved the installation of Veliparib, an investigational PARP inhibitor, as a surface recognition part in the HDAC inhibitor pharmacophore model. The chemical architecture of veliparib was deemed appropriate as a starting point for the generation of dual inhibitors by virtue of its size and structural flexibility. A validatory docking study was conducted at the outset to predict the binding mode of the designed dual modulatory chemical architectures. Subsequently, the designed chemical architectures were synthesized via a multistep synthetic route and evaluated for antitumor efficacy. Delightfully, one compound manifested impressive anti-leukemic effects (HL-60 cell lines) mediated via dual inhibition of PARP and class I HDACs. The outcome of the western blot analysis revealed that the compound could downregulate the expression levels of PARP1 and PARP2 and the HDAC isoforms (HDAC1, 2, and 3). Also, the dual PARP-HDAC inhibitor upregulated the protein expression of the acetyl histone H3, confirming its abrogation potential for class I HDACs. In addition, the dual modulator could arrest the cell cycle at the G0/G1 phase and induce autophagy. Further, polymer-based nanoformulation of the dual inhibitor was furnished to afford targeted delivery of the dual inhibitor at the cancer site. Transmission electron microscopy (TEM) results indicate that the nanoparticles were monodispersed and spherical. Moreover, the polymeric nanoformulation exhibited an appropriate particle size. Delightfully, pH-sensitive behavior was manifested by the polymeric nanoformulation that led to selective antitumor effects towards the HL-60 cell lines. In light of the magnificent anti-leukemic profile of the identified dual PARP-HDAC inhibitor, in-vivo studies (pharmacokinetics and pharmacodynamics) are currently being conducted. Notably, the optimistic findings of the aforementioned study have spurred our research group to initiate several medicinal chemistry campaigns to create bifunctional small molecule inhibitors addressing PARP as the primary target.Keywords: PARP inhibitors, HDAC inhibitors, BRCA mutations, leukemia
Procedia PDF Downloads 2137 Multiple Primary Pulmonary Meningiomas: A Case Report
Authors: Wellemans Isabelle, Remmelink Myriam, Foucart Annick, Rusu Stefan, Compère Christophe
Abstract:
Primary pulmonary meningioma (PPM) is a very rare tumor, and its occurrence has been reported only sporadically. Multiple PPMs are even more exceptional, and herein, we report, to the best of our knowledge, the fourth case, focusing on the clinicopathological features of the tumor. Moreover, the possible relationship between the use of progesterone–only contraceptives and the development of these neoplasms will be discussed. Case Report: We report a case of a 51-year-old female presenting three solid pulmonary nodules, with the following localizations: right upper lobe, middle lobe, and left lower lobe, described as incidental findings on computed tomography (CT) during a pre-bariatric surgery check-up. The patient revealed no drinking or smoking history. The physical exam was unremarkable except for the obesity. The lesions ranged in size between 6 and 24 mm and presented as solid nodules with lobulated contours. The largest lesion situated in the middle lobe had mild fluorodeoxyglucose (FDG) uptake on F-18 FDG positron emission tomography (PET)/CT, highly suggestive of primary lung neoplasm. For pathological assessment, video-assisted thoracoscopic middle lobectomy and wedge resection of the right upper nodule was performed. Histological examination revealed relatively well-circumscribed solid proliferation of bland meningothelial cells growing in whorls and lobular nests, presenting intranuclear pseudo-inclusions and psammoma bodies. No signs of anaplasia were observed. The meningothelial cells expressed diffusely Vimentin, focally Progesterone receptors and were negative for epithelial (cytokeratin (CK) AE1/AE3, CK7, CK20, Epithelial Membrane Antigen (EMA)), neuroendocrine markers (Synaptophysin, Chromogranin, CD56) and Estrogenic receptors. The proliferation labelling index Ki-67 was low (<5%). Metastatic meningioma was ruled out by brain and spine magnetic resonance imaging (MRI) scans. The third lesion localized in the left lower lobe was followed-up and resected three years later because of its slow but significant growth (14 mm to 16 mm), alongside two new infra centimetric lesions. Those three lesions showed a morphological and immunohistochemical profile similar to previously resected lesions. The patient was disease-free one year post-last surgery. Discussion: Although PPMs are mostly benign and slow-growing tumors with an excellent prognosis, they do not present specific radiological characteristics, and it is difficult to differentiate it from other lung tumors, histopathologic examination being essential. Aggressive behavior is associated with atypical or anaplastic features (WHO grades II–III) The etiology is still uncertain and different mechanisms have been proposed. A causal connection between sexual hormones and meningothelial proliferation has long been suspected and few studies examining progesterone only contraception and meningioma risk have all suggested an association. In line with this, our patient was treated with Levonorgestrel, a progesterone agonist, intra-uterine device (IUD). Conclusions: PPM, defined by the typical histological and immunohistochemical features of meningioma in the lungs and the absence of central nervous system lesions, is an extremely rare neoplasm, mainly solitary and associating, and indolent growth. Because of the unspecific radiologic findings, it should always be considered in the differential diagnosis of lung neoplasms. Regarding multiple PPM, only three cases are reported in the literature, and this is the first described in a woman treated by a progesterone-only IUD to the best of our knowledge.Keywords: pulmonary meningioma, multiple meningioma, meningioma, pulmonary nodules
Procedia PDF Downloads 11236 Implementation of Green Deal Policies and Targets in Energy System Optimization Models: The TEMOA-Europe Case
Authors: Daniele Lerede, Gianvito Colucci, Matteo Nicoli, Laura Savoldi
Abstract:
The European Green Deal is the first internationally agreed set of measures to contrast climate change and environmental degradation. Besides the main target of reducing emissions by at least 55% by 2030, it sets the target of accompanying European countries through an energy transition to make the European Union into a modern, resource-efficient, and competitive net-zero emissions economy by 2050, decoupling growth from the use of resources and ensuring a fair adaptation of all social categories to the transformation process. While the general purpose to allow the realization of the purposes of the Green Deal already dates back to 2019, strategies and policies keep being developed coping with recent circumstances and achievements. However, general long-term measures like the Circular Economy Action Plan, the proposals to shift from fossil natural gas to renewable and low-carbon gases, in particular biomethane and hydrogen, and to end the sale of gasoline and diesel cars by 2035, will all have significant effects on energy supply and demand evolution across the next decades. The interactions between energy supply and demand over long-term time frames are usually assessed via energy system models to derive useful insights for policymaking and to address technological choices and research and development. TEMOA-Europe is a newly developed energy system optimization model instance based on the minimization of the total cost of the system under analysis, adopting a technologically integrated, detailed, and explicit formulation and considering the evolution of the system in partial equilibrium in competitive markets with perfect foresight. TEMOA-Europe is developed on the TEMOA platform, an open-source modeling framework totally implemented in Python, therefore ensuring third-party verification even on large and complex models. TEMOA-Europe is based on a single-region representation of the European Union and EFTA countries on a time scale between 2005 and 2100, relying on a set of assumptions for socio-economic developments based on projections by the International Energy Outlook and a large technological dataset including 7 sectors: the upstream and power sectors for the production of all energy commodities and the end-use sectors, including industry, transport, residential, commercial and agriculture. TEMOA-Europe also includes an updated hydrogen module considering its production, storage, transportation, and utilization. Besides, it can rely on a wide set of innovative technologies, ranging from nuclear fusion and electricity plants equipped with CCS in the power sector to electrolysis-based steel production processes and steel in the industrial sector – with a techno-economic characterization based on public literature – to produce insightful energy scenarios and especially to cope with the very long analyzed time scale. The aim of this work is to examine in detail the scheme of measures and policies for the realization of the purposes of the Green Deal and to transform them into a set of constraints and new socio-economic development pathways. Based on them, TEMOA-Europe will be used to produce and comparatively analyze scenarios to assess the consequences of Green Deal-related measures on the future evolution of the energy mix over the whole energy system in an economic optimization environment.Keywords: European Green Deal, energy system optimization modeling, scenario analysis, TEMOA-Europe
Procedia PDF Downloads 10435 Health and Climate Changes: "Ippocrate" a New Alert System to Monitor and Identify High Risk
Authors: A. Calabrese, V. F. Uricchio, D. di Noia, S. Favale, C. Caiati, G. P. Maggi, G. Donvito, D. Diacono, S. Tangaro, A. Italiano, E. Riezzo, M. Zippitelli, M. Toriello, E. Celiberti, D. Festa, A. Colaianni
Abstract:
Climate change has a severe impact on human health. There is a vast literature demonstrating temperature increase is causally related to cardiovascular problem and represents a high risk for human health, but there are not study that improve a solution. In this work, it is studied how the clime influenced the human parameter through the analysis of climatic conditions in an area of the Apulia Region: Capurso Municipality. At the same time, medical personnel involved identified a set of variables useful to define an index describing health condition. These scientific studies are the base of an innovative alert system, IPPOCRATE, whose aim is to asses climate risk and share information to population at risk to support prevention and mitigation actions. IPPOCRATE is an e-health system, it is designed to provide technological support to analysis of health risk related to climate and provide tools for prevention and management of critical events. It is the first integrated system of prevention of human risk caused by climate change. IPPOCRATE calculates risk weighting meteorological data with the vulnerability of monitored subjects and uses mobile and cloud technologies to acquire and share information on different data channels. It is composed of four components: Multichannel Hub. Multichannel Hub is the ICT infrastructure used to feed IPPOCRATE cloud with a different type of data coming from remote monitoring devices, or imported from meteorological databases. Such data are ingested, transformed and elaborated in order to be dispatched towards mobile app and VoIP phone systems. IPPOCRATE Multichannel Hub uses open communication protocols to create a set of APIs useful to interface IPPOCRATE with 3rd party applications. Internally, it uses non-relational paradigm to create flexible and highly scalable database. WeHeart and Smart Application The wearable device WeHeart is equipped with sensors designed to measure following biometric variables: heart rate, systolic blood pressure and diastolic blood pressure, blood oxygen saturation, body temperature and blood glucose for diabetic subjects. WeHeart is designed to be easy of use and non-invasive. For data acquisition, users need only to wear it and connect it to Smart Application by Bluetooth protocol. Easy Box was designed to take advantage from new technologies related to e-health care. EasyBox allows user to fully exploit all IPPOCRATE features. Its name, Easy Box, reveals its purpose of container for various devices that may be included depending on user needs. Territorial Registry is the IPPOCRATE web module reserved to medical personnel for monitoring, research and analysis activities. Territorial Registry allows to access to all information gathered by IPPOCRATE using GIS system in order to execute spatial analysis combining geographical data (climatological information and monitored data) with information regarding the clinical history of users and their personal details. Territorial Registry was designed for different type of users: control rooms managed by wide area health facilities, single health care center or single doctor. Territorial registry manages such hierarchy diversifying the access to system functionalities. IPPOCRATE is the first e-Health system focused on climate risk prevention.Keywords: climate change, health risk, new technological system
Procedia PDF Downloads 86734 Evaluating Forecasting Strategies for Day-Ahead Electricity Prices: Insights From the Russia-Ukraine Crisis
Authors: Alexandra Papagianni, George Filis, Panagiotis Papadopoulos
Abstract:
The liberalization of the energy market and the increasing penetration of fluctuating renewables (e.g., wind and solar power) have heightened the importance of the spot market for ensuring efficient electricity supply. This is further emphasized by the EU’s goal of achieving net-zero emissions by 2050. The day-ahead market (DAM) plays a key role in European energy trading, accounting for 80-90% of spot transactions and providing critical insights for next-day pricing. Therefore, short-term electricity price forecasting (EPF) within the DAM is crucial for market participants to make informed decisions and improve their market positioning. Existing literature highlights out-of-sample performance as a key factor in assessing EPF accuracy, with influencing factors such as predictors, forecast horizon, model selection, and strategy. Several studies indicate that electricity demand is a primary price determinant, while renewable energy sources (RES) like wind and solar significantly impact price dynamics, often lowering prices. Additionally, incorporating data from neighboring countries, due to market coupling, further improves forecast accuracy. Most studies predict up to 24 steps ahead using hourly data, while some extend forecasts using higher-frequency data (e.g., half-hourly or quarter-hourly). Short-term EPF methods fall into two main categories: statistical and computational intelligence (CI) methods, with hybrid models combining both. While many studies use advanced statistical methods, particularly through different versions of traditional AR-type models, others apply computational techniques such as artificial neural networks (ANNs) and support vector machines (SVMs). Recent research combines multiple methods to enhance forecasting performance. Despite extensive research on EPF accuracy, a gap remains in understanding how forecasting strategy affects prediction outcomes. While iterated strategies are commonly used, they are often chosen without justification. This paper contributes by examining whether the choice of forecasting strategy impacts the quality of day-ahead price predictions, especially for multi-step forecasts. We evaluate both iterated and direct methods, exploring alternative ways of conducting iterated forecasts on benchmark and state-of-the-art forecasting frameworks. The goal is to assess whether these factors should be considered by end-users to improve forecast quality. We focus on the Greek DAM using data from July 1, 2021, to March 31, 2022. This period is chosen due to significant price volatility in Greece, driven by its dependence on natural gas and limited interconnection capacity with larger European grids. The analysis covers two phases: pre-conflict (January 1, 2022, to February 23, 2022) and post-conflict (February 24, 2022, to March 31, 2022), following the Russian-Ukraine conflict that initiated an energy crisis. We use the mean absolute percentage error (MAPE) and symmetric mean absolute percentage error (sMAPE) for evaluation, as well as the Direction of Change (DoC) measure to assess the accuracy of price movement predictions. Our findings suggest that forecasters need to apply all strategies across different horizons and models. Different strategies may be required for different horizons to optimize both accuracy and directional predictions, ensuring more reliable forecasts.Keywords: short-term electricity price forecast, forecast strategies, forecast horizons, recursive strategy, direct strategy
Procedia PDF Downloads 433 Anti-Infective Potential of Selected Philippine Medicinal Plant Extracts against Multidrug-Resistant Bacteria
Authors: Demetrio L. Valle Jr., Juliana Janet M. Puzon, Windell L. Rivera
Abstract:
From the various medicinal plants available in the Philippines, crude ethanol extracts of twelve (12) Philippine medicinal plants, namely: Senna alata L. Roxb. (akapulko), Psidium guajava L. (bayabas), Piper betle L. (ikmo), Vitex negundo L. (lagundi), Mitrephora lanotan (Blanco) Merr. (Lanotan), Zingiber officinale Roscoe (luya), Curcuma longa L. (Luyang dilaw), Tinospora rumphii Boerl (Makabuhay), Moringga oleifera Lam. (malunggay), Phyllanthus niruri L. (sampa-sampalukan), Centella asiatica (L.) Urban (takip kuhol), and Carmona retusa (Vahl) Masam (tsaang gubat) were studied. In vitro methods of evaluation against selected Gram-positive and Gram-negative multidrug-resistant (MDR), bacteria were performed on the plant extracts. Although five of the plants showed varying antagonistic activities against the test organisms, only Piper betle L. exhibited significant activities against both Gram-negative and Gram-positive multidrug-resistant bacteria, exhibiting wide zones of growth inhibition in the disk diffusion assay, and with the lowest concentrations of the extract required to inhibit the growth of the bacteria, as supported by the minimum inhibitory concentration (MIC) and minimum bactericidal concentration (MBC) assays. Further antibacterial studies of the Piper betle L. leaf, obtained by three extraction methods (ethanol, methanol, supercritical CO2), revealed similar inhibitory activities against a multitude of Gram-positive and Gram-negative MDR bacteria. Thin layer chromatography (TLC) assay of the leaf extract revealed a maximum of eight compounds with Rf values of 0.92, 0.86, 0.76, 0.53, 0.40, 0.25, 0.13, and 0.013, best visualized when inspected under UV-366 nm. TLC- agar overlay bioautography of the isolated compounds showed the compounds with Rf values of 0.86 and 0.13 having inhibitory activities against Gram-positive MDR bacteria (MRSA and VRE). The compound with an Rf value of 0.86 also possesses inhibitory activity against Gram-negative MDR bacteria (CRE Klebsiella pneumoniae and MBL Acinetobacter baumannii). Gas Chromatography-Mass Spectrometry (GC-MS) was able to identify six volatile compounds, four of which are new compounds that have not been mentioned in the medical literature. The chemical compounds isolated include 4-(2-propenyl)phenol and eugenol; and the new four compounds were ethyl diazoacetate, tris(trifluoromethyl)phosphine, heptafluorobutyrate, and 3-fluoro-2-propynenitrite. Phytochemical screening and investigation of its antioxidant, cytotoxic, possible hemolytic activities, and mechanisms of antibacterial activity were also done. The results showed that the local variant of Piper betle leaf extract possesses significant antioxidant, anti-cancer and antimicrobial properties, attributed to the presence of bioactive compounds, particularly of flavonoids (condensed tannin, leucoanthocyanin, gamma benzopyrone), anthraquinones, steroids/triterpenes and 2-deoxysugars. Piper betle L. is also traditionally known to enhance wound healing, which could be primarily due to its antioxidant, anti-inflammatory and antimicrobial activities. In vivo studies on mice using 2.5% and 5% of the ethanol leaf extract cream formulations in the excised wound models significantly increased the process of wound healing in the mice subjects, the results and values of which are at par with the current antibacterial cream (Mupirocin). From the results of the series of studies, we have definitely proven the value of Piper betle L. as a source of bioactive compounds that could be developed into therapeutic agents against MDR bacteria.Keywords: Philippine herbal medicine, multidrug-resistant bacteria, Piper betle, TLC-bioautography
Procedia PDF Downloads 76532 Towards Better Integration: Qualitative Study on Perceptions of Russian-Speaking Immigrants in Australia
Authors: Oleg Shovkovyy
Abstract:
This research conducted in response to one of the most pressing questions on the agenda of many public administration offices around the world: “What could be done for better integration and assimilation of immigrants into hosting communities?” In author’s view, the answer could be suggested by immigrants themselves. They, often ‘bogged down in the past,’ snared by own idols and demons, perceive things differently, which, in turn, may result in their inability to integrate smoothly into hosting communities. Brief literature review suggests that perceptions of immigrants are completely neglected or something unsought in the current research on migrants, which, often, based on opinion polls by members of hosting communities themselves or superficial research data by various research organizations. Even those specimens that include voices of immigrants, unlikely to shed any additional light onto the problem simply because certain things are not made to speak out loud, especially to those in whose hands immigrants’ fate is (authorities). In this regard, this qualitative study, conducted by an insider to a few Russian-speaking communities, represents a unique opportunity for all stakeholders to look at the question of integration through the eyes of immigrants, from a different perspective and thus, makes research findings especially valuable for better understanding of the problem. Case study research employed ethnographic methods of gathering data where, approximately 200 Russian-speaking immigrants of first and second generations were closely observed by the Russian-speaking researcher in their usual setting, for eight months, and at different venues. The number of informal interviews with 27 key informants, with whom the researcher managed to establish a good rapport and who were keen enough to share their experiences voluntarily, were conducted. The field notes were taken at 14 locations (study sites) within the Brisbane region of Queensland, Australia. Moreover, all this time, researcher lived in dwelling of one of the immigrants and was an active participant in the social life (worship, picnics, dinners, weekend schools, concerts, cultural events, social gathering, etc.) of observed communities, whose members, to a large extent, belong to various religious lines of the Russian and Protestant Church. It was found that the majority of immigrants had experienced some discrimination in matters of hiring, employment, recognition of educational qualifications from home countries, and simply felt a sort of dislike from society in various everyday situations. Many noted complete absences or very limited state assistance in terms of employment, training, education, and housing. For instance, the Australian Government Department of Human Services not only does not stimulate job search but, on the contrary, encourages to refuse short-term works and employment. On the other hand, offered free courses on adaptation, and the English language proved to be ineffective and unpopular amongst immigrants. Many interviewees have reported overstated requirements for English proficiency and local work experience, whereas it was not critical for the given task or job. Based on the result of long-term monitoring, the researcher also had the courage to assert the negative and decelerating roles of immigrants’ communities, particularly religious communities, on processes of integration and assimilation. The findings suggest that governments should either change current immigration policies in the direction of their toughening or to take more proactive and responsible role in dealing with immigrant-related issues; for instance, increasing assistance and support to all immigrants and probably, paying more attention to and taking stake in managing and organizing lives of immigrants’ communities rather, simply leaving it all to chance.Keywords: Australia, immigration, integration, perceptions
Procedia PDF Downloads 22031 Stabilizing Additively Manufactured Superalloys at High Temperatures
Authors: Keivan Davami, Michael Munther, Lloyd Hackel
Abstract:
The control of properties and material behavior by implementing thermal-mechanical processes is based on mechanical deformation and annealing according to a precise schedule that will produce a unique and stable combination of grain structure, dislocation substructure, texture, and dispersion of precipitated phases. The authors recently developed a thermal-mechanical technique to stabilize the microstructure of additively manufactured nickel-based superalloys even after exposure to high temperatures. However, the mechanism(s) that controls this stability is still under investigation. Laser peening (LP), also called laser shock peening (LSP), is a shock based (50 ns duration) post-processing technique used for extending performance levels and improving service life of critical components by developing deep levels of plastic deformation, thereby generating high density of dislocations and inducing compressive residual stresses in the surface and deep subsurface of components. These compressive residual stresses are usually accompanied with an increase in hardness and enhance the material’s resistance to surface-related failures such as creep, fatigue, contact damage, and stress corrosion cracking. While the LP process enhances the life span and durability of the material, the induced compressive residual stresses relax at high temperatures (>0.5Tm, where Tm is the absolute melting temperature), limiting the applicability of the technology. At temperatures above 0.5Tm, the compressive residual stresses relax, and yield strength begins to drop dramatically. The principal reason is the increasing rate of solid-state diffusion, which affects both the dislocations and the microstructural barriers. Dislocation configurations commonly recover by mechanisms such as climbing and recombining rapidly at high temperatures. Furthermore, precipitates coarsen, and grains grow; virtually all of the available microstructural barriers become ineffective.Our results indicate that by using “cyclic” treatments with sequential LP and annealing steps, the compressive stresses survive, and the microstructure is stable after exposure to temperatures exceeding 0.5Tm for a long period of time. When the laser peening process is combined with annealing, dislocations formed as a result of LPand precipitates formed during annealing have a complex interaction that provides further stability at high temperatures. From a scientific point of view, this research lays the groundwork for studying a variety of physical, materials science, and mechanical engineering concepts. This research could lead to metals operating at higher sustained temperatures enabling improved system efficiencies. The strengthening of metals by a variety of means (alloying, work hardening, and other processes) has been of interest for a wide range of applications. However, the mechanistic understanding of the often complex processes of interactionsbetween dislocations with solute atoms and with precipitates during plastic deformation have largely remained scattered in the literature. In this research, the elucidation of the actual mechanisms involved in the novel cyclic LP/annealing processes as a scientific pursuit is investigated through parallel studies of dislocation theory and the implementation of advanced experimental tools. The results of this research help with the validation of a novel laser processing technique for high temperature applications. This will greatly expand the applications of the laser peening technology originally devised only for temperatures lower than half of the melting temperature.Keywords: laser shock peening, mechanical properties, indentation, high temperature stability
Procedia PDF Downloads 14830 Case Report: Ocular Helminth - In Unusual Site (Lens)
Authors: Chandra Shekhar Majumder, Md. Shamsul Haque, Khondaker Anower Hossain, Md. Rafiqul Islam
Abstract:
Introduction: Ocular helminths are parasites that infect the eye or its adnexa. They can be either motile worms or sessile worms that form cysts. These parasites require two hosts for their life cycle, a definite host (usually a human) and an intermediate host (usually an insect). While there have been reports of ocular helminths infecting various structures of the eye, including the anterior chamber and subconjunctival space, there is no previous record of such a case involving the lens. Research Aim: The aim of this case report is to present a rare case of ocular helminth infection in the lens and to contribute to the understanding of this unusual site of infection. Methodology: This study is a case report, presenting the details and findings of an 80-year-old retired policeman who presented with severe pain, redness, and vision loss in the left eye. The patient had a history of diabetes mellitus and hypertension. The examination revealed the presence of a thread-like helminth in the lens. The patient underwent treatment and follow-up, and the helminth specimen was sent for identification to the department of Parasitology. Case report: An 80-year-old retired policeman attended the OPD, Faridpur Medical College Hospital with the complaints of severe pain, redness and gross dimness of vision of the left eye for 5 days. He had a history of diabetes mellitus and hypertension for 3 years. On examination, L/E visual acuity was PL only, moderate ciliary congestion, KP 2+, cells 2+ and posterior synechia from 5 to 7 O’clock position was found. Lens was opaque. A thread like helminth was found under the anterior of the lens. The worm was moving and changing its position during examination. On examination of R/E, visual acuity was 6/36 unaided, 6/18 with pinhole. There was lental opacity. Slit-lamp and fundus examination were within normal limit. Patient was admitted in Faridpur Medical College Hospital. Diabetes mellitus was controlled with insulin. ICCE with PI was done on the same day of admission under depomedrol coverage. The helminth was recovered from the lens. It was thread like, about 5 to 6 mm in length, 1 mm in width and pinkish in colour. The patient followed up after 7 days, VA was HM, mild ciliary congestion, few KPs and cells were present. Media was hazy due to vitreous opacity. The worm was sent to the department of Parasitology, NIPSOM, Dhaka for identification. Findings: The findings of this case report highlight the presence of a helminth in the lens, which has not been previously reported. The helminth was successfully removed from the lens, but the patient experienced complications such as anterior uveitis and vitreous opacity. The exact mechanism by which the helminth enters the lens remains unclear. Theoretical Importance: This case report contributes to the existing literature on ocular helminth infections by reporting a unique case involving the lens. It highlights the need for further research to understand the pathogenesis and mechanism of entry of helminths in the lens. Data Collection and Analysis Procedures: The data for this case report were collected through clinical examination and medical records of the patient. The findings were described and presented in a descriptive manner. No statistical analysis was conducted. Question Addressed: This case report addresses the question of whether ocular helminth infections can occur in the lens, which has not been previously reported. Conclusion: To the best of our knowledge, this is the first reported case of ocular helminth infection in the lens. The presence of the helminth in the lens raises interesting questions regarding its pathogenesis and entry mechanism. Further study and research are needed to explore these aspects. Ophthalmologists and parasitologists should be aware of the possibility of ocular helminth infections in unusual sites like the lens.Keywords: ocular, helminth, unsual site, lens
Procedia PDF Downloads 6329 An Engaged Approach to Developing Tools for Measuring Caregiver Knowledge and Caregiver Engagement in Juvenile Type 1 Diabetes
Authors: V. Howard, R. Maguire, S. Corrigan
Abstract:
Background: Type 1 Diabetes (T1D) is a chronic autoimmune disease, typically diagnosed in childhood. T1D puts an enormous strain on families; controlling blood-glucose in children is difficult and the consequences of poor control for patient health are significant. Successful illness management and better health outcomes can be dependent on quality of caregiving. On diagnosis, parent-caregivers face a steep learning curve as T1D care requires a significant level of knowledge to inform complex decision making throughout the day. The majority of illness management is carried out in the home setting, independent of clinical health providers. Parent-caregivers vary in their level of knowledge and their level of engagement in applying this knowledge in the practice of illness management. Enabling researchers to quantify these aspects of the caregiver experience is key to identifying targets for psychosocial support interventions, which are desirable for reducing stress and anxiety in this highly burdened cohort, and supporting better health outcomes in children. Currently, there are limited tools available that are designed to capture this information. Where tools do exist, they are not comprehensive and do not adequately capture the lived experience. Objectives: Development of quantitative tools, informed by lived experience, to enable researchers gather data on parent-caregiver knowledge and engagement, which accurately represents the experience/cohort and enables exploration of questions that are of real-world value to the cohort themselves. Methods: This research employed an engaged approach to address the problem of quantifying two key aspects of caregiver diabetes management: Knowledge and engagement. The research process was multi-staged and iterative. Stage 1: Working from a constructivist standpoint, literature was reviewed to identify relevant questionnaires, scales and single-item measures of T1D caregiver knowledge and engagement, and harvest candidate questionnaire items. Stage 2: Aggregated findings from the review were circulated among a PPI (patient and public involvement) expert panel of caregivers (n=6), for discussion and feedback. Stage 3: In collaboration with the expert panel, data were interpreted through the lens of lived experience to create a long-list of candidate items for novel questionnaires. Items were categorized as either ‘knowledge’ or ‘engagement’. Stage 4: A Delphi-method process (iterative surveys) was used to prioritize question items and generate novel questions that further captured the lived experience. Stage 5: Both questionnaires were piloted to refine wording of text to increase accessibility and limit socially desirable responding. Stage 6: Tools were piloted using an online survey that was deployed using an online peer-support group for caregivers for Juveniles with T1D. Ongoing Research: 123 parent-caregivers completed the survey. Data analysis is ongoing to establish face and content validity qualitatively and through exploratory factor analysis. Reliability will be established using an alternative-form method and Cronbach’s alpha will assess internal consistency. Work will be completed by early 2024. Conclusion: These tools will enable researchers to gain deeper insights into caregiving practices among parents of juveniles with T1D. Development was driven by lived experience, illustrating the value of engaged research at all levels of the research process.Keywords: caregiving, engaged research, juvenile type 1 diabetes, quantified engagement and knowledge
Procedia PDF Downloads 5528 Design and 3D-Printout of The Stack-Corrugate-Sheel Core Sandwiched Decks for The Bridging System
Authors: K. Kamal
Abstract:
Structural sandwich panels with core of Advanced Composites Laminates l Honeycombs / PU-foams are used in aerospace applications and are also fabricated for use now in some civil engineering applications. An all Advanced Composites Foot Over Bridge (FOB) system, designed and developed for pedestrian traffic is one such application earlier, may be cited as an example here. During development stage of this FoB, a profile of its decks was then spurred as a single corrugate sheet core sandwiched between two Glass Fibre Reinforced Plastics(GFRP) flat laminates. Once successfully fabricated and used, these decks did prove suitable also to form other structure on assembly, such as, erecting temporary shelters. Such corrugated sheet core profile sandwiched panels were then also tried using the construction materials but any conventional method of construction only posed certain difficulties in achieving the required core profile monolithically within the sandwiched slabs and hence it was then abended. Such monolithic construction was, however, subsequently eased out on demonstration by dispensing building materials mix through a suitably designed multi-dispenser system attached to a 3D Printer. This study conducted at lab level was thus reported earlier and it did include the fabrication of a 3D printer in-house first as ‘3DcMP’ as well as on its functional operation, some required sandwich core profiles also been 3D-printed out producing panels hardware. Once a number of these sandwich panels in single corrugated sheet core monolithically printed out, panels were subjected to load test in an experimental set up as also their structural behavior was studied analytically, and subsequently, these results were correlated as reported in the literature. In achieving the required more depths and also to exhibit further the stronger and creating sandwiched decks of better structural and mechanical behavior, further more complex core configuration such as stack corrugate sheets core with a flat mid plane was felt to be the better sandwiched core. Such profile remained as an outcome that turns out merely on stacking of two separately printed out monolithic units of single corrugated sheet core developed earlier as above and bonded them together initially, maintaining a different orientation. For any required sequential understanding of the structural behavior of any such complex profile core sandwiched decks with special emphasis to study of the effect in the variation of corrugation orientation in each distinct tire in this core, it obviously calls for an analytical study first. The rectangular,simply supported decks have therefore been considered for analysis adopting the ‘Advanced Composite Technology(ACT), some numerical results along with some fruitful findings were obtained and these are all presented here in this paper. From this numerical result, it has been observed that a mid flat layer which eventually get created monolethically itself, in addition to eliminating the bonding process in development, has been found to offer more effective bending resistance by such decks subjected to UDL over them. This is understood to have resulted here since the existence of a required shear resistance layer at the mid of the core in this profile, unlike other bending elements. As an addendum to all such efforts made as covered above and was published earlier, this unique stack corrugate sheet core profile sandwiched structural decks, monolithically construction with ease at the site itself, has been printed out from a 3D Printer. On employing 3DcMP and using some innovative building construction materials, holds the future promises of such research & development works since all those several aspects of a 3D printing in construction are now included such as reduction in the required construction time, offering cost effective solutions with freedom in design of any such complex shapes thus can widely now be realized by the modern construction industry.Keywords: advance composite technology(ACT), corrugated laminates, 3DcMP, foot over bridge (FOB), sandwiched deck units
Procedia PDF Downloads 17027 The Impact of Supporting Productive Struggle in Learning Mathematics: A Quasi-Experimental Study in High School Algebra Classes
Authors: Sumeyra Karatas, Veysel Karatas, Reyhan Safak, Gamze Bulut-Ozturk, Ozgul Kartal
Abstract:
Productive struggle entails a student's cognitive exertion to comprehend mathematical concepts and uncover solutions not immediately apparent. The significance of productive struggle in learning mathematics is accentuated by influential educational theorists, emphasizing its necessity for learning mathematics with understanding. Consequently, supporting productive struggle in learning mathematics is recognized as a high-leverage and effective mathematics teaching practice. In this study, the investigation into the role of productive struggle in learning mathematics led to the development of a comprehensive rubric for productive struggle pedagogy through an exhaustive literature review. The rubric consists of eight primary criteria and 37 sub-criteria, providing a detailed description of teacher actions and pedagogical choices that foster students' productive struggles. These criteria encompass various pedagogical aspects, including task design, tool implementation, allowing time for struggle, posing questions, scaffolding, handling mistakes, acknowledging efforts, and facilitating discussion/feedback. Utilizing this rubric, a team of researchers and teachers designed eight 90-minute lesson plans, employing a productive struggle pedagogy, for a two-week unit on solving systems of linear equations. Simultaneously, another set of eight lesson plans on the same topic, featuring identical content and problems but employing a traditional lecture-and-practice model, was designed by the same team. The objective was to assess the impact of supporting productive struggle on students' mathematics learning, defined by the strands of mathematical proficiency. This quasi-experimental study compares the control group, which received traditional lecture- and practice instruction, with the treatment group, which experienced a productive struggle in pedagogy. Sixty-six 10th and 11th-grade students from two algebra classes, taught by the same teacher at a high school, underwent either the productive struggle pedagogy or lecture-and-practice approach over two-week eight 90-minute class sessions. To measure students' learning, an assessment was created and validated by a team of researchers and teachers. It comprised seven open-response problems assessing the strands of mathematical proficiency: procedural and conceptual understanding, strategic competence, and adaptive reasoning on the topic. The test was administered at the beginning and end of the two weeks as pre-and post-test. Students' solutions underwent scoring using an established rubric, subjected to expert validation and an inter-rater reliability process involving multiple criteria for each problem based on their steps and procedures. An analysis of covariance (ANCOVA) was conducted to examine the differences between the control group, which received traditional pedagogy, and the treatment group, exposed to the productive struggle pedagogy, on the post-test scores while controlling for the pre-test. The results indicated a significant effect of treatment on post-test scores for procedural understanding (F(2, 63) = 10.47, p < .001), strategic competence (F(2, 63) = 9.92, p < .001), adaptive reasoning (F(2, 63) = 10.69, p < .001), and conceptual understanding (F(2, 63) = 10.06, p < .001), controlling for pre-test scores. This demonstrates the positive impact of supporting productive struggle in learning mathematics. In conclusion, the results revealed the significance of the role of productive struggle in learning mathematics. The study further explored the practical application of productive struggle through the development of a comprehensive rubric describing the pedagogy of supporting productive struggle.Keywords: effective mathematics teaching practice, high school algebra, learning mathematics, productive struggle
Procedia PDF Downloads 5126 Nigeria Rural Water Supply Management: Participatory Process as the Best Option
Authors: E. O. Aluta, C. A. Booth, D. G. Proverbs, T. Appleby
Abstract:
Challenges in the effective management of potable water have attracted global attention in recent years and remain many world regions’ major priorities. Scarcity and unavailability of potable water may potentially escalate poverty, obviate democratic expression of views and militate against inter-sectoral development. These challenges contra-indicate the inherent potentials of the resource. Thus, while creation of poverty may be regarded as a broad-based problem, it is capable of reflecting life-span reduction diseases, the friction of interests manifesting in threats and warfare, the relegation of democratic principles for authoritarian definitions and Human Rights abuse. The challenges may be identified as manifestations of ineffective management of potable water resource and therefore, regarded as major problems in environmental protection. In reaction, some nations have re-examined their laws and policies, while others have developed innovative projects, which seek to ameliorate difficulties of providing sustainable potable water. The problems resonate in Nigeria, where the legal framework supporting the supply and management of potable water has been criticized as ineffective. This has impacted more on rural community members, often regarded as ‘voiceless’. At that level, the participation of non-state actors has been identified as an effective strategy, which can improve water supply. However, there are indications that there is no pragmatic application of this, resulting in over-centralization and top-down management. Thus, this study focuses on how the participatory process may enable the development of participatory water governance framework, for use in Nigeria rural communities. The Rural Advisory Board (RAB) is proposed as a governing body to promote proximal relationships, institute democratisation borne out of participation, while enabling effective accountability and information. The RAB establishes mechanisms for effectiveness, taking into consideration Transparency, Accountability and Participation (TAP), advocated as guiding principles of decision-makers. Other tools, which may be explored in achieving these are, Laws and Policies supporting the water sector, under the direction of the Ministries and Law Courts, which ensure non-violation of laws. Community norms and values, consisting of Nigerian traditional belief system, perceptions, attitude and reality (often undermined in favour of legislations), are relied on to pave the way for enforcement. While the Task Forces consist of community members with specific designation of duties, which ensure compliance and enforceability, a cross-section of community members are assigned duties. Thus, the principle of participation is pragmatically reflected. A review of the literature provided information on the potentials of the participatory process, in potable water governance. Qualitative methodology was explored by using the semi-structured interview as strategy for inquiry. The purposive sampling strategy, consisting of homogeneous, heterogeneous and criterion techniques was applied to enable sampling. The samples, sourced from diverse positions of life, were from the study area of Delta State of Nigeria, involving three local governments of Oshimili South, Uvwie and Warri South. From the findings, there are indications that the application of the participatory process is inhered with empowerment of the rural community members to make legitimate demands for TAP. This includes the obviation of mono-decision making for the supply and management of potable water. This is capable of restructuring the top-down management to a top-down/bottom-up system.Keywords: participation, participatory process, participatory water governance, rural advisory board
Procedia PDF Downloads 38225 Risks for Cyanobacteria Harmful Algal Blooms in Georgia Piedmont Waterbodies Due to Land Management and Climate Interactions
Authors: Sam Weber, Deepak Mishra, Susan Wilde, Elizabeth Kramer
Abstract:
The frequency and severity of cyanobacteria harmful blooms (CyanoHABs) have been increasing over time, with point and non-point source eutrophication and shifting climate paradigms being blamed as the primary culprits. Excessive nutrients, warm temperatures, quiescent water, and heavy and less regular rainfall create more conducive environments for CyanoHABs. CyanoHABs have the potential to produce a spectrum of toxins that cause gastrointestinal stress, organ failure, and even death in humans and animals. To promote enhanced, proactive CyanoHAB management, risk modeling using geospatial tools can act as predictive mechanisms to supplement current CyanoHAB monitoring, management and mitigation efforts. The risk maps would empower water managers to focus their efforts on high risk water bodies in an attempt to prevent CyanoHABs before they occur, and/or more diligently observe those waterbodies. For this research, exploratory spatial data analysis techniques were used to identify the strongest predicators for CyanoHAB blooms based on remote sensing-derived cyanobacteria cell density values for 771 waterbodies in the Georgia Piedmont and landscape characteristics of their watersheds. In-situ datasets for cyanobacteria cell density, nutrients, temperature, and rainfall patterns are not widely available, so free gridded geospatial datasets were used as proxy variables for assessing CyanoHAB risk. For example, the percent of a watershed that is agriculture was used as a proxy for nutrient loading, and the summer precipitation within a watershed was used as a proxy for water quiescence. Cyanobacteria cell density values were calculated using atmospherically corrected images from the European Space Agency’s Sentinel-2A satellite and multispectral instrument sensor at a 10-meter ground resolution. Seventeen explanatory variables were calculated for each watershed utilizing the multi-petabyte geospatial catalogs available within the Google Earth Engine cloud computing interface. The seventeen variables were then used in a multiple linear regression model, and the strongest predictors of cyanobacteria cell density were selected for the final regression model. The seventeen explanatory variables included land cover composition, winter and summer temperature and precipitation data, topographic derivatives, vegetation index anomalies, and soil characteristics. Watershed maximum summer temperature, percent agriculture, percent forest, percent impervious, and waterbody area emerged as the strongest predictors of cyanobacteria cell density with an adjusted R-squared value of 0.31 and a p-value ~ 0. The final regression equation was used to make a normalized cyanobacteria cell density index, and a Jenks Natural Break classification was used to assign waterbodies designations of low, medium, or high risk. Of the 771 waterbodies, 24.38% were low risk, 37.35% were medium risk, and 38.26% were high risk. This study showed that there are significant relationships between free geospatial datasets representing summer maximum temperatures, nutrient loading associated with land use and land cover, and the area of a waterbody with cyanobacteria cell density. This data analytics approach to CyanoHAB risk assessment corroborated the literature-established environmental triggers for CyanoHABs, and presents a novel approach for CyanoHAB risk mapping in waterbodies across the greater southeastern United States.Keywords: cyanobacteria, land use/land cover, remote sensing, risk mapping
Procedia PDF Downloads 21024 The Impact of Right to Repair Initiatives on Environmental and Financial Performance in European Consumer Electronics Firms: An Econometric Analysis
Authors: Daniel Stabler, Anne-Laure Mention, Henri Hakala, Ahmad Alaassar
Abstract:
In Europe, 2.2 billion tons of waste annually generate severe environmental damage and economic burdens, and negatively impact human health. A stark illustration of the problem is found within the consumer electronics industry, which reflects one of the most complex global waste streams. Of the 5.3 billion globally discarded mobile phones in 2022, only 17% were properly recycled. To address these pressing issues, Europe has made significant strides in developing waste management strategies, Circular Economy initiatives, and Right to Repair policies. These endeavors aim to make product repair and maintenance more accessible, extend product lifespans, reduce waste, and promote sustainable resource use. European countries have introduced Right to Repair policies, often in conjunction with extended producer responsibility legislation, repair subsidies, and consumer repair indices, to varying degrees of regulatory rigor. Changing societal trends emphasizing sustainability and environmental responsibility have driven consumer demand for more sustainable and repairable products, benefiting repair-focused consumer electronics businesses. In academic research, much of the literature in Management studies has examined the European Circular Economy and the Right to Repair from firm-level perspectives. These studies frequently employ a business-model lens, emphasizing innovation and strategy frameworks. However, this study takes an institutional perspective, aiming to understand the adoption of Circular Economy and repair-focused business models within the European consumer electronics market. The concepts of the Circular Economy and the Right to Repair align with institutionalism as they reflect evolving societal norms favoring sustainability and consumer empowerment. Regulatory institutions play a pivotal role in shaping and enforcing these concepts through legislation, influencing the behavior of businesses and individuals. Compliance and enforcement mechanisms are essential for their success, compelling actors to adopt sustainable practices and consider product life extension. Over time, these mechanisms create a path for more sustainable choices, underscoring the influence of institutions and societal values on behavior and decision-making. Institutionalism, particularly 'neo-institutionalism,' provides valuable insights into the factors driving the adoption of Circular and repair-focused business models. Neo-institutional pressures can manifest through coercive regulatory initiatives or normative standards shaped by socio-cultural trends. The Right to Repair movement has emerged as a prominent and influential idea within academic discourse and sustainable development initiatives. Therefore, understanding how macro-level societal shifts toward the Circular Economy and the Right to Repair trigger firm-level responses is imperative. This study aims to answer a crucial question about the impact of European Right to Repair initiatives had on the financial and environmental performance of European consumer electronics companies at the firm level. A quantitative and statistical research design will be employed. The study will encompass an extensive sample of consumer electronics firms in Northern and Western Europe, analyzing their financial and environmental performance in relation to the implementation of Right to Repair mechanisms. The study's findings are expected to provide valuable insights into the broader implications of the Right to Repair and Circular Economy initiatives on the European consumer electronics industry.Keywords: circular economy, right to repair, institutionalism, environmental management, european union
Procedia PDF Downloads 7923 Speeding Up Lenia: A Comparative Study Between Existing Implementations and CUDA C++ with OpenGL Interop
Authors: L. Diogo, A. Legrand, J. Nguyen-Cao, J. Rogeau, S. Bornhofen
Abstract:
Lenia is a system of cellular automata with continuous states, space and time, which surprises not only with the emergence of interesting life-like structures but also with its beauty. This paper reports ongoing research on a GPU implementation of Lenia using CUDA C++ and OpenGL Interoperability. We demonstrate how CUDA as a low-level GPU programming paradigm allows optimizing performance and memory usage of the Lenia algorithm. A comparative analysis through experimental runs with existing implementations shows that the CUDA implementation outperforms the others by one order of magnitude or more. Cellular automata hold significant interest due to their ability to model complex phenomena in systems with simple rules and structures. They allow exploring emergent behavior such as self-organization and adaptation, and find applications in various fields, including computer science, physics, biology, and sociology. Unlike classic cellular automata which rely on discrete cells and values, Lenia generalizes the concept of cellular automata to continuous space, time and states, thus providing additional fluidity and richness in emerging phenomena. In the current literature, there are many implementations of Lenia utilizing various programming languages and visualization libraries. However, each implementation also presents certain drawbacks, which serve as motivation for further research and development. In particular, speed is a critical factor when studying Lenia, for several reasons. Rapid simulation allows researchers to observe the emergence of patterns and behaviors in more configurations, on bigger grids and over longer periods without annoying waiting times. Thereby, they enable the exploration and discovery of new species within the Lenia ecosystem more efficiently. Moreover, faster simulations are beneficial when we include additional time-consuming algorithms such as computer vision or machine learning to evolve and optimize specific Lenia configurations. We developed a Lenia implementation for GPU using the C++ and CUDA programming languages, and CUDA/OpenGL Interoperability for immediate rendering. The goal of our experiment is to benchmark this implementation compared to the existing ones in terms of speed, memory usage, configurability and scalability. In our comparison we focus on the most important Lenia implementations, selected for their prominence, accessibility and widespread use in the scientific community. The implementations include MATLAB, JavaScript, ShaderToy GLSL, Jupyter, Rust and R. The list is not exhaustive but provides a broad view of the principal current approaches and their respective strengths and weaknesses. Our comparison primarily considers computational performance and memory efficiency, as these factors are critical for large-scale simulations, but we also investigate the ease of use and configurability. The experimental runs conducted so far demonstrate that the CUDA C++ implementation outperforms the other implementations by one order of magnitude or more. The benefits of using the GPU become apparent especially with larger grids and convolution kernels. However, our research is still ongoing. We are currently exploring the impact of several software design choices and optimization techniques, such as convolution with Fast Fourier Transforms (FFT), various GPU memory management scenarios, and the trade-off between speed and accuracy using single versus double precision floating point arithmetic. The results will give valuable insights into the practice of parallel programming of the Lenia algorithm, and all conclusions will be thoroughly presented in the conference paper. The final version of our CUDA C++ implementation will be published on github and made freely accessible to the Alife community for further development.Keywords: artificial life, cellular automaton, GPU optimization, Lenia, comparative analysis.
Procedia PDF Downloads 4022 Consecration from the Margins: El Anatsui in Venice and the Turbine Hall
Authors: Jonathan Adeyemi
Abstract:
Context: This study focuses on El Anatsui and his global acclaim in the art world despite his origins from the global artworld’s margins. It addresses the disparities in the treatment between Western and non-Western artists and questions whether Anatsui’s consecration is a result of exoticism or the growing consensus on decolonization. Research Aim: The aim of this study is to investigate how El Anatsui achieved global acclaim from the margins of the art world and determine if his consecration represents a mark of decolonization or the typical Western desire for exoticism. Methodology: The study utilizes a case study approach, literature analysis, and in-depth interviews. The artist, the organizers of the Venice Biennale, the relevant curators at Tate Modern London, and the October Gallery in London, and other galleries in Nigeria, which represent the artist were interviewed for data collection. Findings: The study seeks to determine the authenticity of the growing consensus on decolonization, inclusion, and diversity in the global artistic field. Preliminary findings show that domestic socio-economic and political factors debilitated the mechanisms for local validation in Nigeria, weakening the domestic foundation for international engagement. However, alternative systems of exhibition, especially in London and the USA contributed critically to providing the initial international visibility, which formed the foundation for his global acclaim. Out of the 21 winners of the Golden Lion for Lifetime Achievement since its inception at the 47th Venice Biennale in 1997, American artists have dominated with 10 recipients, 8 recipients from Europe, 2 recipients from Africa (2007 and 2015) and 1 from Asia. This aligns with Bourdieu’s concept of cultural and economic capital, which prevented Africa countries from participation until recently. Moreover, while the average age of recipients is 76 years, Anatsui received the award at the age of 71, while Malick Sidibé (Mali) was awarded at 72. Thus, the Venice Biennale award for El Anatsui incline more towards a commitment to decolonisation than exoticism. Theoretical Importance: This study contributes to the field by examining the dynamics of the art world's monopoly of legitimation and the role of national, ethnicity and cultural differences in the promotion of artists. It aims to challenge the Westernized hierarchy of valorization and consecration in the art world. The research supports Bourdieu’s artistic field theory, which emphasises the importance of cultural, economic and social capital in determining agents’ position and access to the field resources (symbolic capital). Bourdieu also established that dominated agents can change their position in the field’s hierarchy either by establishing or navigating alternative systems. Data Collection and Analysis Procedures: The opacity of art world’s operations places the required information within the purview of the insiders (agents). Thus, the study collects data through in-depth interviews with relevant and purposively selected individuals and organizations. The data was/will be analyzed using qualitative methods, such as thematic analysis and content analysis. The interpretive analytical approach adopted facilitated the construction of meanings that may not be apparent in the data or responses. Questions Addressed: The study addresses how El Anatsui achieved global acclaim despite being from the margins, whether his consecration represents decolonization or exoticism, and the extent to which the global artistic field embraces decolonization, inclusion, and diversity. Conclusion: The study will contribute to knowledge by providing insights into the extent of commitment to decolonization, inclusion, and diversity in the global artistic field. It also shed light on the mechanisms behind El Anatsui's rise to global acclaim and challenge Western-dominated artistic hierarchies.Keywords: decolonisation, exorticism, artistic field, culture game
Procedia PDF Downloads 5921 Unidentified Remains with Extensive Bone Disease without a Clear Diagnosis
Authors: Patricia Shirley Almeida Prado, Selma Paixão Argollo, Maria De Fátima Teixeira Guimarães, Leticia Matos Sobrinho
Abstract:
Skeletal differential diagnosis is essential in forensic anthropology in order to differentiate skeletal trauma from normal osseous variation and pathological processes. Thus, part of forensic anthropological field is differentiate skeletal criminal injuries from the normal skeletal variation (bone fusion or nonunion, transitional vertebrae and other non-metric traits), non-traumatic skeletal pathology (myositis ossificans, arthritis, bone metastasis, osteomyelitis) from traumatic skeletal pathology (myositis ossificans traumatic) avoiding misdiagnosis. This case shows the importance of effective pathological diagnosis in order to accelerate the identification process of skeletonized human remains. THE CASE: An unidentified skeletal remains at the medico legal institute Nina Rodrigues-Salvador, of a male young adult (29 to 40 years estimated) showing a massive heterotopic ossification on its right tibia at upper epiphysis and adjacent articular femur surface; an extensive ossification on the right clavicle (at the sternal extremity) also presenting an heterotopic ossification at right scapulae (upper third of scapulae lateral margin and infraglenoid tubercule) and at the head of right humerus at the shoulder joint area. Curiously, this case also shows an unusual porosity in certain vertebrae´s body and in some tarsal and carpal bones. Likewise, his left fifth metacarpal bones (right and left) showed a healed fracture which led both bones distorted. Based on identification, of pathological conditions in human skeletal remains literature and protocols these alterations can be misdiagnosed and this skeleton may present more than one pathological process. The anthropological forensic lab at Medico-legal Institute Nina Rodrigues in Salvador (Brazil) adopts international protocols to ancestry, sex, age and stature estimations, also implemented well-established conventions to identify pathological disease and skeletal alterations. The most compatible diagnosis for this case is hematogenous osteomyelitis due to following findings: 1: the healed fracture pattern at the clavicle showing a cloaca which is a pathognomonic for osteomyelitis; 2: the metacarpals healed fracture does not present cloaca although they developed a periosteal formation. 3: the superior articular surface of the right tibia shows an extensive inflammatory healing process that extends to adjacent femur articular surface showing some cloaca at tibia bone disease. 4: the uncommon porosities may result from hematogenous infectious process. The fractures probably have occurred in a different moments based on the healing process; the tibia injury is more extensive and has not been reorganized, while metacarpals and clavicle fracture is properly healed. We suggest that the clavicle and tibia´s fractures were infected by an existing infectious disease (syphilis, tuberculosis, brucellosis) or an existing syndrome (Gorham’s disease), which led to the development of osteomyelitis. This hypothesis is supported by the fact that different bones are affected in diverse levels. Like the metacarpals that do not show the cloaca, but then a periosteal new bone formation; then the unusual porosities do not show a classical osteoarthritic processes findings as the marginal osteophyte, pitting and new bone formation, they just show an erosive process without bone formation or osteophyte. To confirm and prove our hypothesis we are working on different clinical approaches like DNA, histopathology and other image exams to find the correct diagnostic.Keywords: bone disease, forensic anthropology, hematogenous osteomyelitis, human identification, human remains
Procedia PDF Downloads 32420 Exploring Symptoms, Causes and Treatments of Feline Pruritus Using Thematic Analysis of Pet Owner Social Media Posts
Authors: Sitira Williams, Georgina Cherry, Andrea Wright, Kevin Wells, Taran Rai, Richard Brown, Travis Street, Alasdair Cook
Abstract:
Social media sources (50) were identified, keywords defined by veterinarians and organised into 6 topics known to be indicative of feline pruritus: body areas, behaviors, symptoms, diagnosis, and treatments. These were augmented using academic literature, a cat owner survey, synonyms, and Google Trends. The content was collected using a social intelligence solution, with keywords tagged and filtered. Data were aggregated and de-duplicated. SL content matching body areas, behaviors and symptoms were reviewed manually, and posts were marked relevant if: posted by a pet owner, identifying an itchy cat and not duplicated. A sub-set of 493 posts published from 2009-2022 was used for reflexive thematic analysis in NVIVO (Burlington, MA) to identify themes. Five themes were identified: allergy, pruritus, additional behaviors, unusual or undesirable behaviors, diagnosis, and treatment. Most (258) posts reported the cat was excessively licking, itching, and scratching. The majority were indoor cats and were less playful and friendly when itchy. Half of these posts did not indicate a known cause of pruritus. Bald spots and scabs (123) were reported, often causing swelling and fur loss, and 56 reported bumps, lumps, and dry patches. Other impacts on the cat’s quality of life were ear mites, cat self-trauma and stress. Seven posts reported their cats’ symptoms caused them ongoing anxiety and depression. Cats with food allergies to poultry (often chicken and beef) causing bald spots featured in 23 posts. Veterinarians advised switching to a raw food diet and/or changing their bowls. Some cats got worse after switching, leaving owners’ needs unmet. Allergic reactions to flea bites causing excessive itching, red spots, scabs, and fur loss were reported in 13 posts. Some (3) posts indicated allergic reactions to medication. Cats with seasonal and skin allergies, causing sneezing, scratching, headshaking, watery eyes, and nasal discharge, were reported 17 times. Eighty-five posts identified additional behaviors. Of these, 13 reported their cat’s burst pimple or insect bite. Common behaviors were headshaking, rubbing, pawing at their ears, and aggressively chewing. In some cases, bites or pimples triggered previously unseen itchiness, making the cat irritable. Twenty-four reported their cat had anxiety: overgrooming, itching, losing fur, hiding, freaking out, breathing quickly, sleeplessness, hissing and vocalising. Most reported these cats as having itchy skin, fleas, and bumps. Cats were commonly diagnosed with an ear infection, ringworm, acne, or kidney disease. Acne was diagnosed in cats with an allergy flare-up or overgrooming. Ear infections were diagnosed in itchy cats with mites or other parasites. Of the treatments mentioned, steroids were most frequently used, then anti-parasitics, including flea treatments and oral medication (steroids, antibiotics). Forty-six posts reported distress following poor outcomes after medication or additional vet consultations. SL provides veterinarians with unique insights. Verbatim comments highlight the detrimental effects of pruritus on pets and owner quality of life. This study demonstrates the need for veterinarians to communicate management and treatment options more effectively to relieve owner frustrations. Data analysis could be scaled up using machine learning for topic modeling.Keywords: content analysis, feline, itch, pruritus, social media, thematic analysis, veterinary dermatology
Procedia PDF Downloads 18819 Using the UK as a Case Study to Assess the Current State of Large Woody Debris Restoration as a Tool for Improving the Ecological Status of Natural Watercourses Globally
Authors: Isabelle Barrett
Abstract:
Natural watercourses provide a range of vital ecosystem services, notably freshwater provision. They also offer highly heterogeneous habitat which supports an extreme diversity of aquatic life. Exploitation of rivers, changing land use and flood prevention measures have led to habitat degradation and subsequent biodiversity loss; indeed, freshwater species currently face a disproportionate rate of extinction compared to their terrestrial and marine counterparts. Large woody debris (LWD) encompasses the trees, large branches and logs which fall into watercourses, and is responsible for important habitat characteristics. Historically, natural LWD has been removed from streams under the assumption that it is not aesthetically pleasing and is thus ecologically unfavourable, despite extensive evidence contradicting this. Restoration efforts aim to replace lost LWD in order to reinstate habitat heterogeneity. This paper aims to assess the current state of such restoration schemes for improving fluvial ecological health in the UK. A detailed review of the scientific literature was conducted alongside a meta-analysis of 25 UK-based projects involving LWD restoration. Projects were chosen for which sufficient information was attainable for analysis, covering a broad range of budgets and scales. The most effective strategies for river restoration encompass ecological success, stakeholder engagement and scientific advancement, however few projects surveyed showed sensitivity to all three; for example, only 32% of projects stated biological aims. Focus tended to be on stakeholder engagement and public approval, since this is often a key funding driver. Consequently, there is a tendency to focus on the aesthetic outcomes of a project, however physical habitat restoration does not necessarily lead to direct biodiversity increases. This highlights the significance of rivers as highly heterogeneous environments with multiple interlinked processes, and emphasises a need for a stronger scientific presence in project planning. Poor scientific rigour means monitoring is often lacking, with varying, if any, definitions of success which are rarely pre-determined. A tendency to overlook negative or neutral results was apparent, with unjustified focus often put on qualitative results. The temporal scale of monitoring is typically inadequate to facilitate scientific conclusions, with only 20% of projects surveyed reporting any pre-restoration monitoring. Furthermore, monitoring is often limited to a few variables, with biotic monitoring often fish-focussed. Due to their longer life cycles and dispersal capability, fish are usually poor indicators of environmental change, making it difficult to attribute any changes in ecological health to restoration efforts. Although the potential impact of LWD restoration may be positive, this method of restoration could simply be making short-term, small-scale improvements; without addressing the underlying symptoms of degradation, for example water quality, the issue cannot be fully resolved. Promotion of standardised monitoring for LWD projects could help establish a deeper understanding of the ecology surrounding the practice, supporting movement towards adaptive management in which scientific evidence feeds back to practitioners, enabling the design of more efficient projects with greater ecological success. By highlighting LWD, this study hopes to address the difficulties faced within river management, and emphasise the need for a more holistic international and inter-institutional approach to tackling problems associated with degradation.Keywords: biological monitoring, ecological health, large woody debris, river management, river restoration
Procedia PDF Downloads 21518 Development of a Core Set of Clinical Indicators to Measure Quality of Care for Thyroid Cancer: A Modified-Delphi Approach
Authors: Liane J. Ioannou, Jonathan Serpell, Cino Bendinelli, David Walters, Jenny Gough, Dean Lisewski, Win Meyer-Rochow, Julie Miller, Duncan Topliss, Bill Fleming, Stephen Farrell, Andrew Kiu, James Kollias, Mark Sywak, Adam Aniss, Linda Fenton, Danielle Ghusn, Simon Harper, Aleksandra Popadich, Kate Stringer, David Watters, Susannah Ahern
Abstract:
BACKGROUND: There are significant variations in the management, treatment and outcomes of thyroid cancer, particularly in the role of: diagnostic investigation and pre-treatment scanning; optimal extent of surgery (total or hemi-thyroidectomy); use of active surveillance for small low-risk cancers; central lymph node dissections (therapeutic or prophylactic); outcomes following surgery (e.g. recurrent laryngeal nerve palsy, hypocalcaemia, hypoparathyroidism); post-surgical hormone, calcium and vitamin D therapy; and provision and dosage of radioactive iodine treatment. A proven strategy to reduce variations in the outcome and to improve survival is to measure and compare it using high-quality clinical registry data. Clinical registries provide the most effective means of collecting high-quality data and are a tool for quality improvement. Where they have been introduced at a state or national level, registries have become one of the most clinically valued tools for quality improvement. To benchmark clinical care, clinical quality registries require systematic measurement at predefined intervals and the capacity to report back information to participating clinical units. OBJECTIVE: The aim of this study was to develop a core set clinical indicators that enable measurement and reporting of quality of care for patients with thyroid cancer. We hypothesise that measuring clinical quality indicators, developed to identify differences in quality of care across sites, will reduce variation and improve patient outcomes and survival, thereby lessening costs and healthcare burden to the Australian community. METHOD: Preparatory work and scoping was conducted to identify existing high quality, clinical guidelines and best practice for thyroid cancer both nationally and internationally, as well as relevant literature. A bi-national panel was invited to participate in a modified Delphi process. Panelists were asked to rate each proposed indicator on a Likert scale of 1–9 in a three-round iterative process. RESULTS: A total of 236 potential quality indicators were identified. One hundred and ninety-two indicators were removed to reflect the data capture by the Australian and New Zealand Thyroid Cancer Registry (ANZTCR) (from diagnosis to 90-days post-surgery). The remaining 44 indicators were presented to the panelists for voting. A further 21 indicators were later added by the panelists bringing the total potential quality indicators to 65. Of these, 21 were considered the most important and feasible indicators to measure quality of care in thyroid cancer, of which 12 were recommended for inclusion in the final set. The consensus indicator set spans the spectrum of care, including: preoperative; surgery; surgical complications; staging and post-surgical treatment planning; and post-surgical treatment. CONCLUSIONS: This study provides a core set of quality indicators to measure quality of care in thyroid cancer. This indicator set can be applied as a tool for internal quality improvement, comparative quality reporting, public reporting and research. Inclusion of these quality indicators into monitoring databases such as clinical quality registries will enable opportunities for benchmarking and feedback on best practice care to clinicians involved in the management of thyroid cancer.Keywords: clinical registry, Delphi survey, quality indicators, quality of care
Procedia PDF Downloads 179