Search results for: combined active and reactive market
131 Regenerating Habitats. A Housing Based on Modular Wooden Systems
Authors: Rui Pedro de Sousa Guimarães Ferreira, Carlos Alberto Maia Domínguez
Abstract:
Despite the ambitions to achieve climate neutrality by 2050, to fulfill the Paris Agreement's goals, the building and construction sector remains one of the most resource-intensive and greenhouse gas-emitting industries in the world, accounting for 40% of worldwide CO ₂ emissions. Over the past few decades, globalization and population growth have led to an exponential rise in demand in the housing market and, by extension, in the building industry. Considering this housing crisis, it is obvious that we will not stop building in the near future. However, the transition, which has already started, is challenging and complex because it calls for the worldwide participation of numerous organizations in altering how building systems, which have been a part of our everyday existence for over a century, are used. Wood is one of the alternatives that is most frequently used nowadays (under responsible forestry conditions) because of its physical qualities and, most importantly, because it produces fewer carbon emissions during manufacturing than steel or concrete. Furthermore, as wood retains its capacity to store CO ₂ after application and throughout the life of the building, working as a natural carbon filter, it helps to reduce greenhouse gas emissions. After a century-long focus on other materials, in the last few decades, technological advancements have made it possible to innovate systems centered around the use of wood. However, there are still some questions that require further exploration. It is necessary to standardize production and manufacturing processes based on prefabrication and modularization principles to achieve greater precision and optimization of the solutions, decreasing building time, prices, and waste from raw materials. In addition, this approach will make it possible to develop new architectural solutions to solve the rigidity and irreversibility of buildings, two of the most important issues facing housing today. Most current models are still created as inflexible, fixed, monofunctional structures that discourage any kind of regeneration, based on matrices that sustain the conventional family's traditional model and are founded on rigid, impenetrable compartmentalization. Adaptability and flexibility in housing are, and always have been, necessities and key components of architecture. People today need to constantly adapt to their surroundings and themselves because of the fast-paced, disposable, and quickly obsolescent nature of modern items. Migrations on a global scale, different kinds of co-housing, or even personal changes are some of the new questions that buildings have to answer. Designing with the reversibility of construction systems and materials in mind not only allows for the concept of "looping" in construction, with environmental advantages that enable the development of a circular economy in the sector but also unleashes multiple social benefits. In this sense, it is imperative to develop prefabricated and modular construction systems able to address the formalization of a reversible proposition that adjusts to the scale of time and its multiple reformulations, many of which are unpredictable. We must allow buildings to change, grow, or shrink over their lifetime, respecting their nature and, finally, the nature of the people living in them. It´s the ability to anticipate the unexpected, adapt to social factors, and take account of demographic shifts in society to stabilize communities, the foundation of real innovative sustainability.Keywords: modular, timber, flexibility, housing
Procedia PDF Downloads 80130 Regulation of Cultural Relationship between Russia and Ukraine after Crimea’s Annexation: A Comparative Socio-Legal Study
Authors: Elena Sherstoboeva, Elena Karzanova
Abstract:
This paper explores the impact of the annexation of Crimea on the regulation of live performances and tour management of Russian pop music performers in Ukraine and of Ukrainian performers in Russia. Without a doubt, the cultural relationship between Russia and Ukraine is not limited to this issue. Yet concert markets tend to respond particularly rapidly to political, economic, and social changes, especially in Russia and Ukraine, where the high level of digital piracy means that the music businesses mainly depend upon income from performances rather than from digital rights sales. This paper argues that the rules formed in both countries after Russia’s annexation of Crimea in 2014 have contributed to the separation of a single cultural space that had existed in Soviet and Post-Soviet Russia and Ukraine before the annexation. These rules have also facilitated performers’ self-censorship and increased the politicisation of the music businesses in the two neighbouring countries. This study applies a comparative socio-legal approach to study Russian and Ukrainian live events and tour regulation. A qualitative analysis of Russian and Ukrainian national and intergovernmental legal frameworks is applied to examine formal regulations. Soviet and early post-Soviet laws and policies are also studied, but only to the extent that they help to track the changes in the Russian–Ukrainian cultural relationship. To identify and analyse the current informal rules, the study design includes in-depth semi-structured interviews with 30 live event or tour managers working in Russia and Ukraine. A case study is used to examine how the Eurovision Song Contest, an annual international competition, has played out within the Russian–Ukrainian conflict. The study suggests that modern Russian and Ukrainian frameworks for live events and tours have developed Soviet regulatory traditions when cultural policies served as a means of ideological control. At the same time, contemporary regulations mark a considerable perspective shift, as the previous rules have been aimed at maintaining close cultural connections between the Russian and Ukrainian nations. Instead of collaboration, their current frameworks mostly serve as forms of repression, implying that performers must choose only one national market in which to work. The regulatory instruments vary and often impose limitations that typically exist in non-democratic regimes to restrict foreign journalism, such as visa barriers or bans on entry. The more unexpected finding is that, in comparison with Russian law, Ukrainian regulations have created more obstacles to the organisation of live tours and performances by Russian artists in Ukraine. Yet this stems from commercial rather than political factors. This study predicts that the more economic challenges the Russian or Ukrainian music businesses face, the harsher the regulations will be regarding the organisation of live events or tours in the other country. This study recommends that international human rights organisations and non-governmental organisations develop and promote specific standards for artistic rights and freedoms, given the negative effects of the increasing politicisation of the entertainment business and cultural spheres to freedom of expression and cultural rights and pluralism.Keywords: annexation of Crimea, artistic freedom, censorship, cultural policy
Procedia PDF Downloads 118129 Mental Health of Caregivers in Public Hospital Intensive Care Department: A Multicentric Cross-Sectional Study
Authors: Lamia Bouzgarrou, Amira Omrane, Naima Bouatay, Chaima Harrathi, Samia Machroughl, Ahmed Mhalla
Abstract:
Background and Aims: Professionals of health care sector are exposed to psychosocial constraints like stress, harassment, violence, which can lead to many mental health problems such as, depression, addictive behavior, and burn-out. Moreover, it’s well established that caregivers affected to intensive care units are more likely to experience such constraints and mental health problems. For these caregivers, the mental health state may affect care quality and patient’s safety. This study aims either to identify occupational psychosocial constraints and their mental health consequences among paramedical and medical caregivers affected to intensive units in Tunisian public hospital. Methods: An exhaustive three months cross-sectional study conducted among medical and paramedical staffs of intensive care units in three Tunisian university hospitals. After informed consent collection, we evaluated work-related stress, workplace harassment, depression, anxious troubles, addictive behavior, and self-esteems through an anonymous self-completed inquiry form. Five validated questionnaires and scales were included in this form: Karasek's Job Content Questionnaire, Negative Acts Questionnaire, Rosenberg, Beck depression inventory and Hamilton Anxiety scale. Results: We included 129 intensive unit caregivers; with a mean age of 36.1 ± 1.1 years and a sex ratio of 0.58. Among these caregivers, 30% were specialist or under-specialization doctors. The average seniority in the intensive care was 6.1 ± 1.2 (extremes=1 to 40 years). Atypical working schedules were noted among 36.7% of the subjects with an imposed choice in 52.4% of cases. During the last 12 months preceding the survey, 51.7% of care workers were absent from work because of a health problem with stops exceeding 15 days in 11.7%. Job strain was objective among 15% of caregivers and 38.33% of them were victims of moral harassment. A low or very low self-esteem was noted among 40% of respondents. Moreover, active smoking was reported by 20% subjects, alcohol consumption by 13.3% and psychotropic substance use by 1.7% of them. According to Beck inventory and Hamilton Anxiety scale, we concluded that 61.7% of intensive care providers were depressed, with 'severe' depression in 13.3% of cases and 49.9% of them present anxious disorders. Multivariate analysis objective that, job strain was correlated with young age (p=0.005) and shorter work seniority (p=0.001). Workplace and moral harassment was more prevalent among females (p=0.009), under-specialization doctor (p=0.021), those affected to atypical schedules (p=0.008). Concerning depression, it was more prevalent among staff in job strain situation (p = 0.004), among smokers caregivers (p = 0.048), and those with no leisure activity (p < 0.001). Anxious disorders were positively correlated to chronic diseases history (p = 0.001) and work-bullying exposure (p = 0.004). Conclusions: Our findings reflected a high frequency of caregivers who are under stress at work and those who are victims of moral harassment. These health professionals were at increased risk for developing psychiatric illness such depressive and anxious disorders and addictive behavior. Our results suggest the necessity of preventive strategies of occupational psychosocial constraints in order to preserve professional’s mental health and maximize patient safety and quality of care.Keywords: health care sector, intensive care units, mental health, psychosocial constraints
Procedia PDF Downloads 156128 A Review on Cyberchondria Based on Bibliometric Analysis
Authors: Xiaoqing Peng, Aijing Luo, Yang Chen
Abstract:
Background: Cyberchondria, as an "emerging risk" accompanied by the information era, is a new abnormal pattern characterized by excessive or repeated online searches for health-related information and escalating health anxiety, which endangers people's physical and mental health and poses a huge threat to public health. Objective: To explore and discuss the research status, hotspots and trends of Cyberchondria. Methods: Based on a total of 77 articles regarding "Cyberchondria" extracted from Web of Science from the beginning till October 2019, the literature trends, countries, institutions, hotspots are analyzed by bibliometric analysis, the concept definition of Cyberchondria, instruments, relevant factors, treatment and intervention are discussed as well. Results: Since "Cyberchondria" was put forward for the first time in 2001, the last two decades witnessed a noticeable increase in the amount of literature, especially during 2014-2019, it quadrupled dramatically at 62 compared with that before 2014 only at 15, which shows that Cyberchondria has become a new theme and hot topic in recent years. The United States was the most active contributor with the largest publication (23), followed by England (11) and Australia (11), while the leading institutions were Baylor University(7) and University of Sydney(7), followed by Florida State University(4) and University of Manchester(4). The WoS categories "Psychiatry/Psychology " and "Computer/ Information Science "were the areas of greatest influence. The concept definition of Cyberchondria is not completely unified in the world, but it is generally considered as an abnormal behavioral pattern and emotional state and has been invoked to refer to the anxiety-amplifying effects of online health-related searches. The first and the most frequently cited scale for measuring the severity of Cyberchondria called “The Cyberchondria Severity Scale (CSS) ”was developed in 2014, which conceptualized Cyberchondria as a multidimensional construct consisting of compulsion, distress, excessiveness, reassurance, and mistrust of medical professionals which was proved to be not necessary for this construct later. Since then, the Brazilian, German, Turkish, Polish and Chinese versions were subsequently developed, improved and culturally adjusted, while CSS was optimized to a simplified version (CSS-12) in 2019, all of which should be worthy of further verification. The hotspots of Cyberchondria mainly focuses on relevant factors as follows: intolerance of uncertainty, anxiety sensitivity, obsessive-compulsive disorder, internet addition, abnormal illness behavior, Whiteley index, problematic internet use, trying to make clear the role played by “associated factors” and “anxiety-amplifying factors” in the development of Cyberchondria, to better understand the aetiological links and pathways in the relationships between hypochondriasis, health anxiety and online health-related searches. Although the treatment and intervention of Cyberchondria are still in the initial stage of exploration, there are kinds of meaningful attempts to seek effective strategies from different aspects such as online psychological treatment, network technology management, health information literacy improvement and public health service. Conclusion: Research on Cyberchondria is in its infancy but should be deserved more attention. A conceptual consensus on Cyberchondria, a refined assessment tool, prospective studies conducted in various populations, targeted treatments for it would be the main research direction in the near future.Keywords: cyberchondria, hypochondriasis, health anxiety, online health-related searches
Procedia PDF Downloads 124127 Celebrity Culture and Social Role of Celebrities in Türkiye during the 1990s: The Case of Türkiye, Newspaper, Radio, Televison (TGRT) Channel
Authors: Yelda Yenel, Orkut Acele
Abstract:
In a media-saturated world, celebrities have become ubiquitous figures, encountered both in public spaces and within the privacy of our homes, seamlessly integrating into daily life. From Alexander the Great to contemporary media personalities, the image of celebrity has persisted throughout history, manifesting in various forms and contexts. Over time, as the relationship between society and the market evolved, so too did the roles and behaviors of celebrities. These transformations offer insights into the cultural climate, revealing shifts in habits and worldviews. In Türkiye, the emergence of private television channels brought an influx of celebrities into everyday life, making them a pervasive part of daily routines. To understand modern celebrity culture, it is essential to examine the ideological functions of media within political, economic, and social contexts. Within this framework, celebrities serve as both reflections and creators of cultural values and, at times, act as intermediaries, offering insights into the society of their era. Starting its broadcasting life in 1992 with religious films and religious conversation, Türkiye Newspaper, Radio, Television channel (TGRT) later changed its appearance, slogan, and the celebrities it featured in response to the political atmosphere. Celebrities played a critical role in transforming from the existing slogan 'Peace has come to the screen' to 'Watch and see what will happen”. Celebrities hold significant roles in society, and their images are produced and circulated by various actors, including media organizations and public relations teams. Understanding these dynamics is crucial for analyzing their influence and impact. This study aims to explore Turkish society in the 1990s, focusing on TGRT and its visual and discursive characteristics regarding celebrity figures such as Seda Sayan. The first section examines the historical development of celebrity culture and its transformations, guided by the conceptual framework of celebrity studies. The complex and interconnected image of celebrity, as introduced by post-structuralist approaches, plays a fundamental role in making sense of existing relationships. This section traces the existence and functions of celebrities from antiquity to the present day. The second section explores the economic, social, and cultural contexts of 1990s Türkiye, focusing on the media landscape and visibility that became prominent in the neoliberal era following the 1980s. This section also discusses the political factors underlying TGRT's transformation, such as the 1997 military memorandum. The third section analyzes TGRT as a case study, focusing on its significance as an Islamic television channel and the shifts in its public image, categorized into two distinct periods. The channel’s programming, which aligned with Islamic teachings, and the celebrities who featured prominently during these periods became the public face of both TGRT and the broader society. In particular, the transition to a more 'secular' format during TGRT's second phase is analyzed, focusing on changes in celebrity attire and program formats. This study reveals that celebrities are used as indicators of ideology, benefiting from this instrumentalization by enhancing their own fame and reflecting the prevailing cultural hegemony in society.Keywords: celebrity culture, media, neoliberalism, TGRT
Procedia PDF Downloads 32126 An Initial Assessment of the Potential Contibution of 'Community Empowerment' to Mitigating the Drivers of Deforestation and Forest Degradation, in Giam Siak Kecil-Bukit Batu Biosphere Reserve
Authors: Arzyana Sunkar, Yanto Santosa, Siti Badriyah Rushayati
Abstract:
Indonesia has experienced annual forest fires that have rapidly destroyed and degraded its forests. Fires in the peat swamp forests of Riau Province, have set the stage for problems to worsen, this being the ecosystem most prone to fires (which are also the most difficult, to extinguish). Despite various efforts to curb deforestation, and forest degradation processes, severe forest fires are still occurring. To find an effective solution, the basic causes of the problems must be identified. It is therefore critical to have an in-depth understanding of the underlying causal factors that have contributed to deforestation and forest degradation as a whole, in order to attain reductions in their rates. An assessment of the drivers of deforestation and forest degradation was carried out, in order to design and implement measures that could slow these destructive processes. Research was conducted in Giam Siak Kecil–Bukit Batu Biosphere Reserve (GSKBB BR), in the Riau Province of Sumatera, Indonesia. A biosphere reserve was selected as the study site because such reserves aim to reconcile conservation with sustainable development. A biosphere reserve should promote a range of local human activities, together with development values that are in line spatially and economically with the area conservation values, through use of a zoning system. Moreover, GSKBB BR is an area with vast peatlands, and is experiencing forest fires annually. Various factors were analysed to assess the drivers of deforestation and forest degradation in GSKBB BR; data were collected from focus group discussions with stakeholders, key informant interviews with key stakeholders, field observation and a literature review. Landsat satellite imagery was used to map forest-cover changes for various periods. Analysis of landsat images, taken during the period 2010-2014, revealed that within the non-protected area of core zone, there was a trend towards decreasing peat swamp forest areas, increasing land clearance, and increasing areas of community oil-palm and rubber plantations. Fire was used for land clearing and most of the forest fires occurred in the most populous area (the transition area). The study found a relationship between the deforested/ degraded areas, and certain distance variables, i.e. distance from roads, villages and the borders between the core area and the buffer zone. The further the distance from the core area of the reserve, the higher was the degree of deforestation and forest degradation. Research findings suggested that agricultural expansion may be the direct cause of deforestation and forest degradation in the reserve, whereas socio-economic factors were the underlying driver of forest cover changes; such factors consisting of a combination of socio-cultural, infrastructural, technological, institutional (policy and governance), demographic (population pressure) and economic (market demand) considerations. These findings indicated that local factors/problems were the critical causes of deforestation and degradation in GSKBB BR. This research therefore concluded that reductions in deforestation and forest degradation in GSKBB BR could be achieved through ‘local actor’-tailored approaches such as community empowermentKeywords: Actor-led solution, community empowerment, drivers of deforestation and forest degradation, Giam Siak Kecil – Bukit Batu Biosphere Reserve
Procedia PDF Downloads 348125 High Purity Germanium Detector Characterization by Means of Monte Carlo Simulation through Application of Geant4 Toolkit
Authors: Milos Travar, Jovana Nikolov, Andrej Vranicar, Natasa Todorovic
Abstract:
Over the years, High Purity Germanium (HPGe) detectors proved to be an excellent practical tool and, as such, have established their today's wide use in low background γ-spectrometry. One of the advantages of gamma-ray spectrometry is its easy sample preparation as chemical processing and separation of the studied subject are not required. Thus, with a single measurement, one can simultaneously perform both qualitative and quantitative analysis. One of the most prominent features of HPGe detectors, besides their excellent efficiency, is their superior resolution. This feature virtually allows a researcher to perform a thorough analysis by discriminating photons of similar energies in the studied spectra where otherwise they would superimpose within a single-energy peak and, as such, could potentially scathe analysis and produce wrongly assessed results. Naturally, this feature is of great importance when the identification of radionuclides, as well as their activity concentrations, is being practiced where high precision comes as a necessity. In measurements of this nature, in order to be able to reproduce good and trustworthy results, one has to have initially performed an adequate full-energy peak (FEP) efficiency calibration of the used equipment. However, experimental determination of the response, i.e., efficiency curves for a given detector-sample configuration and its geometry, is not always easy and requires a certain set of reference calibration sources in order to account for and cover broader energy ranges of interest. With the goal of overcoming these difficulties, a lot of researches turned towards the application of different software toolkits that implement the Monte Carlo method (e.g., MCNP, FLUKA, PENELOPE, Geant4, etc.), as it has proven time and time again to be a very powerful tool. In the process of creating a reliable model, one has to have well-established and described specifications of the detector. Unfortunately, the documentation that manufacturers provide alongside the equipment is rarely sufficient enough for this purpose. Furthermore, certain parameters tend to evolve and change over time, especially with older equipment. Deterioration of these parameters consequently decreases the active volume of the crystal and can thus affect the efficiencies by a large margin if they are not properly taken into account. In this study, the optimisation method of two HPGe detectors through the implementation of the Geant4 toolkit developed by CERN is described, with the goal of further improving simulation accuracy in calculations of FEP efficiencies by investigating the influence of certain detector variables (e.g., crystal-to-window distance, dead layer thicknesses, inner crystal’s void dimensions, etc.). Detectors on which the optimisation procedures were carried out were a standard traditional co-axial extended range detector (XtRa HPGe, CANBERRA) and a broad energy range planar detector (BEGe, CANBERRA). Optimised models were verified through comparison with experimentally obtained data from measurements of a set of point-like radioactive sources. Acquired results of both detectors displayed good agreement with experimental data that falls under an average statistical uncertainty of ∼ 4.6% for XtRa and ∼ 1.8% for BEGe detector within the energy range of 59.4−1836.1 [keV] and 59.4−1212.9 [keV], respectively.Keywords: HPGe detector, γ spectrometry, efficiency, Geant4 simulation, Monte Carlo method
Procedia PDF Downloads 121124 Overview of Research Contexts about XR Technologies in Architectural Practice
Authors: Adeline Stals
Abstract:
The transformation of architectural design practices has been underway for almost forty years due to the development and democratization of computer technology. New and more efficient tools are constantly being proposed to architects, amplifying a technological wave that sometimes stimulates them, sometimes overwhelms them, depending essentially on their digital culture and the context (socio-economic, structural, organizational) in which they work on a daily basis. Our focus is on VR, AR, and MR technologies dedicated to architecture. The commercialization of affordable headsets like the Oculus Rift, the HTC Vive or more low-tech like the Google CardBoard, makes it more accessible to benefit from these technologies. In that regard, researchers report the growing interest of these tools for architects, given the new perspectives they open up in terms of workflow, representation, collaboration, and client’s involvement. However, studies rarely mention the consequences of the sample studied on results. Our research provides an overview of VR, AR, and MR researches among a corpus of papers selected from conferences and journals. A closer look at the sample of these research projects highlights the necessity to take into consideration the context of studies in order to develop tools truly dedicated to the real practices of specific architect profiles. This literature review formalizes milestones for future challenges to address. The methodology applied is based on a systematic review of two sources of publications. The first one is the Cumincad database, which regroups publications from conferences exclusively about digital in architecture. Additionally, the second part of the corpus is based on journal publications. Journals have been selected considering their ranking on Scimago. Among the journals in the predefined category ‘architecture’ and in Quartile 1 for 2018 (last update when consulted), we have retained the ones related to the architectural design process: Design Studies, CoDesign, Architectural Science Review, Frontiers of Architectural Research and Archnet-IJAR. Beside those journals, IJAC, not classified in the ‘architecture’ category, is selected by the author for its adequacy with architecture and computing. For all requests, the search terms were ‘virtual reality’, ‘augmented reality’, and ‘mixed reality’ in title and/or keywords for papers published between 2015 and 2019 (included). This frame time is defined considering the fast evolution of these technologies in the past few years. Accordingly, the systematic review covers 202 publications. The literature review on studies about XR technologies establishes the state of the art of the current situation. It highlights that studies are mostly based on experimental contexts with controlled conditions (pedagogical, e.g.) or on practices established in large architectural offices of international renown. However, few studies focus on the strategies and practices developed by offices of smaller size, which represent the largest part of the market. Indeed, a European survey studying the architectural profession in Europe in 2018 reveals that 99% of offices are composed of less than ten people, and 71% of only one person. The study also showed that the number of medium-sized offices is continuously decreasing in favour of smaller structures. In doing so, a frontier seems to remain between the worlds of research and practice, especially for the majority of small architectural practices having a modest use of technology. This paper constitutes a reference for the next step of the research and for further worldwide researches by facilitating their contextualization.Keywords: architectural design, literature review, SME, XR technologies
Procedia PDF Downloads 111123 Evaluation of the Biological Activity of New Antimicrobial and Biodegradable Textile Materials for Protective Equipment
Authors: Safa Ladhari, Alireza Saidi, Phuong Nguyen-Tri
Abstract:
During health crises, such as COVID-19, using disposable protective equipment (PEs) (masks, gowns, etc.) causes long-term problems, increasing the volume of hazardous waste that must be handled safely and expensively. Therefore, producing textiles for antimicrobial and reusable materials is highly desirable to decrease the use of disposable PEs that should be treated as hazardous waste. In addition, if these items are used regularly in the workplace or for daily activities by the public, they will most likely end up in household waste. Furthermore, they may pose a high risk of contagion to waste collection workers if contaminated. Therefore, to protect the whole population in times of sanitary crisis, it is necessary to equip these materials with tools that make them resilient to the challenges of carrying out daily activities without compromising public health and the environment and without depending on them external technologies and producers. In addition, the materials frequently used for EPs are plastics of petrochemical origin. The subject of the present work is replacing petroplastics with bioplastic since it offers better biodegradability. The chosen polymer is polyhydroxybutyrate (PHB), a family of polyhydroxyalkanoates synthesized by different bacteria. It has similar properties to conventional plastics. However, it is renewable, biocompatible, and has attractive barrier properties compared to other polyesters. These characteristics make it ideal for EP protection applications. The current research topic focuses on the preparation and rapid evaluation of the biological activity of nanotechnology-based antimicrobial agents to treat textile surfaces used for PE. This work will be carried out to provide antibacterial solutions that can be transferred to a workplace application in the fight against short-term biological risks. Three main objectives are proposed during this research topic: 1) the development of suitable methods for the deposition of antibacterial agents on the surface of textiles; 2) the development of a method for measuring the antibacterial activity of the prepared textiles and 3) the study of the biodegradability of the prepared textiles. The studied textile is a non-woven fabric based on a biodegradable polymer manufactured by the electrospinning method. Indeed, nanofibers are increasingly studied due to their unique characteristics, such as high surface-to-volume ratio, improved thermal, mechanical, and electrical properties, and confinement effects. The electrospun film will be surface modified by plasma treatment and then loaded with hybrid antibacterial silver and titanium dioxide nanoparticles by the dip-coating method. This work uses simple methods with emerging technologies to fabricate nanofibers with suitable size and morphology to be used as components for protective equipment. The antibacterial agents generally used are based on silver, zinc, copper, etc. However, to our knowledge, few researchers have used hybrid nanoparticles to ensure antibacterial activity with biodegradable polymers. Also, we will exploit visible light to improve the antibacterial effectiveness of the fabric, which differs from the traditional contact mode of killing bacteria and presents an innovation of active protective equipment. Finally, this work will allow for the innovation of new antibacterial textile materials through a simple and ecological method.Keywords: protective equipment, antibacterial textile materials, biodegradable polymer, electrospinning, hybrid antibacterial nanoparticles
Procedia PDF Downloads 82122 Medical Workforce Knowledge of Adrenaline (Epinephrine) Administration in Anaphylaxis in Adults Considerably Improved with Training in an UK Hospital from 2010 to 2017
Authors: Jan C. Droste, Justine Burns, Nithin Narayan
Abstract:
Introduction: Life-threatening detrimental effects of inappropriate adrenaline (epinephrine) administration, e.g., by giving the wrong dose, in the context of anaphylaxis management is well documented in the medical literature. Half of the fatal anaphylactic reactions in the UK are iatrogenic, and the median time to a cardio-respiratory arrest can be as short as 5 minutes. It is therefore imperative that hospital doctors of all grades have active and accurate knowledge of the correct route, site, and dosage of administration of adrenaline. Given this time constraint and the potential fatal outcome with inappropriate management of anaphylaxis, it is alarming that surveys over the last 15 years have repeatedly shown only a minority of doctors to have accurate knowledge of adrenaline administration as recommended by the UK Resuscitation Council guidelines (2008 updated 2012). This comparison of survey results of the medical workforce over several years in a small NHS District General Hospital was conducted in order to establish the effect of the employment of multiple educational methods regarding adrenaline administration in anaphylaxis in adults. Methods: Between 2010 and 2017, several education methods and tools were used to repeatedly inform the medical workforce (doctors and advanced clinical practitioners) in a single district general hospital regarding the treatment of anaphylaxis in adults. Whilst the senior staff remained largely the same cohort, junior staff had changed fully in every survey. Examples included: (i) Formal teaching -in Grand Rounds; during the junior doctors’ induction process; advanced life support courses (ii) In-situ simulation training performed by the clinical skills simulation team –several ad hoc sessions and one 3-day event in 2017 visiting 16 separate clinical areas performing an acute anaphylaxis scenario using actors- around 100 individuals from multi-disciplinary teams were involved (iii) Hospital-wide distribution of the simulation event via the Trust’s Simulation Newsletter (iv) Laminated algorithms were attached to the 'crash trolleys' (v) A short email 'alert' was sent to all medical staff 3 weeks prior to the survey detailing the emergency treatment of anaphylaxis (vi) In addition, the performance of the surveys themselves represented a teaching opportunity when gaps in knowledge could be addressed. Face to face surveys were carried out in 2010 ('pre-intervention), 2015, and 2017, in the latter two occasions including advanced clinical practitioners (ACP). All surveys consisted of convenience samples. If verbal consent to conduct the survey was obtained, the medical practitioners' answers were recorded immediately on a data collection sheet. Results: There was a sustained improvement in the knowledge of the medical workforce from 2010 to 2017: Answers improved regarding correct drug by 11% (84%, 95%, and 95%); the correct route by 20% (76%, 90%, and 96%); correct site by 40% (43%, 83%, and 83%) and the correct dose by 45% (27%, 54%, and 72%). Overall, knowledge of all components -correct drug, route, site, and dose-improved from 13% in 2010 to 62% in 2017. Conclusion: This survey comparison shows knowledge of the medical workforce regarding adrenaline administration for treatment of anaphylaxis in adults can be considerably improved by employing a variety of educational methods.Keywords: adrenaline, anaphylaxis, epinephrine, medical education, patient safety
Procedia PDF Downloads 129121 Experimental Study of the Antibacterial Activity and Modeling of Non-isothermal Crystallization Kinetics of Sintered Seashell Reinforced Poly(Lactic Acid) And Poly(Butylene Succinate) Biocomposites Planned for 3D Printing
Authors: Mohammed S. Razali, Kamel Khimeche, Dahah Hichem, Ammar Boudjellal, Djamel E. Kaderi, Nourddine Ramdani
Abstract:
The use of additive manufacturing technologies has revolutionized various aspects of our daily lives. In particular, 3D printing has greatly advanced biomedical applications. While fused filament fabrication (FFF) technologies have made it easy to produce or prototype various medical devices, it is crucial to minimize the risk of contamination. New materials with antibacterial properties, such as those containing compounded silver nanoparticles, have emerged on the market. In a previous study, we prepared a newly sintered seashell filler (SSh) from bio-based seashells found along the Mediterranean coast using a suitable heat treatment process. We then prepared a series of polylactic acid (PLA) and polybutylene succinate (PBS) biocomposites filled with these SSh particles using a melt mixing technique with a twin-screw extruder to use them as feedstock filaments for 3D printing. The study consisted of two parts: evaluating the antibacterial activity of newly prepared biocomposites made of PLA and PBS reinforced with a sintered seashell in the first part and experimental and modeling analysis of the non-isothermal crystallization kinetics of these biocomposites in the second part. In the first part, the bactericidal activity of the biocomposites against three different bacteria, including Gram-negative bacteria such as (E. coli and Pseudomonas aeruginosa), as well as Gram-positive bacteria such as (Staphylococcus aureus), was examined. The PLA-based biocomposite containing 20 wt.% of SSh particles exhibited an inhibition zone with radial diameters of 8mm and 6mm against E. coli and Pseudo. Au, respectively, while no bacterial activity was observed against Staphylococcus aureus. In the second part, the focus was on investigating the effect of the sintered seashell filler particles on the non-isothermal crystallization kinetics of PLA and PBS 3D-printing composite materials. The objective was to understand the impact of the filler particles on the crystallization mechanism of both PLA and PBS during the cooling process of a melt-extruded filament in (FFF) to manage the dimensional accuracy and mechanical properties of the final printed part. We conducted a non-isothermal melt crystallization kinetic study of a series of PLA-SS and PBS-SS composites using differential scanning calorimetry at various cooling rates. We analyzed the obtained kinetic data using different crystallization kinetic models such as modified Avrami, Ozawa, and Mo's methods. Dynamic mode describes the relative crystallinity as a function of temperature; it found that time half crystallinity (t1/2) of neat PLA decreased from 17 min to 7.3 min for PLA+5 SSh and the (t1/2) of virgin PBS was reduced from 3.5 min to 2.8 min for the composite containing 5wt.% of SSh. We found that the coated SS particles with stearic acid acted as nucleating agents and had a nucleation activity, as observed through polarized optical microscopy. Moreover, we evaluated the effective energy barrier of the non-isothermal crystallization process using the Iso conversional methods of Flynn-Wall-Ozawa (F-W-O) and Kissinger-Akahira-Sunose (K-A-S). The study provides significant insights into the crystallization behavior of PLA and PBS biocomposites.Keywords: avrami model, bio-based reinforcement, dsc, gram-negative bacteria, gram-positive bacteria, isoconversional methods, non-isothermal crystallization kinetics, poly(butylene succinate), poly(lactic acid), antbactirial activity
Procedia PDF Downloads 81120 Effect of Metarhizium robertsii in Rhipicephalus microplus hemocytes
Authors: Jessica P. Fiorotti, Maria C. Freitas, Caio J. B. Coutinho-Rodrigues, Mariana G. Camargo, Emily S. Mesquita, Amanda R. C. Corval, Ricardo O. B. Bitencourt, Allan F. Marciano, Diva D. Spadacci-Morena, Patricia S. Golo, Isabele C. Angelo, Vania R. E. P. Bittencourt
Abstract:
The bovine tick, Rhipicephalus microplus, is an arthropod of great importance in veterinary medicine leading to anemia, weight loss, animals' leather depreciation and also acting as a vector of many pathogens. In this way, the parasitism causes a loss of 3.24 billion dollars per year in Brazil. Knowingly, entomopathogenic fungi act as natural controller of some arthropods, acting mainly by active penetration through the cuticle. However, it can also act on the hemolymph and through the production of mycotoxins. Hemocytes are responsible for the cellular immune response and participate in the processes of phagocytosis, nodulation and encapsulation and may undergo changes when challenged by pathogens. The aim of the present study was to evaluate changes in R. microplus hemocytes after inoculation of Metarhizium robertsii using transmission electron microscopy. The isolate ARSEF 2575 and 200 engorged R. microplus females were used. The groups were divided into control, in which the females were inoculated with 5 μL of sterile distilled water solution and 0.1% Tween 80, and a group inoculated with 5 μL of fungal suspension at the concentration of 10⁷ conidia mL⁻¹. The experiment was performed in duplicate and each group contained 50 females. Twenty-four hours after fungal inoculation, hemolymph was collected through the cuticle dorsal surface perforation of the tick females. After collection, the hemolymph samples were centrifuged at 500 x g for 3 minutes at 4 °C, the plasma was discarded and the hemocyte pellet was resuspended in 50 μl PBS. The suspension material was fixed in 2% glutaraldehyde in Millonig buffer for three hours. After fixation, the material was centrifuged at 500 x g for 3 minutes, the supernatant was discarded and the cells were resuspended in a wash solution. Subsequently, the cells were post-fixed with 1% osmium tetroxide in phosphate buffer for one hour at room temperature and dehydrated in increasing concentrations of ethanol, and then embedded in Epon resin. The ultrathin sections were examined under the LEO EM 906E transmission electron microscopy at 80kV. The ultrastructural results revealed that.in control group, the cells were considered intact, in which the granulocytes were observed with granules of different electrodensities, intact mitochondria and cytoplasm without vacuolization. In addition, granulocytes showed plasma membrane projections similar to pseudopodia. Plasmatocytes presented as irregularly shaped cells, with the eccentric nucleus, agranular cytoplasm and some cells presented pseudopodia. Nevertheless, in the group exposed to the fungus, most of the cells presented in degeneration. The granulocytes found had fewer granules in the cytoplasm and more vacuoles. Plasmatocytes, after treatment, presented many vacuoles also in the cytoplasm and the lysosomes presented great amount of electrodense material in their interior. Thus, the results suggest that the fungus has a depressant action in the immune system of the tick, not only by the cell degranulation, but also suggesting that this leads to morphological changes in the hemocytes and may even trigger processes such as phagocytosis.Keywords: bovine tick, cellular defense, entomopathogenic fungi, immune response
Procedia PDF Downloads 189119 A Comparative Study of Efficacy and Safety of Salicylic Acid, Trichloroacetic Acid and Glycolic Acid in Various Facial Melanosis
Authors: Shivani Dhande, Sanjiv Choudhary, Adarshlata Singh
Abstract:
Introduction: Chemical peeling is a popular, relatively inexpensive day procedure and generally safe method for treatment of pigmentary skin disorders and for skin rejuvenation. Chemical peels are classified by the depth of action into superficial, medium, and deep peels.Various facial pigmentary conditions have significant impact on quality of life causing psychological stress, necessitating its safe and effective treatment.Aim & Objectives:To compare the efficacy of Salicylic acid, Trichloroaceticacid & Glycolic Acid in facial melanosis(melasma,photomelanosis& post acne pigmentation).To study the side effects of above mentioned peeling agents. Method and Materials:It was a randomized parallel control single blind study consisting of total of 36 cases, 12 cases each of melasma, photo melanosis and post acne pigmentation within age group 20-50 years having fitzpatrick’s skin type4. Woods lamp examination was done to confirm the type of melasma.Patients with keloidal tendency, active herpes infection or past history of hypersensitivity to salicylic acid, trichloroaceticand glycolic acid as well aspatients on systemic isotretinoin were excluded.Clinical photographs at the beginning of therapy and then serially, were taken to assess the clinical response. Prior to application a written informed consent was obtained. A post auricular test peel was performed. Patients were divided into 3 groups, containing 12 patients each of melasma, photomelanosis and post acnepigmentation.All the three peels SA peel 20% (done once in 2 weeks), GA peel 50% (done once in 3 weeks) and TCA 15% (done once in 3 weeks) were used with total six settings for each patient. Before application of peel patients were counseled to wash the face with soap and water. Then face was dried and cleaned with spirit and acetone to remove all cutaneous oils. GA, TCA, SA were applied with cotton buds/gauze withmild strokes. After a contact period off 5-10mins neutralization was done with cold water. Post peel topical sunscreen application was mandatory. MASI was used pre and post treatment to assess melasma. Investigator’s global improvement scale- overall hyperpigmentation (4-significant, 3-moderate, 2-mild, 1-minimal, 0-no change ) and Patient’s satisfaction grading scale (>70%- excellent response, 50-70%- good response, <50%- average response) was used to assess improvement in all the three facial melanosis.Results:In our study of 12 patients of melasma, 4 (33.33%)patients showed excellent results;3 (25%) with GAand 1(8.33%) of TCA.Good response was seen in 4 (33.33%) patients;1(8.33%) each for GA & SA and 2(16.66%) for TCA.Poor response was seen in 4(33.33%) patients;1(8.33%) for TCA and 3 (25%) for SA.Of 12 patients of photomelanosis, excellent resultswas seen in 3(25%)patients of TCA. Good response was seen in 4 (33.33%) patients, 1(8.33%) each of TCA &SA and 2(16.66%) of GA.Poor responsewas seen in 5(41.66%) patients;3 (25%) for SA and 2(16.66%) of GA.Of 12 patients of post acne pigmentation, excellent responsein 3 (25%) patients;2(16.66%) of SA and 1(8.33%) of TCA.Good responsewas seen in 5(41.66%) patients;2(16.66%) of SA and GA and1(8.33%) of TCA.Poor response was seen in 4 (33.33%) patients; 2 (16.66%) for SA and TCA both. No major side effects in the form of scarring or persistant pigmentation was seen. Transient blackening of skin with burning sensation was seen in cases treated with TCA and SA. Post procedural itching and redness was noted with GA peel. Conclusion- In our study GA(50%),TCA(15%) & SA(20%) peels showed excellent response in melasma, photomelanosis and post-acne pigmentation respectively.All the 3 peeling agents were well tolerated without any significant side-effects in the above specified concentrations.Keywords: facial melanosis, gycolic acid, salicylic acid, trichloroacetic acid
Procedia PDF Downloads 260118 Addressing the Gap in Health and Wellbeing Evidence for Urban Real Estate Brownfield Asset Management Social Needs and Impact Analysis Using Systems Mapping Approach
Authors: Kathy Pain, Nalumino Akakandelwa
Abstract:
The study explores the potential to fill a gap in health and wellbeing evidence for purposeful urban real estate asset management to make investment a powerful force for societal good. Part of a five-year programme investigating the root causes of unhealthy urban development funded by the United Kingdom Prevention Research Partnership (UKPRP), the study pilots the use of a systems mapping approach to identify drivers and barriers to the incorporation of health and wellbeing evidence in urban brownfield asset management decision-making. Urban real estate not only provides space for economic production but also contributes to the quality of life in the local community. Yet market approaches to urban land use have, until recently, insisted that neo-classical technology-driven efficient allocation of economic resources should inform acquisition, operational, and disposal decisions. Buildings in locations with declining economic performance have thus been abandoned, leading to urban decay. Property investors are recognising the inextricable connection between sustainable urban production and quality of life in local communities. The redevelopment and operation of brownfield assets recycle existing buildings, minimising embodied carbon emissions. It also retains established urban spaces with which local communities identify and regenerate places to create a sense of security, economic opportunity, social interaction, and quality of life. Social implications of urban real estate on health and wellbeing and increased adoption of benign sustainability guidance in urban production are driving the need to consider how they affect brownfield real estate asset management decisions. Interviews with real estate upstream decision-makers in the study, find that local social needs and impact analysis is becoming a commercial priority for large-scale urban real estate development projects. Evidence of the social value-added of proposed developments is increasingly considered essential to secure local community support and planning permissions, and to attract sustained inward long-term investment capital flows for urban projects. However, little is known about the contribution of population health and wellbeing to socially sustainable urban projects and the monetary value of the opportunity this presents to improve the urban environment for local communities. We report early findings from collaborations with two leading property companies managing major investments in brownfield urban assets in the UK to consider how the inclusion of health and wellbeing evidence in social valuation can inform perceptions of brownfield development social benefit for asset managers, local communities, public authorities and investors for the benefit of all parties. Using holistic case studies and systems mapping approaches, we explore complex relationships between public health considerations and asset management decisions in urban production. Findings indicate a strong real estate investment industry appetite and potential to include health as a vital component of sustainable real estate social value creation in asset management strategies.Keywords: brownfield urban assets, health and wellbeing, social needs and impact, social valuation, sustainable real estate, systems mapping
Procedia PDF Downloads 70117 Analyzing Spatio-Structural Impediments in the Urban Trafficscape of Kolkata, India
Authors: Teesta Dey
Abstract:
Integrated Transport development with proper traffic management leads to sustainable growth of any urban sphere. Appropriate mass transport planning is essential for the populous cities in third world countries like India. The exponential growth of motor vehicles with unplanned road network is now the common feature of major urban centres in India. Kolkata, the third largest mega city in India, is not an exception of it. The imbalance between demand and supply of unplanned transport services in this city is manifested in the high economic and environmental costs borne by the associated society. With the passage of time, the growth and extent of passenger demand for rapid urban transport has outstripped proper infrastructural planning and causes severe transport problems in the overall urban realm. Hence Kolkata stands out in the world as one of the most crisis-ridden metropolises. The urban transport crisis of this city involves severe traffic congestion, the disparity in mass transport services on changing peripheral land uses, route overlapping, lowering of travel speed and faulty implementation of governmental plans as mostly induced by rapid growth of private vehicles on limited road space with huge carbon footprint. Therefore the paper will critically analyze the extant road network pattern for improving regional connectivity and accessibility, assess the degree of congestion, identify the deviation from demand and supply balance and finally evaluate the emerging alternate transport options as promoted by the government. For this purpose, linear, nodal and spatial transport network have been assessed based on certain selected indices viz. Road Degree, Traffic Volume, Shimbel Index, Direct Bus Connectivity, Average Travel and Waiting Tine Indices, Route Variety, Service Frequency, Bus Intensity, Concentration Analysis, Delay Rate, Quality of Traffic Transmission, Lane Length Duration Index and Modal Mix. Total 20 Traffic Intersection Points (TIPs) have been selected for the measurement of nodal accessibility. Critical Congestion Zones (CCZs) are delineated based on one km buffer zones of each TIP for congestion pattern analysis. A total of 480 bus routes are assessed for identifying the deficiency in network planning. Apart from bus services, the combined effects of other mass and para transit modes, containing metro rail, auto, cab and ferry services, are also analyzed. Based on systematic random sampling method, a total of 1500 daily urban passengers’ perceptions were studied for checking the ground realities. The outcome of this research identifies the spatial disparity among the 15 boroughs of the city with severe route overlapping and congestion problem. North and Central Kolkata-based mass transport services exceed the transport strength of south and peripheral Kolkata. Faulty infrastructural condition, service inadequacy, economic loss and workers’ inefficiency are the most dominant reasons behind the defective mass transport network plan. Hence there is an urgent need to revive the extant road based mass transport system of this city by implementing a holistic management approach by upgrading traffic infrastructure, designing new roads, better cooperation among different mass transport agencies, better coordination of transport and changing land use policies, large increase in funding and finally general passengers’ awareness.Keywords: carbon footprint, critical congestion zones, direct bus connectivity, integrated transport development
Procedia PDF Downloads 273116 Application of the Pattern Method to Form the Stable Neural Structures in the Learning Process as a Way of Solving Modern Problems in Education
Authors: Liudmyla Vesper
Abstract:
The problems of modern education are large-scale and diverse. The aspirations of parents, teachers, and experts converge - everyone interested in growing up a generation of whole, well-educated persons. Both the family and society are expected in the future generation to be self-sufficient, desirable in the labor market, and capable of lifelong learning. Today's children have a powerful potential that is difficult to realize in the conditions of traditional school approaches. Focusing on STEM education in practice often ends with the simple use of computers and gadgets during class. "Science", "technology", "engineering" and "mathematics" are difficult to combine within school and university curricula, which have not changed much during the last 10 years. Solving the problems of modern education largely depends on teachers - innovators, teachers - practitioners who develop and implement effective educational methods and programs. Teachers who propose innovative pedagogical practices that allow students to master large-scale knowledge and apply it to the practical plane. Effective education considers the creation of stable neural structures during the learning process, which allow to preserve and increase knowledge throughout life. The author proposed a method of integrated lessons – cases based on the maths patterns for forming a holistic perception of the world. This method and program are scientifically substantiated and have more than 15 years of practical application experience in school and student classrooms. The first results of the practical application of the author's methodology and curriculum were announced at the International Conference "Teaching and Learning Strategies to Promote Elementary School Success", 2006, April 22-23, Yerevan, Armenia, IREX-administered 2004-2006 Multiple Component Education Project. This program is based on the concept of interdisciplinary connections and its implementation in the process of continuous learning. This allows students to save and increase knowledge throughout life according to a single pattern. The pattern principle stores information on different subjects according to one scheme (pattern), using long-term memory. This is how neural structures are created. The author also admits that a similar method can be successfully applied to the training of artificial intelligence neural networks. However, this assumption requires further research and verification. The educational method and program proposed by the author meet the modern requirements for education, which involves mastering various areas of knowledge, starting from an early age. This approach makes it possible to involve the child's cognitive potential as much as possible and direct it to the preservation and development of individual talents. According to the methodology, at the early stages of learning students understand the connection between school subjects (so-called "sciences" and "humanities") and in real life, apply the knowledge gained in practice. This approach allows students to realize their natural creative abilities and talents, which makes it easier to navigate professional choices and find their place in life.Keywords: science education, maths education, AI, neuroplasticity, innovative education problem, creativity development, modern education problem
Procedia PDF Downloads 63115 Efficacy and Safety of Sublingual Sufentanil for the Management of Acute Pain
Authors: Neil Singla, Derek Muse, Karen DiDonato, Pamela Palmer
Abstract:
Introduction: Pain is the most common reason people visit emergency rooms. Studies indicate however, that Emergency Department (ED) physicians often do not provide adequate analgesia to their patients as a result of gender and age bias, opiophobia and insufficient knowledge of and formal training in acute pain management. Novel classes of analgesics have recently been introduced, but many patients suffer from acute pain in settings where the availability of intravenous (IV) access may be limited, so there remains a clinical need for rapid-acting, potent analgesics that do not require an invasive route of delivery. A sublingual sufentanil tablet (SST), dispensed using a single-dose applicator, is in development for treatment of moderate-to-severe acute pain in a medically-supervised setting. Objective: The primary objective of this study was to demonstrate the repeat-dose efficacy, safety and tolerability of sufentanil 20 mcg and 30 mcg sublingual tablets compared to placebo for the management of acute pain as determined by the time-weighted sum of pain intensity differences (SPID) to baseline over the 12-hour study period (SPID12). Key secondary efficacy variables included SPID over the first hour (SPID1), Total pain relief over the 12-hour study period (TOTPAR12), time to perceived pain relief (PR) and time to meaningful PR. Safety variables consisted of adverse events (AE), vital signs, oxygen saturation and early termination. Methods: In this Phase 2, double-blind, dose-finding study, an equal number of male and female patients were randomly assigned in a 2:2:1 ratio to SST 20 mcg, SS 30 mcg or placebo, respectively, following bunionectomy. Study drug was dosed as needed, but not more frequently than hourly. Rescue medication was available as needed. The primary endpoint was the Summed Pain Intensity Difference to baseline over 12h (SPIDI2). Safety was assessed by continuous oxygen saturation monitoring and adverse event reporting. Results: 101 patients (51 Male/50 Female) were randomized, 100 received study treatment (intent-to-treat [ITT] population), and 91 completed the study. Reasons for early discontinuation were lack of efficacy (6), adverse events (2) and drug-dosing error (1). Mean age was 42.5 years. For the ITT population, SST 30 mcg was superior to placebo (p=0.003) for the SPID12. SPID12 scores in the active groups were superior for both male (ANOVA overall p-value =0.038) and female (ANOVA overall p-value=0.005) patients. Statistically significant differences in favour of sublingual sufentanil were also observed between the SST 30mcg and placebo group for SPID1(p<0.001), TOTPAR12(p=0.002), time to perceived PR (p=0.023) and time to meaningful PR (p=0.010). Nausea, vomiting and somnolence were more frequent in the sufentanil groups but there were no significant differences between treatment arms for the proportion of patients who prematurely terminated due to AE or inadequate analgesia. Conclusions: Sufentanil tablets dispensed sublingually using a single-dose applicator is in development for treatment of patients with moderate-to-severe acute pain in a medically-supervised setting where immediate IV access is limited. When administered sublingually, sufentanil’s pharmacokinetic profile and non-invasive delivery makes it a useful alternative to IM or IV dosing.Keywords: acute pain, pain management, sublingual, sufentanil
Procedia PDF Downloads 356114 Human-Carnivore Interaction: Patterns, Causes and Perceptions of Local Herders of Hoper Valley in Central Karakoram National Park, Pakistan
Authors: Saeed Abbas, Rahilla Tabassum, Haider Abbas, Babar Khan, Shahid Hussain, Muhammad Zafar Khan, Fazal Karim, Yawar Abbas, Rizwan Karim
Abstract:
Human–carnivore conflict is considered to be a major conservation and rural livelihood concern because many carnivore species have been heavily victimized due to elevated conflict levels with communities. Like other snow leopard range countries, this situation prevails in Pakistan, where WWF is currently working under Asia High Mountain Project (AHMP) in Gilgit-Baltistan of Pakistan. To mitigate such conflicts requires a firm understanding of grazing and predation pattern including human-carnivore interaction. For this purpose we conducted a survey in Hoper valley (one of the AHMP project sites in Pakistan), during August, 2013 through a questionnaire based survey and unstructured interviews covering 647 households, permanently residing in the project area out of the total 900 households. The valley, spread over 409 km2 between 36°7'46" N and 74°49'2"E, at 2900m asl in Karakoram ranges is considered to be one of an important habitat of snow leopard and associated prey species such as Himalayan ibex. The valley is home of 8100 Brusho people (ancient tribe of Northern Pakistan) dependent on agro-pastoral livelihoods including farming and livestock rearing. The total number of livestock reported were (N=15,481) out of which 8346 (53.91%) were sheep, 3546 (22.91%) goats, 2193 (14.16%) cows, 903 (5.83%) yaks, 508 (3.28%) bulls, 28 (0.18%) donkeys, 27 (0.17%) zo/zomo (cross breed of yak and cow), and 4 (0.03%) horses. 83 percent respondent (n=542 households) confirmed loss of their livestock during the last one year July, 2012 to June, 2013 which account for 2246 (14.51%) animals. The major reason of livestock loss include predation by large carnivores such as snow leopards and wolf (1710, 76.14%) followed by diseases (536, 23.86%). Of the total predation cases snow leopard is suspected to kill 1478 animals (86.43%). Among livestock sheep were found to be the major prey of snow leopard (810, 55%) followed by goats (484, 32.7%) cows (151, 10.21%), yaks (15, 1.015%), zo/zomo (7, 0.5%) and donkey (1, 0.07%). The reason for the mass depredation of sheep and goats is that they tend to browse on twigs of bushes and graze on soft grass near cliffs. They are also considered to be very active as compared to other species in moving quickly and covering more grazing area. This makes them more vulnerable to snow leopard attack. The majority (1283, 75%) of livestock killed by predators occurred during the warm season (May-September) in alpine and sub-alpine pastures and remaining (427, 25%) occurred in the winter season near settlements in valley. It was evident from the recent study that Snow leopard kills outside the pen were (1351, 79.76%) as compared to inside pen (359, 20.24%). Assessing the economic loss of livestock predation we found that the total loss of livestock predation in the study area is equal to PKR 11,230,000 (USD 105,797), which is about PRK 17, 357 (USD 163.51) per household per year. Economic loss incurred by the locals due to predation is quite significant where the average cash income per household per year is PKR 85,000 (USD 800.75).Keywords: carnivores, conflict, predation, livelihood, conservation, rural, snow leopard, livestock
Procedia PDF Downloads 347113 Embodied Empowerment: A Design Framework for Augmenting Human Agency in Assistive Technologies
Authors: Melina Kopke, Jelle Van Dijk
Abstract:
Persons with cognitive disabilities, such as Autism Spectrum Disorder (ASD) are often dependent on some form of professional support. Recent transformations in Dutch healthcare have spurred institutions to apply new, empowering methods and tools to enable their clients to cope (more) independently in daily life. Assistive Technologies (ATs) seem promising as empowering tools. While ATs can, functionally speaking, help people to perform certain activities without human assistance, we hold that, from a design-theoretical perspective, such technologies often fail to empower in a deeper sense. Most technologies serve either to prescribe or to monitor users’ actions, which in some sense objectifies them, rather than strengthening their agency. This paper proposes that theories of embodied interaction could help formulating a design vision in which interactive assistive devices augment, rather than replace, human agency and thereby add to a persons’ empowerment in daily life settings. It aims to close the gap between empowerment theory and the opportunities provided by assistive technologies, by showing how embodiment and empowerment theory can be applied in practice in the design of new, interactive assistive devices. Taking a Research-through-Design approach, we conducted a case study of designing to support independently living people with ASD with structuring daily activities. In three iterations we interlaced design action, active involvement and prototype evaluations with future end-users and healthcare professionals, and theoretical reflection. Our co-design sessions revealed the issue of handling daily activities being multidimensional. Not having the ability to self-manage one’s daily life has immense consequences on one’s self-image, and also has major effects on the relationship with professional caregivers. Over the course of the project relevant theoretical principles of both embodiment and empowerment theory together with user-insights, informed our design decisions. This resulted in a system of wireless light units that users can program as a reminder for tasks, but also to record and reflect on their actions. The iterative process helped to gradually refine and reframe our growing understanding of what it concretely means for a technology to empower a person in daily life. Drawing on the case study insights we propose a set of concrete design principles that together form what we call the embodied empowerment design framework. The framework includes four main principles: Enabling ‘reflection-in-action’; making information ‘publicly available’ in order to enable co-reflection and social coupling; enabling the implementation of shared reflections into an ‘endurable-external feedback loop’ embedded in the persons familiar ’lifeworld’; and nudging situated actions with self-created action-affordances. In essence, the framework aims for the self-development of a suitable routine, or ‘situated practice’, by building on a growing shared insight of what works for the person. The framework, we propose, may serve as a starting point for AT designers to create truly empowering interactive products. In a set of follow-up projects involving the participation of persons with ASD, Intellectual Disabilities, Dementia and Acquired Brain Injury, the framework will be applied, evaluated and further refined.Keywords: assistive technology, design, embodiment, empowerment
Procedia PDF Downloads 279112 Azadirachta indica Derived Protein Encapsulated Novel Guar Gum Nanocapsules against Colon Cancer
Authors: Suman Chaudhary, Rupinder K. Kanwar, Jagat R. Kanwar
Abstract:
Azadirachta indica, also known as Neem belonging to the mahogany family is actively gaining interest in the era of modern day medicine due to its extensive applications in homeopathic medicine such as Ayurveda and Unani. More than 140 phytochemicals have been extracted from neem leaves, seed, bark and flowers for agro-medicinal applications. Among the various components, neem leaf protein (NLP) is currently the most investigated active ingredient, due to its immunomodulatory activities against tumor growth. However, these therapeutic ingredients of neem are susceptible to degradation and cannot withstand the drastic pH changes under physiological environment, and therefore, there is an urgent need of an alternative strategy such as a nano-delivery system to exploit its medicinal benefits. This study hypothesizes that guar gum (GG) derived biodegradable nano-carrier based encapsulation of NLP will improve its stability, specificity and sensitivity, thus facilitating targeted anti-cancer therapeutics. GG is a galactomannan derived from the endosperm of the guar beans seeds. Synthesis of guar nanocapsules (NCs) was performed using nanoprecipitation technique where the GG was encapsulated with NLP. Preliminary experiments conducted to characterize the NCs confirmed spherical morphology with a narrow size distribution of 30-40 nm. Differential scanning colorimetric analysis (DSC) validated the stability of these NCs even at a temperature range of 50-60°C which was well within the physiological and storage conditions. Thermogravimetric (TGA) analysis indicated high decomposition temperature of these NCs ranging upto 350°C. Additionally, Fourier Transform Infrared spectroscopy (FTIR) and the SDS-PAGE data acquired confirmed the successful encapsulation of NLP in the NCs. The anti-cancerous therapeutic property of this NC was tested on colon cancer cells (caco-2) as they are one of the most prevalent form of cancer. These NCs (both NLP loaded and void) were also tested on human intestinal epithelial cells (FHs 74) cells to evaluate their effect on normal cells. Cytotoxicity evaluation of the NCs in the cell lines confirmed that the IC50 for NLP in FHs 74 cells was ~2 fold higher than in caco-2 cells, indicating that this nanoformulation system possessed biocompatible anti-cancerous properties Immunoconfocal microscopy analysis confirmed the time dependent internalization of the NCs within 6h. Recent findings performed using Annexin V and PI staining indicated a significant increase (p ≤ 0.001) in the early and late apoptotic cell population when treated with the NCs signifying the role of NLP in inducing apoptosis in caco-2 cells. This was further validated using Western blot, Polymerase chain reaction (PCR) and Fluorescence activated cell sorter (FACS) aided protein expressional analysis which presented a downregulation of survivin, an anti-apoptotic cell marker and upregulation of Bax/Bcl-2 ratio (pro-apoptotic indicator). Further, both the NLP NC and unencapsulated NLP treatment destabilized the mitochondrial membrane potential subsequently facilitating the release of the pro-apoptotic caspase cascade initiator, cytochrome-c. Future studies will be focused towards granting specificity to these NCs towards cancer cells, along with a comprehensive analysis of the anti-cancer potential of this naturally occurring compound in different cancer and in vivo animal models, will validate the clinical application of this unprecedented protein therapeutic.Keywords: anti-tumor, guar gum, nanocapsules, neem leaf protein
Procedia PDF Downloads 178111 Widely Diversified Macroeconomies in the Super-Long Run Casts a Doubt on Path-Independent Equilibrium Growth Model
Authors: Ichiro Takahashi
Abstract:
One of the major assumptions of mainstream macroeconomics is the path independence of capital stock. This paper challenges this assumption by employing an agent-based approach. The simulation results showed the existence of multiple "quasi-steady state" equilibria of the capital stock, which may cast serious doubt on the validity of the assumption. The finding would give a better understanding of many phenomena that involve hysteresis, including the causes of poverty. The "market-clearing view" has been widely shared among major schools of macroeconomics. They understand that the capital stock, the labor force, and technology, determine the "full-employment" equilibrium growth path and demand/supply shocks can move the economy away from the path only temporarily: the dichotomy between the short-run business cycles and the long-run equilibrium path. The view then implicitly assumes the long-run capital stock to be independent of how the economy has evolved. In contrast, "Old Keynesians" have recognized fluctuations in output as arising largely from fluctuations in real aggregate demand. It will then be an interesting question to ask if an agent-based macroeconomic model, which is known to have path dependence, can generate multiple full-employment equilibrium trajectories of the capital stock in the super-long run. If the answer is yes, the equilibrium level of capital stock, an important supply-side factor, would no longer be independent of the business cycle phenomenon. This paper attempts to answer the above question by using the agent-based macroeconomic model developed by Takahashi and Okada (2010). The model would serve this purpose well because it has neither population growth nor technology progress. The objective of the paper is twofold: (1) to explore the causes of long-term business cycle, and (2) to examine the super-long behaviors of the capital stock of full-employment economies. (1) The simulated behaviors of the key macroeconomic variables such as output, employment, real wages showed widely diversified macro-economies. They were often remarkably stable but exhibited both short-term and long-term fluctuations. The long-term fluctuations occur through the following two adjustments: the quantity and relative cost adjustments of capital stock. The first one is obvious and assumed by many business cycle theorists. The reduced aggregate demand lowers prices, which raises real wages, thereby decreasing the relative cost of capital stock with respect to labor. (2) The long-term business cycles/fluctuations were synthesized with the hysteresis of real wages, interest rates, and investments. In particular, a sequence of the simulation runs with a super-long simulation period generated a wide range of perfectly stable paths, many of which achieved full employment: all the macroeconomic trajectories, including capital stock, output, and employment, were perfectly horizontal over 100,000 periods. Moreover, the full-employment level of capital stock was influenced by the history of unemployment, which was itself path-dependent. Thus, an experience of severe unemployment in the past kept the real wage low, which discouraged a relatively costly investment in capital stock. Meanwhile, a history of good performance sometimes brought about a low capital stock due to a high-interest rate that was consistent with a strong investment.Keywords: agent-based macroeconomic model, business cycle, hysteresis, stability
Procedia PDF Downloads 211110 Geotechnical Challenges for the Use of Sand-sludge Mixtures in Covers for the Rehabilitation of Acid-Generating Mine Sites
Authors: Mamert Mbonimpa, Ousseynou Kanteye, Élysée Tshibangu Ngabu, Rachid Amrou, Abdelkabir Maqsoud, Tikou Belem
Abstract:
The management of mine wastes (waste rocks and tailings) containing sulphide minerals such as pyrite and pyrrhotite represents the main environmental challenge for the mining industry. Indeed, acid mine drainage (AMD) can be generated when these wastes are exposed to water and air. AMD is characterized by low pH and high concentrations of heavy metals, which are toxic to plants, animals, and humans. It affects the quality of the ecosystem through water and soil pollution. Different techniques involving soil materials can be used to control AMD generation, including impermeable covers (compacted clays) and oxygen barriers. The latter group includes covers with capillary barrier effects (CCBE), a multilayered cover that include the moisture retention layer playing the role of an oxygen barrier. Once AMD is produced at a mine site, it must be treated so that the final effluent at the mine site complies with regulations and can be discharged into the environment. Active neutralization with lime is one of the treatment methods used. This treatment produces sludge that is usually stored in sedimentation ponds. Other sludge management alternatives have been examined in recent years, including sludge co-disposal with tailings or waste rocks, disposal in underground mine excavations, and storage in technical landfill sites. Considering the ability of AMD neutralization sludge to maintain an alkaline to neutral pH for decades or even centuries, due to the excess alkalinity induced by residual lime within the sludge, valorization of sludge in specific applications could be an interesting management option. If done efficiently, the reuse of sludge could free up storage ponds and thus reduce the environmental impact. It should be noted that mixtures of sludge and soils could potentially constitute usable materials in CCBE for the rehabilitation of acid-generating mine sites, while sludge alone is not suitable for this purpose. The high sludge water content (up to 300%), even after sedimentation, can, however, constitute a geotechnical challenge. Adding lime to the mixtures can reduce the water content and improve the geotechnical properties. The objective of this paper is to investigate the impact of the sludge content (30, 40 and 50%) in sand-sludge mixtures (SSM) on their hydrogeotechnical properties (compaction, shrinkage behaviour, saturated hydraulic conductivity, and water retention curve). The impact of lime addition (dosages from 2% to 6%) on the moisture content, dry density after compaction and saturated hydraulic conductivity of SSM was also investigated. Results showed that sludge adding to sand significantly improves the saturated hydraulic conductivity and water retention capacity, but the shrinkage increased with sludge content. The dry density after compaction of lime-treated SSM increases with the lime dosage but remains lower than the optimal dry density of the untreated mixtures. The saturated hydraulic conductivity of lime-treated SSM after 24 hours of cure decreases by 3 orders of magnitude. Considering the hydrogeotechnical properties obtained with these mixtures, it would be possible to design CCBE whose moisture retention layer is made of SSM. Physical laboratory models confirmed the performance of such CCBE.Keywords: mine waste, AMD neutralization sludge, sand-sludge mixture, hydrogeotechnical properties, mine site reclamation, CCBE
Procedia PDF Downloads 54109 Miniaturizing the Volumetric Titration of Free Nitric Acid in U(vi) Solutions: On the Lookout for a More Sustainable Process Radioanalytical Chemistry through Titration-On-A-Chip
Authors: Jose Neri, Fabrice Canto, Alastair Magnaldo, Laurent Guillerme, Vincent Dugas
Abstract:
A miniaturized and automated approach for the volumetric titration of free nitric acid in U(VI) solutions is presented. Free acidity measurement refers to the acidity quantification in solutions containing hydrolysable heavy metal ions such as U(VI), U(IV) or Pu(IV) without taking into account the acidity contribution from the hydrolysis of such metal ions. It is, in fact, an operation having an essential role for the control of the nuclear fuel recycling process. The main objective behind the technical optimization of the actual ‘beaker’ method was to reduce the amount of radioactive substance to be handled by the laboratory personnel, to ease the instrumentation adjustability within a glove-box environment and to allow a high-throughput analysis for conducting more cost-effective operations. The measurement technique is based on the concept of the Taylor-Aris dispersion in order to create inside of a 200 μm x 5cm circular cylindrical micro-channel a linear concentration gradient in less than a second. The proposed analytical methodology relies on the actinide complexation using pH 5.6 sodium oxalate solution and subsequent alkalimetric titration of nitric acid with sodium hydroxide. The titration process is followed with a CCD camera for fluorescence detection; the neutralization boundary can be visualized in a detection range of 500nm- 600nm thanks to the addition of a pH sensitive fluorophore. The operating principle of the developed device allows the active generation of linear concentration gradients using a single cylindrical micro channel. This feature simplifies the fabrication and ease of use of the micro device, as it does not need a complex micro channel network or passive mixers to generate the chemical gradient. Moreover, since the linear gradient is determined by the liquid reagents input pressure, its generation can be fully achieved in faster intervals than one second, being a more timely-efficient gradient generation process compared to other source-sink passive diffusion devices. The resulting linear gradient generator device was therefore adapted to perform for the first time, a volumetric titration on a chip where the amount of reagents used is fixed to the total volume of the micro channel, avoiding an important waste generation like in other flow-based titration techniques. The associated analytical method is automated and its linearity has been proven for the free acidity determination of U(VI) samples containing up to 0.5M of actinide ion and nitric acid in a concentration range of 0.5M to 3M. In addition to automation, the developed analytical methodology and technique greatly improves the standard off-line oxalate complexation and alkalimetric titration method by reducing a thousand fold the required sample volume, forty times the nuclear waste per analysis as well as the analysis time by eight-fold. The developed device represents, therefore, a great step towards an easy-to-handle nuclear-related application, which in the short term could be used to improve laboratory safety as much as to reduce the environmental impact of the radioanalytical chain.Keywords: free acidity, lab-on-a-chip, linear concentration gradient, Taylor-Aris dispersion, volumetric titration
Procedia PDF Downloads 388108 A Triple Win: Linking Students, Academics, and External Organisations to Provide Real-World Learning Experiences with Real-World Benefits
Authors: Anne E. Goodenough
Abstract:
Students often learn best ‘on the job’ through holistic real-world projects. They need real-world experiences to make classroom learning applicable and to increase their employability. Academics typically value working on projects where new knowledge is created and have a genuine desire to help students engage with learning and develop new skills. They might also have institutional pressure to enhance student engagement, retention, and satisfaction. External organizations - especially non-governmental bodies, charities, and small enterprises - often have fundamental and pressing questions, but lack the manpower and academic expertise to answer them effectively. They might also be on the lookout for talented potential employees. This study examines ways in which these diverse requirements can be met simultaneously by creating three-way projects that provide excellent academic and real-world outcomes for all involved. It studied a range of innovative projects across natural sciences (biology, ecology, physical geography and social sciences (human geography, sociology, criminology, and community engagement) to establish how to best harness the potential of this powerful approach. Focal collaborations included: (1) development of practitioner-linked modules; (2) frameworks where students collected/analyzed data for link organizations in research methods modules; (3) placement-based internships and dissertations; and (4) immersive fieldwork projects in novel locations to allow students engage first-hand with contemporary issues as diverse as rhino poaching in South Africa, segregation in Ireland, and gun crime in Florida. Although there was no ‘magic formula’ for success, the approach was found to work best when small projects were developed that were achievable in a short time-frame, both to tie into modular curricula and meet the immediacy expectations of many link organizations. Bigger projects were found to work well in some cases, especially when they were essentially a series of linked smaller projects, either running concurrently or successively with each building on previous work. Opportunities were maximized when there were tangible benefits to the link organization as this generally increased organization investment in the project and motivated students too. The importance of finding the right approach for a given project was found to be key: it was vital to ensure that something that could work effectively as an independent research project for one student, for example, was not shoehorned into being a project for multiple students within a taught module. In general, students were very positive about collaboration projects. They identified benefits to confidence, time-keeping and communication, as well as conveying their enthusiasm when their work was of benefit to the wider community. Several students have gone on to do further work with the link organization in a voluntary capacity or as paid staff, or used the experiences to help them break into the ever-more competitive job market in other ways. Although this approach involves a substantial time investment, especially from academics, the benefits can be profound. The approach has strong potential to engage students, help retention, improve student satisfaction, and teach new skills; keep the knowledge of academics fresh and current; and provide valuable tangible benefits for link organizations: a real triple win.Keywords: authentic learning, curriculum development, effective education, employability, higher education, innovative pedagogy, link organizations, student experience
Procedia PDF Downloads 219107 Supporting 'vulnerable' Students to Complete Their Studies During the Economic Crisis in Greece: The Umbrella Program of International Hellenic University
Authors: Rigas Kotsakis, Nikolaos Tsigilis, Vasilis Grammatikopoulos, Evridiki Zachopoulou
Abstract:
During the last decade, Greece has faced an unprecedented financial crisis, affecting various aspects and functionalities of Higher Education. Besides the restricted funding of academic institutions, the students and their families encountered economical difficulties that undoubtedly influenced the effective completion of their studies. In this context, a fairly large number of students in Alexander campus of International Hellenic University (IHU) delay, interrupt, or even abandon their studies, especially when they come from low-income families, belong to sensitive social or special needs groups, they have different cultural origins, etc. For this reason, a European project, named “Umbrella”, was initiated aiming at providing the necessary psychological support and counseling, especially to disadvantaged students, towards the completion of their studies. To this end, a network of various academic members (academic staff and students) from IHU, namely iMentor, were implicated in different roles. Specifically, experienced academic staff trained students to serve as intermediate links for the integration and educational support of students that fall into the aforementioned sensitive social groups and face problems for the completion of their studies. The main idea of the project is held upon its person-centered character, which facilitates direct student-to-student communication without the intervention of the teaching staff. The backbone of the iMentors network are senior students that face no problem in their academic life and volunteered for this project. It should be noted that there is a provision from the Umbrella structure for substantial and ethical rewards for their engagement. In this context, a well-defined, stringent methodology was implemented for the evaluation of the extent of the problem in IHU and the detection of the profile of the “candidate” disadvantaged students. The first phase included two steps, (a) data collection and (b) data cleansing/ preprocessing. The first step involved the data collection process from the Secretary Services of all Schools in IHU, from 1980 to 2019, which resulted in 96.418 records. The data set included the School name, the semester of studies, a student enrolling criteria, the nationality, the graduation year or the current, up-to-date academic state (still studying, delayed, dropped off, etc.). The second step of the employed methodology involved the data cleansing/preprocessing because of the existence of “noisy” data, missing and erroneous values, etc. Furthermore, several assumptions and grouping actions were imposed to achieve data homogeneity and an easy-to-interpret subsequent statistical analysis. Specifically, the duration of 40 years recording was limited to the last 15 years (2004-2019). In 2004 the Greek Technological Institutions were evolved into Higher Education Universities, leading into a stable and unified frame of graduate studies. In addition, the data concerning active students were excluded from the analysis since the initial processing effort was focused on the detection of factors/variables that differentiated graduate and deleted students. The final working dataset included 21.432 records with only two categories of students, those that have a degree and those who abandoned their studies. Findings of the first phase are presented across faculties and further discussed.Keywords: higher education, students support, economic crisis, mentoring
Procedia PDF Downloads 115106 Tailoring Piezoelectricity of PVDF Fibers with Voltage Polarity and Humidity in Electrospinning
Authors: Piotr K. Szewczyk, Arkadiusz Gradys, Sungkyun Kim, Luana Persano, Mateusz M. Marzec, Oleksander Kryshtal, Andrzej Bernasik, Sohini Kar-Narayan, Pawel Sajkiewicz, Urszula Stachewicz
Abstract:
Piezoelectric polymers have received great attention in smart textiles, wearables, and flexible electronics. Their potential applications range from devices that could operate without traditional power sources, through self-powering sensors, up to implantable biosensors. Semi-crystalline PVDF is often proposed as the main candidate for industrial-scale applications as it exhibits exceptional energy harvesting efficiency compared to other polymers combined with high mechanical strength and thermal stability. Plenty of approaches have been proposed for obtaining PVDF rich in the desired β-phase with electric polling, thermal annealing, and mechanical stretching being the most prevalent. Electrospinning is a highly tunable technique that provides a one-step process of obtaining highly piezoelectric PVDF fibers without the need for post-treatment. In this study, voltage polarity and relative humidity influence on electrospun PVDF, fibers were investigated with the main focus on piezoelectric β-phase contents and piezoelectric performance. Morphology and internal structure of fibers were investigated using scanning (SEM) and transmission electron microscopy techniques (TEM). Fourier Transform Infrared Spectroscopy (FITR), wide-angle X-ray scattering (WAXS) and differential scanning calorimetry (DSC) were used to characterize the phase composition of electrospun PVDF. Additionally, surface chemistry was verified with X-ray photoelectron spectroscopy (XPS). Piezoelectric performance of individual electrospun PVDF fibers was measured using piezoresponse force microscopy (PFM), and the power output from meshes was analyzed via custom-built equipment. To prepare the solution for electrospinning, PVDF pellets were dissolved in dimethylacetamide and acetone solution in a 1:1 ratio to achieve a 24% solution. Fibers were electrospun with a constant voltage of +/-15kV applied to the stainless steel nozzle with the inner diameter of 0.8mm. The flow rate was kept constant at 6mlh⁻¹. The electrospinning of PVDF was performed at T = 25°C and relative humidity of 30 and 60% for PVDF30+/- and PVDF60+/- samples respectively in the environmental chamber. The SEM and TEM analysis of fibers produced at a lower relative humidity of 30% (PVDF30+/-) showed a smooth surface in opposition to fibers obtained at 60% relative humidity (PVDF60+/-), which had wrinkled surface and additionally internal voids. XPS results confirmed lower fluorine content at the surface of PVDF- fibers obtained by electrospinning with negative voltage polarity comparing to the PVDF+ obtained with positive voltage polarity. Changes in surface composition measured with XPS were found to influence the piezoelectric performance of obtained fibers what was further confirmed by PFM as well as by custom-built fiber-based piezoelectric generator. For PVDF60+/- samples humidity led to an increase of β-phase contents in PVDF fibers as confirmed by FTIR, WAXS, and DSC measurements, which showed almost two times higher concentrations of β-phase. A combination of negative voltage polarity with high relative humidity led to fibers with the highest β-phase contents and the best piezoelectric performance of all investigated samples. This study outlines the possibility to produce electrospun PVDF fibers with tunable piezoelectric performance in a one-step electrospinning process by controlling relative humidity and voltage polarity conditions. Acknowledgment: This research was conducted within the funding from m the Sonata Bis 5 project granted by National Science Centre, No 2015/18/E/ST5/00230, and supported by the infrastructure at International Centre of Electron Microscopy for Materials Science (IC-EM) at AGH University of Science and Technology. The PFM measurements were supported by an STSM Grant from COST Action CA17107.Keywords: crystallinity, electrospinning, PVDF, voltage polarity
Procedia PDF Downloads 134105 Comparative Production of Secondary Metabolites by Prunus africana (Hook. F.) Kalkman Provenances in Cameroon and Some Associated Endophytic Fungi
Authors: Gloria M. Ntuba-Jua, Afui M. Mih, Eneke E. T. Bechem
Abstract:
Prunus africana (Hook. F.) Kalkman, commonly known as Pygeum or African cherry belongs to the Rosaceae family. It is a medium to large, evergreen tree with a spreading crown of 10 to 20 m. It is used by the traditional medical practitioners for the treatment of over 45ailments in Cameroon and sub-Sahara Africa. In modern medicine, it is used in the treatment of benign prostrate hyperplasia (BPH), prostate gland hypertrophy (enlarged prostate glands). This is possible because of its ability to produce some secondary metabolites which are believed to have bioactivity against these ailments. The ready international market for the sale of Prunus bark, uncontrolled exploitation, illegal harvesting using inappropriate techniques and poor timing of harvesting have contributed enormously to making the plant endangered. It is known to harbor a large number of endophytic fungi with the potential to produce similar secondary metabolites as the parent plant. Alternative sourcing of medicinal principles through endophytic fungi requires succinct knowledge of the endophytic fungi. This will serve as a conservation measure for Prunus africana by reducing dependence on Prunus bark for such metabolites. This work thus sought to compare the production of some major secondary metabolites produced by P. africana and some of its associated endophytic fungi. The leaves and stem bark of the plant from different provenances were soaked in methanol for 72 hrs to yield the methanolic crude extract. The phytochemical screening of the methanolic crude extracts using different standard procedures revealed the presence of tannins, flavonoids, terpenoids, saponins, phenolics and steroids. Pure cultures of some predominantly isolated endophyte species from the difference Prunus provenances such as Curvularia sp, and Morphospecies P001 were also grown in Potato Dextrose Broth (PDB) for 21 days and later extracted with Methylene dichloride (MDC) solvent after 24hrs to produce crude culture extracts. Qualitative assessment of crude culture extracts showed the presence of tannins, terpenoids, phenolics and steroids particularly β-Sitosterol, (a major bioactive metabolite) as did the plant tissues. Qualitative analysis by thin layer chromatography (TLC) was done to confirm and compare the production of β-Sitosterol (as marker compounds) in the crude extracts of the plant and endophyte. Samples were loaded on TLC silica gel aluminium barked plate (Kieselgel 60 F254, 0.2 mm, Merck) using acetone/hexane, (3.0:7.0) solvent system. They were visualized under an ultra violet lamp (UV254 and UV360). TLC revealed that leaves had a higher concentration of β-sitosterol in terms of band intensity than stem barks from the different provenances. The intensity of β-sitosterol bands in the culture extracts of endophytes was comparable to the plant extracts except for Curvularia sp (very minute) whose band was very faint. The ability of these fungi to make β-sitosterol was confirmed by TLC analysis with the compound having chromatographic properties (retention factor) similar to those of β-sitosterol standard. The ability of these major endophytes to produce secondary metabolites similar to the host has therefore been demonstrated. There is, therefore, the potential of developing the in vitro production system of Prunus secondary metabolites thereby enhancing its conservation.Keywords: Caneroon, endophytic fungi, Prunus africana, secondary metabolite
Procedia PDF Downloads 234104 The Role of Cholesterol Oxidase of Mycobacterium tuberculosis in the Down-Regulation of TLR2-Signaling Pathway in Human Macrophages during Infection Process
Authors: Michal Kielbik, Izabela Szulc-Kielbik, Anna Brzostek, Jaroslaw Dziadek, Magdalena Klink
Abstract:
The goal of many research groups in the world is to find new components that are important for survival of mycobacteria in the host cells. Mycobacterium tuberculosis (Mtb) possesses a number of enzymes degrading cholesterol that are considered to be an important factor for its survival and persistence in host macrophages. One of them - cholesterol oxidase (ChoD), although not being essential for cholesterol degradation, is discussed as a virulence compound, however its involvement in macrophages’ response to Mtb is still not sufficiently determined. The recognition of tubercle bacilli antigens by pathogen recognition receptors is crucial for the initiation of the host innate immune response. An important receptor that has been implicated in the recognition and/or uptake of Mtb is Toll-like receptor type 2 (TLR2). Engagement of TLR2 results in the activation and phosphorylation of intracellular signaling proteins including IRAK-1 and -4, TRAF-6, which in turn leads to the activation of target kinases and transcription factors responsible for bactericidal and pro-inflammatory response of macrophages. The aim of these studies was a detailed clarification of the role of Mtb cholesterol oxidase as a virulence factor affecting the TLR2 signaling pathway in human macrophages. As human macrophages the THP-1 differentiated cells were applied. The virulent wild-type Mtb strain (H37Rv), its mutant lacking a functional copy of gene encoding cholesterol oxidase (∆choD), as well as complimented strain (∆choD–choD) were used. We tested the impact of Mtb strains on the expression of TLR2-depended signaling proteins (mRNA level, cytosolic level and phosphorylation status). The cytokine and bactericidal response of THP-1 derived macrophages infected with Mtb strains in relation to TLR2 signaling pathway dependence was also determined. We found that during the 24-hours of infection process the wild-type and complemented Mtb significantly reduced the cytosolic level and phosphorylation status of IRAK-4 and TRAF-6 proteins in macrophages, that was not observed in the case of ΔchoD mutant. Decreasement of TLR2-dependent signaling proteins, induced by wild-type Mtb, was not dependent on the activity of proteasome. Blocking of TLR2 expression, before infection, effectively prevented the induced by wild-type strain reduction of cytosolic level and phosphorylation of IRAK-4. None of the strains affected the surface expression of TLR2. The mRNA level of IRAK-4 and TRAF-6 genes were significantly increased in macrophages 24 hours post-infection with either of tested strains. However, the impact of wild-type Mtb strain on both examined genes was significantly stronger than its ΔchoD mutant. We also found that wild-type strain stimulated macrophages to release high amount of immunosuppressive IL-10, accompanied by low amount of pro-inflammatory IL-8 and bactericidal nitric oxide in comparison to mutant lacking cholesterol oxidase. The influence of wild-type Mtb on this type of macrophages' response strongly dependent on fully active IRAK-1 and IRAK-4 signaling proteins. In conclusion, Mtb using cholesterol oxidase causes the over-activation of TLR2 signaling proteins leading to the reduction of their cytosolic level and activity resulting in the modulation of macrophages response to allow its intracellular survival. Supported by grant: 2014/15/B/NZ6/01565, National Science Center, PolandKeywords: Mycobacterium tuberculosis, cholesterol oxidase, macrophages, TLR2-dependent signaling pathway
Procedia PDF Downloads 419103 Analysis of Vibration and Shock Levels during Transport and Handling of Bananas within the Post-Harvest Supply Chain in Australia
Authors: Indika Fernando, Jiangang Fei, Roger Stanley, Hossein Enshaei
Abstract:
Delicate produce such as fresh fruits are increasingly susceptible to physiological damage during the essential post-harvest operations such as transport and handling. Vibration and shock during the distribution are identified factors for produce damage within post-harvest supply chains. Mechanical damages caused during transit may significantly diminish the quality of fresh produce which may also result in a substantial wastage. Bananas are one of the staple fruit crops and the most sold supermarket produce in Australia. It is also the largest horticultural industry in the state of Queensland where 95% of the total production of bananas are cultivated. This results in significantly lengthy interstate supply chains where fruits are exposed to prolonged vibration and shocks. This paper is focused on determining the shock and vibration levels experienced by packaged bananas during transit from the farm gate to the retail market. Tri-axis acceleration data were captured by custom made accelerometer based data loggers which were set to a predetermined sampling rate of 400 Hz. The devices recorded data continuously for 96 Hours in the interstate journey of nearly 3000 Km from the growing fields in far north Queensland to the central distribution centre in Melbourne in Victoria. After the bananas were ripened at the ripening facility in Melbourne, the data loggers were used to capture the transport and handling conditions from the central distribution centre to three retail outlets within the outskirts of Melbourne. The quality of bananas were assessed before and after transport at each location along the supply chain. Time series vibration and shock data were used to determine the frequency and the severity of the transient shocks experienced by the packages. Frequency spectrogram was generated to determine the dominant frequencies within each segment of the post-harvest supply chain. Root Mean Square (RMS) acceleration levels were calculated to characterise the vibration intensity during transport. Data were further analysed by Fast Fourier Transform (FFT) and the Power Spectral Density (PSD) profiles were generated to determine the critical frequency ranges. It revealed the frequency range in which the escalated energy levels were transferred to the packages. It was found that the vertical vibration was the highest and the acceleration levels mostly oscillated between ± 1g during transport. Several shock responses were recorded exceeding this range which were mostly attributed to package handling. These detrimental high impact shocks may eventually lead to mechanical damages in bananas such as impact bruising, compression bruising and neck injuries which affect their freshness and visual quality. It was revealed that the frequency range between 0-5 Hz and 15-20 Hz exert an escalated level of vibration energy to the packaged bananas which may result in abrasion damages such as scuffing, fruit rub and blackened rub. Further research is indicated specially in the identified critical frequency ranges to minimise exposure of fruits to the harmful effects of vibration. Improving the handling conditions and also further study on package failure mechanisms when exposed to transient shock excitation will be crucial to improve the visual quality of bananas within the post-harvest supply chain in Australia.Keywords: bananas, handling, post-harvest, supply chain, shocks, transport, vibration
Procedia PDF Downloads 191102 Understanding Natural Resources Governance in Canada: The Role of Institutions, Interests, and Ideas in Alberta's Oil Sands Policy
Authors: Justine Salam
Abstract:
As a federal state, Canada’s constitutional arrangements regarding the management of natural resources is unique because it gives complete ownership and control of natural resources to the provinces (subnational level). However, the province of Alberta—home to the third largest oil reserves in the world—lags behind comparable jurisdictions in levying royalties on oil corporations, especially oil sands royalties. While Albertans own the oil sands, scholars have argued that natural resource exploitation in Alberta benefits corporations and industry more than it does Albertans. This study provides a systematic understanding of the causal factors affecting royalties in Alberta to map dynamics of power and how they manifest themselves during policy-making. Mounting domestic and global public pressure led Alberta to review its oil sands royalties twice in less than a decade through public-commissioned Royalty Review Panels, first in 2007 and again in 2015. The Panels’ task was to research best practices and to provide policy recommendations to the Government through public consultations with Albertans, industry, non-governmental organizations, and First Nations peoples. Both times, the Panels recommended a relative increase to oil sands royalties. However, irrespective of the Reviews’ recommendations, neither the right-wing 2007 Progressive Conservative Party (PC) nor the left-wing 2015 New Democratic Party (NDP) government—both committed to increase oil sands royalties—increased royalty intake. Why did two consecutive political parties at opposite ends of the political spectrum fail to account for the recommendations put forward by the Panel? Through a qualitative case-study analysis, this study assesses domestic and global causal factors for Alberta’s inability to raise oil sands royalties significantly after the two Reviews through an institutions, interests, and ideas framework. Indeed, causal factors can be global (e.g. market and price fluctuation) or domestic (e.g. oil companies’ influence on the Alberta government). The institutions, interests, and ideas framework is at the intersection of public policy, comparative studies, and political economy literatures, and therefore draws multi-faceted insights into the analysis. To account for institutions, the study proposes to review international trade agreements documents such as the North American Free Trade Agreement (NAFTA) because they have embedded Alberta’s oil sands into American energy security policy and tied Canadian and Albertan oil policy in legal international nods. To account for interests, such as how the oil lobby or the environment lobby can penetrate governmental decision-making spheres, the study draws on the Oil Sands Oral History project, a database of interviews from government officials and oil industry leaders at a pivotal time in Alberta’s oil industry, 2011-2013. Finally, to account for ideas, such as how narratives of Canada as a global ‘energy superpower’ and the importance of ‘energy security’ have dominated and polarized public discourse, the study relies on content analysis of Alberta-based pro-industry newspapers to trace the prevalence of these narratives. By mapping systematically the nods and dynamics of power at play in Alberta, the study sheds light on the factors that influence royalty policy-making in one of the largest industries in Canada.Keywords: Alberta Canada, natural resources governance, oil sands, political economy
Procedia PDF Downloads 134