Search results for: digital media firms
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5973

Search results for: digital media firms

123 Biochemical Effects of Low Dose Dimethyl Sulfoxide on HepG2 Liver Cancer Cell Line

Authors: Esra Sengul, R. G. Aktas, M. E. Sitar, H. Isan

Abstract:

Hepatocellular carcinoma (HCC) is a hepatocellular tumor commonly found on the surface of the chronic liver. HepG2 is the most commonly used cell type in HCC studies. The main proteins remaining in the blood serum after separation of plasma fibrinogen are albumin and globulin. The fact that the albumin showed hepatocellular damage and reflect the synthesis capacity of the liver was the main reason for our use. Alpha-Fetoprotein (AFP) is an albumin-like structural embryonic globulin found in the embryonic cortex, cord blood, and fetal liver. It has been used as a marker in the follow-up of tumor growth in various malign tumors and in the efficacy of surgical-medical treatments, so it is a good protein to look at with albumins. We have seen the morphological changes of dimethyl sulfoxide (DMSO) on HepG2 and decided to investigate its biochemical effects. We examined the effects of DMSO, which is used in cell cultures, on albumin, AFP and total protein at low doses. Material Method: Cell Culture: Medium was prepared in cell culture using Dulbecco's Modified Eagle Media (DMEM), Fetal Bovine Serum Dulbecco's (FBS), Phosphate Buffered Saline and trypsin maintained at -20 ° C. Fixation of Cells: HepG2 cells, which have been appropriately developed at the end of the first week, were fixed with acetone. We stored our cells in PBS at + 4 ° C until the fixation was completed. Area Calculation: The areas of the cells are calculated in the ImageJ (IJ). Microscope examination: The examination was performed with a Zeiss Inverted Microscope. Daytime photographs were taken at 40x, 100x 200x and 400x. Biochemical Tests: Protein (Total): Serum sample was analyzed by a spectrophotometric method in autoanalyzer. Albumin: Serum sample was analyzed by a spectrophotometric method in autoanalyzer. Alpha-fetoprotein: Serum sample was analyzed by ECLIA method. Results: When liver cancer cells were cultured in medium with 1% DMSO for 4 weeks, a significant difference was observed when compared with the control group. As a result, we have seen that DMSO can be used as an important agent in the treatment of liver cancer. Cell areas were reduced in the DMSO group compared to the control group and the confluency ratio increased. The ability to form spheroids was also significantly higher in the DMSO group. Alpha-fetoprotein was lower than the values of an ordinary liver cancer patient and the total protein amount increased to the reference range of the normal individual. Because the albumin sample was below the specimen value, the numerical results could not be obtained on biochemical examinations. We interpret all these results as making DMSO a caretaking aid. Since each one was not enough alone we used 3 parameters and the results were positive when we refer to the values of a normal healthy individual in parallel. We hope to extend the study further by adding new parameters and genetic analyzes, by increasing the number of samples, and by using DMSO as an adjunct agent in the treatment of liver cancer.

Keywords: hepatocellular carcinoma, HepG2, dimethyl sulfoxide, cell culture, ELISA

Procedia PDF Downloads 118
122 Production of Medicinal Bio-active Amino Acid Gamma-Aminobutyric Acid In Dairy Sludge Medium

Authors: Farideh Tabatabaee Yazdi, Fereshteh Falah, Alireza Vasiee

Abstract:

Introduction: Gamma-aminobutyric acid (GABA) is a non-protein amino acid that is widely present in organisms. GABA is a kind of pharmacological and biological component and its application is wide and useful. Several important physiological functions of GABA have been characterized, such as neurotransmission and induction of hypotension. GABA is also a strong secretagogue of insulin from the pancreas and effectively inhibits small airway-derived lung adenocarcinoma and tranquilizer. Many microorganisms can produce GABA, and lactic acid bacteria have been a focus of research in recent years because lactic acid bacteria possess special physiological activities and are generally regarded as safe. Among them, the Lb. Brevis produced the highest amount of GABA. The major factors affecting GABA production have been characterized, including carbon sources and glutamate concentration. The use of food industry waste to produce valuable products such as amino acids seems to be a good way to reduce production costs and prevent the waste of food resources. In a dairy factory, a high volume of sludge is produced from a separator that contains useful compounds such as growth factors, carbon, nitrogen, and organic matter that can be used by different microorganisms such as Lb.brevis as carbon and nitrogen sources. Therefore, it is a good source of GABA production. GABA is primarily formed by the irreversible α-decarboxylation reaction of L-glutamic acid or its salts, catalysed by the GAD enzyme. In the present study, this aim was achieved for the fast-growing of Lb.brevis and producing GABA, using the dairy industry sludge as a suitable growth medium. Lactobacillus Brevis strains obtained from Microbial Type Culture Collection (MTCC) were used as model strains. In order to prepare dairy sludge as a medium, sterilization should be done at 121 ° C for 15 minutes. Lb. Brevis was inoculated to the sludge media at pH=6 and incubated for 120 hours at 30 ° C. After fermentation, the supernatant solution is centrifuged and then, the GABA produced was analyzed by the Thin Layer chromatography (TLC) method qualitatively and by the high-performance liquid chromatography (HPLC) method quantitatively. By increasing the percentage of dairy sludge in the culture medium, the amount of GABA increased. Also, evaluated the growth of bacteria in this medium showed the positive effect of dairy sludge on the growth of Lb.brevis, which resulted in the production of more GABA. GABA-producing LAB offers the opportunity of developing naturally fermented health-oriented products. Although some GABA-producing LAB has been isolated to find strains suitable for different fermentations, further screening of various GABA-producing strains from LAB, especially high-yielding strains, is necessary. The production of lactic acid, bacterial gamma-aminobutyric acid, is safe and eco-friendly. The use of dairy industry waste causes enhanced environmental safety. Also provides the possibility of producing valuable compounds such as GABA. In general, dairy sludge is a suitable medium for the growth of Lactic Acid Bacteria and produce this amino acid that can reduce the final cost of it by providing carbon and nitrogen source.

Keywords: GABA, Lactobacillus, HPLC, dairy sludge

Procedia PDF Downloads 114
121 Masstige and the New Luxury: An Exploratory Study on Cosmetic Brands Among Black African Woman

Authors: Melanie Girdharilall, Anjli Himraj, Shivan Bhagwandin, Marike Venter De Villiers

Abstract:

The allure of luxury has long been attractive, fashionable, mystifying, and complex. As globalisation and the popularity of social media continue to evolve, consumers are seeking status products. However, in emerging economies like South Africa, where 60% of the country lives in poverty, this desire is often far-fetched and out of reach to most of the consumers. As a result, luxury brands are introducing masstige products: products that are associated with luxury and status but within financial reach to the middle-class consumer. The biggest challenge that this industry faces is the lack of knowledge and expertise on black female’s hair composition and offering products that meet their intricate requirements. African consumers have unique hair types, and global brands often do not accommodate for the complex nature of their hair and their product needs. By gaining insight into this phenomenon, global cosmetic brands can benefit from brand expansion, product extensions, increased brand awareness, brand knowledge, and brand equity. The purpose of this study is to determine how cosmetic brands can leverage the concept of masstige products to cater to the needs of middle-income black African woman. This study explores the 18- to 35-year-old black female cohort, which comprises approximately 17% of the South African population. The black hair care industry in Africa is expected a 6% growth rate over the next 5 years. The study is grounded in Paul’s (2019) 3-phase model for masstige marketing. This model demonstrates that product, promotion, and place strategies play a significant role in masstige value creation and the impact of these strategies on the branding dimensions (brand trust, brand association, brand positioning, brand preference, etc.).More specifically, this theoretical framework encompasses nine stages, or dimensions, that are of critical importance to companies who plan to infiltrate the masstige market. In short, the most critical components to consider are the positioning of the product and its competitive advantage in comparison to competitors. Secondly, advertising appeals and use of celebrities, and lastly, distribution channels such as online or in-store while maintain the exclusivity of the brand. By means of an exploratory study, a qualitative approach was undertaken, and focus groups were conducted among black African woman. The focus groups were voice recorded, transcribed, and analysed using Atlas software. The main themes were identified and used to provide brands with insight and direction for developing a comprehensive marketing mix for effectively entering the masstige market. The findings of this study will provide marketing practitioners with in-depth insight into how to effectively position masstige brands in line with consumer needs. It will give direction to both existing and new brands aiming to enter this market, by giving a comprehensive marketing mix for targeting the growing black hair care industry in Africa.

Keywords: africa, masstige, cosmetics, hard care, black females

Procedia PDF Downloads 69
120 Formulation of Lipid-Based Tableted Spray-Congealed Microparticles for Zero Order Release of Vildagliptin

Authors: Hend Ben Tkhayat , Khaled Al Zahabi, Husam Younes

Abstract:

Introduction: Vildagliptin (VG), a dipeptidyl peptidase-4 inhibitor (DPP-4), was proven to be an active agent for the treatment of type 2 diabetes. VG works by enhancing and prolonging the activity of incretins which improves insulin secretion and decreases glucagon release, therefore lowering blood glucose level. It is usually used with various classes, such as insulin sensitizers or metformin. VG is currently only marketed as an immediate-release tablet that is administered twice daily. In this project, we aim to formulate an extended-release with a zero-order profile tableted lipid microparticles of VG that could be administered once daily ensuring the patient’s convenience. Method: The spray-congealing technique was used to prepare VG microparticles. Compritol® was heated at 10 oC above its melting point and VG was dispersed in the molten carrier using a homogenizer (IKA T25- USA) set at 13000 rpm. VG dispersed in the molten Compritol® was added dropwise to the molten Gelucire® 50/13 and PEG® (400, 6000, and 35000) in different ratios under manual stirring. The molten mixture was homogenized and Carbomer® amount was added. The melt was pumped through the two-fluid nozzle of the Buchi® Spray-Congealer (Buchi B-290, Switzerland) using a Pump drive (Master flex, USA) connected to a silicone tubing wrapped with silicone heating tape heated at the same temperature of the pumped mix. The physicochemical properties of the produced VG-loaded microparticles were characterized using Mastersizer, Scanning Electron Microscope (SEM), Differential Scanning Calorimeter (DSC) and X‐Ray Diffractometer (XRD). VG microparticles were then pressed into tablets using a single punch tablet machine (YDP-12, Minhua pharmaceutical Co. China) and in vitro dissolution study was investigated using Agilent Dissolution Tester (Agilent, USA). The dissolution test was carried out at 37±0.5 °C for 24 hours in three different dissolution media and time phases. The quantitative analysis of VG in samples was realized using a validated High-Pressure Liquid Chromatography (HPLC-UV) method. Results: The microparticles were spherical in shape with narrow distribution and smooth surface. DSC and XRD analyses confirmed the crystallinity of VG that was lost after being incorporated into the amorphous polymers. The total yields of the different formulas were between 70% and 80%. The VG content in the microparticles was found to be between 99% and 106%. The in vitro dissolution study showed that VG was released from the tableted particles in a controlled fashion. The adjustment of the hydrophilic/hydrophobic ratio of excipients, their concentration and the molecular weight of the used carriers resulted in tablets with zero-order kinetics. The Gelucire 50/13®, a hydrophilic polymer was characterized by a time-dependent profile with an important burst effect that was decreased by adding Compritol® as a lipophilic carrier to retard the release of VG which is highly soluble in water. PEG® (400,6000 and 35 000) were used for their gelling effect that led to a constant rate delivery and achieving a zero-order profile. Conclusion: Tableted spray-congealed lipid microparticles for extended-release of VG were successfully prepared and a zero-order profile was achieved.

Keywords: vildagliptin, spray congealing, microparticles, controlled release

Procedia PDF Downloads 106
119 Seawater Desalination for Production of Highly Pure Water Using a Hydrophobic PTFE Membrane and Direct Contact Membrane Distillation (DCMD)

Authors: Ahmad Kayvani Fard, Yehia Manawi

Abstract:

Qatar’s primary source of fresh water is through seawater desalination. Amongst the major processes that are commercially available on the market, the most common large scale techniques are Multi-Stage Flash distillation (MSF), Multi Effect distillation (MED), and Reverse Osmosis (RO). Although commonly used, these three processes are highly expensive down to high energy input requirements and high operating costs allied with maintenance and stress induced on the systems in harsh alkaline media. Beside that cost, environmental footprint of these desalination techniques are significant; from damaging marine eco-system, to huge land use, to discharge of tons of GHG and huge carbon footprint. Other less energy consuming techniques based on membrane separation are being sought to reduce both the carbon footprint and operating costs is membrane distillation (MD). Emerged in 1960s, MD is an alternative technology for water desalination attracting more attention since 1980s. MD process involves the evaporation of a hot feed, typically below boiling point of brine at standard conditions, by creating a water vapor pressure difference across the porous, hydrophobic membrane. Main advantages of MD compared to other commercially available technologies (MSF and MED) and specially RO are reduction of membrane and module stress due to absence of trans-membrane pressure, less impact of contaminant fouling on distillate due to transfer of only water vapor, utilization of low grade or waste heat from oil and gas industries to heat up the feed up to required temperature difference across the membrane, superior water quality, and relatively lower capital and operating cost. To achieve the objective of this study, state of the art flat-sheet cross-flow DCMD bench scale unit was designed, commissioned, and tested. The objective of this study is to analyze the characteristics and morphology of the membrane suitable for DCMD through SEM imaging and contact angle measurement and to study the water quality of distillate produced by DCMD bench scale unit. Comparison with available literature data is undertaken where appropriate and laboratory data is used to compare a DCMD distillate quality with that of other desalination techniques and standards. Membrane SEM analysis showed that the PTFE membrane used for the study has contact angle of 127º with highly porous surface supported with less porous and bigger pore size PP membrane. Study on the effect of feed solution (salinity) and temperature on water quality of distillate produced from ICP and IC analysis showed that with any salinity and different feed temperature (up to 70ºC) the electric conductivity of distillate is less than 5 μS/cm with 99.99% salt rejection and proved to be feasible and effective process capable of consistently producing high quality distillate from very high feed salinity solution (i.e. 100000 mg/L TDS) even with substantial quality difference compared to other desalination methods such as RO and MSF.

Keywords: membrane distillation, waste heat, seawater desalination, membrane, freshwater, direct contact membrane distillation

Procedia PDF Downloads 208
118 The Gender Criteria of Film Criticism: Creating the ‘Big’, Avoiding the Important

Authors: Eleni Karasavvidou

Abstract:

Social and anthropological research, parallel to Gender Studies, highlighted the relationship between social structures and symbolic forms as an important field of interaction and recording of 'social trends.' Since the study of representations can contribute to the understanding of the social functions and power relations, they encompass. This ‘mirage,’ however, has not only to do with the representations themselves but also with the ways they are received and the film or critical narratives that are established as dominant or alternative. Cinema and the criticism of its cultural products are no exception. Even in the rapidly changing media landscape of the 21st century, movies remain an integral and widespread part of popular culture, making films an extremely powerful means of 'legitimizing' or 'delegitimizing' visions of domination and commonsensical gender stereotypes throughout society. And yet it is film criticism, the 'language per se,' that legitimizes, reinforces, rewards and reproduces (or at least ignores) the stereotypical depictions of female roles that remain common in the realm of film images. This creates the need for this issue to have emerged (also) in academic research questioning gender criteria in film reviews as part of the effort for an inclusive art and society. Qualitative content analysis is used to examine female roles in selected Oscar-nominated films against their reviews from leading websites and newspapers. This method was chosen because of the complex nature of the depictions in the films and the narratives they evoke. The films were divided into basic scenes depicting social functions, such as love and work relationships, positions of power and their function, which were analyzed by content analysis, with borrowings from structuralism (Gennette) and the local/universal images of intercultural philology (Wierlacher). In addition to the measurement of the general ‘representation-time’ by gender, other qualitative characteristics were also analyzed, such as: speaking time, sayings or key actions, overall quality of the character's action in relation to the development of the scenario and social representations in general, as well as quantitatively (insufficient number of female lead roles, fewer key supporting roles, relatively few female directors and people in the production chain and how they might affect screen representations. The quantitative analysis in this study was used to complement the qualitative content analysis. Then the focus shifted to the criteria of film criticism and to the rhetorical narratives that exclude or highlight in relation to gender identities and functions. In the criteria and language of film criticism, stereotypes are often reproduced or allegedly overturned within the framework of apolitical "identity politics," which mainly addresses the surface of a self-referential cultural-consumer product without connecting it more deeply with the material and cultural life. One of the prime examples of this failure is the Bechtel Test, which tracks whether female characters speak in a film regardless of whether women's stories are represented or not in the films analyzed. If perceived unbiased male filmmakers still fail to tell truly feminist stories, the same is the case with the criteria of criticism and the related interventions.

Keywords: representations, context analysis, reviews, sexist stereotypes

Procedia PDF Downloads 57
117 Mixed Mode Fracture Analyses Using Finite Element Method of Edge Cracked Heavy Annulus Pulley

Authors: Bijit Kalita, K. V. N. Surendra

Abstract:

The pulley works under both compressive loading due to contacting belt in tension and central torque due to cause rotation. In a power transmission system, the belt pulley assemblies offer a contact problem in the form of two mating cylindrical parts. In this work, we modeled a pulley as a heavy two-dimensional circular disk. Stress analysis due to contact loading in the pulley mechanism is performed. Finite element analysis (FEA) is conducted for a pulley to investigate the stresses experienced on its inner and outer periphery. In most of the heavy-duty applications, most frequently used mechanisms to transmit power in applications such as automotive engines, industrial machines, etc. is Belt Drive. Usually, very heavy circular disks are used as pulleys. A pulley could be entitled as a drum and may have a groove between two flanges around the circumference. A rope, belt, cable or chain can be the driving element of a pulley system that runs over the pulley inside the groove. A pulley is experienced by normal and shear tractions on its contact region in the process of motion transmission. The region may be belt-pulley contact surface or pulley-shaft contact surface. In 1895, Hertz solved the elastic contact problem for point contact and line contact of an ideal smooth object. Afterward, this hypothesis is generally utilized for computing the actual contact zone. Detailed stress analysis in such contact region of such pulleys is quite necessary to prevent early failure. In this paper, the results of the finite element analyses carried out on the compressed disk of a belt pulley arrangement using fracture mechanics concepts are shown. Based on the literature on contact stress problem induced in the wide field of applications, generated stress distribution on the shaft-pulley and belt-pulley interfaces due to the application of high-tension and torque was evaluated in this study using FEA concepts. Finally, the results obtained from ANSYS (APDL) were compared with the Hertzian contact theory. The study is mainly focused on the fatigue life estimation of a rotating part as a component of an engine assembly using the most famous Paris equation. Digital Image Correlation (DIC) analyses have been performed using the open-source software. From the displacement computed using the images acquired at a minimum and maximum force, displacement field amplitude is computed. From these fields, the crack path is defined and stress intensity factors and crack tip position are extracted. A non-linear least-squares projection is used for the purpose of the estimation of fatigue crack growth. Further study will be extended for the various application of rotating machinery such as rotating flywheel disk, jet engine, compressor disk, roller disk cutter etc., where Stress Intensity Factor (SIF) calculation plays a significant role on the accuracy and reliability of a safe design. Additionally, this study will be progressed to predict crack propagation in the pulley using maximum tangential stress (MTS) criteria for mixed mode fracture.

Keywords: crack-tip deformations, contact stress, stress concentration, stress intensity factor

Procedia PDF Downloads 104
116 Librarian Liaisons: Facilitating Multi-Disciplinary Research for Academic Advancement

Authors: Tracey Woods

Abstract:

In the ever-evolving landscape of academia, the traditional role of the librarian has undergone a remarkable transformation. Once considered as custodians of books and gatekeepers of information, librarians have the potential to take on the vital role of facilitators of cross and inter-disciplinary projects. This shift is driven by the growing recognition of the value of interdisciplinary collaboration in addressing complex research questions in pursuit of novel solutions to real-world problems. This paper shall explore the potential of the academic librarian’s role in facilitating innovative, multi-disciplinary projects, both recognising and validating the vital role that the librarian plays in a somewhat underplayed profession. Academic libraries support teaching, the strengthening of knowledge discourse, and, potentially, the development of innovative practices. As the role of the library gradually morphs from a quiet repository of books to a community-based information hub, a potential opportunity arises. The academic librarian’s role is to build knowledge across a wide span of topics, from the advancement of AI to subject-specific information, and, whilst librarians are generally not offered the research opportunities and funding that the traditional academic disciplines enjoy, they are often invited to help build research in support of the academic. This identifies that one of the primary skills of any 21st-century librarian must be the ability to collaborate and facilitate multi-disciplinary projects. In universities seeking to develop research diversity and academic performance, there is an increasing awareness of the need for collaboration between faculties to enable novel directions and advancements. This idea has been documented and discussed by several researchers; however, there is not a great deal of literature available from recent studies. Having a team based in the library that is adept at creating effective collaborative partnerships is valuable for any academic institution. This paper outlines the development of such a project, initiated within and around an identified library-specific need: the replication of fragile special collections for object-based learning. The research was developed as a multi-disciplinary project involving the faculties of engineering (digital twins lab), architecture, design, and education. Centred around methods for developing a fragile archive into a series of tactile objects furthers knowledge and understanding in both the role of the library as a facilitator of projects, chairing and supporting, alongside contributing to the research process and innovating ideas through the bank of knowledge found amongst the staff and their liaising capabilities. This paper shall present the method of project development from the initiation of ideas to the development of prototypes and dissemination of the objects to teaching departments for analysis. The exact replication of artefacts is also balanced with the adaptation and evolutionary speculations initiated by the design team when adapted as a teaching studio method. The dynamic response required from the library to generate and facilitate these multi-disciplinary projects highlights the information expertise and liaison skills that the librarian possesses. As academia embraces this evolution, the potential for groundbreaking discoveries and innovative solutions across disciplines becomes increasingly attainable.

Keywords: Liaison librarian, multi-disciplinary collaborations, library innovations, librarian stakeholders

Procedia PDF Downloads 41
115 Menstruating Bodies and Social Control – Insights From Dignity Without Danger: Collaboratively Analysing Menstrual Stigma and Taboos in Nepal

Authors: Sara Parker, Kay Standing

Abstract:

This paper will share insights into how menstruators bodies in Nepal are viewed and controlled in Nepal due to the deeply held stigmas and taboos that exist that frame menstrual blood as impure and polluting. It draws on a British Academy Global Challenges Research (BA/GCRF) funded project, ‘Dignity Without Danger,’ that ran from December 2019 to 2022. In Nepal, beliefs and myths around menstrual related practices prevail and vary in accordance to time, generation, caste and class. Physical seclusion and/or restrictions include the consumption of certain foods, the ability to touch certain people and objects, and restricted access to water sources. These restrictions not only put women at risk of poor health outcomes, but they also promote discrimination and challenge fundamental human rights. Despite the pandemic, a wealth of field research and creative outputs have been generated to help break the silence that surrounds menstruation and also highlights the complexity of addressing the harms associated with the exclusion from sacred and profane spaces that menstruators face. Working with locally recruited female research assistants, NGOS and brining together academics from the UK and Nepal, we explore the intersecting factors that impact on menstrual experiences and how they vary throughout Nepal. WE concur with Tamang that there is no such thing as a ‘Nepali Woman’, and there is no one narrative that captures the experiences of menstruators in Nepal. These deeply held beliefs and practices mean that menstruators are denied their right to a dignified menstruation. By being excluded from public and private spaces, such as temples and religious sites, as well as from kitchens and your own bedroom in your own home, these beliefs impact on individuals in complex and interesting ways. Existing research in Nepal by academics and activists demonstrates current programmes and initiatives do not fully address the misconceptions that underpin the exclusionary practices impacting on sexual and reproductive health, a sense of well being and highlight more work is needed in this area. Research has been conducted in all 7 provinces and through exploring and connecting disparate stories, artefacts and narratives, we will deepen understanding of the complexity of menstrual practices enabling local stakeholders to challenge exclusionary practices. By using creative methods to engage with stakeholders and share our research findings as well as highlighting the wealth of activism in Nepal. We highlight the importance of working with local communities, leaders and cutting across disciplines and agencies to promote menstrual justice and dignity. Our research findings and creative outputs that we share on social media channels such as Dignity Without Danger Facebook, Instagram and you tube stress the value of employing a collaborative action research approach to generate material which helps local people take control of their own narrative and change social relations that lead to harmful practices.

Keywords: menstruation, Nepal, stigma, social norms

Procedia PDF Downloads 46
114 Assessment of Potential Chemical Exposure to Betamethasone Valerate and Clobetasol Propionate in Pharmaceutical Manufacturing Laboratories

Authors: Nadeen Felemban, Hamsa Banjer, Rabaah Jaafari

Abstract:

One of the most common hazards in the pharmaceutical industry is the chemical hazard, which can cause harm or develop occupational health diseases/illnesses due to chronic exposures to hazardous substances. Therefore, a chemical agent management system is required, including hazard identification, risk assessment, controls for specific hazards and inspections, to keep your workplace healthy and safe. However, routine management monitoring is also required to verify the effectiveness of the control measures. Moreover, Betamethasone Valerate and Clobetasol Propionate are some of the APIs (Active Pharmaceutical Ingredients) with highly hazardous classification-Occupational Hazard Category (OHC 4), which requires a full containment (ECA-D) during handling to avoid chemical exposure. According to Safety Data Sheet, those chemicals are reproductive toxicants (reprotoxicant H360D), which may affect female workers’ health and cause fatal damage to an unborn child, or impair fertility. In this study, qualitative (chemical Risk assessment-qCRA) was conducted to assess the chemical exposure during handling of Betamethasone Valerate and Clobetasol Propionate in pharmaceutical laboratories. The outcomes of qCRA identified that there is a risk of potential chemical exposure (risk rating 8 Amber risk). Therefore, immediate actions were taken to ensure interim controls (according to the Hierarchy of controls) are in place and in use to minimize the risk of chemical exposure. No open handlings should be done out of the Steroid Glove Box Isolator (SGB) with the required Personal Protective Equipment (PPEs). The PPEs include coverall, nitrile hand gloves, safety shoes and powered air-purifying respirators (PAPR). Furthermore, a quantitative assessment (personal air sampling) was conducted to verify the effectiveness of the engineering controls (SGB Isolator) and to confirm if there is chemical exposure, as indicated earlier by qCRA. Three personal air samples were collected using an air sampling pump and filter (IOM2 filters, 25mm glass fiber media). The collected samples were analyzed by HPLC in the BV lab, and the measured concentrations were reported in (ug/m3) with reference to Occupation Exposure Limits, 8hr OELs (8hr TWA) for each analytic. The analytical results are needed in 8hr TWA (8hr Time-weighted Average) to be analyzed using Bayesian statistics (IHDataAnalyst). The results of the Bayesian Likelihood Graph indicate (category 0), which means Exposures are de "minimus," trivial, or non-existent Employees have little to no exposure. Also, these results indicate that the 3 samplings are representative samplings with very low variations (SD=0.0014). In conclusion, the engineering controls were effective in protecting the operators from such exposure. However, routine chemical monitoring is required every 3 years unless there is a change in the processor type of chemicals. Also, frequent management monitoring (daily, weekly, and monthly) is required to ensure the control measures are in place and in use. Furthermore, a Similar Exposure Group (SEG) was identified in this activity and included in the annual health surveillance for health monitoring.

Keywords: occupational health and safety, risk assessment, chemical exposure, hierarchy of control, reproductive

Procedia PDF Downloads 154
113 Development of a Conceptual Framework for Supply Chain Management Strategies Maximizing Resilience in Volatile Business Environments: A Case of Ventilator Challenge UK

Authors: Elena Selezneva

Abstract:

Over the last two decades, an unprecedented growth in uncertainty and volatility in all aspects of the business environment has caused major global supply chain disruptions and malfunctions. The effects of one failed company in a supply chain can ripple up and down the chain, causing a number of entities or an entire supply chain to collapse. The complicating factor is that an increasingly unstable and unpredictable business environment fuels the growing complexity of global supply chain networks. That makes supply chain operations extremely unpredictable and hard to manage with the established methods and strategies. It has caused the premature demise of many companies around the globe as they could not withstand or adapt to the storm of change. Solutions to this problem are not easy to come by. There is a lack of new empirically tested theories and practically viable supply chain resilience strategies. The mainstream organizational approach to managing supply chain resilience is rooted in well-established theories developed in the 1960-1980s. However, their effectiveness is questionable in currently extremely volatile business environments. The systems thinking approach offers an alternative view of supply chain resilience. Still, it is very much in the development stage. The aim of this explorative research is to investigate supply chain management strategies that are successful in taming complexity in volatile business environments and creating resilience in supply chains. The design of this research methodology was guided by an interpretivist paradigm. A literature review informed the selection of the systems thinking approach to supply chain resilience. Therefore, an explorative single case study of Ventilator Challenge UK was selected as a case study for its extremely resilient performance of its supply chain during a period of national crisis. Ventilator Challenge UK is intensive care ventilators supply project for the NHS. It ran for 3.5 months and finished in 2020. The participants moved on with their lives, and most of them are not employed by the same organizations anymore. Therefore, the study data includes documents, historical interviews, live interviews with participants, and social media postings. The data analysis was accomplished in two stages. First, data were thematically analyzed. In the second stage, pattern matching and pattern identification were used to identify themes that formed the findings of the research. The findings from the Ventilator Challenge UK case study supply management practices demonstrated all the features of an adaptive dynamic system. They cover all the elements of supply chain and employ an entire arsenal of adaptive dynamic system strategies enabling supply chain resilience. Also, it is not a simple sum of parts and strategies. Bonding elements and connections between the components of a supply chain and its environment enabled the amplification of resilience in the form of systemic emergence. Enablers are categorized into three subsystems: supply chain central strategy, supply chain operations, and supply chain communications. Together, these subsystems and their interconnections form the resilient supply chain system framework conceptualized by the author.

Keywords: enablers of supply chain resilience, supply chain resilience strategies, systemic approach in supply chain management, resilient supply chain system framework, ventilator challenge UK

Procedia PDF Downloads 61
112 Innovation in PhD Training in the Interdisciplinary Research Institute

Authors: B. Shaw, K. Doherty

Abstract:

The Cultural Communication and Computing Research Institute (C3RI) is a diverse multidisciplinary research institute including art, design, media production, communication studies, computing and engineering. Across these disciplines it can seem like there are enormous differences of research practice and convention, including differing positions on objectivity and subjectivity, certainty and evidence, and different political and ethical parameters. These differences sit within, often unacknowledged, histories, codes, and communication styles of specific disciplines, and it is all these aspects that can make understanding of research practice across disciplines difficult. To explore this, a one day event was orchestrated, testing how a PhD community might communicate and share research in progress in a multi-disciplinary context. Instead of presenting results at a conference, research students were tasked to articulate their method of inquiry. A working party of students from across disciplines had to design a conference call, visual identity and an event framework that would work for students across all disciplines. The process of establishing the shape and identity of the conference was revealing. Even finding a linguistic frame that would meet the expectations of different disciplines for the conference call was challenging. The first abstracts submitted either resorted to reporting findings, or only described method briefly. It took several weeks of supported intervention for research students to get ‘inside’ their method and to understand their research practice as a process rich with philosophical and practical decisions and implications. In response to the abstracts the conference committee generated key methodological categories for conference sessions, including sampling, capturing ‘experience’, ‘making models’, researcher identities, and ‘constructing data’. Each session involved presentations by visual artists, communications students and computing researchers with inter-disciplinary dialogue, facilitated by alumni Chairs. The apparently simple focus on method illuminated research process as a site of creativity, innovation and discovery, and also built epistemological awareness, drawing attention to what is being researched and how it can be known. It was surprisingly difficult to limit students to discussing method, and it was apparent that the vocabulary available for method is sometimes limited. However, by focusing on method rather than results, the genuine process of research, rather than one constructed for approval, could be captured. In unlocking the twists and turns of planning and implementing research, and the impact of circumstance and contingency, students had to reflect frankly on successes and failures. This level of self – and public- critique emphasised the degree of critical thinking and rigour required in executing research and demonstrated that honest reportage of research, faults and all, is good valid research. The process also revealed the degree that disciplines can learn from each other- the computing students gained insights from the sensitive social contextualizing generated by communications and art and design students, and art and design students gained understanding from the greater ‘distance’ and emphasis on application that computing students applied to their subjects. Finding the means to develop dialogue across disciplines makes researchers better equipped to devise and tackle research problems across disciplines, potentially laying the ground for more effective collaboration.

Keywords: interdisciplinary, method, research student, training

Procedia PDF Downloads 185
111 Mycophenolate Mofetil Increases Mucin Expression in Primary Cultures of Oral Mucosal Epithelial Cells for Application in Limbal Stem Cell Deficiency

Authors: Sandeep Kumar Agrawal, Aditi Bhattacharya, Janvie Manhas, Krushna Bhatt, Yatin Kholakiya, Nupur Khera, Ajoy Roychoudhury, Sudip Sen

Abstract:

Autologous cultured explants of human oral mucosal epithelial cells (OMEC) are a potential therapeutic modality for limbal stem cell deficiency (LSCD). Injury or inflammation of the ocular surface in the form of burns, chemicals, Stevens Johnson syndrome, ocular cicatricial pemphigoid etc. can lead to destruction and deficiency of limbal stem cells. LSCD manifests in the form of severe ocular surface diseases (OSD) characterized by persistent and recurrent epithelial defects, conjuntivalisation and neovascularisation of the corneal surface, scarring and ultimately opacity and blindness. Most of the cases of OSD are associated with severe dry eye pertaining to diminished mucin and aqueous secretion. Mycophenolate mofetil (MMF) has been shown to upregulate the mucin expression in conjunctival goblet cells in vitro. The aim of this study was to evaluate the effects of MMF on mucin expression in primary cultures of oral mucosal epithelial cells. With institutional ethics committee approval and written informed consent, thirty oral mucosal epithelial tissue samples were obtained from patients undergoing oral surgery for non-malignant conditions. OMEC were grown on human amniotic membrane (HAM, obtained from expecting mothers undergoing elective caesarean section) scaffold for 2 weeks in growth media containing DMEM & Ham’s F12 (1:1) with 10% FBS and growth factors. In vitro dosage of MMF was standardised by MTT assay. Analysis of stem cell markers was done using RT-PCR while mucin mRNA expression was quantified using RT-PCR and q-PCR before and after treating cultured OMEC with graded concentrations of MMF for 24 hours. Protein expression was validated using immunocytochemistry. Morphological studies revealed a confluent sheet of proliferating, stratified oral mucosal epithelial cells growing over the surface of HAM scaffold. The presence of progenitor stem cell markers (p63, p75, β1-Integrin and ABCG2) and cell surface associated mucins (MUC1, MUC15 and MUC16) were elucidated by RT-PCR. The mucin mRNA expression was found to be upregulated in MMF treated primary cultures of OMEC, compared to untreated controls as quantified by q-PCR with β-actin as internal reference gene. Increased MUC1 protein expression was validated by immunocytochemistry on representative samples. Our findings conclude that OMEC have the ability to form a multi-layered confluent sheet on the surface of HAM similar to a cornea, which is important for the reconstruction of the damaged ocular surface. Cultured OMEC has stem cell properties as demonstrated by stem cell markers. MMF can be a novel enhancer of mucin production in OMEC. It has the potential to improve dry eye in patients undergoing OMEC transplantation for bilateral OSD. Further clinical trials are required to establish the role of MMF in patients undergoing OMEC transplantation.

Keywords: limbal stem cell deficiency, mycophenolate mofetil, mucin, ocular surface disease

Procedia PDF Downloads 304
110 Overlaps and Intersections: An Alternative Look at Choreography

Authors: Ashlie Latiolais

Abstract:

Architecture, as a discipline, is on a trajectory of extension beyond the boundaries of buildings and, more increasingly, is coupled with research that connects to alternative and typically disjointed disciplines. A “both/and” approach and (expanded) definition of architecture, as depicted here, expands the margins that contain the profession. Figuratively, architecture is a series of edges, events, and occurrences that establishes a choreography or stage by which humanity exists. The way in which architecture controls and suggests the movement through these spaces, being within a landscape, city, or building, can be viewed as a datum by which the “dance” of everyday life occurs. This submission views the realm of architecture through the lens of movement and dance as a cross-fertilizer of collaboration, tectonic, and spatial geometry investigations. “Designing on digital programs puts architects at a distance from the spaces they imagine. While this has obvious advantages, it also means that they lose the lived, embodied experience of feeling what is needed in space—meaning that some design ideas that work in theory ultimately fail in practice.” By studying the body in motion through real-time performance, a more holistic understanding of architectural space surfaces and new prospects for theoretical teaching pedagogies emerge. The atypical intersection rethinks how architecture is considered, created, and tested, similar to how “dance artists often do this by thinking through the body, opening pathways and possibilities that might not otherwise be accessible” –this is the essence of this poster submission as explained through unFOLDED, a creative performance work. A new languageismaterialized through unFOLDED, a dynamic occupiable installation by which architecture is investigated through dance, movement, and body analysis. The entry unfolds a collaboration of an architect, dance choreographer, musicians, video artist, and lighting designers to re-create one of the first documented avant-garde performing arts collaborations (Matisse, Satie, Massine, Picasso) from the Ballet Russes in 1917, entitled Parade. Architecturally, this interdisciplinary project orients and suggests motion through structure, tectonic, lightness, darkness, and shadow as it questions the navigation of the dark space (stage) surrounding the installation. Artificial light via theatrical lighting and video graphics brought the blank canvas to life – where the sensitive mix of musicality coordinated with the structure’s movement sequencing was certainly a challenge. The upstage light from the video projections created both flickered contextual imagery and shadowed figures. When the dancers were either upstage or downstage of the structure, both silhouetted figures and revealed bodies are experienced as dancer-controlled installation manipulations occurred throughout the performance. The experimental performance, through structure, prompted moving (dancing) bodies in space, where the architecture served as a key component to the choreography itself. The tectonic of the delicate steel structure allowed for the dancers to interact with the installation, which created a variety of spatial conditions – the contained box of three-dimensional space, to a wall, and various abstracted geometries in between. The development of this research unveils the new role of an Architect as a Choreographer of the built environment.

Keywords: dance, architecture, choreography, installation, architect, choreographer, space

Procedia PDF Downloads 72
109 Pricing Techniques to Mitigate Recurring Congestion on Interstate Facilities Using Dynamic Feedback Assignment

Authors: Hatem Abou-Senna

Abstract:

Interstate 4 (I-4) is a primary east-west transportation corridor between Tampa and Daytona cities, serving commuters, commercial and recreational traffic. I-4 is known to have severe recurring congestion during peak hours. The congestion spans about 11 miles in the evening peak period in the central corridor area as it is considered the only non-tolled limited access facility connecting the Orlando Central Business District (CBD) and the tourist attractions area (Walt Disney World). Florida officials had been skeptical of tolling I-4 prior to the recent legislation, and the public through the media had been complaining about the excessive toll facilities in Central Florida. So, in search for plausible mitigation to the congestion on the I-4 corridor, this research is implemented to evaluate the effectiveness of different toll pricing alternatives that might divert traffic from I-4 to the toll facilities during the peak period. The network is composed of two main diverging limited access highways, freeway (I-4) and toll road (SR 417) in addition to two east-west parallel toll roads SR 408 and SR 528, intersecting the above-mentioned highways from both ends. I-4 and toll road SR 408 are the most frequently used route by commuters. SR-417 is a relatively uncongested toll road with 15 miles longer than I-4 and $5 tolls compared to no monetary cost on 1-4 for the same trip. The results of the calibrated Orlando PARAMICS network showed that percentages of route diversion vary from one route to another and depends primarily on the travel cost between specific origin-destination (O-D) pairs. Most drivers going from Disney (O1) or Lake Buena Vista (O2) to Lake Mary (D1) were found to have a high propensity towards using I-4, even when eliminating tolls and/or providing real-time information. However, a diversion from I-4 to SR 417 for these OD pairs occurred only in the cases of the incident and lane closure on I-4, due to the increase in delay and travel costs, and when information is provided to travelers. Furthermore, drivers that diverted from I-4 to SR 417 and SR 528 did not gain significant travel-time savings. This was attributed to the limited extra capacity of the alternative routes in the peak period and the longer traveling distance. When the remaining origin-destination pairs were analyzed, average travel time savings on I-4 ranged between 10 and 16% amounting to 10 minutes at the most with a 10% increase in the network average speed. High propensity of diversion on the network increased significantly when eliminating tolls on SR 417 and SR 528 while doubling the tolls on SR 408 along with the incident and lane closure scenarios on I-4 and with real-time information provided. The toll roads were found to be a viable alternative to I-4 for these specific OD pairs depending on the user perception of the toll cost which was reflected in their specific travel times. However, on the macroscopic level, it was concluded that route diversion through toll reduction or elimination on surrounding toll roads would only have a minimum impact on reducing I-4 congestion during the peak period.

Keywords: congestion pricing, dynamic feedback assignment, microsimulation, paramics, route diversion

Procedia PDF Downloads 152
108 Ectopic Osteoinduction of Porous Composite Scaffolds Reinforced with Graphene Oxide and Hydroxyapatite Gradient Density

Authors: G. M. Vlasceanu, H. Iovu, E. Vasile, M. Ionita

Abstract:

Herein, the synthesis and characterization of chitosan-gelatin highly porous scaffold reinforced with graphene oxide, and hydroxyapatite (HAp), crosslinked with genipin was targeted. In tissue engineering, chitosan and gelatin are two of the most robust biopolymers with wide applicability due to intrinsic biocompatibility, biodegradability, low antigenicity properties, affordability, and ease of processing. HAp, per its exceptional activity in tuning cell-matrix interactions, is acknowledged for its capability of sustaining cellular proliferation by promoting bone-like native micro-media for cell adjustment. Genipin is regarded as a top class cross-linker, while graphene oxide (GO) is viewed as one of the most performant and versatile fillers. The composites with natural bone HAp/biopolymer ratio were obtained by cascading sonochemical treatments, followed by uncomplicated casting methods and by freeze-drying. Their structure was characterized by Fourier Transform Infrared Spectroscopy and X-ray Diffraction, while overall morphology was investigated by Scanning Electron Microscopy (SEM) and micro-Computer Tomography (µ-CT). Ensuing that, in vitro enzyme degradation was performed to detect the most promising compositions for the development of in vivo assays. Suitable GO dispersion was ascertained within the biopolymer mix as nanolayers specific signals lack in both FTIR and XRD spectra, and the specific spectral features of the polymers persisted with GO load enhancement. Overall, correlations between the GO induced material structuration, crystallinity variations, and chemical interaction of the compounds can be correlated with the physical features and bioactivity of each composite formulation. Moreover, the HAp distribution within follows an auspicious density gradient tuned for hybrid osseous/cartilage matter architectures, which were mirrored in the mice model tests. Hence, the synthesis route of a natural polymer blend/hydroxyapatite-graphene oxide composite material is anticipated to emerge as influential formulation in bone tissue engineering. Acknowledgement: This work was supported by the project 'Work-based learning systems using entrepreneurship grants for doctoral and post-doctoral students' (Sisteme de invatare bazate pe munca prin burse antreprenor pentru doctoranzi si postdoctoranzi) - SIMBA, SMIS code 124705 and by a grant of the National Authority for Scientific Research and Innovation, Operational Program Competitiveness Axis 1 - Section E, Program co-financed from European Regional Development Fund 'Investments for your future' under the project number 154/25.11.2016, P_37_221/2015. The nano-CT experiments were possible due to European Regional Development Fund through Competitiveness Operational Program 2014-2020, Priority axis 1, ID P_36_611, MySMIS code 107066, INOVABIOMED.

Keywords: biopolymer blend, ectopic osteoinduction, graphene oxide composite, hydroxyapatite

Procedia PDF Downloads 86
107 Rethinking the Languages for Specific Purposes Syllabus in the 21st Century: Topic-Centered or Skills-Centered

Authors: A. Knezović

Abstract:

21st century has transformed the labor market landscape in a way of posing new and different demands on university graduates as well as university lecturers, which means that the knowledge and academic skills students acquire in the course of their studies should be applicable and transferable from the higher education context to their future professional careers. Given the context of the Languages for Specific Purposes (LSP) classroom, the teachers’ objective is not only to teach the language itself, but also to prepare students to use that language as a medium to develop generic skills and competences. These include media and information literacy, critical and creative thinking, problem-solving and analytical skills, effective written and oral communication, as well as collaborative work and social skills, all of which are necessary to make university graduates more competitive in everyday professional environments. On the other hand, due to limitations of time and large numbers of students in classes, the frequently topic-centered syllabus of LSP courses places considerable focus on acquiring the subject matter and specialist vocabulary instead of sufficient development of skills and competences required by students’ prospective employers. This paper intends to explore some of those issues as viewed both by LSP lecturers and by business professionals in their respective surveys. The surveys were conducted among more than 50 LSP lecturers at higher education institutions in Croatia, more than 40 HR professionals and more than 60 university graduates with degrees in economics and/or business working in management positions in mainly large and medium-sized companies in Croatia. Various elements of LSP course content have been taken into consideration in this research, including reading and listening comprehension of specialist texts, acquisition of specialist vocabulary and grammatical structures, as well as presentation and negotiation skills. The ability to hold meetings, conduct business correspondence, write reports, academic texts, case studies and take part in debates were also taken into consideration, as well as informal business communication, business etiquette and core courses delivered in a foreign language. The results of the surveys conducted among LSP lecturers will be analyzed with reference to what extent those elements are included in their courses and how consistently and thoroughly they are evaluated according to their course requirements. Their opinions will be compared to the results of the surveys conducted among professionals from a range of industries in Croatia so as to examine how useful and important they perceive the same elements of the LSP course content in their working environments. Such comparative analysis will thus show to what extent the syllabi of LSP courses meet the demands of the employment market when it comes to the students’ language skills and competences, as well as transferable skills. Finally, the findings will also be compared to the observations based on practical teaching experience and the relevant sources that have been used in this research. In conclusion, the ideas and observations in this paper are merely open-ended questions that do not have conclusive answers, but might prompt LSP lecturers to re-evaluate the content and objectives of their course syllabi.

Keywords: languages for specific purposes (LSP), language skills, topic-centred syllabus, transferable skills

Procedia PDF Downloads 287
106 Impact of COVID-19 on Study Migration

Authors: Manana Lobzhanidze

Abstract:

The COVID-19 pandemic has made significant changes in migration processes, notably changes in the study migration process. The constraints caused by the COVID-19 pandemic led to changes in the studying process, which negatively affected its efficiency. The educational process has partially or completely shifted to distance learning; Both labor and study migration have increased significantly in the world. The employment and education market has become global and consequently, a number of challenges have arisen for employers, researchers, and businesses. The role of preparing qualified personnel in achieving high productivity is justified, the benefits for employers and employees are assessed on the one hand, and the role of study migration for the country’s development is examined on the other hand. Research methods. The research is based on methods of analysis and synthesis, quantitative and qualitative, groupings, relative and mean quantities, graphical representation, comparison, analysis and etc. In-depth interviews were conducted with experts to determine quantitative and qualitative indicators. Research findings. Factors affecting study migration are analysed in the paper and the environment that stimulates migration is explored. One of the driving forces of migration is considered to be the desire for receiving higher pay. Levels and indicators of study migration are studied by country. Comparative analysis has found that study migration rates are high in countries where the price of skilled labor is high. The productivity of individuals with low skills is low, which negatively affects the economic development of countries. It has been revealed that students leave the country to improve their skills during study migration. The process mentioned in the article is evaluated as a positive event for a developing country, as individuals are given the opportunity to share the technology of developed countries, gain knowledge, and then introduce it in their own country. The downside of study migration is the return of a small proportion of graduates from developed economies to their home countries. The article concludes that countries with emerging economies devote less resources to research and development, while this is a priority in developed countries, allowing highly skilled individuals to use their skills efficiently. The paper studies the national education system examines the level of competition in the education market and the indicators of educational migration. The level of competition in the education market and the indicators of educational migration are studied. The role of qualified personnel in achieving high productivity is substantiated, the benefits of employers and employees are assessed on the one hand, and the role of study migration in the development of the country is revealed on the other hand. The paper also analyzes the level of competition in the education and labor markets and identifies indicators of study migration. During the pandemic period, there was a great demand for the digital technologies. Open access to a variety of comprehensive platforms will significantly reduce study migration to other countries. As a forecast, it can be said that the intensity of the use of e-learning platforms will be increased significantly in the post-pandemic period. The paper analyzes the positive and negative effects of study migration on economic development, examines the challenges of study migration in light of the COVID-19 pandemic, suggests ways to avoid negative consequences, and develops recommendations for improving the study migration process in the post-pandemic period.

Keywords: study migration, COVID-19 pandemic, factors affecting migration, economic development, post-pandemic migration

Procedia PDF Downloads 110
105 LncRNA-miRNA-mRNA Networks Associated with BCR-ABL T315I Mutation in Chronic Myeloid Leukemia

Authors: Adenike Adesanya, Nonthaphat Wong, Xiang-Yun Lan, Shea Ping Yip, Chien-Ling Huang

Abstract:

Background: The most challenging mutation of the oncokinase BCR-ABL protein T315I, which is commonly known as the “gatekeeper” mutation and is notorious for its strong resistance to almost all tyrosine kinase inhibitors (TKIs), especially imatinib. Therefore, this study aims to identify T315I-dependent downstream microRNA (miRNA) pathways associated with drug resistance in chronic myeloid leukemia (CML) for prognostic and therapeutic purposes. Methods: T315I-carrying K562 cell clones (K562-T315I) were generated by the CRISPR-Cas9 system. Imatinib-treated K562-T315I cells were subjected to small RNA library preparation and next-generation sequencing. Putative lncRNA-miRNA-mRNA networks were analyzed with (i) DESeq2 to extract differentially expressed miRNAs, using Padj value of 0.05 as cut-off, (ii) STarMir to obtain potential miRNA response element (MRE) binding sites of selected miRNAs on lncRNA H19, (iii) miRDB, miRTarbase, and TargetScan to predict mRNA targets of selected miRNAs, (iv) IntaRNA to obtain putative interactions between H19 and the predicted mRNAs, (v) Cytoscape to visualize putative networks, and (vi) several pathway analysis platforms – Enrichr, PANTHER and ShinyGO for pathway enrichment analysis. Moreover, mitochondria isolation and transcript quantification were adopted to determine the new mechanism involved in T315I-mediated resistance of CML treatment. Results: Verification of the CRISPR-mediated mutagenesis with digital droplet PCR detected the mutation abundance of ≥80%. Further validation showed the viability of ≥90% by cell viability assay, and intense phosphorylated CRKL protein band being detected with no observable change for BCR-ABL and c-ABL protein expressions by Western blot. As reported by several investigations into hematological malignancies, we determined a 7-fold increase of H19 expression in K562-T315I cells. After imatinib treatment, a 9-fold increment was observed. DESeq2 revealed 171 miRNAs were differentially expressed K562-T315I, 112 out of these miRNAs were identified to have MRE binding regions on H19, and 26 out of the 112 miRNAs were significantly downregulated. Adopting the seed-sequence analysis of these identified miRNAs, we obtained 167 mRNAs. 6 hub miRNAs (hsa-let-7b-5p, hsa-let-7e-5p, hsa-miR-125a-5p, hsa-miR-129-5p, and hsa-miR-372-3p) and 25 predicted genes were identified after constructing hub miRNA-target gene network. These targets demonstrated putative interactions with H19 lncRNA and were mostly enriched in pathways related to cell proliferation, senescence, gene silencing, and pluripotency of stem cells. Further experimental findings have also shown the up-regulation of mitochondrial transcript and lncRNA MALAT1 contributing to the lncRNA-miRNA-mRNA networks induced by BCR-ABL T315I mutation. Conclusions: Our results have indicated that lncRNA-miRNA regulators play a crucial role not only in leukemogenesis but also in drug resistance, considering the significant dysregulation and interactions in the K562-T315I cell model generated by CRISPR-Cas9. In silico analysis has further shown that lncRNAs H19 and MALAT1 bear several complementary miRNA sites. This implies that they could serve as a sponge, hence sequestering the activity of the target miRNAs.

Keywords: chronic myeloid leukemia, imatinib resistance, lncRNA-miRNA-mRNA, T315I mutation

Procedia PDF Downloads 133
104 Single Pass Design of Genetic Circuits Using Absolute Binding Free Energy Measurements and Dimensionless Analysis

Authors: Iman Farasat, Howard M. Salis

Abstract:

Engineered genetic circuits reprogram cellular behavior to act as living computers with applications in detecting cancer, creating self-controlling artificial tissues, and dynamically regulating metabolic pathways. Phenemenological models are often used to simulate and design genetic circuit behavior towards a desired behavior. While such models assume that each circuit component’s function is modular and independent, even small changes in a circuit (e.g. a new promoter, a change in transcription factor expression level, or even a new media) can have significant effects on the circuit’s function. Here, we use statistical thermodynamics to account for the several factors that control transcriptional regulation in bacteria, and experimentally demonstrate the model’s accuracy across 825 measurements in several genetic contexts and hosts. We then employ our first principles model to design, experimentally construct, and characterize a family of signal amplifying genetic circuits (genetic OpAmps) that expand the dynamic range of cell sensors. To develop these models, we needed a new approach to measuring the in vivo binding free energies of transcription factors (TFs), a key ingredient of statistical thermodynamic models of gene regulation. We developed a new high-throughput assay to measure RNA polymerase and TF binding free energies, requiring the construction and characterization of only a few constructs and data analysis (Figure 1A). We experimentally verified the assay on 6 TetR-homolog repressors and a CRISPR/dCas9 guide RNA. We found that our binding free energy measurements quantitatively explains why changing TF expression levels alters circuit function. Altogether, by combining these measurements with our biophysical model of translation (the RBS Calculator) as well as other measurements (Figure 1B), our model can account for changes in TF binding sites, TF expression levels, circuit copy number, host genome size, and host growth rate (Figure 1C). Model predictions correctly accounted for how these 8 factors control a promoter’s transcription rate (Figure 1D). Using the model, we developed a design framework for engineering multi-promoter genetic circuits that greatly reduces the number of degrees of freedom (8 factors per promoter) to a single dimensionless unit. We propose the Ptashne (Pt) number to encapsulate the 8 co-dependent factors that control transcriptional regulation into a single number. Therefore, a single number controls a promoter’s output rather than these 8 co-dependent factors, and designing a genetic circuit with N promoters requires specification of only N Pt numbers. We demonstrate how to design genetic circuits in Pt number space by constructing and characterizing 15 2-repressor OpAmp circuits that act as signal amplifiers when within an optimal Pt region. We experimentally show that OpAmp circuits using different TFs and TF expression levels will only amplify the dynamic range of input signals when their corresponding Pt numbers are within the optimal region. Thus, the use of the Pt number greatly simplifies the genetic circuit design, particularly important as circuits employ more TFs to perform increasingly complex functions.

Keywords: transcription factor, synthetic biology, genetic circuit, biophysical model, binding energy measurement

Procedia PDF Downloads 449
103 Reducing Road Traffic Accident: Rapid Evidence Synthesis for Low and Middle Income Countries

Authors: Tesfaye Dagne, Dagmawit Solomon, Firmaye Bogale, Yosef Gebreyohannes, Samson Mideksa, Mamuye Hadis, Desalegn Ararso, Ermias Woldie, Tsegaye Getachew, Sabit Ababor, Zelalem Kebede

Abstract:

Globally, road traffic accident (RTA) is causing millions of deaths and injuries every year. It is one of the leading causes of death among people of all age groups and the problem is worse among young reproductive age group. Moreover the problem is increasing with an increasing number of vehicles. The majority of the problem happen in low and middle income countries (LMIC), even if the number of vehicles in these countries is low compared to their population. So, the objective of this paper is to summarize the best available evidence on interventions that can reduce road traffic accidents in low and middle income countries (LMIC). Method: A rapid evidence synthesis approach adapted from the SURE Rapid Response Service was applied to search, appraise and summarize the best available evidence on effective intervention in reducing road traffic injury. To answer the question under review, we searched for relevant studies from databases including PubMed, the Cochrane Library, TRANSPORT, Health system evidence, Epistemonikos, and SUPPORT summary. The following key terms were used for searching: Road traffic accident, RTA, Injury, Reduc*, Prevent*, Minimiz*, “Low and middle-income country”, LMIC. We found 18 articles through a search of different databases mentioned above. After screening for the titles and abstracts of the articles, four of them which satisfy the inclusion criteria were included in the final review. Then we appraised and graded the methodological quality of systematic reviews that are deemed to be highly relevant using AMSTAR. Finding: The identified interventions to reduce road traffic accidents were legislation and enforcement, public awareness/education, speed control/ rumble strips, road improvement, mandatory motorcycle helmet, graduated driver license, street lighting. Legislation and Enforcement: Legislation focusing on mandatory motorcycle helmet usage, banning cellular phone usage when driving, seat belt laws, decreasing the legal blood alcohol content (BAC) level from 0.06 g/L to 0.02 g/L bring the best result where enforcement is there. Public Awareness/Education: focusing on seat belt use, child restraint use, educational training in health centers and schools/universities, and public awareness with media through the distribution of videos, posters/souvenirs, and pamphlets are effective in the short run. Speed Control: through traffic calming bumps, or speed bumps, rumbled strips are effective in reducing accidents and fatality. Mandatory Motorcycle Helmet: is associated with reduction in mortality. Graduated driver’s license (GDL): reduce road traffic injury by 19%. Street lighting: is a low-cost intervention which may reduce road traffic accidents.

Keywords: evidence synthesis, injury, rapid review, reducing, road traffic accident

Procedia PDF Downloads 138
102 Mapping of Urban Micro-Climate in Lyon (France) by Integrating Complementary Predictors at Different Scales into Multiple Linear Regression Models

Authors: Lucille Alonso, Florent Renard

Abstract:

The characterizations of urban heat island (UHI) and their interactions with climate change and urban climates are the main research and public health issue, due to the increasing urbanization of the population. These solutions require a better knowledge of the UHI and micro-climate in urban areas, by combining measurements and modelling. This study is part of this topic by evaluating microclimatic conditions in dense urban areas in the Lyon Metropolitan Area (France) using a combination of data traditionally used such as topography, but also from LiDAR (Light Detection And Ranging) data, Landsat 8 satellite observation and Sentinel and ground measurements by bike. These bicycle-dependent weather data collections are used to build the database of the variable to be modelled, the air temperature, over Lyon’s hyper-center. This study aims to model the air temperature, measured during 6 mobile campaigns in Lyon in clear weather, using multiple linear regressions based on 33 explanatory variables. They are of various categories such as meteorological parameters from remote sensing, topographic variables, vegetation indices, the presence of water, humidity, bare soil, buildings, radiation, urban morphology or proximity and density to various land uses (water surfaces, vegetation, bare soil, etc.). The acquisition sources are multiple and come from the Landsat 8 and Sentinel satellites, LiDAR points, and cartographic products downloaded from an open data platform in Greater Lyon. Regarding the presence of low, medium, and high vegetation, the presence of buildings and ground, several buffers close to these factors were tested (5, 10, 20, 25, 50, 100, 200 and 500m). The buffers with the best linear correlations with air temperature for ground are 5m around the measurement points, for low and medium vegetation, and for building 50m and for high vegetation is 100m. The explanatory model of the dependent variable is obtained by multiple linear regression of the remaining explanatory variables (Pearson correlation matrix with a |r| < 0.7 and VIF with < 5) by integrating a stepwise sorting algorithm. Moreover, holdout cross-validation is performed, due to its ability to detect over-fitting of multiple regression, although multiple regression provides internal validation and randomization (80% training, 20% testing). Multiple linear regression explained, on average, 72% of the variance for the study days, with an average RMSE of only 0.20°C. The impact on the model of surface temperature in the estimation of air temperature is the most important variable. Other variables are recurrent such as distance to subway stations, distance to water areas, NDVI, digital elevation model, sky view factor, average vegetation density, or building density. Changing urban morphology influences the city's thermal patterns. The thermal atmosphere in dense urban areas can only be analysed on a microscale to be able to consider the local impact of trees, streets, and buildings. There is currently no network of fixed weather stations sufficiently deployed in central Lyon and most major urban areas. Therefore, it is necessary to use mobile measurements, followed by modelling to characterize the city's multiple thermal environments.

Keywords: air temperature, LIDAR, multiple linear regression, surface temperature, urban heat island

Procedia PDF Downloads 110
101 Assessing Image Quality in Mobile Radiography: A Phantom-Based Evaluation of a New Lightweight Mobile X-Ray Equipment

Authors: May Bazzi, Shafik Tokmaj, Younes Saberi, Mats Geijer, Tony Jurkiewicz, Patrik Sund, Anna Bjällmark

Abstract:

Mobile radiography, employing portable X-ray equipment, has become a routine procedure within hospital settings, with chest X-rays in intensive care units standing out as the most prevalent mobile X-ray examinations. This approach is not limited to hospitals alone, as it extends its benefits to imaging patients in various settings, particularly those too frail to be transported, such as elderly care residents in nursing homes. Moreover, the utility of mobile X-ray isn't confined solely to traditional healthcare recipients; it has proven to be a valuable resource for vulnerable populations, including the homeless, drug users, asylum seekers, and patients with multiple co-morbidities. Mobile X-rays reduce patient stress, minimize costly hospitalizations, and offer cost-effective imaging. While studies confirm its reliability, further research is needed, especially regarding image quality. Recent advancements in lightweight equipment with enhanced battery and detector technology provide the potential for nearly handheld radiography. The main aim of this study was to evaluate a new lightweight mobile X-ray system with two different detectors and compare the image quality with a modern stationary system. Methods: A total of 74 images of the chest (chest anterior-posterior (AP) views and chest lateral views) and pelvic/hip region (AP pelvis views, hip AP views, and hip cross-table lateral views) were acquired on a whole-body phantom (Kyotokagaku, Japan), utilizing varying image parameters. These images were obtained using a stationary system - 18 images (Mediel, Sweden), a mobile X-ray system with a second-generation detector - 28 images (FDR D-EVO II; Fujifilm, Japan) and a mobile X-ray system with a third-generation detector - 28 images (FDR D-EVO III; Fujifilm, Japan). Image quality was assessed by visual grading analysis (VGA), which is a method to measure image quality by assessing the visibility and accurate reproduction of anatomical structures within the images. A total of 33 image criteria were used in the analysis. A panel of two experienced radiologists, two experienced radiographers, and two final-term radiographer students evaluated the image quality on a 5-grade ordinal scale using the software Viewdex 3.0 (Viewer for Digital Evaluation of X-ray images, Sweden). Data were analyzed using visual grading characteristics analysis. The dose was measured by the dose-area product (DAP) reported by the respective systems. Results: The mobile X-ray equipment (both detectors) showed significantly better image quality than the stationary equipment for the pelvis, hip AP and hip cross-table lateral images with AUCVGA-values ranging from 0.64-0.92, while chest images showed mixed results. The number of images rated as having sufficient quality for diagnostic use was significantly higher for mobile X-ray generation 2 and 3 compared with the stationary X-ray system. The DAP values were higher for the stationary compared to the mobile system. Conclusions: The new lightweight radiographic equipment had an image quality at least as good as a fixed system at a lower radiation dose. Future studies should focus on clinical images and consider radiographers' viewpoints for a comprehensive assessment.

Keywords: mobile x-ray, visual grading analysis, radiographer, radiation dose

Procedia PDF Downloads 42
100 Sustainability in the Purchase of Airline Tickets: Analysis of Digital Communication from the Perspective of Neuroscience

Authors: Rodríguez Sánchez Carla, Sancho-Esper Franco, Guillen-Davo Marina

Abstract:

Tourism is one of the most important sectors worldwide since it is an important economic engine for today's society. It is also one of the sectors that most negatively affect the environment in terms of CO₂ emissions due to this expansion. In light of this, airlines are developing Voluntary Carbon Offset (VCO). There is important evidence focused on analyzing the features of these VCO programs and their efficacy in reducing CO₂ emissions, and findings are mixed without a clear consensus. Different research approaches have centered on analyzing factors and consequences of VCO programs, such as economic modelling based on panel data, survey research based on traveler responses or experimental research analyzing customer decisions in a simulated context. This study belongs to the latter group because it tries to understand how different characteristics of an online ticket purchase website affect the willingness of a traveler to choose a sustainable one. The proposed behavioral model is based on several theories, such as the nudge theory, the dual processing ELM and the cognitive dissonance theory. This randomized experiment aims at overcoming previous studies based on self-reported measures that mainly study sustainable behavioral intention rather than actual decision-making. It also complements traditional self-reported independent variables by gathering objective information from an eye-tracking device. This experiment analyzes the influence of two characteristics of the online purchase website: i) the type of information regarding flight CO₂ emissions (quantitative vs. qualitative) and the comparison framework related to the sustainable purchase decision (negative: alternative with more emissions than the average flight of the route vs. positive: alternative with less emissions than the average flight of the route), therefore it is a 2x2 experiment with four alternative scenarios. A pretest was run before the actual experiment to refine the experiment features and to check the manipulations. Afterward, a different sample of students answered the pre-test questionnaire aimed at recruiting the cases and measuring several pre-stimulus measures. One week later, students came to the neurolab at the University setting to be part of the experiment, made their decision regarding online purchases and answered the post-test survey. A final sample of 21 students was gathered. The committee of ethics of the institution approved the experiment. The results show that qualitative information generates more sustainable decisions (less contaminant alternative) than quantitative information. Moreover, evidence shows that subjects are more willing to choose the sustainable decision to be more ecological (comparison of the average with the less contaminant alternative) rather than to be less contaminant (comparison of the average with the more contaminant alternative). There are also interesting differences in the information processing variables from the eye tracker. Both the total time to make the choice and the specific times by area of interest (AOI) differ depending on the assigned scenario. These results allow for a better understanding of the factors that condition the decision of a traveler to be part of a VCO program and provide useful information for airline managers to promote these programs to reduce environmental impact.

Keywords: voluntary carbon offset, airline, online purchase, carbon emission, sustainability, randomized experiment

Procedia PDF Downloads 44
99 Medicompills Architecture: A Mathematical Precise Tool to Reduce the Risk of Diagnosis Errors on Precise Medicine

Authors: Adriana Haulica

Abstract:

Powered by Machine Learning, Precise medicine is tailored by now to use genetic and molecular profiling, with the aim of optimizing the therapeutic benefits for cohorts of patients. As the majority of Machine Language algorithms come from heuristics, the outputs have contextual validity. This is not very restrictive in the sense that medicine itself is not an exact science. Meanwhile, the progress made in Molecular Biology, Bioinformatics, Computational Biology, and Precise Medicine, correlated with the huge amount of human biology data and the increase in computational power, opens new healthcare challenges. A more accurate diagnosis is needed along with real-time treatments by processing as much as possible from the available information. The purpose of this paper is to present a deeper vision for the future of Artificial Intelligence in Precise medicine. In fact, actual Machine Learning algorithms use standard mathematical knowledge, mostly Euclidian metrics and standard computation rules. The loss of information arising from the classical methods prevents obtaining 100% evidence on the diagnosis process. To overcome these problems, we introduce MEDICOMPILLS, a new architectural concept tool of information processing in Precise medicine that delivers diagnosis and therapy advice. This tool processes poly-field digital resources: global knowledge related to biomedicine in a direct or indirect manner but also technical databases, Natural Language Processing algorithms, and strong class optimization functions. As the name suggests, the heart of this tool is a compiler. The approach is completely new, tailored for omics and clinical data. Firstly, the intrinsic biological intuition is different from the well-known “a needle in a haystack” approach usually used when Machine Learning algorithms have to process differential genomic or molecular data to find biomarkers. Also, even if the input is seized from various types of data, the working engine inside the MEDICOMPILLS does not search for patterns as an integrative tool. This approach deciphers the biological meaning of input data up to the metabolic and physiologic mechanisms, based on a compiler with grammars issued from bio-algebra-inspired mathematics. It translates input data into bio-semantic units with the help of contextual information iteratively until Bio-Logical operations can be performed on the base of the “common denominator “rule. The rigorousness of MEDICOMPILLS comes from the structure of the contextual information on functions, built to be analogous to mathematical “proofs”. The major impact of this architecture is expressed by the high accuracy of the diagnosis. Detected as a multiple conditions diagnostic, constituted by some main diseases along with unhealthy biological states, this format is highly suitable for therapy proposal and disease prevention. The use of MEDICOMPILLS architecture is highly beneficial for the healthcare industry. The expectation is to generate a strategic trend in Precise medicine, making medicine more like an exact science and reducing the considerable risk of errors in diagnostics and therapies. The tool can be used by pharmaceutical laboratories for the discovery of new cures. It will also contribute to better design of clinical trials and speed them up.

Keywords: bio-semantic units, multiple conditions diagnosis, NLP, omics

Procedia PDF Downloads 49
98 Dysphagia Tele Assessment Challenges Faced by Speech and Swallow Pathologists in India: Questionnaire Study

Authors: B. S. Premalatha, Mereen Rose Babu, Vaishali Prabhu

Abstract:

Background: Dysphagia must be assessed, either subjectively or objectively, in order to properly address the swallowing difficulty. Providing therapeutic care to patients with dysphagia via tele mode was one approach for providing clinical services during the COVID-19 epidemic. As a result, the teleassessment of dysphagia has increased in India. Aim: This study aimed to identify challenges faced by Indian SLPs while providing teleassessment to individuals with dysphagia during the outbreak of COVID-19 from 2020 to 2021. Method: After receiving approval from the institute's institutional review board and ethics committee, the current study was carried out. The study was cross-sectional in nature and lasted from 2020 to 2021. The study enrolled participants who met the inclusion and exclusion criteria of the study. It was decided to recruit roughly 246 people based on the sample size calculations. The research was done in three stages: questionnaire development and content validation, questionnaire administration. Five speech and hearing professionals' content verified the questionnaire for faults and clarity. Participants received questionnaires via various social media platforms such as e-mail and WhatsApp, which were written in Microsoft Word and then converted to Google Forms. SPSS software was used to examine the data. Results: In light of the obstacles that Indian SLPs encounter, the study's findings were examined. Only 135 people responded. During the COVID-19 lockdowns, 38% of participants said they did not deal with dysphagia patients. After the lockout, 70.4% of SLPs kept working with dysphagia patients, while 29.6% did not. From the beginning of the oromotor examination, the main problems in completing tele evaluation of dysphagia have been highlighted. Around 37.5% of SLPs said they don't undertake the OPME online because of difficulties doing the evaluation, such as the need for repeated instructions from patients and family members and trouble visualizing structures in various positions. The majority of SLPs' online assessments were inefficient and time-consuming. A bigger percentage of SLPs stated that they will not advocate tele evaluation in dysphagia to their colleagues. SLPs' use of dysphagia assessment has decreased as a result of the epidemic. When it came to the amount of food, the majority of people proposed a small amount. Apart from placing the patient for assessment and gaining less cooperation from the family, most SLPs found that Internet speed was a source of concern and a barrier. Hearing impairment and the presence of a tracheostomy in patients with dysphagia proved to be the most difficult conditions to treat online. For patients with NPO, the majority of SLPs did not advise tele-evaluation. In the anterior region of the oral cavity, oral meal residue was more visible. The majority of SLPs reported more anterior than posterior leakage. Even while the majority of SLPs could detect aspiration by coughing, many found it difficult to discern the gurgling tone of speech after swallowing. Conclusion: The current study sheds light on the difficulties that Indian SLPs experience when assessing dysphagia via tele mode, indicating that tele-assessment of dysphagia is still to gain importance in India.

Keywords: dysphagia, teleassessment, challenges, Indian SLP

Procedia PDF Downloads 108
97 Guard@Lis: Birdwatching Augmented Reality Mobile Application

Authors: Jose A. C. Venancio, Alexandrino J. M. Goncalves, Anabela Marto, Nuno C. S. Rodrigues, Rita M. T. Ascenso

Abstract:

Nowadays, it is common to find people who are concerned about getting away from the everyday life routine, looking forward to outcome well-being and pleasant emotions. Trying to disconnect themselves from the usual places of work and residence, they pursue different places, such as tourist destinations, aiming to have unexpected experiences. In order to make this exploration process easier, cities and tourism agencies seek new opportunities and solutions, creating routes with diverse cultural landmarks, including natural landscapes and historic buildings. These offers frequently aspire to the preservation of the local patrimony. In nature and wildlife, birdwatching is an activity that has been increasing, both in cities and in the countryside. This activity seeks to find, observe and identify the diversity of birds that live permanently or temporarily in these places, and it is usually supported by birdwatching guides. Leiria (Portugal) is a well-known city, presenting several historical and natural landmarks, like the Lis river and the castle where King D. Dinis lived in the 13th century. Along the Lis River, a conservation process was carried out and a pedestrian route was created (Polis project). This is considered an excellent spot for birdwatching, especially for the gray heron (Ardea cinerea) and for the kingfisher (Alcedo atthis). There is also a route through the city, from the riverside to the castle, which encloses a characterized variety of species, such as the barn swallow (Hirundo rustica), known for passing through different seasons of the year. Birdwatching is sometimes a difficult task since it is not always possible to see all bird species that inhabit a given place. For this reason, a need to create a technological solution was found to ease this activity. This project aims to encourage people to learn about the various species of birds that live along the Lis River and to promote the preservation of nature in a conscious way. This work is being conducted in collaboration with Leiria Municipal Council and with the Environmental Interpretation Centre. It intends to show the majesty of the Lis River, a place visited daily by several people, such as children and families, who use it for didactic and recreational activities. We are developing a mobile multi-platform application (Guard@Lis) that allows bird species to be observed along a given route, using representative digital 3D models through the integration of augmented reality technologies. Guard@Lis displays a route with points of interest for birdwatching and a list of species for each point of interest, along with scientific information, images and sounds for every species. For some birds, to ensure their observation, the user can watch them in loco, in their real and natural environment, with their mobile device by means of augmented reality, giving the sensation of presence of these birds, even if they cannot be seen in that place at that moment. The augmented reality feature is being developed with Vuforia SDK, using a hybrid approach to recognition and tracking processes, combining marks and geolocation techniques. This application proposes routes and notifies users with alerts for the possibility of viewing models of augmented reality birds. The final Guard@Lis prototype will be tested by volunteers in-situ.

Keywords: augmented reality, birdwatching route, mobile application, nature tourism, watch birds using augmented reality

Procedia PDF Downloads 146
96 The Role of Virtual Reality in Mediating the Vulnerability of Distant Suffering: Distance, Agency, and the Hierarchies of Human Life

Authors: Z. Xu

Abstract:

Immersive virtual reality (VR) has gained momentum in humanitarian communication due to its utopian promises of co-presence, immediacy, and transcendence. These potential benefits have led the United Nations (UN) to tirelessly produce and distribute VR series to evoke global empathy and encourage policymakers, philanthropic business tycoons and citizens around the world to actually do something (i.e. give a donation). However, it is unclear whether or not VR can cultivate cosmopolitans with a sense of social responsibility towards the geographically, socially/culturally and morally mediated misfortune of faraway others. Drawing upon existing works on the mediation of distant suffering, this article constructs an analytical framework to articulate the issue. Applying this framework on a case study of five of the UN’s VR pieces, the article identifies three paradoxes that exist between cyber-utopian and cyber-dystopian narratives. In the “paradox of distance”, VR relies on the notions of “presence” and “storyliving” to implicitly link audiences spatially and temporally to distant suffering, creating global connectivity and reducing perceived distances between audiences and others; yet it also enables audiences to fully occupy the point of view of distant sufferers (creating too close/absolute proximity), which may cause them to feel naive self-righteousness or narcissism with their pleasures and desire, thereby destroying the “proper distance”. In the “paradox of agency”, VR simulates a superficially “real” encounter for visual intimacy, thereby establishing an “audiences–beneficiary” relationship in humanitarian communication; yet in this case the mediated hyperreality is not an authentic reality, and its simulation does not fill the gap between reality and the virtual world. In the “paradox of the hierarchies of human life”, VR enables an audience to experience virtually fundamental “freedom”, epitomizing an attitude of cultural relativism that informs a great deal of contemporary multiculturalism, providing vast possibilities for a more egalitarian representation of distant sufferers; yet it also takes the spectator’s personally empathic feelings as the focus of intervention, rather than structural inequality and political exclusion (an economic and political power relations of viewing). Thus, the audience can potentially remain trapped within the minefield of hegemonic humanitarianism. This study is significant in two respects. First, it advances the turn of digitalization in studies of media and morality in the polymedia milieu; it is motivated by the necessary call for a move beyond traditional technological environments to arrive at a more novel understanding of the asymmetry of power between the safety of spectators and the vulnerability of mediated sufferers. Second, it not only reminds humanitarian journalists and NGOs that they should not rely entirely on the richer news experience or powerful response-ability enabled by VR to gain a “moral bond” with distant sufferers, but also argues that when fully-fledged VR technology is developed, it can serve as a kind of alchemy and should not be underestimated merely as a “bugaboo” of an alarmist philosophical and fictional dystopia.

Keywords: audience, cosmopolitan, distant suffering, virtual reality, humanitarian communication

Procedia PDF Downloads 115
95 Impact of Primary Care Telemedicine Consultations On Health Care Resource Utilisation: A Systematic Review

Authors: Anastasia Constantinou, Stephen Morris

Abstract:

Background: The adoption of synchronous and asynchronous telemedicine modalities for primary care consultations has exponentially increased since the COVID-19 pandemic. However, there is limited understanding of how virtual consultations influence healthcare resource utilization and other quality measures including safety, timeliness, efficiency, patient and provider satisfaction, cost-effectiveness and environmental impact. Aim: Quantify the rate of follow-up visits, emergency department visits, hospitalizations, request for investigations and prescriptions and comment on the effect on different quality measures associated with different telemedicine modalities used for primary care services and primary care referrals to secondary care Design and setting: Systematic review in primary care Methods: A systematic search was carried out across three databases (Medline, PubMed and Scopus) between August and November 2023, using terms related to telemedicine, general practice, electronic referrals, follow-up, use and efficiency and supported by citation searching. This was followed by screening according to pre-defined criteria, data extraction and critical appraisal. Narrative synthesis and metanalysis of quantitative data was used to summarize findings. Results: The search identified 2230 studies; 50 studies are included in this review. There was a prevalence of asynchronous modalities in both primary care services (68%) and referrals from primary care to secondary care (83%), and most of the study participants were females (63.3%), with mean age of 48.2. The average follow-up for virtual consultations in primary care was 28.4% (eVisits: 36.8%, secure messages 18.7%, videoconference 23.5%) with no significant difference between them or F2F consultations. There was an average annual reduction of primary care visits by 0.09/patient, an increase in telephone visits by 0.20/patient, an increase in ED encounters by 0.011/patient, an increase in hospitalizations by 0.02/patient and an increase in out of hours visits by 0.019/patient. Laboratory testing was requested on average for 10.9% of telemedicine patients, imaging or procedures for 5.6% and prescriptions for 58.7% of patients. When looking at referrals to secondary care, on average 36.7% of virtual referrals required follow-up visit, with the average rate of follow-up for electronic referrals being higher than for videoconferencing (39.2% vs 23%, p=0.167). Technical failures were reported on average for 1.4% of virtual consultations to primary care. When using carbon footprint estimates, we calculate that the use of telemedicine in primary care services can potentially provide a net decrease in carbon footprint by 0.592kgCO2/patient/year. When follow-up rates are taken into account, we estimate that virtual consultations reduce carbon footprint for primary care services by 2.3 times, and for secondary care referrals by 2.2 times. No major concerns regarding quality of care, or patient satisfaction were identified. 5/7 studies that addressed cost-effectiveness, reported increased savings. Conclusions: Telemedicine provides quality, cost-effective, and environmentally sustainable care for patients in primary care with inconclusive evidence regarding the rates of subsequent healthcare utilization. The evidence is limited by heterogeneous, small-scale studies and lack of prospective comparative studies. Further research to identify the most appropriate telemedicine modality for different patient populations, clinical presentations, service provision (e.g. used to follow-up patients instead of initial diagnosis) as well as further education for patients and providers alike on how to make best use of this service is expected to improve outcomes and influence practice.

Keywords: telemedicine, healthcare utilisation, digital interventions, environmental impact, sustainable healthcare

Procedia PDF Downloads 36
94 Michel Foucault’s Docile Bodies and The Matrix Trilogy: A Close Reading Applied to the Human Pods and Growing Fields in the Films

Authors: Julian Iliev

Abstract:

The recent release of The Matrix Resurrections persuaded many film scholars that The Matrix trilogy had lost its appeal and its concepts were largely outdated. This study examines the human pods and growing fields in the trilogy. Their functionality is compared to Michel Foucault’s concept of docile bodies: linking fictional and contemporary worlds. This paradigm is scrutinized through surveillance literature. The analogy brings to light common elements of hidden surveillance practices in technologies. The comparison illustrates the effects of body manipulation portrayed in the movies and their relevance with contemporary surveillance practices. Many scholars have utilized a close reading methodology in film studies (J.Bizzocchi, J.Tanenbaum, P.Larsen, S. Herbrechter, and Deacon et al.). The use of a particular lens through which media text is examined is an indispensable factor that needs to be incorporated into the methodology. The study spotlights both scenes from the trilogy depicting the human pods and growing fields. The functionality of the pods and the fields compare directly with Foucault’s concept of docile bodies. By utilizing Foucault’s study as a lens, the research will unearth hidden components and insights into the films. Foucault recognizes three disciplines that produce docile bodies: 1) manipulation and the interchangeability of individual bodies, 2) elimination of unnecessary movements and management of time, and 3) command system guaranteeing constant supervision and continuity protection. These disciplines can be found in the pods and growing fields. Each body occupies a single pod aiding easier manipulation and fast interchangeability. The movement of the bodies in the pods is reduced to the absolute minimum. Thus, the body is transformed into the ultimate object of control – minimum movement correlates to maximum energy generation. Supervision is exercised by wiring the body with numerous types of cables. This ultimate supervision of body activity reduces the body’s purpose to mere functioning. If a body does not function as an energy source, then it’s unplugged, ejected, and liquefied. The command system secures the constant supervision and continuity of the process. To Foucault, the disciplines are distinctly different from slavery because they stop short of a total takeover of the bodies. This is a clear difference from the slave system implemented in the films. Even though their system might lack sophistication, it makes up for it in the elevation of functionality. Further, surveillance literature illustrates the connection between the generation of body energy in The Matrix trilogy to the generation of individual data in contemporary society. This study found that the three disciplines producing docile bodies were present in the portrayal of the pods and fields in The Matrix trilogy. The above comparison combined with surveillance literature yields insights into analogous processes and contemporary surveillance practices. Thus, the constant generation of energy in The Matrix trilogy can be equated to the consistent data generation in contemporary society. This essay shows the relevance of the body manipulation concept in the Matrix films with contemporary surveillance practices.

Keywords: docile bodies, film trilogies, matrix movies, michel foucault, privacy loss, surveillance

Procedia PDF Downloads 65