Search results for: group work
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 20546

Search results for: group work

4886 Artificial Neural Network Approach for Modeling Very Short-Term Wind Speed Prediction

Authors: Joselito Medina-Marin, Maria G. Serna-Diaz, Juan C. Seck-Tuoh-Mora, Norberto Hernandez-Romero, Irving Barragán-Vite

Abstract:

Wind speed forecasting is an important issue for planning wind power generation facilities. The accuracy in the wind speed prediction allows a good performance of wind turbines for electricity generation. A model based on artificial neural networks is presented in this work. A dataset with atmospheric information about air temperature, atmospheric pressure, wind direction, and wind speed in Pachuca, Hidalgo, México, was used to train the artificial neural network. The data was downloaded from the web page of the National Meteorological Service of the Mexican government. The records were gathered for three months, with time intervals of ten minutes. This dataset was used to develop an iterative algorithm to create 1,110 ANNs, with different configurations, starting from one to three hidden layers and every hidden layer with a number of neurons from 1 to 10. Each ANN was trained with the Levenberg-Marquardt backpropagation algorithm, which is used to learn the relationship between input and output values. The model with the best performance contains three hidden layers and 9, 6, and 5 neurons, respectively; and the coefficient of determination obtained was r²=0.9414, and the Root Mean Squared Error is 1.0559. In summary, the ANN approach is suitable to predict the wind speed in Pachuca City because the r² value denotes a good fitting of gathered records, and the obtained ANN model can be used in the planning of wind power generation grids.

Keywords: wind power generation, artificial neural networks, wind speed, coefficient of determination

Procedia PDF Downloads 120
4885 Leadership in the Emergence Paradigm: A Literature Review on the Medusa Principles

Authors: Everard van Kemenade

Abstract:

Many quality improvement activities are planned. Leaders are strongly involved in missions, visions and strategic planning. They use, consciously or unconsciously, the PDCA-cycle, also know as the Deming cycle. After the planning, the plans are carried out and the results or effects are measured. If the results show that the goals in the plan have not been achieved, adjustments are made in the next plan or in the execution of the processes. Then, the cycle is run through again. Traditionally, the PDCA-cycle is advocated as a means to an end. However, PDCA is especially fit for planned, ordered, certain contexts. It fits with the empirical and referential quality paradigm. For uncertain, unordered, unplanned processes, something else might be needed instead of Plan-Do-Check-Act. Due to the complexity of our society, the influence of the context, and the uncertainty in our world nowadays, not every activity can be planned anymore. At the same time organisations need to be more innovative than ever. That provides leaders with ‘wicked tendencies’. However, that raises the question how one can innovate without being able to plan? Complexity science studies the interactions of a diverse group of agents that bring about change in times of uncertainty, e.g. when radical innovation is co-created. This process is called emergence. This research study explores the role of leadership in the emergence paradigm. Aim of the article is to study the way that leadership can support the emergence of innovation in a complex context. First, clarity is given on the concepts used in the research question: complexity, emergence, innovation and leadership. Thereafter a literature search is conducted to answer the research question. The topics ‘emergent leadership’ or ‘complexity leadership’ are chosen for an exploratory search in Google and Google Scholar using the berry picking method. Exclusion criterion is emergence in other disciplines than organizational development or in the meaning of ‘arising’. The literature search conducted gave 45 hits. Twenty-seven articles were excluded after reading the title and abstract because they did not research the topic of emergent leadership and complexity. After reading the remaining articles as a whole one more was excluded because the article used emergent in the limited meaning of ‗arising‘ and eight more were excluded because the topic did not match the research question of this article. That brings the total of the search to 17 articles. The useful conclusions from the articles are merged and grouped together under overarching topics, using thematic analysis. The findings are that 5 topics prevail when looking at possibilities for leadership to facilitate innovation: enabling, sharing values, dreaming, interacting, context sensitivity and adaptivity. Together they form In Dutch the acronym Medusa.

Keywords: complexity science, emergence, leadership in the emergence paradigm, innovation, the Medusa principles

Procedia PDF Downloads 24
4884 Fake news and Conspiracy Narratives in the Covid-19 Crisis: An International Comparison

Authors: Caja Thimm

Abstract:

Already well before the Corona pandemic hit the world, ‘fake news‘ were no longer regarded as harmless twists of the truth but as intentionally composed disinformation, often with the goal of manipulative populist propaganda. During the Corona crisis, particularly conspiracy narratives have become a worldwide phenomenon with dangerous consequences (anti vaccination myths). The success of these manipulated news need s to be counteracted by trustworthy news, which in Europe particularly includes public broadcasting media and their social media channels. To understand better how the main public broadcasters in Germany, the UK, and France used Instagram strategically, a comparative study was carried out. The study – comparative analysis of Instagram during the Corona Crisis In our empirical study, we compared the activities by selected formats during the Corona crisis in order to see how the public broadcasters reached their audiences and how this might, in the longer run, affect journalistic strategies on social media platforms. First analysis showed that the increase in the use of social media overall was striking. Almost one in two adult online users (48 %) obtained information about the virus in social media, and in total, 38% of the younger age group (18-24) looked for Covid19 information on Instagram, so the platform can be regarded as one of the central digital spaces for Corona related information searches. Quantitative measures showed that 47% of recent posts by the broadcasters were related to Corona, and 7% treated conspiracy myths. For the more detailed content analysis, the following categories of analysis were applied: • Digital storytelling and instastories • Textuality and semantic keys • links to information • stickers • videochat • fact checking • news ticker • service • infografics and animated tables Additionally to these basic features, we particularly looked for new formats created during the crisis. Journalistic use of social media platforms opens up immediate and creative ways of applying the media logics of the respective platforms, and particularly the BBC and ARD formats proved to be interactive, responsive, and entertaining. Among them were new formats such as a space for user questions and personal uploads, interviews, music, comedy, etc. Particularly the fact checking channel got a lot of attention, as many user questions were focused on the conspiracy theories, which dominated the public discourse during many weeks in 2020. In the presentation, we will introduce eight particular strategies that show how public broadcasting journalism can adopt digital platforms and use them creatively and, hence help to counteract against conspiracy narratives and fake news.

Keywords: fake news, social media, digital journalism, digital methods

Procedia PDF Downloads 154
4883 Envisioning The Future of Language Learning: Virtual Reality, Mobile Learning and Computer-Assisted Language Learning

Authors: Jasmin Cowin, Amany Alkhayat

Abstract:

This paper will concentrate on a comparative analysis of both the advantages and limitations of using digital learning resources (DLRs). DLRs covered will be Virtual Reality (VR), Mobile Learning (M-learning) and Computer-Assisted Language Learning (CALL) together with their subset, Mobile Assisted Language Learning (MALL) in language education. In addition, best practices for language teaching and the application of established language teaching methodologies such as Communicative Language Teaching (CLT), the audio-lingual method, or community language learning will be explored. Education has changed dramatically since the eruption of the pandemic. Traditional face-to-face education was disrupted on a global scale. The rise of distance learning brought new digital tools to the forefront, especially web conferencing tools, digital storytelling apps, test authoring tools, and VR platforms. Language educators raced to vet, learn, and implement multiple technology resources suited for language acquisition. Yet, questions remain on how to harness new technologies, digital tools, and their ubiquitous availability while using established methods and methodologies in language learning paired with best teaching practices. In M-learning language, learners employ portable computing devices such as smartphones or tablets. CALL is a language teaching approach using computers and other technologies through presenting, reinforcing, and assessing language materials to be learned or to create environments where teachers and learners can meaningfully interact. In VR, a computer-generated simulation enables learner interaction with a 3D environment via screen, smartphone, or a head mounted display. Research supports that VR for language learning is effective in terms of exploration, communication, engagement, and motivation. Students are able to relate through role play activities, interact with 3D objects and activities such as field trips. VR lends itself to group language exercises in the classroom with target language practice in an immersive, virtual environment. Students, teachers, schools, language institutes, and institutions benefit from specialized support to help them acquire second language proficiency and content knowledge that builds on their cultural and linguistic assets. Through the purposeful application of different language methodologies and teaching approaches, language learners can not only make cultural and linguistic connections in DLRs but also practice grammar drills, play memory games or flourish in authentic settings.

Keywords: language teaching methodologies, computer-assisted language learning, mobile learning, virtual reality

Procedia PDF Downloads 231
4882 Use of Magnetically Separable Molecular Imprinted Polymers for Determination of Pesticides in Food Samples

Authors: Sabir Khan, Sajjad Hussain, Ademar Wong, Maria Del Pilar Taboada Sotomayor

Abstract:

The present work aims to develop magnetic molecularly imprinted polymers (MMIPs) for determination of a selected pesticide (ametryne) using high-performance liquid chromatography (HPLC). Computational simulation can assist the choice of the most suitable monomer for the synthesis of polymers. The (MMIPs) were polymerized at the surface of Fe3O4@SiO2 magnetic nanoparticles (MNPs) using 2-vinylpyradine as functional monomer, ethylene-glycol-dimethacrylate (EGDMA) is a cross-linking agent and 2,2-Azobisisobutyronitrile (AIBN) used as radical initiator. Magnetic non-molecularly imprinted polymer (MNIPs) was also prepared under the same conditions without analyte. The MMIPs were characterized by scanning electron microscopy (SEM), Brunauer, Emmett and Teller (BET) and Fourier transform infrared spectroscopy (FTIR). Pseudo first-order and pseudo second order model were applied to study kinetics of adsorption and it was found that adsorption process followed the pseudo-first-order kinetic model. Adsorption equilibrium data was fitted to Freundlich and Langmuir isotherms and the sorption equilibrium process was well described by Langmuir isotherm mode. The selectivity coefficients (α) of MMIPs for ametryne with respect to atrazine, ciprofloxacin and folic acid were 4.28, 12.32 and 14.53 respectively. The spiked recoveries ranged between 91.33 and 106.80% were obtained. The results showed high affinity and selectivity of MMIPs for pesticide ametryne in the food samples.

Keywords: molecularly imprinted polymer, pesticides, magnetic nanoparticles, adsorption

Procedia PDF Downloads 465
4881 The Damage and Durability of a Sport Synthetic Resin Floor: A Case Study

Authors: C. Paglia, C. Mosca

Abstract:

Synthetic resin floorsare often used in sport infrastructure. These organic materials are often in contact with a bituminous substrate, which in turn is placed on the ground. In this work, the damage of a basket resin field surface was characterized by means of visual inspection, optical microscopy, resin thickness measurements, adhesion strength, water vapor transmission capacity, capillary water adsorption, granulometry of the bituminous conglomerate, the surface properties, and the water ground infiltration speed. The infiltration speed indicates water pemeability. This was due to its composition: clean sand mixed with gravel. Relatively good adhesion was present between the synthetic resin and the bituminous layer. The adhesion resistance of the bituminous layer was relatively low. According to the required bitumoniousasphalt-concrete mixes AC 11 S, the placed material was more porous. Insufficient constipation was present. The spaces values were above the standard limits, while the apparent densities were lower compared to the conventional AC 11 mixtures. The microstructure outlines the high permeability and porosity of the bituminous layer. The synthetic resin wasvapourproof and did not exhibit capillary adsorption. It exhibited a lower thickness as required, and no multiple placing steps were observed. Multiple cavities were detected along with the interface between the bituminous layer and the resin coating with no intermediate layers. The layer for the pore filling in the bituminous surface was not properly applied. The swelling bubbles on the synthetic pavement were caused by the humidity in the bituminous layer. Water or humidity were present prior to the application of the resin, and the effect was worsened by the upward movement of the water from the ground.

Keywords: resin, floor, damage, durability

Procedia PDF Downloads 157
4880 Semi-Empirical Modeling of Heat Inactivation of Enterococci and Clostridia During the Hygienisation in Anaerobic Digestion Process

Authors: Jihane Saad, Thomas Lendormi, Caroline Le Marechal, Anne-marie Pourcher, Céline Druilhe, Jean-louis Lanoiselle

Abstract:

Agricultural anaerobic digestion consists in the conversion of animal slurry and manure into biogas and digestate. They need, however, to be treated at 70 ºC during 60 min before anaerobic digestion according to the European regulation (EC n°1069/2009 & EU n°142/2011). The impact of such heat treatment on the outcome of bacteria has been poorly studied up to now. Moreover, a recent study¹ has shown that enterococci and clostridia are still detected despite the application of such thermal treatment, questioning the relevance of this approach for the hygienisation of digestate. The aim of this study is to establish the heat inactivation kinetics of two species of enterococci (Enterococcus faecalis and Enterococcus faecium) and two species of clostridia (Clostridioides difficile and Clostridium novyi as a non-toxic model for Clostridium botulinum of group III). A pure culture of each strain was prepared in a specific sterile medium at concentration of 10⁴ – 10⁷ MPN / mL (Most Probable number), depending on the bacterial species. Bacterial suspensions were then filled in sterilized capillary tubes and placed in a water or oil bath at desired temperature for a specific period of time. Each bacterial suspension was enumerated using a MPN approach, and tests were repeated three times for each temperature/time couple. The inactivation kinetics of the four indicator bacteria is described using the Weibull model and the classical Bigelow model of first-order kinetics. The Weibull model takes biological variation, with respect to thermal inactivation, into account and is basically a statistical model of distribution of inactivation times as the classical first-order approach is a special case of the Weibull model. The heat treatment at 70 ºC / 60 min contributes to a reduction greater than 5 log10 for E. faecium and E. faecalis. However, it results only in a reduction of about 0.7 log10 for C. difficile and an increase of 0.5 log10 for C. novyi. Application of treatments at higher temperatures is required to reach a reduction greater or equal to 3 log10 for C. novyi (such as 30 min / 100 ºC, 13 min / 105 ºC, 3 min / 110 ºC, and 1 min / 115 ºC), raising the question of the relevance of the application of heat treatment at 70 ºC / 60 min for these spore-forming bacteria. To conclude, the heat treatment (70 ºC / 60 min) defined by the European regulation is sufficient to inactivate non-sporulating bacteria. Higher temperatures (> 100 ºC) are required as far as spore-forming bacteria concerns to reach a 3 log10 reduction (sporicidal activity).

Keywords: heat treatment, enterococci, clostridia, inactivation kinetics

Procedia PDF Downloads 106
4879 Pozzolanic Properties of Synthetic Zeolites as Materials Used for the Production of Building Materials

Authors: Joanna Styczen, Wojciech Franus

Abstract:

Currently, cement production reaches 3-6 Gt per year. The production of one ton of cement is associated with the emission of 0.5 to 1 ton of carbon dioxide into the atmosphere, which means that this process is responsible for 5% of global CO2 emissions. Simply improving the cement manufacturing process is not enough. An effective solution is the use of pozzolanic materials, which can partly replace clinker and thus reduce energy consumption, and emission of pollutants and give mortars the desired characteristics, shaping their microstructure. Pozzolanic additives modify the phase composition of cement, reducing the amount of portlandite and changing the CaO/SiO2 ratio in the C-S-H phase. Zeolites are a pozzolanic additive that is not commonly used. Three types of zeolites were synthesized in work: Na-A, sodalite and ZSM-5 (these zeolites come from three different structural groups). Zeolites were obtained by hydrothermal synthesis of fly ash in an aqueous NaOH solution. Then, the pozzolanicity of the obtained materials was assessed. The pozzolanic activity of the zeolites synthesized for testing was tested by chemical methods in accordance with the ASTM C 379-65 standard. The method consisted in determining the percentage content of active ingredients (soluble silicon oxide and aluminum).in alkaline solutions, i.e. those that are potentially reactive towards calcium hydroxide. The highest amount of active silica was found in zeolite ZSM-5 - 88.15%. The amount of active Al2O3 was small - 1%. The smallest pozzolanic activity was found in the Na-A zeolite (active SiO2 - 4.4%, and active Al2O3 - 2.52). The tests carried out using the XRD, SEM, XRF and textural tests showed that the obtained zeolites are characterized by high porosity, which makes them a valuable addition to mortars.

Keywords: pozzolanic properties, hydration, zeolite, alite

Procedia PDF Downloads 73
4878 The World in the 21st Century and Beyound: Convergence or Invariance

Authors: Saleh Maina

Abstract:

There is an on-going debate among intellectuals and scholars of international relations and world politics over the direction which the world is heading particularly in the current era of globalization. On the one hand are adherents to the convergence thesis which is premised on the assumption that global social order is tending toward universalism which could translate into the possible end of the classical state system and the unification of world societies under a single and common ideological dispensation. The convergence thesis is hinged on the globalization process which is gradually reducing world societies into a 'global village'. On the other hand are intellectuals who hold the view that despite advances made in communication technology which appear to threaten the survival of the classical state system. Invariance, as expressed in the survival of the existing state system and the diverse social traditions in world societies, remain a realistic possibility contrary to the conclusions of the convergence thesis. The invariance thesis has been advanced by scholars like Samuel P. Huntington whose work on clash of civilizations suggests that world peace can only be sustained through the co-habitation of diverse civilizations across the world. The purpose of this paper is to examine both sides of the debate with the aim of making a realistic assessment on where world societies are headed, between convergence and invariance. Using the realist theory of international relations as our theoretical premise the paper argues that while there is sufficient ground to predict the future direction of world societies as headed towards some form of convergence, invariance as expressed in the co-existence of diverse civilizations will for a long time remain a major feature of the international system.

Keywords: convergence, invariance, clash of civilization, classical state system, universalism

Procedia PDF Downloads 306
4877 The DAQ Debugger for iFDAQ of the COMPASS Experiment

Authors: Y. Bai, M. Bodlak, V. Frolov, S. Huber, V. Jary, I. Konorov, D. Levit, J. Novy, D. Steffen, O. Subrt, M. Virius

Abstract:

In general, state-of-the-art Data Acquisition Systems (DAQ) in high energy physics experiments must satisfy high requirements in terms of reliability, efficiency and data rate capability. This paper presents the development and deployment of a debugging tool named DAQ Debugger for the intelligent, FPGA-based Data Acquisition System (iFDAQ) of the COMPASS experiment at CERN. Utilizing a hardware event builder, the iFDAQ is designed to be able to readout data at the average maximum rate of 1.5 GB/s of the experiment. In complex softwares, such as the iFDAQ, having thousands of lines of code, the debugging process is absolutely essential to reveal all software issues. Unfortunately, conventional debugging of the iFDAQ is not possible during the real data taking. The DAQ Debugger is a tool for identifying a problem, isolating the source of the problem, and then either correcting the problem or determining a way to work around it. It provides the layer for an easy integration to any process and has no impact on the process performance. Based on handling of system signals, the DAQ Debugger represents an alternative to conventional debuggers provided by most integrated development environments. Whenever problem occurs, it generates reports containing all necessary information important for a deeper investigation and analysis. The DAQ Debugger was fully incorporated to all processes in the iFDAQ during the run 2016. It helped to reveal remaining software issues and improved significantly the stability of the system in comparison with the previous run. In the paper, we present the DAQ Debugger from several insights and discuss it in a detailed way.

Keywords: DAQ Debugger, data acquisition system, FPGA, system signals, Qt framework

Procedia PDF Downloads 281
4876 Optical Variability of Faint Quasars

Authors: Kassa Endalamaw Rewnu

Abstract:

The variability properties of a quasar sample, spectroscopically complete to magnitude J = 22.0, are investigated on a time baseline of 2 years using three different photometric bands (U, J and F). The original sample was obtained using a combination of different selection criteria: colors, slitless spectroscopy and variability, based on a time baseline of 1 yr. The main goals of this work are two-fold: first, to derive the percentage of variable quasars on a relatively short time baseline; secondly, to search for new quasar candidates missed by the other selection criteria; and, thus, to estimate the completeness of the spectroscopic sample. In order to achieve these goals, we have extracted all the candidate variable objects from a sample of about 1800 stellar or quasi-stellar objects with limiting magnitude J = 22.50 over an area of about 0.50 deg2. We find that > 65% of all the objects selected as possible variables are either confirmed quasars or quasar candidates on the basis of their colors. This percentage increases even further if we exclude from our lists of variable candidates a number of objects equal to that expected on the basis of `contamination' induced by our photometric errors. The percentage of variable quasars in the spectroscopic sample is also high, reaching about 50%. On the basis of these results, we can estimate that the incompleteness of the original spectroscopic sample is < 12%. We conclude that variability analysis of data with small photometric errors can be successfully used as an efficient and independent (or at least auxiliary) selection method in quasar surveys, even when the time baseline is relatively short. Finally, when corrected for the different intrinsic time lags corresponding to a fixed observed time baseline, our data do not show a statistically significant correlation between variability and either absolute luminosity or redshift.

Keywords: nuclear activity, galaxies, active quasars, variability

Procedia PDF Downloads 77
4875 Using Non-Negative Matrix Factorization Based on Satellite Imagery for the Collection of Agricultural Statistics

Authors: Benyelles Zakaria, Yousfi Djaafar, Karoui Moussa Sofiane

Abstract:

Agriculture is fundamental and remains an important objective in the Algerian economy, based on traditional techniques and structures, it generally has a purpose of consumption. Collection of agricultural statistics in Algeria is done using traditional methods, which consists of investigating the use of land through survey and field survey. These statistics suffer from problems such as poor data quality, the long delay between collection of their last final availability and high cost compared to their limited use. The objective of this work is to develop a processing chain for a reliable inventory of agricultural land by trying to develop and implement a new method of extracting information. Indeed, this methodology allowed us to combine data from remote sensing and field data to collect statistics on areas of different land. The contribution of remote sensing in the improvement of agricultural statistics, in terms of area, has been studied in the wilaya of Sidi Bel Abbes. It is in this context that we applied a method for extracting information from satellite images. This method is called the non-negative matrix factorization, which does not consider the pixel as a single entity, but will look for components the pixel itself. The results obtained by the application of the MNF were compared with field data and the results obtained by the method of maximum likelihood. We have seen a rapprochement between the most important results of the FMN and those of field data. We believe that this method of extracting information from satellite data leads to interesting results of different types of land uses.

Keywords: blind source separation, hyper-spectral image, non-negative matrix factorization, remote sensing

Procedia PDF Downloads 419
4874 Modeling of Strong Motion Generation Areas of the 2011 Tohoku, Japan Earthquake Using Modified Semi-Empirical Technique Incorporating Frequency Dependent Radiation Pattern Model

Authors: Sandeep, A. Joshi, Kamal, Piu Dhibar, Parveen Kumar

Abstract:

In the present work strong ground motion has been simulated using a modified semi-empirical technique (MSET), with frequency dependent radiation pattern model. Joshi et al. (2014) have modified the semi-empirical technique to incorporate the modeling of strong motion generation areas (SMGAs). A frequency dependent radiation pattern model is applied to simulate high frequency ground motion more precisely. Identified SMGAs (Kurahashi and Irikura 2012) of the 2011 Tohoku earthquake (Mw 9.0) were modeled using this modified technique. Records are simulated for both frequency dependent and constant radiation pattern function. Simulated records for both cases are compared with observed records in terms of peak ground acceleration and pseudo acceleration response spectra at different stations. Comparison of simulated and observed records in terms of root mean square error suggests that the method is capable of simulating record which matches in a wide frequency range for this earthquake and bears realistic appearance in terms of shape and strong motion parameters. The results confirm the efficacy and suitability of rupture model defined by five SMGAs for the developed modified technique.

Keywords: strong ground motion, semi-empirical, strong motion generation area, frequency dependent radiation pattern, 2011 Tohoku Earthquake

Procedia PDF Downloads 532
4873 The Construction of the Bridge between Mrs Dalloway and to the Lighthouse: The Combination of Codes and Metaphors in the Structuring of the Plot in the Work of Virginia Woolf

Authors: María Rosa Mucci

Abstract:

Tzvetan Todorov (1971) designs a model of narrative transformation where the plot is constituted by difference and resemblance. This binary opposition is a synthesis of a central figure within narrative discourse: metaphor. Narrative operates as a metaphor since it combines different actions through similarities within a common plot. However, it sounds paradoxical that metonymy and not metaphor should be the key figure within the narrative. It is a metonymy that keeps the movement of actions within the story through syntagmatic relations. By the same token, this articulation of verbs makes it possible for the reader to engage in a dynamic interaction with the text, responding to the plot and mediating meanings with the contradictory external world. As Roland Barthes (1957) points out, there are two codes that are irreversible within the process: the codes of actions and the codes of enigmas. Virginia Woolf constructs her plots through a process of symbolism; a scene is always enduring, not only because it stands for something else but also because it connotes it. The reader is forced to elaborate the meaning at a mythological level beyond the lines. In this research, we follow a qualitative content analysis to code language through the proairetic (actions) and hermeneutic (enigmas) codes in terms of Barthes. There are two novels in particular that engage the reader in this process of construction: Mrs Dalloway (1925) and To the Lighthouse (1927). The bridge from the first to the second brings memories of childhood, allowing for the discovery of these enigmas hidden between the lines. What survives? Who survives? It is the reader's task to unravel these codes and rethink this dialogue between plot and reader to contribute to the predominance of texts and the textuality of narratives.

Keywords: metonymy, code, metaphor, myth, textuality

Procedia PDF Downloads 55
4872 Winning the “Culture War”: Greater Hungary and the American Confederacy as Sites of Nostalgia, Mythology, and Problem-Making for the Far Right in the US and Hungary

Authors: Grace Rademacher

Abstract:

Trauma” of the Kingdom of Hungary and the “Lost Cause” of the American Confederacy. Applying Nicole Maurantonio’s articulation of “confederate exceptionalism” and Svetlana Boym’s definition of “restorative nostalgia”, this article argues that, via memorialization and public discourse, both far right bodies flood their constituencies with narratives of nostalgia and martyrdom to sow existential anxieties about past and prophetic victimhood, all under the guise of protecting or restoring heritage. Linking this practice to gamification and conspiracy theorizing and following the work of Patrick Jagoda, this article identifies such industries of nostalgia as means by which the far right in both nations can partake in the “immanent and improvisational process of problem making.” Reified through monuments and references to the Trianon Trauma and the American confederacy, political actors “problem make” by alleging that they are victims of the West or the Left, subject to the cruel whims of liberalism and denial of historical legitimacy. In both nations, relying on their victimhood, pundits and politicians can appeal to white supremacists and distract citizens from legitimate active conflicts, such as wars or democratic rollbacks, redirecting them to fictional, mythical attacks on Hungarian or American society and civilization. This article will examine memorials and monuments as “lieux de memoire” and identify the purposeful similarities between the discourse of public figures and politicians such as María Schmidt, János Lázár, and Viktor Orbán, with that of Donald Trump and pundits such as Tucker Carlson.

Keywords: nationalism, political memory, white supremacy, trianon

Procedia PDF Downloads 73
4871 Effect of Forging Pressure on Mechanical Properties and Microstructure of Similar and Dissimilar Friction Welded Joints (Aluminium, Copper, Steel)

Authors: Sagar Pandit

Abstract:

The present work focuses on the effect of various process parameters on the mechanical properties and microstructure of joints produced by continuous drive friction welding and linear friction welding. An attempt is made to investigate the feasibility of obtaining an acceptable weld joint between similar as well as dissimilar components and the microstructural changes have also been assessed once the good weld joints were considered (using Optical Microscopy and Scanning Electron Microscopy techniques). The impact of forging pressure in the microstructure of the weld joint has been studied and the variation in joint strength with varying forge pressure is analyzed. The weld joints were obtained two pair of dissimilar materials and one pair of similar materials, which are listed respectively as: Al-AA5083 & Cu-C101 (dissimilar), Aluminium alloy-3000 series & Mild Steel (dissimilar) and High Nitrogen Austenitic Stainless Steel pair (similar). Intermetallic phase formation was observed at the weld joints in the Al-Cu joint, which consequently harmed the properties of the joint (less tensile strength). It was also concluded that the increase in forging pressure led to both increment and decrement in the tensile strength of the joint depending on the similarity or dissimilarity of the components. The hardness was also observed to possess maximum as well as minimum values at the weld joint depending on the similarity or dissimilarity of workpieces. It was also suggested that a higher forging pressure is needed to obtain complete joining for the formation of the weld joint.

Keywords: forging pressure, friction welding, mechanical properties, microstructure

Procedia PDF Downloads 116
4870 Comparing Double-Stranded RNA Uptake Mechanisms in Dipteran and Lepidopteran Cell Lines

Authors: Nazanin Amanat, Alison Tayler, Steve Whyard

Abstract:

While chemical insecticides effectively control many insect pests, they also harm many non-target species. Double-stranded RNA (dsRNA) pesticides, in contrast, can be designed to target unique gene sequences and thus act in a species-specific manner. DsRNA insecticides do not, however, work equally well for all insects, and for some species that are considered refractory to dsRNA, a primary factor affecting efficacy is the relative ease by which dsRNA can enter a target cell’s cytoplasm. In this study, we are examining how different structured dsRNAs (linear, hairpin, and paperclip) can enter mosquito and lepidopteran cells, as they represent dsRNA-sensitive and refractory species, respectively. To determine how the dsRNAs enter the cells, we are using chemical inhibitors and RNA interference (RNAi)-mediated knockdown of key proteins associated with different endocytosis processes. Understanding how different dsRNAs enter cells will ultimately help in the design of molecules that overcome refractoriness to RNAi or develop resistance to dsRNA-based insecticides. To date, we have conducted chemical inhibitor experiments on both cell lines and have evidence that linear dsRNAs enter the cells using clathrin-mediated endocytosis, while the paperclip dsRNAs (pcRNAs) can enter both species’ cells in a clathrin-independent manner to induce RNAi. An alternative uptake mechanism for the pcRNAs has been tentatively identified, and the outcomes of our RNAi-mediated knockdown experiments, which should provide corroborative evidence of our initial findings, will be discussed.

Keywords: dsRNA, RNAi, uptake, insecticides, dipteran, lepidopteran

Procedia PDF Downloads 70
4869 Comparative Study of the Effects of Process Parameters on the Yield of Oil from Melon Seed (Cococynthis citrullus) and Coconut Fruit (Cocos nucifera)

Authors: Ndidi F. Amulu, Patrick E. Amulu, Gordian O. Mbah, Callistus N. Ude

Abstract:

Comparative analysis of the properties of melon seed, coconut fruit and their oil yield were evaluated in this work using standard analytical technique AOAC. The results of the analysis carried out revealed that the moisture contents of the samples studied are 11.15% (melon) and 7.59% (coconut). The crude lipid content are 46.10% (melon) and 55.15% (coconut).The treatment combinations used (leaching time, leaching temperature and solute: solvent ratio) showed significant difference (p < 0.05) in yield between the samples, with melon oil seed flour having a higher percentage range of oil yield (41.30 – 52.90%) and coconut (36.25 – 49.83%). The physical characterization of the extracted oil was also carried out. The values gotten for refractive index are 1.487 (melon seed oil) and 1.361 (coconut oil) and viscosities are 0.008 (melon seed oil) and 0.002 (coconut oil). The chemical analysis of the extracted oils shows acid value of 1.00mg NaOH/g oil (melon oil), 10.050mg NaOH/g oil (coconut oil) and saponification value of 187.00mg/KOH (melon oil) and 183.26mg/KOH (coconut oil). The iodine value of the melon oil gave 75.00mg I2/g and 81.00mg I2/g for coconut oil. A standard statistical package Minitab version 16.0 was used in the regression analysis and analysis of variance (ANOVA). The statistical software mentioned above was also used to optimize the leaching process. Both samples gave high oil yield at the same optimal conditions. The optimal conditions to obtain highest oil yield ≥ 52% (melon seed) and ≥ 48% (coconut seed) are solute - solvent ratio of 40g/ml, leaching time of 2hours and leaching temperature of 50oC. The two samples studied have potential of yielding oil with melon seed giving the higher yield.

Keywords: Coconut, Melon, Optimization, Processing

Procedia PDF Downloads 439
4868 The Requirements of Developing a Framework for Successful Adoption of Quality Management Systems in the Construction Industry

Authors: Mohammed Ali Ahmed, Vaughan Coffey, Bo Xia

Abstract:

Quality management systems (QMSs) in the construction industry are often implemented to ensure that sufficient effort is made by companies to achieve the required levels of quality for clients. Attainment of these quality levels can result in greater customer satisfaction, which is fundamental to ensure long-term competitiveness for construction companies. However, the construction sector is still lagging behind other industries in terms of its successful adoption of QMSs, due to the relative lack of acceptance of the benefits of these systems among industry stakeholders, as well as from other barriers related to implementing them. Thus, there is a critical need to undertake a detailed and comprehensive exploration of adoption of QMSs in the construction sector. This paper comprehensively investigates in the construction sector setting, the impacts of all the salient factors surrounding successful implementation of QMSs in building organizations, especially those of external factors. This study is part of an ongoing PhD project, which aims to develop a new framework that integrates both internal and external factors affecting QMS implementation. To achieve the paper aim and objectives, interviews will be conducted to define the external factors influencing the adoption of QMSs, and to obtain holistic critical success factors (CSFs) for implementing these systems. In the next stage of data collection, a questionnaire survey will be developed to investigate the prime barriers facing the adoption of QMSs, the CSFs for their implementation, and the external factors affecting the adoption of these systems. Following the survey, case studies will be undertaken to validate and explain in greater detail the real effects of these factors on QMSs adoption. Specifically, this paper evaluates the effects of the external factors in terms of their impact on implementation success within the selected case studies. Using findings drawn from analyzing the data obtained from these various approaches, specific recommendations for the successful implementation of QMSs will be presented, and an operational framework will be developed. Finally, through a focus group, the findings of the study and the new developed framework will be validated. Ultimately, this framework will be made available to the construction industry to facilitate the greater adoption and implementation of QMSs. In addition, deployment of the applicable recommendations suggested by the study will be shared with the construction industry to more effectively help construction companies to implement QMSs, and overcome the barriers experienced by businesses, thus promoting the achievement of higher levels of quality and customer satisfaction.

Keywords: barriers, critical success factors, external factors, internal factors, quality management systems

Procedia PDF Downloads 183
4867 Color Image Compression/Encryption/Contour Extraction using 3L-DWT and SSPCE Method

Authors: Ali A. Ukasha, Majdi F. Elbireki, Mohammad F. Abdullah

Abstract:

Data security needed in data transmission, storage, and communication to ensure the security. This paper is divided into two parts. This work interests with the color image which is decomposed into red, green and blue channels. The blue and green channels are compressed using 3-levels discrete wavelet transform. The Arnold transform uses to changes the locations of red image channel pixels as image scrambling process. Then all these channels are encrypted separately using the key image that has same original size and are generating using private keys and modulo operations. Performing the X-OR and modulo operations between the encrypted channels images for image pixel values change purpose. The extracted contours from color images recovery can be obtained with accepted level of distortion using single step parallel contour extraction (SSPCE) method. Experiments have demonstrated that proposed algorithm can fully encrypt 2D Color images and completely reconstructed without any distortion. Also shown that the analyzed algorithm has extremely large security against some attacks like salt and pepper and Jpeg compression. Its proof that the color images can be protected with a higher security level. The presented method has easy hardware implementation and suitable for multimedia protection in real time applications such as wireless networks and mobile phone services.

Keywords: SSPCE method, image compression and salt and peppers attacks, bitplanes decomposition, Arnold transform, color image, wavelet transform, lossless image encryption

Procedia PDF Downloads 515
4866 Personality Traits and Physical Activity among Staff Personnel of University of Southern Mindanao

Authors: Cheeze Janito, Crisly Dawang

Abstract:

It is important to determine the personality traits that exist in the workplace and the contribution of these personality traits in the staff’s daily work routines; a sedentary lifestyle is harmful to one’s health. This study reports the personality traits of the University of Southern Mindanao, Kabacan, Philippines, non-teaching staff, the physical activity involvement of the non-teaching staff, and the big five personality traits that shape the relationship of university non-teaching staff in engaging physical activities. A quantitative method approach, which comprised a three-part questionnaire, was used to collect the data. The fifty non-teaching staff complete the survey. The results revealed that among the big five personality traits, the university non-teaching staff scored higher in agreeableness as revealed, that there was a commonality among the respondents’ traits of consideration to the feelings of the co-workers in observance to not being rude and vividly display of respect to co-workers and workplace and scored least in the personality trait of neuroticism. The study also reported that the university non-teaching staff's main physical activity was house chores as a prime physical exercise in which respondents reported a physical activity frequency of once to twice a week; thus, this study reported that the respondents are less engaged in doing physical activities. Further, the relationship of personality traits and the physical activity of the non-teaching staff gained a p-value of .596 that indicates there is no significant relationship between the two variables, the personality trait and physical activities. This study recommends the tight promotion of staff in engaging in physical activity of at least one hundred fifty minutes of moderate-intensity activity each week. Added to this, the use of different platforms containing physical exercise literacy and the benefits of physical exercise for the holistic development of the university community.

Keywords: university staff, physical fitness, personality traits, physical activity

Procedia PDF Downloads 191
4865 The Impact of Professional Development in the Area of Technology Enhanced Learning on Higher Education Teaching Practices Across Atlantic Technological University - Research Methodology and Preliminary Findings

Authors: Annette Cosgrove, Carina Ginty, Tony Hall, Cornelia Connolly

Abstract:

The objectives of this research study is to examine the impact of professional development in Technology Enhanced Learning (TEL) and the digitization of learning in teaching communities across multiple higher education sites in the ATU (Atlantic Technological University *) ( 2020-2025), including the proposal of an evidence-based digital teaching model for use in a future pandemic. The research strategy undertaken for this study is a multi-site study using mixed methods. Qualitative & quantitative methods are being used in the study to collect data. A pilot study was carried out initially, feedback was collected and the research instrument was edited to reflect this feedback before being administered. The purpose of the staff questionnaire is to evaluate the impact of professional development in the area of TEL, and to capture the practitioner's views on the perceived impact on their teaching practice in the higher education sector across ATU (West of Ireland – 5 Higher education locations ). The phenomenon being explored is ‘ the impact of professional development in the area of technology-enhanced learning and on teaching practice in a higher education institution. The research methodology chosen for this study is an Action based Research Study. The researcher has chosen this approach as it is a prime strategy for developing educational theory and enhancing educational practice. This study includes quantitative and qualitative methods to elicit data that will quantify the impact that continuous professional development in the area of digital teaching practice and technologies has on the practitioner’s teaching practice in higher education. The research instruments/data collection tools for this study include a lecturer survey with a targeted TEL Practice group ( Pre and post covid experience) and semi-structured interviews with lecturers. This research is currently being conducted across the ATU multi-site campus and targeting Higher education lecturers that have completed formal CPD in the area of digital teaching. ATU, a West of Ireland university, is the focus of the study. The research questionnaire has been deployed, with 75 respondents to date across the ATU - the primary questionnaire and semi-formal interviews are ongoing currently – the purpose being to evaluate the impact of formal professional development in the area of TEL and its perceived impact on the practitioners teaching practice in the area of digital teaching and learning. This paper will present initial findings, reflections and data from this ongoing research study.

Keywords: TEL, technology, digital, education

Procedia PDF Downloads 76
4864 Numerical Simulation of the Dynamic Behavior of a LaNi5 Water Pumping System

Authors: Miled Amel, Ben Maad Hatem, Askri Faouzi, Ben Nasrallah Sassi

Abstract:

Metal hydride water pumping system uses hydrogen as working fluid to pump water for low head and high discharge. The principal operation of this pump is based on the desorption of hydrogen at high pressure and its absorption at low pressure by a metal hydride. This work is devoted to study a concept of the dynamic behavior of a metal hydride pump using unsteady model and LaNi5 as hydriding alloy. This study shows that with MHP, it is possible to pump 340l/kg-cycle of water in 15 000s using 1 Kg of LaNi5 at a desorption temperature of 360 K, a pumping head equal to 5 m and a desorption gear ratio equal to 33. This study reveals also that the error given by the steady model, using LaNi5 is about 2%.A dimensional mathematical model and the governing equations of the pump were presented to predict the coupled heat and mass transfer within the MHP. Then, a numerical simulation is carried out to present the time evolution of the specific water discharge and to test the effect of different parameters (desorption temperature, absorption temperature, desorption gear ratio) on the performance of the water pumping system (specific water discharge, pumping efficiency and pumping time). In addition, a comparison between results obtained with steady and unsteady model is performed with different hydride mass. Finally, a geometric configuration of the reactor is simulated to optimize the pumping time.

Keywords: dynamic behavior, LaNi5, performance of water pumping system, unsteady model

Procedia PDF Downloads 199
4863 Simulation of Lean Principles Impact in a Multi-Product Supply Chain

Authors: Matteo Rossini, Alberto Portioli Staudacher

Abstract:

The market competition is moving from the single firm to the whole supply chain one because of increasing competition and growing need for operational efficiencies and customer orientation. Supply chain management allows companies to look beyond their organizational boundaries to develop and leverage resources and capabilities of their supply chain partners. This leads to create competitive advantages in the marketplace and because of this SCM has acquired strategic importance. Lean Approach is a management strategy that focuses on reducing every type of waste present in an organization. This approach is becoming more and more popular among supply chain managers. The supply chain application of lean approach is low diffused. It is not well studied which are the impacts of lean approach principles in a supply chain context. In literature there are only few studies simulating the lean approach performance in single products supply chain. This research work studies the impacts of lean principles implementation along a supply chain. To achieve this, a simulation model of a three-echelon multiproduct product supply chain has been built. Kanban system (and several priority policies) and setup time reduction degrees are implemented in the lean-configured supply chain to apply pull and lot-sizing decrease principles respectively. To evaluate the benefits of lean approach, lean supply chain is compared with an EOQ-configured supply chain. The simulation results show that Kanban system and setup-time reduction improve inventory stock level. They also show that logistics efforts are affected to lean implementation degree. The paper concludes describing performances of lean supply chain in different contexts.

Keywords: inventory policy, Kanban, lean supply chain, simulation study, supply chain management, planning

Procedia PDF Downloads 353
4862 Non-Linear Assessment of Chromatographic Lipophilicity of Selected Steroid Derivatives

Authors: Milica Karadžić, Lidija Jevrić, Sanja Podunavac-Kuzmanović, Strahinja Kovačević, Anamarija Mandić, Aleksandar Oklješa, Andrea Nikolić, Marija Sakač, Katarina Penov Gaši

Abstract:

Using chemometric approach, the relationships between the chromatographic lipophilicity and in silico molecular descriptors for twenty-nine selected steroid derivatives were studied. The chromatographic lipophilicity was predicted using artificial neural networks (ANNs) method. The most important in silico molecular descriptors were selected applying stepwise selection (SS) paired with partial least squares (PLS) method. Molecular descriptors with satisfactory variable importance in projection (VIP) values were selected for ANN modeling. The usefulness of generated models was confirmed by detailed statistical validation. High agreement between experimental and predicted values indicated that obtained models have good quality and high predictive ability. Global sensitivity analysis (GSA) confirmed the importance of each molecular descriptor used as an input variable. High-quality networks indicate a strong non-linear relationship between chromatographic lipophilicity and used in silico molecular descriptors. Applying selected molecular descriptors and generated ANNs the good prediction of chromatographic lipophilicity of the studied steroid derivatives can be obtained. This article is based upon work from COST Actions (CM1306 and CA15222), supported by COST (European Cooperation and Science and Technology).

Keywords: artificial neural networks, chemometrics, global sensitivity analysis, liquid chromatography, steroids

Procedia PDF Downloads 341
4861 English as a Foreign Language Teachers' Perspectives on the Workable Approaches and Challenges that Encountered them when Teaching Reading Using E-Learning

Authors: Sarah Alshehri, Messedah Alqahtani

Abstract:

Reading instruction in EFL classes is still challenging for teachers, and many students are still behind their expected level. Due to the Covid-19 pandemic, there was a shift in teaching English from face-to face to online classes. This paper will discover how the digital shift during and post pandemic has influenced English literacy instruction and what methods seem to be effective or challenging. Specifically, this paper will examine English language teachers' perspectives on the workable approaches and challenges that encountered them when teaching reading using E-Learning platform in Saudi Arabian Secondary and intermediate schools. The study explores public secondary school EFL teachers’ instructional practices and the challenges encountered when teaching reading online. Quantitative data will be collected through a 28 -item Likert type survey that will be administered to Saudi English teachers who work in public secondary and intermediate schools. The quantitative data will be analyzed using SPSS by conducting frequency distributions, descriptive statistics, reliability tests, and one-way ANOVA tests. The potential outcomes of this study will contribute to better understanding of digital literacy and technology integration in language teaching. Findings of this study can provide directions for professionals and policy makers to improve the quality of English teaching and learning. Limitations and results will be discussed, and suggestions for future directions will be offered.

Keywords: EFL reading, E-learning- EFL literacy, EFL workable approaches, EFL reading instruction

Procedia PDF Downloads 93
4860 Synthesis, Crystal Structure Characterization, Hirshfeld Surface Analysis and Biological Activities of Two Schiff Base Polymorphs Derived From 2-Aminobenzonitrile

Authors: Nesrine Benarous, Hassiba Bougueria, Nabila Moussa Slimane, Aouatef Cherouana

Abstract:

Crystal polymorphism is important for the synthesis of more potent and bioactive pharmaceutical compounds, including their different properties, such as packing arrangement and conformation. In fact, polymorphism plays a vital role in drug development. Different parameters affect the crystallization and give their degree of freedom. Severalproperties affected polymorphism, like kinetics, thermodynamics, spectroscopy, and mechanical property. Various techniques are used for characterizing polymorphs, are crystallography, morphology, phase transitions, molecular motion, and chemical environment. In this work, crystal structures of two polymorphs (I and II) of the Schiff base (SB) title compound were prepared by condensation reaction. The crystal structures of both polymorphs were determined by single X-ray analysis. The two polymorphs crystallize in two different space groups: P21/c for I and Pbca for II. The dihedral angles between the two phenyl rings are 4.81º for I and 82.27º for II. Both crystal structures are built on the basis of moderate and weak hydrogen bonds, 𝜋-stacking, and halogen⋯halogeninteractions. On the other hand, Hirshfeld surface (HS) analysis indicates that the most important contributions to the crystal packing for the two polymorphs are from Cl⋯H/H⋯Cl, H⋯H, and N⋯H/H⋯N contacts. These are followed by C⋯H/H⋯C for compound I and C⋯C and by C⋯H/H⋯C contacts for compound II. Afterwards, the in vitro antibacterial activity revealed that the SB have been found effective against G- bacteria Klebsiella pneumonia andG+ bacteria Staphylococcus aureuswith MIC value of14.37μg/mL. Moreover, the SBexhibited moderate toxicity against Brine Shrimp with LC50 value of 44.19μg/mL.

Keywords: polymorph, crystal structure, hirshfeld surface analysis, in vitro antibacterial activity, toxicity

Procedia PDF Downloads 106
4859 Travel Delay and Modal Split Analysis: A Case Study

Authors: H. S. Sathish, H. S. Jagadeesh, Skanda Kumar

Abstract:

Journey time and delay study is used to evaluate the quality of service, the travel time and study can also be used to evaluate the quality of traffic movement along the route and to determine the location types and extent of traffic delays. Components of delay are boarding and alighting, issue of tickets, other causes and distance between each stops. This study investigates the total journey time required to travel along the stretch and the influence the delays. The route starts from Kempegowda Bus Station to Yelahanka Satellite Station of Bangalore City. The length of the stretch is 16.5 km. Modal split analysis has been done for this stretch. This stretch has elevated highway connecting to Bangalore International Airport and the extension of metro transit stretch. From the regression analysis of total journey time it is affected by delay due to boarding and alighting moderately, Delay due to issue of tickets affects the journey time to a higher extent. Some of the delay factors affecting significantly the journey time are evident from F-test at 10 percent level of confidence. Along this stretch work trips are more prevalent as indicated by O-D study. Modal shift analysis indicates about 70 percent of commuters are ready to shift from current system to Metro Rail System. Metro Rail System carries maximum number of trips compared to private mode. Hence Metro is a highly viable choice of mode for Bangalore Metropolitan City.

Keywords: delay, journey time, modal choice, regression analysis

Procedia PDF Downloads 490
4858 Aorta Adhesion Molecules in Cholesterol-Fed Rats Supplemented with Extra Virgin Olive Oil or Sunflower Oil, in Either Commercial or Modified Forms

Authors: Ageliki I. Katsarou, Andriana C. Kaliora, Antonia Chiou, Apostolos Papalois, Nick Kalogeropoulos, Nikolaos K. Andrikopoulos

Abstract:

Chronic inflammation plays a pivotal role in CVD development, while phytochemicals have been shown to reduce CVD risk. Several studies have correlated olive oil consumption with CVD prevention and CVD risk reduction. However, the effect of individual olive oil macro- or micro-constituents and possible synergisms among them needs to be further elucidated. Herein, extra virgin olive oil (EVOO) lipidic and polar phenolics fractions were evaluated for their effect on inflammatory markers in cholesterol-fed rats. Oils combining different characteristics as to their polar phenolic content and lipid profile were used. Male Wistar rats were fed for 9 weeks on either a high-cholesterol diet (HCD) or a HCD supplemented with oils, either commercially available, i.e. EVOO, sunflower oil (SO), or modified as to their polar phenol content, i.e. phenolics deprived-EVOO (EVOOd), SO enriched with the EVOO phenolics (SOe). Post-intervention, aorta and blood samples were collected. HCD induced dyslipidemia, manifested by serum total cholesterol and low-density lipoprotein cholesterol elevation. Additionally, HCD resulted in higher adhesion molecules’ levels in rat aorta. In the case of E-selectin, this increase was attenuated by HCD supplementation with EVOO and EVOOd, while no alterations were observed in SO and SOe groups. No differences were observed between pairs of commercial and modified oils, indicating that oleates may be the components responsible for aorta E-selectin levels lowering. The same was true for vascular adhesion molecule-1 (VCAM-1); augmentation in cholesterol-fed animals was attenuated by EVOO and EVOOd diets, highlighting oleates effect. In addition, VCAM-1 levels were higher in SO group compared to the respective SOe, indicating that in the presence of phenolic compounds linoleic acid have become less prone to oxidation. Intercellular adhesion molecule-1 (ICAM-1) levels were higher in cholesterol-fed rats, however not affected by any of the oils supplemented during the intervention. Overall, EVOO was found superior in regulating adhesion molecule levels in rat aorta compared to SO. EVOO and EVOOd exhibited analogous effects on all adhesion molecules assessed, indicating that EVOO major constituents (oleates) improve E-selectin and VCAM-1 levels in rat aorta, independently from phenolics presence. Further research is needed to elucidate the effect of phenolics and oleates in other tissues.

Keywords: extra virgin olive oil, cholesterol-fed rats, polar phenolics, adhesion molecules

Procedia PDF Downloads 265
4857 Near-Infrared Optogenetic Manipulation of a Channelrhodopsin via Upconverting Nanoparticles

Authors: Kanchan Yadav, Ai-Chuan Chou, Rajesh Kumar Ulaganathan, Hua-De Gao, Hsien-Ming Lee, Chien-Yuan Pan, Yit-Tsong Chen

Abstract:

Optogenetics is an innovative technology now widely adopted by researchers in different fields of the biological sciences. However, due to the weak tissue penetration capability of the short wavelengths used to activate light-sensitive proteins, an invasive light guide has been used in animal studies for photoexcitation of target tissues. Upconverting nanoparticles (UCNPs), which transform near-infrared (NIR) light to short-wavelength emissions, can help address this issue. To improve optogenetic performance, we enhance the target selectivity for optogenetic controls by specifically conjugating the UCNPs with light-sensitive proteins at a molecular level, which shortens the distance as well as enhances the efficiency of energy transfer. We tagged V5 and Lumio epitopes to the extracellular N-terminal of channelrhodopsin-2 with an mCherry conjugated at the intracellular C-terminal (VL-ChR2m) and then bound NeutrAvidin-functionalized UCNPs (NAv-UCNPs) to the VL-ChR2m via a biotinylated antibody against V5 (bV5-Ab). We observed an apparent energy transfer from the excited UCNP (donor) to the bound VL-ChR2m (receptor) by measuring emission-intensity changes at the donor-receptor complex. The successful patch-clamp electrophysiological test and an intracellular Ca2+ elevation observed in the designed UCNP-ChR2 system under optogenetic manipulation confirmed the practical employment of UCNP-assisted NIR-optogenetic functionality. This work represents a significant step toward improving therapeutic optogenetics.

Keywords: Channelrhodopsin-2, near infrared, optogenetics, upconverting nanoparticles

Procedia PDF Downloads 272