Search results for: David Read
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1156

Search results for: David Read

286 Effects of Foreign-language Learning on Bilinguals' Production in Both Their Languages

Authors: Natalia Kartushina

Abstract:

Foreign (second) language (L2) learning is highly promoted in modern society. Students are encouraged to study abroad (SA) to achieve the most effective learning outcomes. However, L2 learning has side effects for native language (L1) production, as L1 sounds might show a drift from the L1 norms towards those of the L2, and this, even after a short period of L2 learning. L1 assimilatory drift has been attributed to a strong perceptual association between similar L1 and L2 sounds in the mind of L2 leaners; thus, a change in the production of an L2 target leads to the change in the production of the related L1 sound. However, nowadays, it is quite common that speakers acquire two languages from birth, as, for example, it is the case for many bilingual communities (e.g., Basque and Spanish in the Basque Country). Yet, it remains to be established how FL learning affects native production in individuals who have two native languages, i.e., in simultaneous or very early bilinguals. Does FL learning (here a third language, L3) affect bilinguals’ both languages or only one? What factors determine which of the bilinguals’ languages is more susceptible to change? The current study examines the effects of L3 (English) learning on the production of vowels in the two native languages of simultaneous Spanish-Basque bilingual adolescents enrolled into the Erasmus SA English program. Ten bilingual speakers read five Spanish and Basque consonant-vowel-consonant-vowel words two months before their SA and the next day after their arrival back to Spain. Each word contained the target vowel in the stressed syllable and was repeated five times. Acoustic analyses measuring vowel openness (F1) and backness (F2) were performed. Two possible outcomes were considered. First, we predicted that L3 learning would affect the production of only one language and this would be the language that would be used the most in contact with English during the SA period. This prediction stems from the results of recent studies showing that early bilinguals have separate phonological systems for each of their languages; and that late FL learner (as it is the case of our participants), who tend to use their L1 in language-mixing contexts, have more L2-accented L1 speech. The second possibility stated that L3 learning would affect both of the bilinguals’ languages in line with the studies showing that bilinguals’ L1 and L2 phonologies interact and constantly co-influence each other. The results revealed that speakers who used both languages equally often (balanced users) showed an F1 drift in both languages toward the F1 of the English vowel space. Unbalanced speakers, however, showed a drift only in the less used language. The results are discussed in light of recent studies suggesting that the amount of language use is a strong predictor of the authenticity in speech production with less language use leading to more foreign-accented speech and, eventually, to language attrition.

Keywords: language-contact, multilingualism, phonetic drift, bilinguals' production

Procedia PDF Downloads 91
285 Nanoparticles Made of Amino Acid Derived Biodegradable Polymers as Promising Drug Delivery Containers

Authors: Sophio Kobauri, Tengiz Kantaria, Temur Kantaria, David Tugushi, Nina Kulikova, Ramaz Katsarava

Abstract:

Polymeric disperse systems such as nanoparticles (NPs) are of high interest for numerous applications in contemporary medicine and nanobiotechnology to a considerable potential for treatment of many human diseases. The important technological advantages of NPs usage as drug carriers (nanocontainers) are their high stability, high carrier capacity, feasibility of encapsulation of both hydrophilic or hydrophobic substances, as well as a high variety of possible administration routes, including oral application and inhalation. NPs can also be designed to allow controlled (sustained) drug release from the matrix. These properties of NPs enable improvement of drug bioavailability and might allow drug dosage decrease. The targeted and controlled administration of drugs using NPs might also help to overcome drug resistance, which is one of the major obstacles in the control of epidemics. Various degradable and non-degradable polymers of both natural and synthetic origin have been used for NPs construction. One of the most promising for the design of NPs are amino acid-based biodegradable polymers (AABBPs) which can clear from the body after the fulfillment of their function. The AABBPs are composed of naturally occurring and non-toxic building blocks such as α-amino acids, fatty diols and dicarboxylic acids. The particles designed from these polymers are expected to have an improved bioavailability along with a high biocompatibility. The present work deals with a systematic study of the preparation of NPs by cost-effective polymer deposition/solvent displacement method using AABBPs. The influence of the nature and concentration of surfactants, concentration of organic phase (polymer solution), and the ratio organic phase/inorganic(water) phase, as well as of some other factors on the size of the fabricated NPs have been studied. It was established that depending on the used conditions the NPs size could be tuned within 40-330 nm. At the next step of this research was carried out an evaluation of biocompability and bioavailability of the synthesized NPs using a stable human cell culture line – A549. It was established that the obtained NPs are not only biocompatible but they stimulate the cell growth.

Keywords: amino acids, biodegradable polymers, bioavailability, nanoparticles

Procedia PDF Downloads 273
284 Cfd Simulation for Urban Environment for Evaluation of a Wind Energy Potential of a Building or a New Urban Planning

Authors: David Serero, Loic Couton, Jean-Denis Parisse, Robert Leroy

Abstract:

This paper presents an analysis method of airflow at the periphery of several typologies of architectural volumes. To understand the complexity of the urban environment on the airflows in the city, we compared three sites at different architectural scale. The research sets a method to identify the optimal location for the installation of wind turbines on the edges of a building and to achieve an improvement in the performance of energy extracted by precise localization of an accelerating wing called “aero foil”. The objective is to define principles for the installation of wind turbines and natural ventilation design of buildings. Instead of theoretical winds analysis, we combined numerical aeraulic simulations using STAR CCM + software with wind data, over long periods of time (greater than 1 year). If airflows computer fluid analysis (CFD) simulation of buildings are current, we have calibrated a virtual wind tunnel with wind data using in situ anemometers (to establish localized cartography of urban winds). We can then develop a complete volumetric model of the behavior of the wind on a roof area, or an entire urban island. With this method, we can categorize: - the different types of wind in urban areas and identify the minimum and maximum wind spectrum, - select the type of harvesting devices - fixing to the roof of a building, - the altimetry of the device in relation to the levels of the roofs - The potential nuisances around. This study is carried out from the recovery of a geolocated data flow, and the connection of this information with the technical specifications of wind turbines, their energy performance and their speed of engagement. Thanks to this method, we can thus define the characteristics of wind turbines to maximize their performance in urban sites and in a turbulent airflow regime. We also study the installation of a wind accelerator associated with buildings. The “aerofoils which are integrated are improvement to control the speed of the air, to orientate it on the wind turbine, to accelerate it and to hide, thanks to its profile, the device on the roof of the building.

Keywords: wind energy harvesting, wind turbine selection, urban wind potential analysis, CFD simulation for architectural design

Procedia PDF Downloads 120
283 Exploring an Exome Target Capture Method for Cross-Species Population Genetic Studies

Authors: Benjamin A. Ha, Marco Morselli, Xinhui Paige Zhang, Elizabeth A. C. Heath-Heckman, Jonathan B. Puritz, David K. Jacobs

Abstract:

Next-generation sequencing has enhanced the ability to acquire massive amounts of sequence data to address classic population genetic questions for non-model organisms. Targeted approaches allow for cost effective or more precise analyses of relevant sequences; although, many such techniques require a known genome and it can be costly to purchase probes from a company. This is challenging for non-model organisms with no published genome and can be expensive for large population genetic studies. Expressed exome capture sequencing (EecSeq) synthesizes probes in the lab from expressed mRNA, which is used to capture and sequence the coding regions of genomic DNA from a pooled suite of samples. A normalization step produces probes to recover transcripts from a wide range of expression levels. This approach offers low cost recovery of a broad range of genes in the genome. This research project expands on EecSeq to investigate if mRNA from one taxon may be used to capture relevant sequences from a series of increasingly less closely related taxa. For this purpose, we propose to use the endangered Northern Tidewater goby, Eucyclogobius newberryi, a non-model organism that inhabits California coastal lagoons. mRNA will be extracted from E. newberryi to create probes and capture exomes from eight other taxa, including the more at-risk Southern Tidewater goby, E. kristinae, and more divergent species. Captured exomes will be sequenced, analyzed bioinformatically and phylogenetically, then compared to previously generated phylogenies across this group of gobies. This will provide an assessment of the utility of the technique in cross-species studies and for analyzing low genetic variation within species as is the case for E. kristinae. This method has potential applications to provide economical ways to expand population genetic and evolutionary biology studies for non-model organisms.

Keywords: coastal lagoons, endangered species, non-model organism, target capture method

Procedia PDF Downloads 166
282 Sediment Transport Monitoring in the Port of Veracruz Expansion Project

Authors: Francisco Liaño-Carrera, José Isaac Ramírez-Macías, David Salas-Monreal, Mayra Lorena Riveron-Enzastiga, Marcos Rangel-Avalos, Adriana Andrea Roldán-Ubando

Abstract:

The construction of most coastal infrastructure developments around the world are usually made considering wave height, current velocities and river discharges; however, little effort has been paid to surveying sediment transport during dredging or the modification to currents outside the ports or marinas during and after the construction. This study shows a complete survey during the construction of one of the largest ports of the Gulf of Mexico. An anchored Acoustic Doppler Current Velocity profiler (ADCP), a towed ADCP and a combination of model outputs were used at the Veracruz port construction in order to describe the hourly sediment transport and current modifications in and out of the new port. Owing to the stability of the system the new port was construction inside Vergara Bay, a low wave energy system with a tidal range of up to 0.40 m. The results show a two-current system pattern within the bay. The north side of the bay has an anticyclonic gyre, while the southern part of the bay shows a cyclonic gyre. Sediment transport trajectories were made every hour using the anchored ADCP, a numerical model and the weekly data obtained from the towed ADCP within the entire bay. The sediment transport trajectories were carefully tracked since the bay is surrounded by coral reef structures which are sensitive to sedimentation rate and water turbidity. The survey shows that during dredging and rock input used to build the wave breaker sediments were locally added (< 2500 m2) and local currents disperse it in less than 4 h. While the river input located in the middle of the bay and the sewer system plant may add more than 10 times this amount during a rainy day or during the tourist season. Finally, the coastal line obtained seasonally with a drone suggests that the southern part of the bay has not been modified by the construction of the new port located in the northern part of the bay, owing to the two subsystem division of the bay.

Keywords: Acoustic Doppler Current Profiler, construction around coral reefs, dredging, port construction, sediment transport monitoring,

Procedia PDF Downloads 208
281 Towards Creative Movie Title Generation Using Deep Neural Models

Authors: Simon Espigolé, Igor Shalyminov, Helen Hastie

Abstract:

Deep machine learning techniques including deep neural networks (DNN) have been used to model language and dialogue for conversational agents to perform tasks, such as giving technical support and also for general chit-chat. They have been shown to be capable of generating long, diverse and coherent sentences in end-to-end dialogue systems and natural language generation. However, these systems tend to imitate the training data and will only generate the concepts and language within the scope of what they have been trained on. This work explores how deep neural networks can be used in a task that would normally require human creativity, whereby the human would read the movie description and/or watch the movie and come up with a compelling, interesting movie title. This task differs from simple summarization in that the movie title may not necessarily be derivable from the content or semantics of the movie description. Here, we train a type of DNN called a sequence-to-sequence model (seq2seq) that takes as input a short textual movie description and some information on e.g. genre of the movie. It then learns to output a movie title. The idea is that the DNN will learn certain techniques and approaches that the human movie titler may deploy that may not be immediately obvious to the human-eye. To give an example of a generated movie title, for the movie synopsis: ‘A hitman concludes his legacy with one more job, only to discover he may be the one getting hit.’; the original, true title is ‘The Driver’ and the one generated by the model is ‘The Masquerade’. A human evaluation was conducted where the DNN output was compared to the true human-generated title, as well as a number of baselines, on three 5-point Likert scales: ‘creativity’, ‘naturalness’ and ‘suitability’. Subjects were also asked which of the two systems they preferred. The scores of the DNN model were comparable to the scores of the human-generated movie title, with means m=3.11, m=3.12, respectively. There is room for improvement in these models as they were rated significantly less ‘natural’ and ‘suitable’ when compared to the human title. In addition, the human-generated title was preferred overall 58% of the time when pitted against the DNN model. These results, however, are encouraging given the comparison with a highly-considered, well-crafted human-generated movie title. Movie titles go through a rigorous process of assessment by experts and focus groups, who have watched the movie. This process is in place due to the large amount of money at stake and the importance of creating an effective title that captures the audiences’ attention. Our work shows progress towards automating this process, which in turn may lead to a better understanding of creativity itself.

Keywords: creativity, deep machine learning, natural language generation, movies

Procedia PDF Downloads 305
280 Projected Impact of Population Aging on Noncommunicable Disease Burden and Costs in the Kingdom of Saudi Arabia, 2020–2030

Authors: David C. Boettiger, Tracy Kuo Lin, Maram Almansour, Mariam M. Hamza, Reem Alsukait, Christopher H. Herbst, Nada Altheyab, Ayman Afghani, Faisal Kattan

Abstract:

Background The number of people aged greater than 65 years per 100 people aged 20–64 years is expected to almost double in The Kingdom of Saudi Arabia (KSA) between 2020 and 2030. We therefore aimed to quantify the growing non-communicable disease (NCD) burden in KSA between 2020 and 2030, and the impact this will have on the national health budget. Methods Ten priority NCDs were selected: ischemic heart disease, stroke, type 2 diabetes, chronic obstructive pulmonary disease, chronic kidney disease, dementia, depression, osteoarthritis, colorectal cancer, and breast cancer. Age- and sex-specific prevalence was projected for each priority NCD between 2020 and 2030. Treatment coverage rates were applied to the projected prevalence estimates to calculate the number of patients incurring treatment costs for each condition. For each priority NCD, the average cost-of-illness was estimated based on published literature. The impact of changes to our base-case model in terms of assumed disease prevalence, treatment coverage, and costs of care, coming into effect from 2023 onwards, were explored. Results The prevalence estimates for colorectal cancer and stroke were estimated to almost double between 2020 and 2030 (97% and 88% increase, respectively). The only priority NCD prevalence projected to increase by less than 60% between 2020 and 2030 was for depression (22% increase). It is estimated that the total cost of managing priority NCDs in KSA will increase from USD 19.8 billion in 2020 to USD 32.4 billion in 2030 (an increase of USD 12.6 billion or 63%). The largest USD value increases were projected for osteoarthritis (USD 4.3 billion), diabetes (USD 2.4 billion), and dementia (USD 1.9 billion). In scenario analyses, our 2030 projection for the total cost of managing priority NCDs varied between USD 29.2 billion - USD 35.7 billion. Conclusions Managing the growing NCD burden in KSA’s aging population will require substantial healthcare spending increases over the coming years.

Keywords: aging, non communicable disease, costs, Saudi Arabia

Procedia PDF Downloads 24
279 Teaching the Meaning of the Holy Quran Using Modern Technology

Authors: Arjumand Warsy

Abstract:

Among the Muslims, the Holy Quran is taught from early childhood and generally by the age of 7-8 years the reading of the entire Quran is completed by most of the children in Muslim families. During this period excellent reciter’s are selected to teach and emphasis is laid on correct reading, pronunciation and memorization. Following these years, the parents lay emphasis on the recitation of the Quran on daily basis. During the month of Ramadan the entire Quran is read one or more times and there are considerable number of Muslims who complete the entire Quran once or more each calendar month. Many Muslims do not know Arabic and for them message in the Quran is what others tell them and often they have no idea about this Guidance sent to them. This deficiency is reflected in many ways, both among people living in Muslim or non-Muslim countries. Due to the deficiency in knowledge about Islamic teachings, the foundations of Islam are being eroded by a variety of forces. In an attempt to guard against the non-Islamic influences, every Muslim must have a clear understanding of the Islamic teachings and requirements. The best guidance can be provided by the understanding of the Holy Quran. However, we are faced with the problem that often the Quran is taught in a way that fails to develop an interest and understanding of the message from Allah. Looking at the teaching of other subjects both scientific and non-scientific, at school, college or University levels, it is obvious that the advances in teaching methodologies using electronic technology have had a major impact, where both the understanding and the interest of the students are significantly elevated. We attempted to teach the meaning of the Holy Quran to children and adults using a scientific and modern approach using slide presentation and animations. The results showed almost 100% increase in the understanding of the Quran message; all attendees claimed they developed an increased interest in the study of the Holy Quran and did not lose track or develop boredom throughout the lectures. They learnt the information and remembered it more effectively. The love for Allah and Prophet Mohammad (PBUH) increased significantly. The fear of Allah and love of Heaven developed significantly. Historical facts and the stories of the past nations became clearer and the Greatness of the Creator was strongly felt. Several of attendees wanted to become better Muslims and to spread the knowledge of Islam. In this presentation, the adopted teaching method will be first presented and demonstrated to the audience using a short Surah from the Quran, followed by discussion on the results achieved during our study. We will endeavor to convey to the audience that there is a need to adopt a more scientific approach to teach the Quran so that a greater benefit is achieved by all.

Keywords: The Holy Quran, Muslims, presentations, technology

Procedia PDF Downloads 404
278 Characterizing Nasal Microbiota in COVID-19 Patients: Insights from Nanopore Technology and Comparative Analysis

Authors: David Pinzauti, Simon De Jaegher, Maria D'Aguano, Manuele Biazzo

Abstract:

The COVID-19 pandemic has left an indelible mark on global health, leading to a pressing need for understanding the intricate interactions between the virus and the human microbiome. This study focuses on characterizing the nasal microbiota of patients affected by COVID-19, with a specific emphasis on the comparison with unaffected individuals, to shed light on the crucial role of the microbiome in the development of this viral disease. To achieve this objective, Nanopore technology was employed to analyze the bacterial 16s rRNA full-length gene present in nasal swabs collected in Malta between January 2021 and August 2022. A comprehensive dataset consisting of 268 samples (126 SARS-negative samples and 142 SARS-positive samples) was subjected to a comparative analysis using an in-house, custom pipeline. The findings from this study revealed that individuals affected by COVID-19 possess a nasal microbiota that is significantly less diverse, as evidenced by lower α diversity, and is characterized by distinct microbial communities compared to unaffected individuals. The beta diversity analyses were carried out at different taxonomic resolutions. At the phylum level, Bacteroidota was found to be more prevalent in SARS-negative samples, suggesting a potential decrease during the course of viral infection. At the species level, the identification of several specific biomarkers further underscores the critical role of the nasal microbiota in COVID-19 pathogenesis. Notably, species such as Finegoldia magna, Moraxella catarrhalis, and others exhibited relative abundance in SARS-positive samples, potentially serving as significant indicators of the disease. This study presents valuable insights into the relationship between COVID-19 and the nasal microbiota. The identification of distinct microbial communities and potential biomarkers associated with the disease offers promising avenues for further research and therapeutic interventions aimed at enhancing public health outcomes in the context of COVID-19.

Keywords: COVID-19, nasal microbiota, nanopore technology, 16s rRNA gene, biomarkers

Procedia PDF Downloads 42
277 Global and Domestic Response to Boko Haram Terrorism on Cameroon 2014-2018

Authors: David Nchinda Keming

Abstract:

The present study is focused on both the national and international collective fight against Boko Haram terrorism on Cameroon and the rule played by the Lake Chad Basin Countries (LCBCs) and the global community to suffocate the sect’s activities in the region. Although countries of the Lake Chad Basin include: Cameroon, Chad, Nigeria and Niger others like Benin also joined the course. The justification for the internationalisation of the fight against Boko Haram could be explained by the ecological and international climatic importance of the Lake Chad and the danger posed by the sect not only to the Lake Chad member countries but to global armed, civil servants and the international political economy. The study, therefore, kick start with Cameroon’s reaction to Boko Haram’s terrorist attacks on its territory. It further expounds on Cameroon’s request on bilateral diplomacy from members of the UN Security Council for an international collective support to staple the winds of the challenging sect. The study relies on the hypothesis that Boko Haram advanced terrorism on Cameroon was more challenging to the domestic military intelligence thus forcing the government to seek for bilateral and multilateral international collective support to secure its territory from the powerful sect. This premise is tested internationally via (multilateral cooperation, bilateral response, regional cooperation) and domestically through (solidarity parade, religious discourse, political manifestations, war efforts, the vigilantes and the way forward). To accomplish our study, we made used of the mixed research methodologies to interpret the primary, secondary and tertiary sources consulted. Our results reveal that the collective response was effectively positive justified by the drastic drop in the sect’s operations in Cameroon and the whole LCBCs. Although the sect was incapacitated, terrorism remains an international malaise and Cameroon hosts a fertile ground for terrorists’ activism. Boko Haram was just weakened and not completely defeated and could reappear someday even under a different appellation. Therefore, to absolutely eradicate terrorism in general and Boko Haram in particular, LCBCs must improve their military intelligence on terrorism and continue to collaborate with advanced experienced countries in fighting terrorism.

Keywords: Boko Haram, terrorism, domestic, international, response

Procedia PDF Downloads 133
276 Prediction of Finned Projectile Aerodynamics Using a Lattice-Boltzmann Method CFD Solution

Authors: Zaki Abiza, Miguel Chavez, David M. Holman, Ruddy Brionnaud

Abstract:

In this paper, the prediction of the aerodynamic behavior of the flow around a Finned Projectile will be validated using a Computational Fluid Dynamics (CFD) solution, XFlow, based on the Lattice-Boltzmann Method (LBM). XFlow is an innovative CFD software developed by Next Limit Dynamics. It is based on a state-of-the-art Lattice-Boltzmann Method which uses a proprietary particle-based kinetic solver and a LES turbulent model coupled with the generalized law of the wall (WMLES). The Lattice-Boltzmann method discretizes the continuous Boltzmann equation, a transport equation for the particle probability distribution function. From the Boltzmann transport equation, and by means of the Chapman-Enskog expansion, the compressible Navier-Stokes equations can be recovered. However to simulate compressible flows, this method has a Mach number limitation because of the lattice discretization. Thanks to this flexible particle-based approach the traditional meshing process is avoided, the discretization stage is strongly accelerated reducing engineering costs, and computations on complex geometries are affordable in a straightforward way. The projectile that will be used in this work is the Army-Navy Basic Finned Missile (ANF) with a caliber of 0.03 m. The analysis will consist in varying the Mach number from M=0.5 comparing the axial force coefficient, normal force slope coefficient and the pitch moment slope coefficient of the Finned Projectile obtained by XFlow with the experimental data. The slope coefficients will be obtained using finite difference techniques in the linear range of the polar curve. The aim of such an analysis is to find out the limiting Mach number value starting from which the effects of high fluid compressibility (related to transonic flow regime) lead the XFlow simulations to differ from the experimental results. This will allow identifying the critical Mach number which limits the validity of the isothermal formulation of XFlow and beyond which a fully compressible solver implementing a coupled momentum-energy equations would be required.

Keywords: CFD, computational fluid dynamics, drag, finned projectile, lattice-boltzmann method, LBM, lift, mach, pitch

Procedia PDF Downloads 392
275 Radical Scavenging Activity of Protein Extracts from Pulse and Oleaginous Seeds

Authors: Silvia Gastaldello, Maria Grillo, Luca Tassoni, Claudio Maran, Stefano Balbo

Abstract:

Antioxidants are nowadays attractive not only for the countless benefits to the human and animal health, but also for the perspective of use as food preservative instead of synthetic chemical molecules. In this study, the radical scavenging activity of six protein extracts from pulse and oleaginous seeds was evaluated. The selected matrices are Pisum sativum (yellow pea from two different origins), Carthamus tinctorius (safflower), Helianthus annuus (sunflower), Lupinus luteus cv Mister (lupin) and Glycine max (soybean), since they are economically interesting for both human and animal nutrition. The seeds were grinded and proteins extracted from 20mg powder with a specific vegetal-extraction kit. Proteins have been quantified through Bradford protocol and scavenging activity was revealed using DPPH assay, based on radical DPPH (2,2-diphenyl-1-picrylhydrazyl) absorbance decrease in the presence of antioxidants molecules. Different concentrations of the protein extract (1, 5, 10, 50, 100, 500 µg/ml) were mixed with DPPH solution (DPPH 0,004% in ethanol 70% v/v). Ascorbic acid was used as a scavenging activity standard reference, at the same six concentrations of protein extracts, while DPPH solution was used as control. Samples and standard were prepared in triplicate and incubated for 30 minutes in dark at room temperature, the absorbance was read at 517nm (ABS30). Average and standard deviation of absorbance values were calculated for each concentration of samples and standard. Statistical analysis using t-students and p-value were performed to assess the statistical significance of the scavenging activity difference between the samples (or standard) and control (ABSctrl). The percentage of antioxidant activity has been calculated using the formula [(ABSctrl-ABS30)/ABSctrl]*100. The obtained results demonstrate that all matrices showed antioxidant activity. Ascorbic acid, used as standard, exhibits a 96% scavenging activity at the concentration of 500 µg/ml. At the same conditions, sunflower, safflower and yellow peas revealed the highest antioxidant performance among the matrices analyzed, with an activity of 74%, 68% and 70% respectively (p < 0.005). Although lupin and soybean exhibit a lower antioxidant activity compared to the other matrices, they showed a percentage of 46 and 36 respectively. All these data suggest the possibility to use undervalued edible matrices as antioxidants source. However, further studies are necessary to investigate a possible synergic effect of several matrices as well as the impact of industrial processes for a large-scale approach.

Keywords: antioxidants, DPPH assay, natural matrices, vegetal proteins

Procedia PDF Downloads 402
274 Substitutional Inference in Poetry: Word Choice Substitutions Craft Multiple Meanings by Inference

Authors: J. Marie Hicks

Abstract:

The art of the poetic conjoins meaning and symbolism with imagery and rhythm. Perhaps the reader might read this opening sentence as 'The art of the poetic combines meaning and symbolism with imagery and rhythm,' which holds a similar message, but is not quite the same. The reader understands that these factors are combined in this literary form, but to gain a sense of the conjoining of these factors, the reader is forced to consider that these aspects of poetry are not simply combined, but actually adjoin, abut, skirt, or touch in the poetic form. This alternative word choice is an example of substitutional inference. Poetry is, ostensibly, a literary form where language is used precisely or creatively to evoke specific images or emotions for the reader. Often, the reader can predict a coming rhyme or descriptive word choice in a poem, based on previous rhyming pattern or earlier imagery in the poem. However, there are instances when the poet uses an unexpected word choice to create multiple meanings and connections. In these cases, the reader is presented with an unusual phrase or image, requiring that they think about what that image is meant to suggest, and their mind also suggests the word they expected, creating a second, overlying image or meaning. This is what is meant by the term 'substitutional inference.' This is different than simply using a double entendre, a word or phrase that has two meanings, often one complementary and the other disparaging, or one that is innocuous and the other suggestive. In substitutional inference, the poet utilizes an unanticipated word that is either visually or phonetically similar to the expected word, provoking the reader to work to understand the poetic phrase as written, while unconsciously incorporating the meaning of the line as anticipated. In other words, by virtue of a word substitution, an inference of the logical word choice is imparted to the reader, while they are seeking to rationalize the word that was actually used. There is a substitutional inference of meaning created by the alternate word choice. For example, Louise Bogan, 4th Poet Laureate of the United States, used substitutional inference in the form of homonyms, malapropisms, and other unusual word choices in a number of her poems, lending depth and greater complexity, while actively engaging her readers intellectually with her poetry. Substitutional inference not only adds complexity to the potential interpretations of Bogan’s poetry, as well as the poetry of others, but provided a method for writers to infuse additional meanings into their work, thus expressing more information in a compact format. Additionally, this nuancing enriches the poetic experience for the reader, who can enjoy the poem superficially as written, or on a deeper level exploring gradations of meaning.

Keywords: poetic inference, poetic word play, substitutional inference, word substitution

Procedia PDF Downloads 208
273 Blue Finance: A Systematical Review of the Academic Literature on Investment Streams for Marine Conservation

Authors: David Broussard

Abstract:

This review article delves into the realm of marine conservation finance, addressing the inadequacies in current financial streams from the private sector and the underutilization of existing financing mechanisms. The study emphasizes the emerging field of “blue finance”, which contributes to economic growth, improved livelihoods, and marine ecosystem health. The financial burden of marine conservation projects typically falls on philanthropists and governments, contrary to the polluter-pays principle. However, the private sector’s increasing commitment to NetZero and growing environmental and social responsibility goals prompts the need for alternative funding sources for marine conservation initiatives like marine protected areas. The article explores the potential of utilizing several financing mechanisms like carbon credits and other forms of payment for ecosystem services in the marine context, providing a solution to the lack of private funding for marine conservation. The methodology employed involves a systematic and quantitative approach, combining traditional review methods and elements of meta-analysis. A comprehensive search of the years 2000 - 2023, using relevant keywords on the Scopus platform, resulted in a review of 252 articles. The temporal evolution of blue finance studies reveals a significant increase in annual articles from 2010 to 2022, with notable peaks in 2011 and 2022. Marine Policy, Ecosystem Services, and Frontiers in Marine Science are prominent journals in this field. While the majority of articles focus on payment for ecosystem services, there is a growing awareness of the need for holistic approaches in conservation finance. Utilizing bibliometric techniques, the article showcases the dominant share of payment for ecosystem services in the literature with a focus on blue carbon. The classification of articles based on various criteria, including financing mechanisms and conservation types, aids in categorizing and understanding the diversity of research objectives and perspectives in this complex field of marine conservation finance.

Keywords: biodiversity offsets, carbon credits, ecosystem services, impact investment, payment for ecosystem services

Procedia PDF Downloads 54
272 Fast Estimation of Fractional Process Parameters in Rough Financial Models Using Artificial Intelligence

Authors: Dávid Kovács, Bálint Csanády, Dániel Boros, Iván Ivkovic, Lóránt Nagy, Dalma Tóth-Lakits, László Márkus, András Lukács

Abstract:

The modeling practice of financial instruments has seen significant change over the last decade due to the recognition of time-dependent and stochastically changing correlations among the market prices or the prices and market characteristics. To represent this phenomenon, the Stochastic Correlation Process (SCP) has come to the fore in the joint modeling of prices, offering a more nuanced description of their interdependence. This approach has allowed for the attainment of realistic tail dependencies, highlighting that prices tend to synchronize more during intense or volatile trading periods, resulting in stronger correlations. Evidence in statistical literature suggests that, similarly to the volatility, the SCP of certain stock prices follows rough paths, which can be described using fractional differential equations. However, estimating parameters for these equations often involves complex and computation-intensive algorithms, creating a necessity for alternative solutions. In this regard, the Fractional Ornstein-Uhlenbeck (fOU) process from the family of fractional processes offers a promising path. We can effectively describe the rough SCP by utilizing certain transformations of the fOU. We employed neural networks to understand the behavior of these processes. We had to develop a fast algorithm to generate a valid and suitably large sample from the appropriate process to train the network. With an extensive training set, the neural network can estimate the process parameters accurately and efficiently. Although the initial focus was the fOU, the resulting model displayed broader applicability, thus paving the way for further investigation of other processes in the realm of financial mathematics. The utility of SCP extends beyond its immediate application. It also serves as a springboard for a deeper exploration of fractional processes and for extending existing models that use ordinary Wiener processes to fractional scenarios. In essence, deploying both SCP and fractional processes in financial models provides new, more accurate ways to depict market dynamics.

Keywords: fractional Ornstein-Uhlenbeck process, fractional stochastic processes, Heston model, neural networks, stochastic correlation, stochastic differential equations, stochastic volatility

Procedia PDF Downloads 87
271 Evaluation and Proposal for Improvement of the Flow Measurement Equipment in the Bellavista Drinking Water System of the City of Azogues

Authors: David Quevedo, Diana Coronel

Abstract:

The present article carries out an evaluation of the drinking water system in the Bellavista sector of the city of Azogues, with the purpose of determining the appropriate equipment to record the actual consumption flows of the inhabitants in said sector. Taking into account that the study area is located in a rural and economically disadvantaged area, there is an urgent need to establish a control system for the consumption of drinking water in order to conserve and manage the vital resource in the best possible way, considering that the water source supplying this sector is approximately 9km away. The research began with the collection of cartographic, demographic, and statistical data of the sector, determining the coverage area, population projection, and a provision that guarantees the supply of drinking water to meet the water needs of the sector's inhabitants. By using hydraulic modeling through the United States Environmental Protection Agency Application for Modeling Drinking Water Distribution Systems EPANET 2.0 software, theoretical hydraulic data were obtained, which were used to design and justify the most suitable measuring equipment for the Bellavista drinking water system. Taking into account a minimum service life of the drinking water system of 30 years, future flow rates were calculated for the design of the macro-measuring device. After analyzing the network, it was evident that the Bellavista sector has an average consumption of 102.87 liters per person per day, but considering that Ecuadorian regulations recommend a provision of 180 liters per person per day for the geographical conditions of the sector, this value was used for the analysis. With all the collected and calculated information, the conclusion was reached that the Bellavista drinking water system needs to have a 125mm electromagnetic macro-measuring device for the first three quinquenniums of its service life and a 150mm diameter device for the following three quinquenniums. The importance of having equipment that provides real and reliable data will allow for the control of water consumption by the population of the sector, measured through micro-measuring devices installed at the entrance of each household, which should match the readings of the macro-measuring device placed after the water storage tank outlet, in order to control losses that may occur due to leaks in the drinking water system or illegal connections.

Keywords: macrometer, hydraulics, endowment, water

Procedia PDF Downloads 48
270 Delineation of Green Infrastructure Buffer Areas with a Simulated Annealing: Consideration of Ecosystem Services Trade-Offs in the Objective Function

Authors: Andres Manuel Garcia Lamparte, Rocio Losada Iglesias, Marcos BoullóN Magan, David Miranda Barros

Abstract:

The biodiversity strategy of the European Union for 2030, mentions climate change as one of the key factors for biodiversity loss and considers green infrastructure as one of the solutions to this problem. In this line, the European Commission has developed a green infrastructure strategy which commits members states to consider green infrastructure in their territorial planning. This green infrastructure is aimed at granting the provision of a wide number of ecosystem services to support biodiversity and human well-being by countering the effects of climate change. Yet, there are not too many tools available to delimit green infrastructure. The available ones consider the potential of the territory to provide ecosystem services. However, these methods usually aggregate several maps of ecosystem services potential without considering possible trade-offs. This can lead to excluding areas with a high potential for providing ecosystem services which have many trade-offs with other ecosystem services. In order to tackle this problem, a methodology is proposed to consider ecosystem services trade-offs in the objective function of a simulated annealing algorithm aimed at delimiting green infrastructure multifunctional buffer areas. To this end, the provision potential maps of the regulating ecosystem services considered to delimit the multifunctional buffer areas are clustered in groups, so that ecosystem services that create trade-offs are excluded in each group. The normalized provision potential maps of the ecosystem services in each group are added to obtain a potential map per group which is normalized again. Then the potential maps for each group are combined in a raster map that shows the highest provision potential value in each cell. The combined map is then used in the objective function of the simulated annealing algorithm. The algorithm is run both using the proposed methodology and considering the ecosystem services individually. The results are analyzed with spatial statistics and landscape metrics to check the number of ecosystem services that the delimited areas produce, as well as their regularity and compactness. It has been observed that the proposed methodology increases the number of ecosystem services produced by delimited areas, improving their multifunctionality and increasing their effectiveness in preventing climate change impacts.

Keywords: ecosystem services trade-offs, green infrastructure delineation, multifunctional buffer areas, climate change

Procedia PDF Downloads 146
269 Controlled Digital Lending, Equitable Access to Knowledge and Future Library Services

Authors: Xuan Pang, Alvin L. Lee, Peggy Glatthaar

Abstract:

Libraries across the world have been an innovation engine of creativity and opportunityin many decades. The on-going global epidemiology outbreak and health crisis experience illuminates potential reforms, rethinking beyond traditional library operations and services. Controlled Digital Lending (CDL) is one of the emerging technologies libraries used to deliver information digitally in support of online learning and teachingand make educational materials more affordable and more accessible. CDL became a popular term in the United States of America (USA) as a result of a white paper authored by Kyle K. Courtney (Harvard University) and David Hansen (Duke University). The paper gave the legal groundwork to explore CDL: Fair Use, First Sale Doctrine, and Supreme Court rulings. Library professionals implemented this new technology to fulfill their users’ needs. Three libraries in the state of Florida (University of Florida, Florida Gulf Coast University, and Florida A&M University) started a conversation about how to develop strategies to make CDL work possible at each institution. This paper shares the stories of piloting and initiating a CDL program to ensure students have reliable, affordable access to course materials they need to be successful. Additionally, this paper offers an overview of the emerging trends of Controlled Digital Lending in the USA and demonstrates the development of the CDL platforms, policies, and implementation plans. The paper further discusses challenges and lessons learned and how each institution plans to sustain the program into future library services. The fundamental mission of the library is providing users unrestricted access to library resources regardless of their physical location, disability, health status, or other circumstances. The professional due diligence of librarians, as information professionals, is to makeeducational resources more affordable and accessible.CDL opens a new frontier of library services as a mechanism for library practice to enhance user’s experience of using libraries’ services. Libraries should consider exploring this tool to distribute library resources in an effective and equitable way. This new methodology has potential benefits to libraries and end users.

Keywords: controlled digital lending, emerging technologies, equitable access, collaborations

Procedia PDF Downloads 112
268 DNA Hypomethylating Agents Induced Histone Acetylation Changes in Leukemia

Authors: Sridhar A. Malkaram, Tamer E. Fandy

Abstract:

Purpose: 5-Azacytidine (5AC) and decitabine (DC) are DNA hypomethylating agents. We recently demonstrated that both drugs increase the enzymatic activity of the histone deacetylase enzyme SIRT6. Accordingly, we are comparing the changes H3K9 acetylation changes in the whole genome induced by both drugs using leukemia cells. Description of Methods & Materials: Mononuclear cells from the bone marrow of six de-identified naive acute myeloid leukemia (AML) patients were cultured with either 500 nM of DC or 5AC for 72 h followed by ChIP-Seq analysis using a ChIP-validated acetylated-H3K9 (H3K9ac) antibody. Chip-Seq libraries were prepared from treated and untreated cells using SMARTer ThruPLEX DNA- seq kit (Takara Bio, USA) according to the manufacturer’s instructions. Libraries were purified and size-selected with AMPure XP beads at 1:1 (v/v) ratio. All libraries were pooled prior to sequencing on an Illumina HiSeq 1500. The dual-indexed single-read Rapid Run was performed with 1x120 cycles at 5 pM final concentration of the library pool. Sequence reads with average Phred quality < 20, with length < 35bp, PCR duplicates, and those aligning to blacklisted regions of the genome were filtered out using Trim Galore v0.4.4 and cutadapt v1.18. Reads were aligned to the reference human genome (hg38) using Bowtie v2.3.4.1 in end-to-end alignment mode. H3K9ac enriched (peak) regions were identified using diffReps v1.55.4 software using input samples for background correction. The statistical significance of differential peak counts was assessed using a negative binomial test using all individuals as replicates. Data & Results: The data from the six patients showed significant (Padj<0.05) acetylation changes at 925 loci after 5AC treatment versus 182 loci after DC treatment. Both drugs induced H3K9 acetylation changes at different chromosomal regions, including promoters, coding exons, introns, and distal intergenic regions. Ten common genes showed H3K9 acetylation changes by both drugs. Approximately 84% of the genes showed an H3K9 acetylation decrease by 5AC versus 54% only by DC. Figures 1 and 2 show the heatmaps for the top 100 genes and the 99 genes showing H3K9 acetylation decrease after 5AC treatment and DC treatment, respectively. Conclusion: Despite the similarity in hypomethylating activity and chemical structure, the effect of both drugs on H3K9 acetylation change was significantly different. More changes in H3K9 acetylation were observed after 5 AC treatments compared to DC. The impact of these changes on gene expression and the clinical efficacy of these drugs requires further investigation.

Keywords: DNA methylation, leukemia, decitabine, 5-Azacytidine, epigenetics

Procedia PDF Downloads 123
267 Effects of Global Validity of Predictive Cues upon L2 Discourse Comprehension: Evidence from Self-paced Reading

Authors: Binger Lu

Abstract:

It remains unclear whether second language (L2) speakers could use discourse context cues to predict upcoming information as native speakers do during online comprehension. Some researchers propose that L2 learners may have a reduced ability to generate predictions during discourse processing. At the same time, there is evidence that discourse-level cues are weighed more heavily in L2 processing than in L1. Previous studies showed that L1 prediction is sensitive to the global validity of predictive cues. The current study aims to explore whether and to what extent L2 learners can dynamically and strategically adjust their prediction in accord with the global validity of predictive cues in L2 discourse comprehension as native speakers do. In a self-paced reading experiment, Chinese native speakers (N=128), C-E bilinguals (N=128), and English native speakers (N=128) read high-predictable (e.g., Jimmy felt thirsty after running. He wanted to get some water from the refrigerator.) and low-predictable (e.g., Jimmy felt sick this morning. He wanted to get some water from the refrigerator.) discourses in two-sentence frames. The global validity of predictive cues was manipulated by varying the ratio of predictable (e.g., Bill stood at the door. He opened it with the key.) and unpredictable fillers (e.g., Bill stood at the door. He opened it with the card.), such that across conditions, the predictability of the final word of the fillers ranged from 100% to 0%. The dependent variable was reading time on the critical region (the target word and the following word), analyzed with linear mixed-effects models in R. C-E bilinguals showed reliable prediction across all validity conditions (β = -35.6 ms, SE = 7.74, t = -4.601, p< .001), and Chinese native speakers showed significant effect (β = -93.5 ms, SE = 7.82, t = -11.956, p< .001) in two of the four validity conditions (namely, the High-validity and MedLow conditions, where fillers ended with predictable words in 100% and 25% cases respectively), whereas English native speakers didn’t predict at all (β = -2.78 ms, SE = 7.60, t = -.365, p = .715). There was neither main effect (χ^²(3) = .256, p = .968) nor interaction (Predictability: Background: Validity, χ^²(3) = 1.229, p = .746; Predictability: Validity, χ^²(3) = 2.520, p = .472; Background: Validity, χ^²(3) = 1.281, p = .734) of Validity with speaker groups. The results suggest that prediction occurs in L2 discourse processing but to a much less extent in L1, witha significant effect in some conditions of L1 Chinese and anull effect in L1 English processing, consistent with the view that L2 speakers are more sensitive to discourse cues compared with L1 speakers. Additionally, the pattern of L1 and L2 predictive processing was not affected by the global validity of predictive cues. C-E bilinguals’ predictive processing could be partly transferred from their L1, as prior research showed that discourse information played a more significant role in L1 Chinese processing.

Keywords: bilingualism, discourse processing, global validity, prediction, self-paced reading

Procedia PDF Downloads 113
266 Barriers and Opportunities in Apprenticeship Training: How to Complete a Vocational Upper Secondary Qualification with Intermediate Finnish Language Skills

Authors: Inkeri Jaaskelainen

Abstract:

The aim of this study is to shed light on what is it like to study in apprenticeship training using intermediate (or even lower level) Finnish. The aim is to find out and describe these students' experiences and feelings while acquiring a profession in Finnish as it is important to understand how immigrant background adult learners learn and how their needs could be better taken into account. Many students choose apprenticeships and start vocational training while their language skills in Finnish are still very weak. At work, students should be able to simultaneously learn Finnish and do vocational studies in a noisy, demanding, and stressful environment. Learning and understanding new things is very challenging under these circumstances, and sometimes students get exhausted and experience a lot of stress - which makes learning even more difficult. Students are different from each other, and so are their ways to learn. Both duties at work and school assignments require reasonably good general language skills, and, especially at work, language skills are also a safety issue. The empirical target of this study is a group of students with an immigrant background who studied in various fields with intensive L2 support in 2016–2018 and who by now have completed a vocational upper secondary qualification. The interview material for this narrative study was collected from those who completed apprenticeship training in 2019–2020. The data collection methods used are a structured thematic interview, a questionnaire, and observational data. Interviewees with an immigrant background have an inconsistent cultural and educational background - some have completed an academic degree in their country of origin while others have learned to read and write only in Finland. The analysis of the material utilizes thematic analysis, which is used to examine learning and related experiences. Learning a language at work is very different from traditional classroom teaching. With evolving language skills, at an intermediate level at best, rushing and stressing makes it even more difficult to understand and increases the fear of failure. Constant noise, rapidly changing situations, and uncertainty undermine the learning and well-being of apprentices. According to preliminary results, apprenticeship training is well suited to the needs of an adult immigrant student. In apprenticeship training, students need a lot of support for learning and understanding a new communication and working culture. Stress can result in, e.g., fatigue, frustration, and difficulties in remembering and understanding. Apprenticeship training can be seen as a good path to working life. However, L2 support is a very important part of apprenticeship training, and it indeed helps students to believe that one day they will graduate and even get employed in their new country.

Keywords: apprenticeship training, vocational basic degree, Finnish learning, wee-being

Procedia PDF Downloads 113
265 Microscale observations of a gas cell wall rupture in bread dough during baking and confrontation to 2/3D Finite Element simulations of stress concentration

Authors: Kossigan Bernard Dedey, David Grenier, Tiphaine Lucas

Abstract:

Bread dough is often described as a dispersion of gas cells in a continuous gluten/starch matrix. The final bread crumb structure is strongly related to gas cell walls (GCWs) rupture during baking. At the end of proofing and during baking, part of the thinnest GCWs between expanding gas cells is reduced to a gluten film of about the size of a starch granule. When such size is reached gluten and starch granules must be considered as interacting phases in order to account for heterogeneities and appropriately describe GCW rupture. Among experimental investigations carried out to assess GCW rupture, no experimental work was performed to observe the GCW rupture in the baking conditions at GCW scale. In addition, attempts to numerically understand GCW rupture are usually not performed at the GCW scale and often considered GCWs as continuous. The most relevant paper that accounted for heterogeneities dealt with the gluten/starch interactions and their impact on the mechanical behavior of dough film. However, stress concentration in GCW was not discussed. In this study, both experimental and numerical approaches were used to better understand GCW rupture in bread dough during baking. Experimentally, a macro-scope placed in front of a two-chamber device was used to observe the rupture of a real GCW of 200 micrometers in thickness. Special attention was paid in order to mimic baking conditions as far as possible (temperature, gas pressure and moisture). Various differences in pressure between both sides of GCW were applied and different modes of fracture initiation and propagation in GCWs were observed. Numerically, the impact of gluten/starch interactions (cohesion or non-cohesion) and rheological moduli ratio on the mechanical behavior of GCW under unidirectional extension was assessed in 2D/3D. A non-linear viscoelastic and hyperelastic approach was performed to match the finite strain involved in GCW during baking. Stress concentration within GCW was identified. Simulated stresses concentration was discussed at the light of GCW failure observed in the device. The gluten/starch granule interactions and rheological modulus ratio were found to have a great effect on the amount of stress possibly reached in the GCW.

Keywords: dough, experimental, numerical, rupture

Procedia PDF Downloads 103
264 Saving the Decolonized Subject from Neglected Tropical Diseases: Public Health Campaign and Household-Centred Sanitation in Colonial West Africa, 1900-1960

Authors: Adebisi David Alade

Abstract:

In pre-colonial West Africa, the deadliness of the climate vis-a- vis malaria and other tropical diseases to Europeans turned the region into the “white man’s grave.” Thus, immediately after the partition of Africa in 1885, civilisatrice and mise en valeur not only became a pretext for the establishment of colonial rule; from a medical point of view, the control and possible eradication of disease in the continent emerged as one of the first concerns of the European colonizers. Though geared toward making Africa exploitable, historical evidence suggests that some colonial Water, Sanitation and Hygiene (WASH) policies and projects reduced certain tropical diseases in some West African communities. Exploring some of these disease control interventions by way of historical revisionism, this paper challenges the orthodox interpretation of colonial sanitation and public health measures in West Africa. This paper critiques the deployment of race and class as analytical tools for the study of colonial WASH projects, an exercise which often reduces the complexity and ambiguity of colonialism to the binary of colonizer and the colonized. Since West Africa presently ranks high among regions with Neglected Tropical Diseases (NTDs), it is imperative to decentre colonial racism and economic exploitation in African history in order to give room for Africans to see themselves in other ways. Far from resolving the problem of NTDs by fiat in the region, this study seeks to highlight important blind spots in African colonial history in an attempt to prevent post-colonial African leaders from throwing away the baby with the bath water. As scholars researching colonial sanitation and public health in the continent rarely examine its complex meaning and content, this paper submits that the outright demonization of colonial rule across space and time continues to build ideological wall between the present and the past which not only inhibit fruitful borrowing from colonial administration of West Africa, but also prevents a wide understanding of the challenges of WASH policies and projects in most West African states.

Keywords: colonial rule, disease control, neglected tropical diseases, WASH

Procedia PDF Downloads 163
263 Reading Strategies of Generation X and Y: A Survey on Learners' Skills and Preferences

Authors: Kateriina Rannula, Elle Sõrmus, Siret Piirsalu

Abstract:

Mixed generation classroom is a phenomenon that current higher education establishments are faced with daily trying to meet the needs of modern labor market with its emphasis on lifelong learning and retraining. Representatives of mainly X and Y generations in one classroom acquiring higher education is a challenge to lecturers considering all the characteristics that differ one generation from another. The importance of outlining different strategies and considering the needs of the students lies in the necessity for everyone to acquire the maximum of the provided knowledge as well as to understand each other to study together in one classroom and successfully cooperate in future workplaces. In addition to different generations, there are also learners with different native languages which have an impact on reading and understanding texts in third languages, including possible translation. Current research aims to investigate, describe and compare reading strategies among the representatives of generation X and Y. Hypotheses were formulated - representatives of generation X and Y use different reading strategies which is also different among first and third year students of the before mentioned generations. Current study is an empirical, qualitative study. To achieve the aim of the research, relevant literature was analyzed and a semi-structured questionnaire conducted among the first and third year students of Tallinn Health Care College. Questionnaire consisted of 25 statements on the text reading strategies, 3 multiple choice questions on preferences considering the design and medium of the text, and three open questions on the translation process when working with a text in student’s third language. The results of the questionnaire were categorized, analyzed and compared. Both, generation X and Y described their reading strategies to be 'scanning' and 'surfing'. Compared to generation X, first year generation Y learners valued interactivity and nonlinear texts. Students frequently used strategies of skimming, scanning, translating and highlighting together with relevant-thinking and assistance-seeking. Meanwhile, the third-year generation Y students no longer frequently used translating, resourcing and highlighting while Generation X learners still incorporated these strategies. Knowing about different needs of the generations currently inside the classrooms and on the labor market enables us with tools to provide sustainable education and grants the society a work force that is more flexible and able to move between professions. Future research should be conducted in order to investigate the amount of learning and strategy- adoption between generations. As for reading, main suggestions arising from the research are as follows: make a variety of materials available to students; allow them to select what they want to read and try to make those materials visually attractive, relevant, and appropriately challenging for learners considering the differences of generations.

Keywords: generation X, generation Y, learning strategies, reading strategies

Procedia PDF Downloads 161
262 Turkey at the End of the Second Decade of the 21st Century: A Secular or Religious Country?

Authors: Francesco Pisano

Abstract:

Islam has been an important topic in Turkey’s institutional identity. Since the dawn of the Turkish Republic, at the end of the First World War, the new Turkish leadership was urged to deal with the religious heritage of the Sultanate. Mustafa Kemal Ataturk, Turkey’s first President, led the country in a process of internal change, substantially modifying not merely the democratic stance of it, but also the way politics was addressing the Muslim faith. Islam was banned from the public sector of the society and was drastically marginalized to the mere private sphere of citizens’ lives. Headscarves were banned from institutional buildings together with any other religious practice, while the country was proceeding down a path of secularism and Westernization. This issue is demonstrated by the fact that even a new elected Prime Minister, Recep Tayyip Erdoğan, was initially barred from taking the institutional position, because of allegations that he had read a religious text while campaigning. Over the years, thanks to this initial internal shift, Turkey has often been seen by Western partners as one of the few countries that had managed to find a perfect balance between a democratic stance and an Islamic inherent nature. In the early 2000s, this led many academics to believe that Ankara could eventually have become the next European capital. Since then, the internal and external landscape of Turkey has drastically changed. Today, religion has returned to be an important point of reference for Turkish politics, considering also the failure of the European negotiations and the always more unstable external environment of the country. This paper wants to address this issue, looking at the important role religion has covered in the Turkish society and the way it has been politicized since the early years of the Republic. It will evolve from a more theoretical debate on secularism and the path of political westernization of Turkey under Ataturk’s rule to a more practical analysis of today’s situation, passing through the failure of Ankara’s accession into the EU and the current tense political relation with its traditional NATO allies. The final objective of this research, therefore, is not to offer a meticulous opinion on Turkey’s current international stance. This issue will be left entirely to the personal consideration of the reader. Rather, it will supplement the existing literature with a comprehensive and more structured analysis on the role Islam has played on Turkish politics since the early 1920s up until the political domestic revolution of the early 2000s, after the first electoral win of the Justice and Development Party (AKP).

Keywords: democracy, Islam, Mustafa Kemal Atatürk, Recep Tayyip Erdoğan, Turkey

Procedia PDF Downloads 182
261 Re-Examining the Distinction between Odour Nuisance and Health Impact: A Community’s Campaign against Landfill Gas Exposure in Shongweni, South Africa

Authors: Colin David La Grange, Lisa Frost Ramsay

Abstract:

Hydrogen sulphide (H2S) is a minor component of landfill gas, but significant in its distinct odorous quality and its association with landfill-related community complaints. The World Health Organisation (WHO) provides two guidelines for H2S: a health guideline at 150 µg/m3 on a 24-hour average, and a nuisance guideline at 7 µg/m3 on a 30-minute average. Albeit a practical distinction for impact assessment, this paper highlights the danger of the apparent dualism between nuisance and health impact, particularly when it is used to dismiss community concerns of perceived health impacts at low concentrations of H2S, as in the case of a community battle against the impacts of a landfill in Shongweni, KwaZulu-Natal, South Africa. Here community members reported, using a community developed mobile phone application, a range of health symptoms that coincided with, or occurred subsequent to, odour events and localised H2S peaks. Local doctors also documented increased visits for symptoms of respiratory distress, eye and skin irritation, and stress after such odour events. Objectively measured H2S and other pollutant concentrations during these events, however, remained below WHO health guidelines. This case study highlights the importance of the physiological link between the experience of environmental nuisance and overall health and wellbeing, showing these to be less distinct than the WHO guidelines would suggest. The potential mechanisms of impact of an odorous plume, with key constituents at concentrations below traditional health thresholds, on psychologically and/or physiologically sensitised individuals are described. In the case of psychological sensitisation, previously documented mechanisms such as aversive conditioning and odour-triggered panic are relevant. Physiological sensitisation to environmental pollutants, evident as a seemingly disproportionate physical (allergy-type) response to either low concentrations or a short duration exposure of a toxin or toxins, remains extensively examined but still not well understood. The links between a heightened sensitivity to toxic compounds, accumulation of some compounds in the body, and a pre-existing or associated immunological stress disorder are presented as a possible explanation.

Keywords: immunological stress disorder, landfill odour, odour nuisance, odour sensitisation, toxin accumulation

Procedia PDF Downloads 102
260 The Routine Use of a Negative Pressure Incision Management System in Vascular Surgery: A Case Series

Authors: Hansraj Bookun, Angela Tan, Rachel Xuan, Linheng Zhao, Kejia Wang, Animesh Singla, David Kim, Christopher Loupos

Abstract:

Introduction: Incisional wound complications in vascular surgery patients represent a significant clinical and econometric burden of morbidity and mortality. The objective of this study was to trial the feasibility of applying the Prevena negative pressure incision management system as a routine dressing in patients who had undergone arterial surgery. Conventionally, Prevena has been applied to groin incisions, but this study features applications on multiple wound sites such as the thigh or major amputation stumps. Method: This was a cross-sectional observational, single-centre case series of 12 patients who had undergone major vascular surgery. Their wounds were managed with the Prevena system being applied either intra-operatively or on the first post-operative day. Demographic and operative details were collated as well as the length of stay and complication rates. Results: There were 9 males (75%) with mean age of 66 years and the comorbid burden was as follows: ischaemic heart disease (92%), diabetes (42%), hypertension (100%), stage 4 or greater kidney impairment (17%) and current or ex-smoking (83%). The main indications were acute ischaemia (33%), claudication (25%), and gangrene (17%). There were single instances of an occluded popliteal artery aneurysm, diabetic foot infection, and rest pain. The majority of patients (50%) had hybrid operations with iliofemoral endarterectomies, patch arterioplasties, and further peripheral endovascular treatment. There were 4 complex arterial bypass operations and 2 major amputations. The mean length of stay was 17 ± 10 days, with a range of 4 to 35 days. A single complication, in the form of a lymphocoele, was encountered in the context of an iliofemoral endarterectomy and patch arterioplasty. This was managed conservatively. There were no deaths. Discussion: The Prevena wound management system shows that in conjunction with safe vascular surgery, absolute wound complication rates remain low and that it remains a valuable adjunct in the treatment of vasculopaths.

Keywords: wound care, negative pressure, vascular surgery, closed incision

Procedia PDF Downloads 111
259 Metagenomic Assessment of the Effects of Genetically Modified Crops on Microbial Ecology and Physicochemical Properties of Soil

Authors: Falana Yetunde Olaitan, Ijah U. J. J, Solebo Shakirat O.

Abstract:

Genetically modified crops are already phenomenally successful and are grown worldwide in more than eighteen countries on more than 67 million hectares. Nigeria, in October 2018, approved Bacillus thuringiensis (Bt) cotton and maize; therefore, the need to carry out environmental risk assessment studies. A total of 15 4L octagonal ceramic pots were filled with 4kg of soil and placed on the bench in 2 rows of 10 pots each and the 3rd row of 5 pots, 1st-row pots were used to plant GM cotton seeds, while the 2nd-row pots were used for non-GM cotton seeds and the 3rd row of 5 pots served as control, all in the screen house. Soil samples for metagenomic DNA extraction were collected at random and at the monthly interval after planting at a distance of 2mm from the plant’s root and at a depth of 10cm using a sterile spatula. Soil samples for physicochemical analysis were collected before planting and after harvesting the GM and non-GM crops as well as from the control soil. The DNA was extracted, quantified and sequenced; Sample 1A (DNA from GM cotton Soil at 1st interval) gave the lowest sequence read with 0.853M while sample 2B (DNA from GM cotton Soil at 2nd interval) gave the highest with 5.785M, others gave between 1.8M and 4.7M. The samples treatment were grouped into four, Group 1 (GM cotton soil from 1 to 3 intervals) had between 800,000 and 5,700,000 strains of microbes (SOM), Group 2 (non GM cotton soil from 1 to 3 intervals) had between 1,400,600 and 4,200,000 SOM, Group 3 (control soil) had between 900,000 and 3,600,000 SOM and Group 4 (initial soil) had between 3,700,000 and 4,000,000 SOM. The microbes observed were predominantly bacteria (including archaea), fungi, dark matter alongside protists and phages. The predominant bacterial groups were the Terrabacteria (Bacillus funiculus, Bacillus sp.), the Proteobacteria (Microvirga massiliensis, sphingomonas sp.) and the Archaea (Nitrososphaera sp.), while the fungi were Aspergillus fischeri and Fusarium falciforme. The comparative analysis between groups was done using JACCARD PERMANOVA beta diversity analysis at P-value not more than 0.76 and there was no significant pair found. The pH for initial, GM cotton, non-GM cotton and control soil were 6.28, 6.26, 7.25, 8.26 and the percentage moisture was 0.63, 0.78, 0.89 and 0.82, respectively, while the percentage Nitrogen was observed to be 17.79, 1.14, 1.10 and 0.56 respectively. Other parameters include, varying concentrations of Potassium (0.46, 1,284.47, 1,785.48, 1,252.83 mg/kg) and Phosphorus (18.76, 17.76, 16.87, 15.23 mg/kg) were recorded for the four treatments respectively. The soil consisted mainly of silt (32.09 to 34.66%) and clay (58.89 to 60.23%), reflecting the soil texture as silty – clay. The results were then tested with ANOVA at less than 0.05 P-value and no pair was found to be significant as well. The results suggest that the GM crops have no significant effect on microbial ecology and physicochemical properties of the soil and, in turn, no direct or indirect effects on human health.

Keywords: genetically modified crop, microbial ecology, physicochemical properties, metagenomics, DNA, soil

Procedia PDF Downloads 125
258 GPU-Based Back-Projection of Synthetic Aperture Radar (SAR) Data onto 3D Reference Voxels

Authors: Joshua Buli, David Pietrowski, Samuel Britton

Abstract:

Processing SAR data usually requires constraints in extent in the Fourier domain as well as approximations and interpolations onto a planar surface to form an exploitable image. This results in a potential loss of data requires several interpolative techniques, and restricts visualization to two-dimensional plane imagery. The data can be interpolated into a ground plane projection, with or without terrain as a component, all to better view SAR data in an image domain comparable to what a human would view, to ease interpretation. An alternate but computationally heavy method to make use of more of the data is the basis of this research. Pre-processing of the SAR data is completed first (matched-filtering, motion compensation, etc.), the data is then range compressed, and lastly, the contribution from each pulse is determined for each specific point in space by searching the time history data for the reflectivity values for each pulse summed over the entire collection. This results in a per-3D-point reflectivity using the entire collection domain. New advances in GPU processing have finally allowed this rapid projection of acquired SAR data onto any desired reference surface (called backprojection). Mathematically, the computations are fast and easy to implement, despite limitations in SAR phase history data size and 3D-point cloud size. Backprojection processing algorithms are embarrassingly parallel since each 3D point in the scene has the same reflectivity calculation applied for all pulses, independent of all other 3D points and pulse data under consideration. Therefore, given the simplicity of the single backprojection calculation, the work can be spread across thousands of GPU threads allowing for accurate reflectivity representation of a scene. Furthermore, because reflectivity values are associated with individual three-dimensional points, a plane is no longer the sole permissible mapping base; a digital elevation model or even a cloud of points (collected from any sensor capable of measuring ground topography) can be used as a basis for the backprojection technique. This technique minimizes any interpolations and modifications of the raw data, maintaining maximum data integrity. This innovative processing will allow for SAR data to be rapidly brought into a common reference frame for immediate exploitation and data fusion with other three-dimensional data and representations.

Keywords: backprojection, data fusion, exploitation, three-dimensional, visualization

Procedia PDF Downloads 49
257 Developing Social Responsibility Values in Nascent Entrepreneurs through Role-Play: An Explorative Study of University Students in the United Kingdom

Authors: David W. Taylor, Fernando Lourenço, Carolyn Branston, Paul Tucker

Abstract:

There are an increasing number of students at Universities in the United Kingdom engaging in entrepreneurship role-play to explore business start-up as a career alternative to employment. These role-play activities have been shown to have a positive influence on students’ entrepreneurial intentions. Universities also play a role in developing graduates’ awareness of social responsibility. However, social responsibility is often missing from these entrepreneurship role-plays. It is important that these role-play activities include the development of values that support social responsibility, in-line with those running hybrid, humane and sustainable enterprises, and not simply focus on profit. The Young Enterprise (YE) Start-Up programme is an example of a role-play activity that is gaining in popularity amongst United Kingdom Universities seeking ways to give students insight into a business start-up. A Post-92 University in the North-West of England has adapted the traditional YE Directorship roles (e.g., Marketing Director, Sales Director) by including a Corporate Social Responsibility (CSR) Director in all of the team-based YE Start-Up businesses. The aim for introducing this Directorship was to observe if such a role would help create a more socially responsible value-system within each company and in turn shape business decisions. This paper investigates role-play as a tool to help enterprise educators develop socially responsible attitudes and values in nascent entrepreneurs. A mixed qualitative methodology approach has been used, which includes interviews, role-play, and reflection, to help students develop positive value characteristics through the exploration of unethical and selfish behaviors. The initial findings indicate that role-play helped CSR Directors learn and gain insights into the importance of corporate social responsibility, influenced the values and actions of their YE Start-Ups, and increased the likelihood that if the participants were to launch a business post-graduation, that the intent would be for the business to be socially responsible. These findings help inform educators on how to develop socially responsible nascent entrepreneurs within a traditionally profit orientated business model.

Keywords: student entrepreneurship, young enterprise, social responsibility, role-play, values

Procedia PDF Downloads 127