Search results for: short columns
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3250

Search results for: short columns

2350 On the Influence of the Covid-19 Pandemic on Tunisian Stock Market: By Sector Analysis

Authors: Nadia Sghaier

Abstract:

In this paper, we examine the influence of the COVID-19 pandemic on the performance of the Tunisian stock market and 12 sectors over a recent period from 23 March 2020 to 18 August 2021, including several waves and the introduction of vaccination. The empirical study is conducted using cointegration techniques which allows for long and short-run relationships. The obtained results indicate that both daily growth in confirmed cases and deaths have a negative and significant effect on the stock market returns. In particular, this effect differs across sectors. It seems more pronounced in financial, consumer goods and industrials sectors. These findings have important implications for investors to predict the behavior of the stock market or sectors returns and to implement hedging strategies during the COVID-19 pandemic.

Keywords: Tunisian stock market, sectors, COVID-19 pandemic, cointegration techniques

Procedia PDF Downloads 185
2349 A Sustainable and Low-Cost Filter to Treat Pesticides in Water

Authors: T. Abbas, J. McEvoy, E. Khan

Abstract:

Pesticide contamination in water supply is a common environmental problem in rural agricultural communities. Advanced water treatment processes such as membrane filtration and adsorption on activated carbon only remove pesticides from water without degrading them into less toxic/easily degradable compounds leaving behind contaminated brine and activated carbon that need to be managed. Rural communities which normally cannot afford expensive water treatment technologies need an economical and sustainable filter which not only treats pesticides from water but also degrades them into benign products. In this study, iron turning waste experimented as potential point-of-use filtration media for the removal/degradation of a mixture of six chlorinated pesticides (lindane, heptachlor, endosulfan, dieldrin, endrin, and DDT) in water. As a common and traditional medium for water filtration, sand was also tested along with iron turning waste. Iron turning waste was characterized using scanning electron microscopy and energy dispersive X-Ray analyzer. Four glass columns with different filter media layer configurations were set up: (1) only sand, (2) only iron turning, (3) sand and iron turning (two separate layers), and (4) sand, iron turning and sand (three separate layers). The initial pesticide concentration and flow rate were 2 μg/L and 10 mL/min. Results indicate that sand filtration was effective only for the removal of DDT (100%) and endosulfan (94-96%). Iron turning filtration column effectively removed endosulfan, endrin, and dieldrin (85-95%) whereas the lindane and DDT removal were 79-85% and 39-56%, respectively. The removal efficiencies for heptachlor, endosulfan, endrin, dieldrin, and DDT were 90-100% when sand and iron turning waste (two separate layers) were used. However, better removal efficiencies (93-100%) for five out of six pesticides were achieved, when sand, iron turning and sand (three separate layers) were used as filtration media. Moreover, the effects of water pH, amounts of media, and minerals present in water such as magnesium, sodium, calcium, and nitrate on the removal of pesticides were examined. Results demonstrate that iron turning waste efficiently removed all the pesticides under studied parameters. Also, it completely de-chlorinated all the pesticides studied and based on the detection of by-products, the degradation mechanisms for all six pesticides were proposed.

Keywords: pesticide contamination, rural communities, iron turning waste, filtration

Procedia PDF Downloads 239
2348 Molecular Biomonitoring of Bacterial Pathogens in Wastewater

Authors: Desouky Abd El Haleem, Sahar Zaki

Abstract:

This work was conducted to develop a one-step multiplex PCR system for rapid, sensitive, and specific detection of three different bacterial pathogens, Escherichia coli, Pseudomonas aeruginosa, and Salmonella spp, directly in wastewater without prior isolation on selective media. As a molecular confirmatory test after isolation of the pathogens by classical microbiological methods, PCR-RFLP of their amplified 16S rDNA genes was performed. It was observed that the developed protocols have significance impact in the ability to detect sensitively, rapidly and specifically the three pathogens directly in water within short-time, represents a considerable advancement over more time-consuming and less-sensitive methods for identification and characterization of these kinds of pathogens.

Keywords: multiplex PCR, bacterial pathogens, Escherichia coli, Pseudomonas aeruginosa, Salmonella spp.

Procedia PDF Downloads 430
2347 Analysis of Teachers' Self Efficacy in Terms of Emotional Intelligence

Authors: Ercan Yilmaz, Ali Murat Sünbül

Abstract:

The aim of the study is to investigate teachers’ self-efficacy with regards to their emotional intelligence. The relational model was used in the study. The participant of the study included 194 teachers from secondary schools in Konya, Turkey. In order to assess teachers’ emotional intelligence, “Trait Emotional Intelligence Questionnaire-short Form was implemented. For teachers’ self-efficacy, “Teachers’ Sense of Self-Efficacy Scale” was used. As a result of the study, a significant relationship is available between teachers’ sense of self-efficacy and their emotional intelligence. Teachers’ emotional intelligence enucleates approximate eighteen percent of the variable in dimension named teachers’ self-efficacy for the students’ involvement. About nineteen percent of the variable in dimension “self-efficacy for teaching strategies is represented through emotional intelligence. Teachers’ emotional intelligence demonstrates about seventeen percent of variable aimed at classroom management.

Keywords: teachers, self-efficacy, emotional intelligence, education

Procedia PDF Downloads 434
2346 Recycling Carbon Fibers/Epoxy Composites Wastes in Building Materials Based on Geopolymer Binders

Authors: A. Saccani, I. Lancellotti, E. Bursi

Abstract:

Scraps deriving from the production of epoxy-carbon fibers composites have been recycled as a reinforcement to produce building materials. Short chopped fibers (5-7 mm length) have been added at low volume content (max 10%) to produce mortars. The microstructure, mechanical properties (mainly flexural strength) and dimensional stability of the derived materials have been investigated. Two different types of matrix have been used: one based on conventional Portland Cement and the other containing geopolymers formed starting from activated metakaolin and fly ashes. In the second case the materials is almost completely made of recycled ingredients. This is an attempt to produce reliable materials solving waste disposal problems. The first collected results show promising results.

Keywords: building materials, carbon fibres, fly ashes, geopolymers

Procedia PDF Downloads 141
2345 Control the Flow of Big Data

Authors: Shizra Waris, Saleem Akhtar

Abstract:

Big data is a research area receiving attention from academia and IT communities. In the digital world, the amounts of data produced and stored have within a short period of time. Consequently this fast increasing rate of data has created many challenges. In this paper, we use functionalism and structuralism paradigms to analyze the genesis of big data applications and its current trends. This paper presents a complete discussion on state-of-the-art big data technologies based on group and stream data processing. Moreover, strengths and weaknesses of these technologies are analyzed. This study also covers big data analytics techniques, processing methods, some reported case studies from different vendor, several open research challenges and the chances brought about by big data. The similarities and differences of these techniques and technologies based on important limitations are also investigated. Emerging technologies are suggested as a solution for big data problems.

Keywords: computer, it community, industry, big data

Procedia PDF Downloads 175
2344 Noninvasive Technique for Measurement of Heartbeat in Zebrafish Embryos Exposed to Electromagnetic Fields at 27 GHz

Authors: Sara Ignoto, Elena M. Scalisi, Carmen Sica, Martina Contino, Greta Ferruggia, Antonio Salvaggio, Santi C. Pavone, Gino Sorbello, Loreto Di Donato, Roberta Pecoraro, Maria V. Brundo

Abstract:

The new fifth generation technology (5G), which should favor high data-rate connections (1Gbps) and latency times lower than the current ones (<1ms), has the characteristic of working on different frequency bands of the radio wave spectrum (700 MHz, 3.6-3.8 GHz and 26.5-27.5 GHz), thus also exploiting higher frequencies than previous mobile radio generations (1G-4G). The higher frequency waves, however, have a lower capacity to propagate in free space and therefore, in order to guarantee the capillary coverage of the territory for high reliability applications, it will be necessary to install a large number of repeaters. Following the introduction of this new technology, there has been growing concern in recent years about the possible harmful effects on human health and several studies were published using several animal models. This study aimed to observe the possible short-term effects induced by 5G-millimeter waves on heartbeat of early life stages of Danio rerio using DanioScope software (Noldus). DanioScope is the complete toolbox for measurements on zebrafish embryos and larvae. The effect of substances can be measured on the developing zebrafish embryo by a range of parameters: earliest activity of the embryo’s tail, activity of the developing heart, speed of blood flowing through the vein, length and diameters of body parts. Activity measurements, cardiovascular data, blood flow data and morphometric parameters can be combined in one single tool. Obtained data are elaborate and provided by the software both numerical as well as graphical. The experiments were performed at 27 GHz by a no commercial high gain pyramidal horn antenna. According to OECD guidelines, exposure to 5G-millimeter waves was tested by fish embryo toxicity test within 96 hours post fertilization, Observations were recorded every 24h, until the end of the short-term test (96h). The results have showed an increase of heartbeat rate on exposed embryos at 48h hpf than control group, but this increase has not been shown at 72-96 h hpf. Nowadays, there is a scant of literature data about this topic, so these results could be useful to approach new studies and also to evaluate potential cardiotoxic effects of mobile radiofrequency.

Keywords: Danio rerio, DanioScope, cardiotoxicity, millimeter waves.

Procedia PDF Downloads 144
2343 Rapid Detection System of Airborne Pathogens

Authors: Shigenori Togashi, Kei Takenaka

Abstract:

We developed new processes which can collect and detect rapidly airborne pathogens such as the avian flu virus for the pandemic prevention. The fluorescence antibody technique is known as one of high-sensitive detection methods for viruses, but this needs up to a few hours to bind sufficient fluorescence dyes to viruses for detection. In this paper, we developed a mist-labeling can detect substitution viruses in a short time to improve the binding rate of fluorescent dyes and substitution viruses by the micro reaction process. Moreover, we developed the rapid detection system with the above 'mist labeling'. The detection system set with a sampling bag collecting patient’s breath and a cartridge can detect automatically pathogens within 10 minutes.

Keywords: viruses, sampler, mist, detection, fluorescent dyes, microreaction

Procedia PDF Downloads 454
2342 PsyVBot: Chatbot for Accurate Depression Diagnosis using Long Short-Term Memory and NLP

Authors: Thaveesha Dheerasekera, Dileeka Sandamali Alwis

Abstract:

The escalating prevalence of mental health issues, such as depression and suicidal ideation, is a matter of significant global concern. It is plausible that a variety of factors, such as life events, social isolation, and preexisting physiological or psychological health conditions, could instigate or exacerbate these conditions. Traditional approaches to diagnosing depression entail a considerable amount of time and necessitate the involvement of adept practitioners. This underscores the necessity for automated systems capable of promptly detecting and diagnosing symptoms of depression. The PsyVBot system employs sophisticated natural language processing and machine learning methodologies, including the use of the NLTK toolkit for dataset preprocessing and the utilization of a Long Short-Term Memory (LSTM) model. The PsyVBot exhibits a remarkable ability to diagnose depression with a 94% accuracy rate through the analysis of user input. Consequently, this resource proves to be efficacious for individuals, particularly those enrolled in academic institutions, who may encounter challenges pertaining to their psychological well-being. The PsyVBot employs a Long Short-Term Memory (LSTM) model that comprises a total of three layers, namely an embedding layer, an LSTM layer, and a dense layer. The stratification of these layers facilitates a precise examination of linguistic patterns that are associated with the condition of depression. The PsyVBot has the capability to accurately assess an individual's level of depression through the identification of linguistic and contextual cues. The task is achieved via a rigorous training regimen, which is executed by utilizing a dataset comprising information sourced from the subreddit r/SuicideWatch. The diverse data present in the dataset ensures precise and delicate identification of symptoms linked with depression, thereby guaranteeing accuracy. PsyVBot not only possesses diagnostic capabilities but also enhances the user experience through the utilization of audio outputs. This feature enables users to engage in more captivating and interactive interactions. The PsyVBot platform offers individuals the opportunity to conveniently diagnose mental health challenges through a confidential and user-friendly interface. Regarding the advancement of PsyVBot, maintaining user confidentiality and upholding ethical principles are of paramount significance. It is imperative to note that diligent efforts are undertaken to adhere to ethical standards, thereby safeguarding the confidentiality of user information and ensuring its security. Moreover, the chatbot fosters a conducive atmosphere that is supportive and compassionate, thereby promoting psychological welfare. In brief, PsyVBot is an automated conversational agent that utilizes an LSTM model to assess the level of depression in accordance with the input provided by the user. The demonstrated accuracy rate of 94% serves as a promising indication of the potential efficacy of employing natural language processing and machine learning techniques in tackling challenges associated with mental health. The reliability of PsyVBot is further improved by the fact that it makes use of the Reddit dataset and incorporates Natural Language Toolkit (NLTK) for preprocessing. PsyVBot represents a pioneering and user-centric solution that furnishes an easily accessible and confidential medium for seeking assistance. The present platform is offered as a modality to tackle the pervasive issue of depression and the contemplation of suicide.

Keywords: chatbot, depression diagnosis, LSTM model, natural language process

Procedia PDF Downloads 45
2341 Bringing the World to Net Zero Carbon Dioxide by Sequestering Biomass Carbon

Authors: Jeffrey A. Amelse

Abstract:

Many corporations aspire to become Net Zero Carbon Carbon Dioxide by 2035-2050. This paper examines what it will take to achieve those goals. Achieving Net Zero CO₂ requires an understanding of where energy is produced and consumed, the magnitude of CO₂ generation, and proper understanding of the Carbon Cycle. The latter leads to the distinction between CO₂ and biomass carbon sequestration. Short reviews are provided for prior technologies proposed for reducing CO₂ emissions from fossil fuels or substitution by renewable energy, to focus on their limitations and to show that none offer a complete solution. Of these, CO₂ sequestration is poised to have the largest impact. It will just cost money, scale-up is a huge challenge, and it will not be a complete solution. CO₂ sequestration is still in the demonstration and semi-commercial scale. Transportation accounts for only about 30% of total U.S. energy demand, and renewables account for only a small fraction of that sector. Yet, bioethanol production consumes 40% of U.S. corn crop, and biodiesel consumes 30% of U.S. soybeans. It is unrealistic to believe that biofuels can completely displace fossil fuels in the transportation market. Bioethanol is traced through its Carbon Cycle and shown to be both energy inefficient and inefficient use of biomass carbon. Both biofuels and CO₂ sequestration reduce future CO₂ emissions from continued use of fossil fuels. They will not remove CO₂ already in the atmosphere. Planting more trees has been proposed as a way to reduce atmospheric CO₂. Trees are a temporary solution. When they complete their Carbon Cycle, they die and release their carbon as CO₂ to the atmosphere. Thus, planting more trees is just 'kicking the can down the road.' The only way to permanently remove CO₂ already in the atmosphere is to break the Carbon Cycle by growing biomass from atmospheric CO₂ and sequestering biomass carbon. Sequestering tree leaves is proposed as a solution. Unlike wood, leaves have a short Carbon Cycle time constant. They renew and decompose every year. Allometric equations from the USDA indicate that theoretically, sequestrating only a fraction of the world’s tree leaves can get the world to Net Zero CO₂ without disturbing the underlying forests. How can tree leaves be permanently sequestered? It may be as simple as rethinking how landfills are designed to discourage instead of encouraging decomposition. In traditional landfills, municipal waste undergoes rapid initial aerobic decomposition to CO₂, followed by slow anaerobic decomposition to methane and CO₂. The latter can take hundreds to thousands of years. The first step in anaerobic decomposition is hydrolysis of cellulose to release sugars, which those who have worked on cellulosic ethanol know is challenging for a number of reasons. The key to permanent leaf sequestration may be keeping the landfills dry and exploiting known inhibitors for anaerobic bacteria.

Keywords: carbon dioxide, net zero, sequestration, biomass, leaves

Procedia PDF Downloads 107
2340 A Computerized Tool for Predicting Future Reading Abilities in Pre-Readers Children

Authors: Stephanie Ducrot, Marie Vernet, Eve Meiss, Yves Chaix

Abstract:

Learning to read is a key topic of debate today, both in terms of its implications on school failure and illiteracy and regarding what are the best teaching methods to develop. It is estimated today that four to six percent of school-age children suffer from specific developmental disorders that impair learning. The findings from people with dyslexia and typically developing readers suggest that the problems children experience in learning to read are related to the preliteracy skills that they bring with them from kindergarten. Most tools available to professionals are designed for the evaluation of child language problems. In comparison, there are very few tools for assessing the relations between visual skills and the process of learning to read. Recent literature reports that visual-motor skills and visual-spatial attention in preschoolers are important predictors of reading development — the main goal of this study aimed at improving screening for future reading difficulties in preschool children. We used a prospective, longitudinal approach where oculomotor processes (assessed with the DiagLECT test) were measured in pre-readers, and the impact of these skills on future reading development was explored. The dialect test specifically measures the online time taken to name numbers arranged irregularly in horizontal rows (horizontal time, HT), and the time taken to name numbers arranged in vertical columns (vertical time, VT). A total of 131 preschoolers took part in this study. At Time 0 (kindergarten), the mean VT, HT, errors were recorded. One year later, at Time 1, the reading level of the same children was evaluated. Firstly, this study allowed us to provide normative data for a standardized evaluation of the oculomotor skills in 5- and 6-year-old children. The data also revealed that 25% of our sample of preschoolers showed oculomotor impairments (without any clinical complaints). Finally, the results of this study assessed the validity of the DiagLECT test for predicting reading outcomes; the better a child's oculomotor skills are, the better his/her reading abilities will be.

Keywords: vision, attention, oculomotor processes, reading, preschoolers

Procedia PDF Downloads 129
2339 Predicting Polyethylene Processing Properties Based on Reaction Conditions via a Coupled Kinetic, Stochastic and Rheological Modelling Approach

Authors: Kristina Pflug, Markus Busch

Abstract:

Being able to predict polymer properties and processing behavior based on the applied operating reaction conditions in one of the key challenges in modern polymer reaction engineering. Especially, for cost-intensive processes such as the high-pressure polymerization of low-density polyethylene (LDPE) with high safety-requirements, the need for simulation-based process optimization and product design is high. A multi-scale modelling approach was set-up and validated via a series of high-pressure mini-plant autoclave reactor experiments. The approach starts with the numerical modelling of the complex reaction network of the LDPE polymerization taking into consideration the actual reaction conditions. While this gives average product properties, the complex polymeric microstructure including random short- and long-chain branching is calculated via a hybrid Monte Carlo-approach. Finally, the processing behavior of LDPE -its melt flow behavior- is determined in dependence of the previously determined polymeric microstructure using the branch on branch algorithm for randomly branched polymer systems. All three steps of the multi-scale modelling approach can be independently validated against analytical data. A triple-detector GPC containing an IR, viscosimetry and multi-angle light scattering detector is applied. It serves to determine molecular weight distributions as well as chain-length dependent short- and long-chain branching frequencies. 13C-NMR measurements give average branching frequencies, and rheological measurements in shear and extension serve to characterize the polymeric flow behavior. The accordance of experimental and modelled results was found to be extraordinary, especially taking into consideration that the applied multi-scale modelling approach does not contain parameter fitting of the data. This validates the suggested approach and proves its universality at the same time. In the next step, the modelling approach can be applied to other reactor types, such as tubular reactors or industrial scale. Moreover, sensitivity analysis for systematically varying process conditions is easily feasible. The developed multi-scale modelling approach finally gives the opportunity to predict and design LDPE processing behavior simply based on process conditions such as feed streams and inlet temperatures and pressures.

Keywords: low-density polyethylene, multi-scale modelling, polymer properties, reaction engineering, rheology

Procedia PDF Downloads 111
2338 Chromatography Study of Fundamental Properties of Medical Radioisotope Astatine-211

Authors: Evgeny E. Tereshatov

Abstract:

Astatine-211 is considered one of the most promising radionuclides for Targeted Alpha Therapy. In order to develop reliable procedures to label biomolecules and utilize efficient delivery vehicle principles, one should understand the main chemical characteristics of astatine. The short half-life of 211At (~7.2 h) and absence of any stable isotopes of this element are limiting factors towards studying the behavior of astatine. Our team has developed a procedure for rapid and efficient isolation of astatine from irradiated bismuth material in nitric acid media based on 3-octanone and 1-octanol extraction chromatography resins. This process has been automated and it takes 20 min from the beginning of the target dissolution to the At-211 fraction elution. Our next step is to consider commercially available chromatography resins and their applicability in astatine purification in the same media. Results obtained along with the corresponding sorption mechanisms will be discussed.

Keywords: astatine-211, chromatography, automation, mechanism, radiopharmaceuticals

Procedia PDF Downloads 75
2337 Design of Permanent Sensor Fault Tolerance Algorithms by Sliding Mode Observer for Smart Hybrid Powerpack

Authors: Sungsik Jo, Hyeonwoo Kim, Iksu Choi, Hunmo Kim

Abstract:

In the SHP, LVDT sensor is for detecting the length changes of the EHA output, and the thrust of the EHA is controlled by the pressure sensor. Sensor is possible to cause hardware fault by internal problem or external disturbance. The EHA of SHP is able to be uncontrollable due to control by feedback from uncertain information, on this paper; the sliding mode observer algorithm estimates the original sensor output information in permanent sensor fault. The proposed algorithm shows performance to recovery fault of disconnection and short circuit basically, also the algorithm detect various of sensor fault mode.

Keywords: smart hybrid powerpack (SHP), electro hydraulic actuator (EHA), permanent sensor fault tolerance, sliding mode observer (SMO), graphic user interface (GUI)

Procedia PDF Downloads 535
2336 Exploring the Impact of Input Sequence Lengths on Long Short-Term Memory-Based Streamflow Prediction in Flashy Catchments

Authors: Farzad Hosseini Hossein Abadi, Cristina Prieto Sierra, Cesar Álvarez Díaz

Abstract:

Predicting streamflow accurately in flashy catchments prone to floods is a major research and operational challenge in hydrological modeling. Recent advancements in deep learning, particularly Long Short-Term Memory (LSTM) networks, have shown to be promising in achieving accurate hydrological predictions at daily and hourly time scales. In this work, a multi-timescale LSTM (MTS-LSTM) network was applied to the context of regional hydrological predictions at an hourly time scale in flashy catchments. The case study includes 40 catchments allocated in the Basque Country, north of Spain. We explore the impact of hyperparameters on the performance of streamflow predictions given by regional deep learning models through systematic hyperparameter tuning - where optimal regional values for different catchments are identified. The results show that predictions are highly accurate, with Nash-Sutcliffe (NSE) and Kling-Gupta (KGE) metrics values as high as 0.98 and 0.97, respectively. A principal component analysis reveals that a hyperparameter related to the length of the input sequence contributes most significantly to the prediction performance. The findings suggest that input sequence lengths have a crucial impact on the model prediction performance. Moreover, employing catchment-scale analysis reveals distinct sequence lengths for individual basins, highlighting the necessity of customizing this hyperparameter based on each catchment’s characteristics. This aligns with well known “uniqueness of the place” paradigm. In prior research, tuning the length of the input sequence of LSTMs has received limited focus in the field of streamflow prediction. Initially it was set to 365 days to capture a full annual water cycle. Later, performing limited systematic hyper-tuning using grid search, revealed a modification to 270 days. However, despite the significance of this hyperparameter in hydrological predictions, usually studies have overlooked its tuning and fixed it to 365 days. This study, employing a simultaneous systematic hyperparameter tuning approach, emphasizes the critical role of input sequence length as an influential hyperparameter in configuring LSTMs for regional streamflow prediction. Proper tuning of this hyperparameter is essential for achieving accurate hourly predictions using deep learning models.

Keywords: LSTMs, streamflow, hyperparameters, hydrology

Procedia PDF Downloads 36
2335 Capacity Building in Dietary Monitoring and Public Health Nutrition in the Eastern Mediterranean Region

Authors: Marisol Warthon-Medina, Jenny Plumb, Ayoub Aljawaldeh, Mark Roe, Ailsa Welch, Maria Glibetic, Paul M. Finglas

Abstract:

Similar to Western Countries, the Eastern Mediterranean Region (EMR) also presents major public health issues associated with the increased consumption of sugar, fat, and salt. Therefore, one of the policies of the World Health Organization’s (WHO) EMR is to reduce the intake of salt, sugar, and fat (Saturated fatty acids, trans fatty acids) to address the risk of non-communicable diseases (i.e. diabetes, cardiovascular disease, cancer) and obesity. The project objective is to assess status and provide training and capacity development in the use of improved standardized methodologies for updated food composition data, dietary intake methods, use of suitable biomarkers of nutritional value and determine health outcomes in low and middle-income countries (LMIC). Training exchanges have been developed with clusters of countries created resulting from regional needs including Sudan, Egypt and Jordan; Tunisia, Morocco, and Mauritania; and other Middle Eastern countries. This capacity building will lead to the development and sustainability of up-to-date national and regional food composition databases in LMIC for use in dietary monitoring assessment in food and nutrient intakes. Workshops were organized to provide training and capacity development in the use of improved standardized methodologies for food composition and food intake. Training needs identified and short-term scientific missions organized for LMIC researchers including (1) training and knowledge exchange workshops, (2) short-term exchange of researchers, (3) development and application of protocols and (4) development of strategies to reduce sugar and fat intake. An initial training workshop, Morocco 2018 was attended by 25 participants from 10 EMR countries to review status and support development of regional food composition. 4 training exchanges are in progress. The use of improved standardized methodologies for food composition and dietary intake will produce robust measurements that will reinforce dietary monitoring and policy in LMIC. The capacity building from this project will lead to the development and sustainability of up-to-date national and regional food composition databases in EMR countries. Supported by the UK Medical Research Council, Global Challenges Research Fund, (MR/R019576/1), and the World Health Organization’s Eastern Mediterranean Region.

Keywords: dietary intake, food composition, low and middle-income countries, status.

Procedia PDF Downloads 142
2334 On Dialogue Systems Based on Deep Learning

Authors: Yifan Fan, Xudong Luo, Pingping Lin

Abstract:

Nowadays, dialogue systems increasingly become the way for humans to access many computer systems. So, humans can interact with computers in natural language. A dialogue system consists of three parts: understanding what humans say in natural language, managing dialogue, and generating responses in natural language. In this paper, we survey deep learning based methods for dialogue management, response generation and dialogue evaluation. Specifically, these methods are based on neural network, long short-term memory network, deep reinforcement learning, pre-training and generative adversarial network. We compare these methods and point out the further research directions.

Keywords: dialogue management, response generation, deep learning, evaluation

Procedia PDF Downloads 148
2333 The Growth Role of Natural Gas Consumption for Developing Countries

Authors: Tae Young Jin, Jin Soo Kim

Abstract:

Carbon emissions have emerged as global concerns. Intergovernmental Panel of Climate Change (IPCC) have published reports about Green House Gases (GHGs) emissions regularly. United Nations Framework Convention on Climate Change (UNFCCC) have held a conference yearly since 1995. Especially, COP21 held at December 2015 made the Paris agreement which have strong binding force differently from former COP. The Paris agreement was ratified as of 4 November 2016, they finally have legal binding. Participating countries set up their own Intended Nationally Determined Contributions (INDC), and will try to achieve this. Thus, carbon emissions must be reduced. The energy sector is one of most responsible for carbon emissions and fossil fuels particularly are. Thus, this paper attempted to examine the relationship between natural gas consumption and economic growth. To achieve this, we adopted the Cobb-Douglas production function that consists of natural gas consumption, economic growth, capital, and labor using dependent panel analysis. Data were preprocessed with Principal Component Analysis (PCA) to remove cross-sectional dependency which can disturb the panel results. After confirming the existence of time-trended component of each variable, we moved to cointegration test considering cross-sectional dependency and structural breaks to describe more realistic behavior of volatile international indicators. The cointegration test result indicates that there is long-run equilibrium relationship between selected variables. Long-run cointegrating vector and Granger causality test results show that while natural gas consumption can contribute economic growth in the short-run, adversely affect in the long-run. From these results, we made following policy implications. Since natural gas has positive economic effect in only short-run, the policy makers in developing countries must consider the gradual switching of major energy source, from natural gas to sustainable energy source. Second, the technology transfer and financing business suggested by COP must be accelerated. Acknowledgement—This work was supported by the Energy Efficiency & Resources Core Technology Program of the Korea Institute of Energy Technology Evaluation and Planning (KETEP) granted financial resource from the Ministry of Trade, Industry & Energy, Republic of Korea (No. 20152510101880) and by the National Research Foundation of Korea Grant funded by the Korean Government (NRF-205S1A3A2046684).

Keywords: developing countries, economic growth, natural gas consumption, panel data analysis

Procedia PDF Downloads 213
2332 Hospital Evacuation: Best Practice Recommendations

Authors: Ronald Blough

Abstract:

Hospitals, clinics, and medical facilities are the core of the Health Services sector providing 24/7 medical care to those in need. Any disruption of these important medical services highlights the vulnerabilities in the medical system. An internal or external event can create a catastrophic incident paralyzing the medical services causing the facility to shift into emergency operations with the possibility of evacuation. The hospital administrator and government officials must decide in a very short amount of time whether to shelter in place or evacuate. This presentation will identify best practice recommendations regarding the hospital evacuation decision and response analyzing previous hospital evacuations to encourage hospitals in the region to review or develop their own emergency evacuation plans.

Keywords: disaster preparedness, hospital evacuation, shelter-in-place, incident containment, health services vulnerability, hospital resources

Procedia PDF Downloads 351
2331 Study of the Optical Illusion Effects of Color Contrasts on Body Image Perception

Authors: A. Hadj Taieb, H. Ennouri

Abstract:

The current study aimed to investigate the effect that optical illusion garments have on a woman’s self-perception of her own body shape. First, we created different optical illusion garment by using color contrasts. Second, a short survey based on visual perception is addressed to women in order to compare the different optical illusion garments to determine if they met the established 'ideal' body shape. A ‘visual analysis method’ was used to investigate the clothing models with optical illusions. The theories in relation with the optical illusion were used through this method. The effects of the optical illusion of color contrast on body shape in the fashion sector were tried to be revealed.

Keywords: optical illusion, color contrasts, body image perception, self-esteem

Procedia PDF Downloads 257
2330 Caregivers Burden: Risk and Related Psychological Factors in Caregivers of Patients with Parkinson’s Disease

Authors: Pellecchia M. T., Savarese G., Carpinelli L., Calabrese M.

Abstract:

Introduction: Parkinson's disease (PD) is characterized by a progressive loss of autonomy which undoubtedly has a significant impact on the quality of life of caregivers, and parents are the main informal caregivers. Caring for a person with PD is associated with an increased risk of psychiatric morbidity and persistent anxiety-depressive distress. The aim of the study is to investigate the burden on caregivers of patients with PD, through the use of multidimensional scales and to identify their personological and environmental determinants. Methods: The study has been approved by the Ethic Committee of the University of Salerno and informed consent for participation to the study was obtained from patients and their caregivers. The study was conducted at the Neurology Department of the A.O.U. "San Giovanni di Dio and Ruggi D’Aragona" of Salerno between September 2020 and May 2021. Materials: The questionnaires used were: a) Caregiver Burden Inventory - CBI a questionnaire of 24 items that allow identifying five sub-categories of burden (objective, psychological, physical, social, emotional); b) Depression Anxiety Stress Scales Short Version - DASS-21 questionnaire consisting of 21 items and valid in examining three distinct but interrelated areas (depression, anxiety and stress); c) Family Strain Questionnaire Short Form - FSQ-SF is a questionnaire of 30 items grouped in areas of increasing psychological risk (OK, R, SR, U); d) Zarit Caregiver Burden Inventory - ZBI, consisting of 22 items based on the analysis of two main factors: personal stress and pressure related to his role; e) Life Satisfaction, a single item that aims to evaluate the degree of life satisfaction in a global way using a 0-100 Likert scale. Findings: N ° 29 caregivers (M age = 55.14, SD = 9.859; 69% F) participated in the study. 20.6% of the sample had severe and severe burden (CBI score = M = 26.31; SD = 22.43) and 13.8% of participants had moderate to severe burden (ZBI). The FSQ-SF highlighted a minority of caregivers who need psychological support, in some cases urgent (Area SR and Area U). The DASS-21 results show a prevalence of stress-related symptoms (M = 10.90, SD = 10.712) compared to anxiety (M = 7.52, SD = 10.752) and depression (M = 8, SD = 10.876). There are significant correlations between some specific variables and mean test scores: retired caregivers report higher ZBI scores (p = 0.423) and lower Life Satisfaction levels (p = -0.460) than working caregivers; years of schooling show a negative linear correlation with the ZBI score (p = -0.491). The T-Test indicates that caregivers of patients with cognitive impairment are at greater risk than those of patients without cognitive impairment. Conclusions: It knows the factors that affect the burden the most would allow for early recognition of risky situations and caregivers who would need adequate support.

Keywords: anxious-depressive axis, caregivers’ burden, Parkinson’ disease, psychological risks

Procedia PDF Downloads 198
2329 A Report of 5-Months-Old Baby with Balanced Chromosomal Rearrangements along with Phenotypic Abnormalities

Authors: Mohit Kumar, Beklashwar Salona, Shiv Murti, Mukesh Singh

Abstract:

We report here a case of five-months old male baby, born as second child of non-consanguineous parents with no considerable history of genetic abnormality which was referred to our cytogenetic laboratory for chromosomal analysis. Physical dysmorphic facial features including mongoloid face, cleft palate, simian crease, and developmental delay were observed. We present this case with unique balanced autosomal translocation of t(3;10)(p21;p13). The risk of phenotypic abnormalities based on de novo balanced translocation was estimated to be 7%. The association of balanced chromosomal rearrangement with Down syndrome features such as multiple congenital anomalies, facial dysmorphism and congenital heart anomalies are very rare in a 5-months old male child. Trisomy-21 is not uncommon in chromosomal abnormality with the birth defect and balanced translocations are frequently observed in patients with secondary infertility or recurrent spontaneous abortion (RSA). Two ml heparinized peripheral blood cells cultured in RPMI-1640 for 72 hours supplemented with 20% fetal bovine serum, phytohemagglutinin (PHA), and antibiotics were used for chromosomal analysis. A total 30 metaphases images were captured using Olympus-BX51 microscope and analyzed using Bio-view karyotyping software through GTG-banding (G bands by trypsin and Giemsa) according to International System for Human Cytogenetic Nomenclature 2016. The results showed balanced translocation between short arm of chromosome # 3 and short arm of chromosome # 10. The karyotype of the child was found to be 46,XY,t(3;10)(p21; p13). Chromosomal abnormalities are one of the major causes of birth defect in new born babies. Also, balanced translocations are frequently observed in patients with secondary infertility or recurrent spontaneous abortion. The index case presented with dysmorphic facial features and had a balanced translocation 46,XY,t(3;10)(p21;p13). This translocation with break points at (p21; p13) has not been reported in the literature in a child with facial dysmorphism. To the best of our knowledge, this is the first report of novel balanced translocation t(3;10) with break points in a child with dysmorphic features. We found balanced chromosomal translocation instead of any trisomy or unbalanced aberrations along with some phenotypic abnormalities. Therefore, we suggest that such novel balanced translocation with abnormal phenotype should be reported in order to enable the pathologist, pediatrician, and gynecologist to have a better insight into the intricacies of chromosomal abnormalities and their associated phenotypic features. We hypothesized that dysmorphic features as seen in this case may be the result of change in the pattern of genes located at the breakpoint area in balanced translocations or may be due to deletion or mutation of genes located on the p-arm of chromosome # 3 and p-arm of chromosome # 10.

Keywords: balanced translocation, karyotyping, phenotypic abnormalities, facial dimorphisms

Procedia PDF Downloads 193
2328 In Vivo Evaluation of Exposure to Electromagnetic Fields at 27 GHz (5G) of Danio Rerio: A Preliminary Study

Authors: Elena Maria Scalisi, Roberta Pecoraro, Martina Contino, Sara Ignoto, Carmelo Iaria, Santi Concetto Pavone, Gino Sorbello, Loreto Di Donato, Maria Violetta Brundo

Abstract:

5G Technology is evolving to satisfy a variety of service requirements that may allow high data-rate connections (1Gbps) and lower latency times than current (<1ms). In order to support a high data transmission speed and a high traffic service for eMBB (enhanced mobile broadband) use cases, 5G systems have the characteristic of using different frequency bands of the radio wave spectrum (700 MHz, 3.6-3.8 GHz and 26.5-27.5 GHz), thus taking advantage of higher frequencies than previous mobile radio generations (1G-4G). However, waves at higher frequencies have a lower capacity to propagate in free space and therefore, in order to guarantee the capillary coverage of the territory for high reliability applications, it will be necessary to install a large number of repeaters. Following the introduction of this new technology, there has been growing concern over the past few months about possible harmful effects on human health. The aim of this preliminary study is to evaluate possible short term effects induced by 5G-millimeter waves on embryonic development and early life stages of Danio rerio by Z-FET. We exposed developing zebrafish at frequency of 27 GHz, with a standard pyramidal horn antenna placed at 15 cm far from the samples holder ensuring an incident power density of 10 mW/cm2. During the exposure cycle, from 6 h post fertilization (hpf) to 96 hpf, we measured a different morphological endpoints every 24 hours. Zebrafish embryo toxicity test (Z-FET) is a short term test, carried out on fertilized eggs of zebrafish and it represents an effective alternative to acute test with adult fish (OECD, 2013). We have observed that 5G did not reveal significant impacts on mortality nor on morphology because exposed larvae showed a normal detachment of the tail, presence of heartbeat, well-organized somites, therefore hatching rate was lower than untreated larvae even at 48 h of exposure. Moreover, the immunohistochemical analysis performed on larvae showed a negativity to the HSP-70 expression used as a biomarkers. This is a preliminary study on evaluation of potential toxicity induced by 5G and it seems appropriate to underline the importance that further studies would take, aimed at clarifying the probable real risk of exposure to electromagnetic fields.

Keywords: Biomarker of exposure, embryonic development, 5G waves, zebrafish embryo toxicity test

Procedia PDF Downloads 107
2327 Macronutrients and the FTO Gene Expression in Hypothalamus: A Systematic Review of Experimental Studies

Authors: Saeid Doaei

Abstract:

The various studies have examined the relationship between FTO gene expression and macronutrients levels. In order to obtain better viewpoint from this interactions, all of the existing studies were reviewed systematically. All published papers have been obtained and reviewed using standard and sensitive keywords from databases such as CINAHL, Embase, PubMed, PsycInfo, and the Cochrane, from 1990 to 2016. The results indicated that all of 6 studies that met the inclusion criteria (from a total of 428 published article) found FTO gene expression changes at short-term follow-ups. Four of six studies found an increased FTO gene expression after calorie restriction, while two of them indicated decreased FTO gene expression. The effect of protein, carbohydrate and fat were separately assessed and suggested by all of six studies. In conclusion, the level of FTO gene expression in hypothalamus is related to macronutrients levels. Future research should evaluate the long-term impact of dietary interventions.

Keywords: obesity, gene expression, FTO, macronutrients

Procedia PDF Downloads 248
2326 Educational Diagnosis and Evaluation Processes of Disabled Preschoolers in Turkey: Family Opinions

Authors: Şule Yanık, Hasan Gürgür

Abstract:

It is thought that it is important for disabled children to have the opportunity to benefit preschool education that smoothens transition process to formal education, and for the constitution of a precondition for their success. Within this context, it is important for the disabled in Turkey to be evaluated medically firstly and then educational-wise in order for them to benefit early inclusive education. Thus, disabled people are both diagnosed in hospitals and at Guidance and Research Centers (GRC) attached to Ministry of Education educational-wise. It is seen that standard evaluation tools are used and evaluations are done by special education teachers (SET) in order for educational diagnosis and evaluation (EDAE) to be realized. The literature emphasizes the importance of informal evaluation tools as well as formal ones. According to this, it is thought that another party, besides students in EDAE process and SETs, is family, because families are primary care takers for their children, and that the most correct and real information can be obtained via families beside results of educational evaluation processes (EEP). It is thought that obtaining opinions of families during EEP is important to be able to exhibit the present EDAE activities in Turkey, materialize any existing problems, and increase quality of the process. Within this context, the purpose of this study is to exhibit experiences regarding EDAE processes of 10 families having preschool children with hearing loss (CHL). The process of research is designed to be descriptive based on qualitative research paradigms. Data were collected via semi-structured interview questions, and the themes were obtained. As a result, it is seen that families, after they realize the hearing loss of their children, do not have any information regarding the subject, and that they consult to an ear-nose-throat doctor or an audiologist for support. It is seen that families go to hospitals for medical evaluation which is a pre-requisite for benefiting early education opportunities. However, during this process, as some families do not have any experience of having a CHL, it is seen that they are late for medical evaluation and hearing aids. Moreover, families stated that they were directed to GRC via audiologists for educational evaluation. Families stated that their children were evaluated regarding language, academic and psychological development in proportion with their ages in GRC after they were diagnosed medically. However, families stated that EEP realized in GRC was superficial, short and lacked detail. It is seen that many families were not included in EEP process, whereas some families stated that they were asked questions because their children are too small to answer. Regarding the benefits of EEP for themselves and their children, families stated that GRC had to give a report to them for benefiting the free support of Special Education and Rehabilitation Center, and that families had to be directed to inclusive education. As a result, it is seen that opinions of families regarding EDAE processes at GRC indicate inefficiency of the process as it is short and superficial, regardless being to the point.

Keywords: children with hearing loss, educational diagnosis and evaluation, guidance and research center, inclusion

Procedia PDF Downloads 221
2325 Programmed Speech to Text Summarization Using Graph-Based Algorithm

Authors: Hamsini Pulugurtha, P. V. S. L. Jagadamba

Abstract:

Programmed Speech to Text and Text Summarization Using Graph-based Algorithms can be utilized in gatherings to get the short depiction of the gathering for future reference. This gives signature check utilizing Siamese neural organization to confirm the personality of the client and convert the client gave sound record which is in English into English text utilizing the discourse acknowledgment bundle given in python. At times just the outline of the gathering is required, the answer for this text rundown. Thus, the record is then summed up utilizing the regular language preparing approaches, for example, solo extractive text outline calculations

Keywords: Siamese neural network, English speech, English text, natural language processing, unsupervised extractive text summarization

Procedia PDF Downloads 189
2324 Provisional Settlements and Urban Resilience: The Transformation of Refugee Camps into Cities

Authors: Hind Alshoubaki

Abstract:

The world is now confronting a widespread urban phenomenon: refugee camps, which have mostly been established in ‘rushing mode,’ pointing toward affording temporary settlements for refugees that provide them with minimum levels of safety, security and protection from harsh weather conditions within a very short time period. In fact, those emergency settlements are transforming into permanent ones since time is a decisive factor in terms of construction and camps’ age. These play an essential role in transforming their temporary character into a permanent one that generates deep modifications to the city’s territorial structure, shaping a new identity and creating a contentious change in the city’s form and history. To achieve a better understanding for the transformation of refugee camps, this study is based on a mixed-methods approach: the qualitative approach explores different refugee camps and analyzes their transformation process in terms of population density and the changes to the city’s territorial structure and urban features. The quantitative approach employs a statistical regression analysis as a reliable prediction of refugees’ satisfaction within the Zaatari camp in order to predict its future transformation. Obviously, refugees’ perceptions of their current conditions will affect their satisfaction, which plays an essential role in transforming emergency settlements into permanent cities over time. The test basically discusses five main themes: the access and readiness of schools, the dispersion of clinics and shopping centers; the camp infrastructure, the construction materials, and the street networks. The statistical analysis showed that Syrian refugees were not satisfied with their current conditions inside the Zaatari refugee camp and that they had started implementing changes according to their needs, desires, and aspirations because they are conscious about the fact of their prolonged stay in this settlement. Also, the case study analyses showed that neglecting the fact that construction takes time leads settlements being created with below-minimum standards that are deteriorating and creating ‘slums,’ which lead to increased crime rates, suicide, drug use and diseases and deeply affect cities’ urban tissues. For this reason, recognizing the ‘temporary-eternal’ character of those settlements is the fundamental concept to consider refugee camps from the beginning as definite permanent cities. This is the key factor to minimize the trauma of displacement on both refugees and the hosting countries. Since providing emergency settlements within a short time period does not mean using temporary materials, having a provisional character or creating ‘makeshift cities.’

Keywords: refugee, refugee camp, temporary, Zaatari

Procedia PDF Downloads 114
2323 Indicator-Immobilized, Cellulose Based Optical Sensing Membrane for the Detection of Heavy Metal Ions

Authors: Nisha Dhariwal, Anupama Sharma

Abstract:

The synthesis of cellulose nanofibrils quaternized with 3‐chloro‐2‐hydroxypropyltrimethylammonium chloride (CHPTAC) in NaOH/urea aqueous solution has been reported. Xylenol Orange (XO) has been used as an indicator for selective detection of Sn (II) ions, by its immobilization on quaternized cellulose membrane. The effects of pH, reagent concentration and reaction time on the immobilization of XO have also been studied. The linear response, limit of detection, and interference of other metal ions have also been studied and no significant interference has been observed. The optical chemical sensor displayed good durability and short response time with negligible leaching of the reagent.

Keywords: cellulose, chemical sensor, heavy metal ions, indicator immobilization

Procedia PDF Downloads 284
2322 Lightweight Sheet Molding Compound Composites by Coating Glass Fiber with Cellulose Nanocrystals

Authors: Amir Asadi, Karim Habib, Robert J. Moon, Kyriaki Kalaitzidou

Abstract:

There has been considerable interest in cellulose nanomaterials (CN) as polymer and polymer composites reinforcement due to their high specific modulus and strength, low density and toxicity, and accessible hydroxyl side groups that can be readily chemically modified. The focus of this study is making lightweight composites for better fuel efficiency and lower CO2 emission in auto industries with no compromise on mechanical performance using a scalable technique that can be easily integrated in sheet molding compound (SMC) manufacturing lines. Light weighting will be achieved by replacing part of the heavier components, i.e. glass fibers (GF), with a small amount of cellulose nanocrystals (CNC) in short GF/epoxy composites made using SMC. CNC will be introduced as coating of the GF rovings prior to their use in the SMC line. The employed coating method is similar to the fiber sizing technique commonly used and thus it can be easily scaled and integrated to industrial SMC lines. This will be an alternative route to the most techniques that involve dispersing CN in polymer matrix, in which the nanomaterials agglomeration limits the capability for scaling up in an industrial production. We have demonstrated that incorporating CNC as a coating on GF surface by immersing the GF in CNC aqueous suspensions, a simple and scalable technique, increases the interfacial shear strength (IFSS) by ~69% compared to the composites produced by uncoated GF, suggesting an enhancement of stress transfer across the GF/matrix interface. As a result of IFSS enhancement, incorporation of 0.17 wt% CNC in the composite results in increases of ~10% in both elastic modulus and tensile strength, and 40 % and 43 % in flexural modulus and strength respectively. We have also determined that dispersing 1.4 and 2 wt% CNC in the epoxy matrix of short GF/epoxy SMC composites by sonication allows removing 10 wt% GF with no penalty on tensile and flexural properties leading to 7.5% lighter composites. Although sonication is a scalable technique, it is not quite as simple and inexpensive as coating the GF by passing through an aqueous suspension of CNC. In this study, the above findings are integrated to 1) investigate the effect of CNC content on mechanical properties by passing the GF rovings through CNC aqueous suspension with various concentrations (0-5%) and 2) determine the optimum ratio of the added CNC to the removed GF to achieve the maximum possible weight reduction with no cost on mechanical performance of the SMC composites. The results of this study are of industrial relevance, providing a path toward producing high volume lightweight and mechanically enhanced SMC composites using cellulose nanomaterials.

Keywords: cellulose nanocrystals, light weight polymer-matrix composites, mechanical properties, sheet molding compound (SMC)

Procedia PDF Downloads 209
2321 Investigating the Influences of Long-Term, as Compared to Short-Term, Phonological Memory on the Word Recognition Abilities of Arabic Readers vs. Arabic Native Speakers: A Word-Recognition Study

Authors: Insiya Bhalloo

Abstract:

It is quite common in the Muslim faith for non-Arabic speakers to be able to convert written Arabic, especially Quranic Arabic, into a phonological code without significant semantic or syntactic knowledge. This is due to prior experience learning to read the Quran (a religious text written in Classical Arabic), from a very young age such as via enrolment in Quranic Arabic classes. As compared to native speakers of Arabic, these Arabic readers do not have a comprehensive morpho-syntactic knowledge of the Arabic language, nor can understand, or engage in Arabic conversation. The study seeks to investigate whether mere phonological experience (as indicated by the Arabic readers’ experience with Arabic phonology and the sound-system) is sufficient to cause phonological-interference during word recognition of previously-heard words, despite the participants’ non-native status. Both native speakers of Arabic and non-native speakers of Arabic, i.e., those individuals that learned to read the Quran from a young age, will be recruited. Each experimental session will include two phases: An exposure phase and a test phase. During the exposure phase, participants will be presented with Arabic words (n=40) on a computer screen. Half of these words will be common words found in the Quran while the other half will be words commonly found in Modern Standard Arabic (MSA) but either non-existent or prevalent at a significantly lower frequency within the Quran. During the test phase, participants will then be presented with both familiar (n = 20; i.e., those words presented during the exposure phase) and novel Arabic words (n = 20; i.e., words not presented during the exposure phase. ½ of these presented words will be common Quranic Arabic words and the other ½ will be common MSA words but not Quranic words. Moreover, ½ the Quranic Arabic and MSA words presented will be comprised of nouns, while ½ the Quranic Arabic and MSA will be comprised of verbs, thereby eliminating word-processing issues affected by lexical category. Participants will then determine if they had seen that word during the exposure phase. This study seeks to investigate whether long-term phonological memory, such as via childhood exposure to Quranic Arabic orthography, has a differential effect on the word-recognition capacities of native Arabic speakers and Arabic readers; we seek to compare the effects of long-term phonological memory in comparison to short-term phonological exposure (as indicated by the presentation of familiar words from the exposure phase). The researcher’s hypothesis is that, despite the lack of lexical knowledge, early experience with converting written Quranic Arabic text into a phonological code will help participants recall the familiar Quranic words that appeared during the exposure phase more accurately than those that were not presented during the exposure phase. Moreover, it is anticipated that the non-native Arabic readers will also report more false alarms to the unfamiliar Quranic words, due to early childhood phonological exposure to Quranic Arabic script - thereby causing false phonological facilitatory effects.

Keywords: modern standard arabic, phonological facilitation, phonological memory, Quranic arabic, word recognition

Procedia PDF Downloads 338