Search results for: autobiographical memory functions
782 Spatial Interactions Between Earthworm Abundance and Tree Growth Characteristics in Western Niger Delta
Authors: Olatunde Sunday Eludoyin, Charles Obiechina Olisa
Abstract:
The study examined the spatial interactions between earthworm abundance (EA) and tree growth characteristics in ecological belts of Western Niger Delta, Nigeria. Eight 20m x 20m quadrat were delimited in the natural vegetation in each of the rainforest (RF), mangrove (M), fresh water swamp (FWS), and guinea savanna (GS) ecological belts to gather data about the tree species (TS) characteristics which included individual number of tree species (IN), diversity (Di), density (De) and richness (Ri). Three quadrats of 1m x 1m were delineated in each of the 20m x 20m quadrats to collect earthworm species the topsoil (0-15cm), and subsoil (15-30cm) and were taken to laboratory for further analysis. Descriptive statistics and inferential statistics were used for data analysis. Findings showed that a total of 19 earthworm species was found, with 58.5% individual species recorded in the topsoil and 41.5% recorded in the subsoil. The total population ofEudriliuseugeniae was predominantly highest in both topsoil (38.4%) and subsoil (27.1%). The total population of individual species of earthworm was least in GS in the topsoil (11.9%) and subsoil (8.4%). A total of 40 different species of TS was recorded, of which 55.5% were recorded in FWS, while RF was significantly highest in the species diversity(0.5971). Regression analysis revealed that Ri, IN, DBH, Di, and De of trees explained 65.9% of the variability of EA in the topsoil, while 46.9 % of the variability of earthworm abundance was explained by the floristic parameters in the subsoil.Similarly, correlation statistics revealed that in the topsoil, EA is positively and significantly correlated with Ri (r=0.35; p<0.05), IN (r=0.523; p<0.05) and De (r=0.469; p<0.05) while DBH was negatively and significantly correlated with earthworm abundance (r=-0.437; p<0.05). In the subsoil, only Ri and DBH correlated significantly with EA. The study concluded that EA in the study locations was highly influenced by tree growth species especially Ri, IN, DBH, Di, and De. The study recommended that the TSabundance should be improved in the study locations to ensure the survival of earthworms for ecosystem functions.Keywords: interactions, earthworm abundance, tree growth, ecological zones, western niger delta
Procedia PDF Downloads 100781 Phytochemical Composition and Characterization of Bioactive Compounds of the Green Seaweed Ulva lactuca: A Phytotherapeutic Approach
Authors: Mariame Taibi, Marouane Aouiji, Rachid Bengueddour
Abstract:
The Moroccan coastline is particularly rich in algae and constitutes a reserve of species with considerable economic, social and ecological potential. This work focuses on the research and characterization of algae bioactive compounds that can be used in pharmacology or phytopathology. The biochemical composition of the green alga Ulva lactuca (Ulvophyceae) was studied by determining the content of moisture, ash, phenols, flavonoids, total tannins, and chlorophyll. Seven solvents: distilled water, methanol, ethyl acetate, chloroform, benzene, petroleum ether, and hexane, were tested for their effectiveness in recovering chemical compounds. The identification of functional groupings, as well as the bioactive chemical compounds, was determined by FT-IR and GC-MS. The moisture content of the alga was 77%, while the ash content was 15%. Phenol content differed from one solvent studied to another, while chlorophyll a, b, and total chlorophyll were determined at 14%, 9.52%, and 25%, respectively. Carotenoid was present in a considerable amount (8.17%). The experimental results show that methanol is the most effective solvent for recovering bioactive compounds, followed by water. Moreover, the green alga Ulva lactuca is characterized by a high level of total polyphenols (45±3.24 mg GAE/gDM), average levels of total tannins and flavonoids (22.52±8.23 mg CE/gDM, 15.49±0.064 mg QE/gDM) respectively. The results of Fourier transform infrared spectroscopy (FT-IR) confirmed the presence of alcohol/phenol and amide functions in Ulva lactuca. The GC-MS analysis gave precisely the compounds contained in the various extracts, such as phenolic compounds, fatty acids, terpenoids, alcohols, alkanes, hydrocarbons, and steroids. All these results represent only a first step in the search for biologically active natural substances from seaweed. Additional tests are envisaged to confirm the bioactivity of seaweed.Keywords: algae, Ulva lactuca, phenolic compounds, FTIR, GC-MS
Procedia PDF Downloads 108780 Implications of Optimisation Algorithm on the Forecast Performance of Artificial Neural Network for Streamflow Modelling
Authors: Martins Y. Otache, John J. Musa, Abayomi I. Kuti, Mustapha Mohammed
Abstract:
The performance of an artificial neural network (ANN) is contingent on a host of factors, for instance, the network optimisation scheme. In view of this, the study examined the general implications of the ANN training optimisation algorithm on its forecast performance. To this end, the Bayesian regularisation (Br), Levenberg-Marquardt (LM), and the adaptive learning gradient descent: GDM (with momentum) algorithms were employed under different ANN structural configurations: (1) single-hidden layer, and (2) double-hidden layer feedforward back propagation network. Results obtained revealed generally that the gradient descent with momentum (GDM) optimisation algorithm, with its adaptive learning capability, used a relatively shorter time in both training and validation phases as compared to the Levenberg- Marquardt (LM) and Bayesian Regularisation (Br) algorithms though learning may not be consummated; i.e., in all instances considering also the prediction of extreme flow conditions for 1-day and 5-day ahead, respectively especially using the ANN model. In specific statistical terms on the average, model performance efficiency using the coefficient of efficiency (CE) statistic were Br: 98%, 94%; LM: 98 %, 95 %, and GDM: 96 %, 96% respectively for training and validation phases. However, on the basis of relative error distribution statistics (MAE, MAPE, and MSRE), GDM performed better than the others overall. Based on the findings, it is imperative to state that the adoption of ANN for real-time forecasting should employ training algorithms that do not have computational overhead like the case of LM that requires the computation of the Hessian matrix, protracted time, and sensitivity to initial conditions; to this end, Br and other forms of the gradient descent with momentum should be adopted considering overall time expenditure and quality of the forecast as well as mitigation of network overfitting. On the whole, it is recommended that evaluation should consider implications of (i) data quality and quantity and (ii) transfer functions on the overall network forecast performance.Keywords: streamflow, neural network, optimisation, algorithm
Procedia PDF Downloads 152779 Water Management Scheme: Panacea to Development Using Nigeria’s University of Ibadan Water Supply Scheme as a Case Study
Authors: Sunday Olufemi Adesogan
Abstract:
The supply of potable water at least is a very important index in national development. Water tariffs depend on the treatment cost which carries the highest percentage of the total operation cost in any water supply scheme. In order to keep water tariffs as low as possible, treatment costs have to be minimized. The University of Ibadan, Nigeria, water supply scheme consists of a treatment plant with three distribution stations (Amina way, Kurumi and Lander) and two raw water supply sources (Awba dam and Eleyele dam). An operational study of the scheme was carried out to ascertain the efficiency of the supply of potable water on the campus to justify the need for water supply schemes in tertiary institutions. The study involved regular collection, processing and analysis of periodic operational data. Data collected include supply reading (water production on daily basis) and consumers metered reading for a period of 22 months (October 2013 - July 2015), and also collected, were the operating hours of both plants and human beings. Applying the required mathematical equations, total loss was determined for the distribution system, which was translated into monetary terms. Adequacies of the operational functions were also determined. The study revealed that water supply scheme is justified in tertiary institutions. It was also found that approximately 10.7 million Nigerian naira (Keywords: development, panacea, supply, water
Procedia PDF Downloads 209778 Context and Culture in EFL Learners' and Native Speakers' Discourses
Authors: Emad A. S. Abu-Ayyash
Abstract:
Cohesive devices, the linguistic tools that are usually employed to hold the different parts of the text together, have been the focus of a significant number of discourse analysis studies. These linguistic tools have grabbed the attention of researchers since the inception of the first and most comprehensive model of cohesion in 1976. However, it was noticed that some cohesive devices (e.g., endophoric reference, conjunctions, ellipsis, substitution, and lexical ties) – being thought of as more popular than others (e.g., exophoric reference) – were over-researched. The present paper explores the usage of two cohesive devices that have been evidently almost absent from discourse analysis studies. These cohesive devices are exophoric and homophoric references, the linguistic items that can be interpreted in terms of the physical and cultural contexts of discourse. The significance of the current paper, therefore, stems from the fact that it attempts to fill a gap in the research conducted so far on cohesive devices. This study provides an explanation of the concepts of the cohesive devices that have been employed in a plethora of research on cohesion and elucidates the relevant context-related concepts. The paper also identifies the gap in cohesive devices research. Exophora and homophora, the least visited cohesive devices in previous studies, were qualitatively and quantitatively explored in six opinion articles, four produced by eight postgraduate English as a Foreign Language (EFL) students in a university in the United Arab Emirates and two by professional NS writers in the Independent and the Guardian. The six pieces were about the United Kingdom Independent Party (UKIP) leader’s call to ban the burqa in the UK and were analysed vis-a-vis the employment and function of homophora and exophora. The study found that both EFL students and native speakers employed exophora and homophora considerably in their writing to serve a variety of functions, including building assumptions, supporting main ideas, and involving the readers among others.Keywords: cohesive devices, context, culture, exophoric reference, homophoric reference
Procedia PDF Downloads 123777 Oxidative and Hormonal Disruptions Underlie Bisphenol A: Induced Testicular Toxicity in Male Rabbits
Authors: Kadry M. Sadek, Tarek K. Abouzed, Mousa A. Ayoub
Abstract:
The presence of endocrine-disrupting compounds, such as bisphenol A (BPA), in the environment can cause serious health problems. However, there are controversial opinions. This study investigated the reproductive, metabolic, oxidative and immunologic-disrupting effects of bisphenol A in male rabbits. Rabbits were divided into five groups. The first four rabbit groups were administered oral BPA (1, 10, 50, or 100 mg/kg/day) for ten weeks. The fifth group was administered corn oil as the vehicle. BPA significantly decreased serum testosterone, estradiol and the free androgen index (FAI) and significantly increased sex hormone binding globulin (SHBG) compared with the placebo group. The higher doses of BPA showed a significant decrease in follicular stimulating hormone (FSH) and luteinizing hormone (LH). A significant increase in blood glucose levels was identified in the BPA groups. The non-significant difference in insulin levels is a novel finding. The cumulative testicular toxicity of BPA was clearly demonstrated by the dose-dependent decrease in absolute testes weight, primary measures of semen quality and a significant increase in testicular malonaldehyde (MDA). Moreover, BPA significantly decreased total antioxidant capacity (TAC) and significantly increased immunoglobulin G (IgG) at the highest concentration. Our results suggest that BPA, especially at higher doses, is associated with many adverse effects on metabolism, oxidative stress, immunity, sperm quality and markers of androgenic action. These results may reflect the estrogenic effects of BPA, which we hypothesize could be related, in part, to an inhibitory effect on testicular steroidogenesis. The induction of oxidative stress by BPA may play an additional role in testicular toxicity. These results suggest that BPA poses a threat to endocrine and reproductive functions.Keywords: bisphenol A, oxidative stress, rabbits, semen quality, steroidogenesis
Procedia PDF Downloads 294776 Landcover Mapping Using Lidar Data and Aerial Image and Soil Fertility Degradation Assessment for Rice Production Area in Quezon, Nueva Ecija, Philippines
Authors: Eliza. E. Camaso, Guiller. B. Damian, Miguelito. F. Isip, Ronaldo T. Alberto
Abstract:
Land-cover maps were important for many scientific, ecological and land management purposes and during the last decades, rapid decrease of soil fertility was observed to be due to land use practices such as rice cultivation. High-precision land-cover maps are not yet available in the area which is important in an economy management. To assure accurate mapping of land cover to provide information, remote sensing is a very suitable tool to carry out this task and automatic land use and cover detection. The study did not only provide high precision land cover maps but it also provides estimates of rice production area that had undergone chemical degradation due to fertility decline. Land-cover were delineated and classified into pre-defined classes to achieve proper detection features. After generation of Land-cover map, of high intensity of rice cultivation, soil fertility degradation assessment in rice production area due to fertility decline was created to assess the impact of soils used in agricultural production. Using Simple spatial analysis functions and ArcGIS, the Land-cover map of Municipality of Quezon in Nueva Ecija, Philippines was overlaid to the fertility decline maps from Land Degradation Assessment Philippines- Bureau of Soils and Water Management (LADA-Philippines-BSWM) to determine the area of rice crops that were most likely where nitrogen, phosphorus, zinc and sulfur deficiencies were induced by high dosage of urea and imbalance N:P fertilization. The result found out that 80.00 % of fallow and 99.81% of rice production area has high soil fertility decline.Keywords: aerial image, landcover, LiDAR, soil fertility degradation
Procedia PDF Downloads 252775 The Negative Implications of Childhood Obesity and Malnutrition on Cognitive Development
Authors: Stephanie Remedios, Linda Veronica Rios
Abstract:
Background. Pediatric obesity is a serious health problem linked to multiple physical diseases and ailments, including diabetes, heart disease, and joint issues. While research has shown pediatric obesity can bring about an array of physical illnesses, it is less known how such a condition can affect children’s cognitive development. With childhood overweight and obesity prevalence rates on the rise, it is essential to understand the scope of their cognitive consequences. The present review of the literature tested the hypothesis that poor physical health, such as childhood obesity or malnutrition, negatively impacts a child’s cognitive development. Methodology. A systematic review was conducted to determine the relationship between poor physical health and lower cognitive functioning in children ages 4-16. Electronic databases were searched for studies dating back to ten years. The following databases were used: Science Direct, FIU Libraries, and Google Scholar. Inclusion criteria consisted of peer-reviewed academic articles written in English from 2012 to 2022 that analyzed the relationship between childhood malnutrition and obesity on cognitive development. A total of 17,000 articles were obtained, of which 16,987 were excluded for not addressing the cognitive implications exclusively. Of the acquired articles, 13 were retained. Results. Research suggested a significant connection between diet and cognitive development. Both diet and physical activity are strongly correlated with higher cognitive functioning. Cognitive domains explored in this work included learning, memory, attention, inhibition, and impulsivity. IQ scores were also considered objective representations of overall cognitive performance. Studies showed physical activity benefits cognitive development, primarily for executive functioning and language development. Additionally, children suffering from pediatric obesity or malnutrition were found to score 3-10 points lower in IQ scores when compared to healthy, same-aged children. Conclusion. This review provides evidence that the presence of physical activity and overall physical health, including appropriate diet and nutritional intake, has beneficial effects on cognitive outcomes. The primary conclusion from this research is that childhood obesity and malnutrition show detrimental effects on cognitive development in children, primarily with learning outcomes. Assuming childhood obesity and malnutrition rates continue their current trade, it is essential to understand the complete physical and psychological implications of obesity and malnutrition in pediatric populations. Given the limitations encountered through our research, further studies are needed to evaluate the areas of cognition affected during childhood.Keywords: childhood malnutrition, childhood obesity, cognitive development, cognitive functioning
Procedia PDF Downloads 118774 Cognitive Behaviour Drama: A Research-Based Intervention Model to Improve Social Thinking in High-Functioning Children with Autism
Authors: Haris Karnezi, Kevin Tierney
Abstract:
Cognitive Behaviour Drama is a research-based intervention model that brought together the science of psychology with the art form of drama to create an unobtrusive and exciting approach that would provide children on the higher end of the autism spectrum the motivation to explore the rules of social interaction and develop competencies associated with communicative success. The method involves engaging the participants in exciting fictional scenarios and encouraging them to seek various solutions on a number of problems that will lead them to an understanding of causal relationships and how a different course of action may lead to a different outcome. The sessions are structured to offer opportunities to the participants to practice target behaviours and understand the functions they serve. The study involved six separate interventions and employed both single case and group designs. Overall 8 children aged between 6 to 13 years, diagnosed with ASD participated in the study. Outcomes were measured using theory of mind tests, executive functioning tests, behavioural observations, pre and post intervention standardised social competence questionnaires for parents and teachers. Collectively, the results indicated positive changes in the self esteem and behaviour of all eight participants. In particular, improvements in the ability to solve theory of mind tasks were noted in the younger group; and qualitative improvements in social communication, in terms of verbal (content) and non verbal expression (body posture, vocal expression, fluency, eye contact, reduction of ritualistic mannerisms) were noted in the older group. The need for reliable impact measures to assess the effectiveness of the model in generating global changes in the participants’ behaviour outside the therapeutic context was identified.Keywords: autism, drama, intervention, social skills
Procedia PDF Downloads 160773 Analytical Slope Stability Analysis Based on the Statistical Characterization of Soil Shear Strength
Authors: Bernardo C. P. Albuquerque, Darym J. F. Campos
Abstract:
Increasing our ability to solve complex engineering problems is directly related to the processing capacity of computers. By means of such equipments, one is able to fast and accurately run numerical algorithms. Besides the increasing interest in numerical simulations, probabilistic approaches are also of great importance. This way, statistical tools have shown their relevance to the modelling of practical engineering problems. In general, statistical approaches to such problems consider that the random variables involved follow a normal distribution. This assumption tends to provide incorrect results when skew data is present since normal distributions are symmetric about their means. Thus, in order to visualize and quantify this aspect, 9 statistical distributions (symmetric and skew) have been considered to model a hypothetical slope stability problem. The data modeled is the friction angle of a superficial soil in Brasilia, Brazil. Despite the apparent universality, the normal distribution did not qualify as the best fit. In the present effort, data obtained in consolidated-drained triaxial tests and saturated direct shear tests have been modeled and used to analytically derive the probability density function (PDF) of the safety factor of a hypothetical slope based on Mohr-Coulomb rupture criterion. Therefore, based on this analysis, it is possible to explicitly derive the failure probability considering the friction angle as a random variable. Furthermore, it is possible to compare the stability analysis when the friction angle is modelled as a Dagum distribution (distribution that presented the best fit to the histogram) and as a Normal distribution. This comparison leads to relevant differences when analyzed in light of the risk management.Keywords: statistical slope stability analysis, skew distributions, probability of failure, functions of random variables
Procedia PDF Downloads 338772 Comparing Deep Architectures for Selecting Optimal Machine Translation
Authors: Despoina Mouratidis, Katia Lida Kermanidis
Abstract:
Machine translation (MT) is a very important task in Natural Language Processing (NLP). MT evaluation is crucial in MT development, as it constitutes the means to assess the success of an MT system, and also helps improve its performance. Several methods have been proposed for the evaluation of (MT) systems. Some of the most popular ones in automatic MT evaluation are score-based, such as the BLEU score, and others are based on lexical similarity or syntactic similarity between the MT outputs and the reference involving higher-level information like part of speech tagging (POS). This paper presents a language-independent machine learning framework for classifying pairwise translations. This framework uses vector representations of two machine-produced translations, one from a statistical machine translation model (SMT) and one from a neural machine translation model (NMT). The vector representations consist of automatically extracted word embeddings and string-like language-independent features. These vector representations used as an input to a multi-layer neural network (NN) that models the similarity between each MT output and the reference, as well as between the two MT outputs. To evaluate the proposed approach, a professional translation and a "ground-truth" annotation are used. The parallel corpora used are English-Greek (EN-GR) and English-Italian (EN-IT), in the educational domain and of informal genres (video lecture subtitles, course forum text, etc.) that are difficult to be reliably translated. They have tested three basic deep learning (DL) architectures to this schema: (i) fully-connected dense, (ii) Convolutional Neural Network (CNN), and (iii) Long Short-Term Memory (LSTM). Experiments show that all tested architectures achieved better results when compared against those of some of the well-known basic approaches, such as Random Forest (RF) and Support Vector Machine (SVM). Better accuracy results are obtained when LSTM layers are used in our schema. In terms of a balance between the results, better accuracy results are obtained when dense layers are used. The reason for this is that the model correctly classifies more sentences of the minority class (SMT). For a more integrated analysis of the accuracy results, a qualitative linguistic analysis is carried out. In this context, problems have been identified about some figures of speech, as the metaphors, or about certain linguistic phenomena, such as per etymology: paronyms. It is quite interesting to find out why all the classifiers led to worse accuracy results in Italian as compared to Greek, taking into account that the linguistic features employed are language independent.Keywords: machine learning, machine translation evaluation, neural network architecture, pairwise classification
Procedia PDF Downloads 132771 Anaphora and Cataphora on the Selected State of the City Addresses of the Mayor of Dapitan
Authors: Mark Herman Sumagang Potoy
Abstract:
State of the City Address (SOCA) is a speech, modelled after the State of the Nation Address, given not as mandated by law but usually a matter of practice or tradition delivered before the chief executive’s constituents. Through this, the general public is made to know the performance of the local government unit and its agenda for the coming year. Therefore, it is imperative for SOCAs to clearly convey its message and carry out the myriad function of enlightening its readers which could be achieved through the proper use of reference. Anaphora and cataphora are the two major types of reference; the former refer back to something that has already been mentioned while the latter points forward to something which is yet to be said. This paper seeks to identify the types of reference employed on the SOCAs from 2014 to 2016 of Hon. Rosalina Garcia Jalosjos, Mayor of Dapitan City and look into how the references contribute to the clarity of the message of the text. The qualitative method of research is used in this study through an in-depth analysis of the corpus. As soon as the copies of the SOCAs are secured from the Office of the City Mayor, they are then analyzed using documentary technique categorizing the types of reference as to anaphora and cataphora, counting each of these types and describing the implications of the dominant types used in the addresses. After a thorough analysis, it is found out that the two reference types namely, anaphora and cataphora are both employed on the three SOCAs, the former being used more frequently than the latter accounting to 80% and 20% of actual usage, respectively. Moreover, the use of anaphors and cataphora on the three addresses helps in conveying the message clearly because they primarily become aids to avoid the repetition of the same element in the text especially when there wasn’t a need to emphasize a point. Finally, it is recommended that writers of State of the City Addresses should have a vast knowledge on how reference should be used and the functions they take in the text since this is a vital tool to clearly transmit a message. Moreover, English teachers should explicitly teach the proper usage of anaphora and cataphora, as instruments to develop cohesion in written discourse, to enable students to write not only with sense but also with fluidity in tying utterances together.Keywords: anaphora, cataphora, reference, State of the City Address
Procedia PDF Downloads 192770 Transgenerational Impact of Intrauterine Hyperglycaemia to F2 Offspring without Pre-Diabetic Exposure on F1 Male Offspring
Authors: Jun Ren, Zhen-Hua Ming, He-Feng Huang, Jian-Zhong Sheng
Abstract:
Adverse intrauterine stimulus during critical or sensitive periods in early life, may lead to health risk not only in later life span, but also further generations. Intrauterine hyperglycaemia, as a major feature of gestational diabetes mellitus (GDM), is a typical adverse environment for both F1 fetus and F1 gamete cells development. However, there is scare information of phenotypic difference of metabolic memory between somatic cells and germ cells exposed by intrauterine hyperglycaemia. The direct transmission effect of intrauterine hyperglycaemia per se has not been assessed either. In this study, we built a GDM mice model and selected male GDM offspring without pre-diabetic phenotype as our founders, to exclude postnatal diabetic influence on gametes, thereby investigate the direct transmission effect of intrauterine hyperglycaemia exposure on F2 offspring, and we further compared the metabolic difference of affected F1-GDM male offspring and F2 offspring. A GDM mouse model of intrauterine hyperglycemia was established by intraperitoneal injection of streptozotocin after pregnancy. Pups of GDM mother were fostered by normal control mothers. All the mice were fed with standard food. Male GDM offspring without metabolic dysfunction phenotype were crossed with normal female mice to obtain F2 offspring. Body weight, glucose tolerance test, insulin tolerance test and homeostasis model of insulin resistance (HOMA-IR) index were measured in both generations at 8 week of age. Some of F1-GDM male mice showed impaired glucose tolerance (p < 0.001), none of F1-GDM male mice showed impaired insulin sensitivity. Body weight of F1-GDM mice showed no significance with control mice. Some of F2-GDM offspring exhibited impaired glucose tolerance (p < 0.001), all the F2-GDM offspring exhibited higher HOMA-IR index (p < 0.01 of normal glucose tolerance individuals vs. control, p < 0.05 of glucose intolerance individuals vs. control). All the F2-GDM offspring exhibited higher ITT curve than control (p < 0.001 of normal glucose tolerance individuals, p < 0.05 of glucose intolerance individuals, vs. control). F2-GDM offspring had higher body weight than control mice (p < 0.001 of normal glucose tolerance individuals, p < 0.001 of glucose intolerance individuals, vs. control). While glucose intolerance is the only phenotype that F1-GDM male mice may exhibit, F2 male generation of healthy F1-GDM father showed insulin resistance, increased body weight and/or impaired glucose tolerance. These findings imply that intrauterine hyperglycaemia exposure affects germ cells and somatic cells differently, thus F1 and F2 offspring demonstrated distinct metabolic dysfunction phenotypes. And intrauterine hyperglycaemia exposure per se has a strong influence on F2 generation, independent of postnatal metabolic dysfunction exposure.Keywords: inheritance, insulin resistance, intrauterine hyperglycaemia, offspring
Procedia PDF Downloads 238769 Evaluation of Pragmatic Information in an English Textbook: Focus on Requests
Authors: Israa A. Qari
Abstract:
Learning to request in a foreign language is a key ability within pragmatics language teaching. This paper examines how requests are taught in English Unlimited Book 3 (Cambridge University Press), an EFL textbook series employed by King Abdulaziz University in Jeddah, Saudi Arabia to teach advanced foundation year students English. The focus of analysis is the evaluation of the request linguistic strategies present in the textbook, frequency of the use of these strategies, and the contextual information provided on the use of these linguistic forms. The researcher collected all the linguistic forms which consisted of the request speech act and divided them into levels employing the CCSARP request coding manual. Findings demonstrated that simple and commonly employed request strategies are introduced. Looking closely at the exercises throughout the chapters, it was noticeable that the book exclusively employed the most direct form of requesting (the imperative) when giving learners instructions: e.g. listen, write, ask, answer, read, look, complete, choose, talk, think, etc. The book also made use of some other request strategies such as ‘hedged performatives’ and ‘query preparatory’. However, it was also found that many strategies were not dealt with in the book, specifically strategies with combined functions (e.g. possibility, ability). On a sociopragmatic level, a strong focus was found to exist on standard situations in which relations between the requester and requestee are clear. In general, contextual information was communicated implicitly only. The textbook did not seem to differentiate between formal and informal request contexts (register) which might consequently impel students to overgeneralize. The paper closes with some recommendations for textbook and curriculum designers. Findings are also contrasted with previous results from similar body of research on EFL requests.Keywords: EFL, requests, saudi, speech acts, textbook evaluation
Procedia PDF Downloads 134768 Adjustments of Mechanical and Hydraulic Properties of Wood Formed under Environmental Stresses
Authors: B. Niez, B. Moulia, J. Dlouha, E. Badel
Abstract:
Trees adjust their development to the environmental conditions they experience. Storms events of last decades showed that acclimation of trees to mechanical stresses due to wind is a very important process that allows the trees to sustain for long years. In the future, trees will experience new wind patterns, namely, more often strong winds and fewer daily moderate winds. Moreover, these patterns will go along with drought periods that may interact with the capacity of trees to adjust their growth to mechanical stresses due to wind. It is necessary to understand the mechanisms of wood functional acclimations to environmental conditions in order to predict their behaviour and in order to give foresters and breeders the relevant tools to adapt their forest management. This work aims to study how trees adjust the mechanical and hydraulic functions of their wood to environmental stresses and how this acclimation may be beneficial for the tree to resist to future stresses. In this work, young poplars were grown under controlled climatic conditions that include permanent environmental stress (daily mechanical stress of the stem by bending and/or hydric stress). Then, the properties of wood formed under these stressed conditions were characterized. First, hydraulic conductivity and sensibility to cavitation were measured at the tissue level in order to evaluate the changes in water transport capacity. Secondly, bending tests and Charpy impact tests were carried out at the millimetric scale to locally measure mechanical parameters such as elastic modulus, elastic limit or rupture energy. These experimental data allow evaluating the impacts of mechanical and water stress on the wood material. At the stem level, they will be merged in an integrative model in order to evaluate the beneficial aspect of wood acclimation for trees.Keywords: acclimation, environmental stresses, hydraulics, mechanics, wood
Procedia PDF Downloads 204767 Stoa: Urban Community-Building Social Experiment through Mixed Reality Game Environment
Authors: Radek Richtr, Petr Pauš
Abstract:
Social media nowadays connects people more tightly and intensively than ever, but simultaneously, some sort of social distance, incomprehension, lost of social integrity appears. People can be strongly connected to the person on the other side of the world but unaware of neighbours in the same district or street. The Stoa is a type of application from the ”serious games” genre- it is research augmented reality experiment masked as a gaming environment. In the Stoa environment, the player can plant and grow virtual (organic) structure, a Pillar, that represent the whole suburb. Everybody has their own idea of what is an acceptable, admirable or harmful visual intervention in the area they live in; the purpose of this research experiment is to find and/or define residents shared subconscious spirit, genius loci of the Pillars vicinity, where residents live in. The appearance and evolution of Stoa’s Pillars reflect the real world as perceived by not only the creator but also by other residents/players, who, with their actions, refine the environment. Squares, parks, patios and streets get their living avatar depictions; investors and urban planners obtain information on the occurrence and level of motivation for reshaping the public space. As the project is in product conceptual design phase, the function is one of its most important factors. Function-based modelling makes design problem modular and structured and thus decompose it into sub-functions or function-cells. Paper discuss the current conceptual model for Stoa project, the using of different organic structure textures and models, user interface design, UX study and project’s developing to the final state.Keywords: augmented reality, urban computing, interaction design, mixed reality, social engineering
Procedia PDF Downloads 228766 Simulation of GAG-Analogue Biomimetics for Intervertebral Disc Repair
Authors: Dafna Knani, Sarit S. Sivan
Abstract:
Aggrecan, one of the main components of the intervertebral disc (IVD), belongs to the family of proteoglycans (PGs) that are composed of glycosaminoglycan (GAG) chains covalently attached to a core protein. Its primary function is to maintain tissue hydration and hence disc height under the high loads imposed by muscle activity and body weight. Significant PG loss is one of the first indications of disc degeneration. A possible solution to recover disc functions is by injecting a synthetic hydrogel into the joint cavity, hence mimicking the role of PGs. One of the hydrogels proposed is GAG-analogues, based on sulfate-containing polymers, which are responsible for hydration in disc tissue. In the present work, we used molecular dynamics (MD) to study the effect of the hydrogel crosslinking (type and degree) on the swelling behavior of the suggested GAG-analogue biomimetics by calculation of cohesive energy density (CED), solubility parameter, enthalpy of mixing (ΔEmix) and the interactions between the molecules at the pure form and as a mixture with water. The simulation results showed that hydrophobicity plays an important role in the swelling of the hydrogel, as indicated by the linear correlation observed between solubility parameter values of the copolymers and crosslinker weight ratio (w/w); this correlation was found useful in predicting the amount of PEGDA needed for the desirable hydration behavior of (CS)₄-peptide. Enthalpy of mixing calculations showed that all the GAG analogs, (CS)₄ and (CS)₄-peptide are water-soluble; radial distribution function analysis revealed that they form interactions with water molecules, which is important for the hydration process. To conclude, our simulation results, beyond supporting the experimental data, can be used as a useful predictive tool in the future development of biomaterials, such as disc replacement.Keywords: molecular dynamics, proteoglycans, enthalpy of mixing, swelling
Procedia PDF Downloads 75765 Beta-Carotene Attenuates Cognitive and Hepatic Impairment in Thioacetamide-Induced Rat Model of Hepatic Encephalopathy via Mitigation of MAPK/NF-κB Signaling Pathway
Authors: Marawan Abd Elbaset Mohamed, Hanan A. Ogaly, Rehab F. Abdel-Rahman, Ahmed-Farid O.A., Marwa S. Khattab, Reham M. Abd-Elsalam
Abstract:
Liver fibrosis is a severe worldwide health concern due to various chronic liver disorders. Hepatic encephalopathy (HE) is one of its most common complications affecting liver and brain cognitive function. Beta-Carotene (B-Car) is an organic, strongly colored red-orange pigment abundant in fungi, plants, and fruits. The study attempted to know B-Car neuroprotective potential against thioacetamide (TAA)-induced neurotoxicity and cognitive decline in HE in rats. Hepatic encephalopathy was induced by TAA (100 mg/kg, i.p.) three times per week for two weeks. B-Car was given orally (10 or 20 mg/kg) daily for two weeks after TAA injections. Organ body weight ratio, Serum transaminase activities, liver’s antioxidant parameters, ammonia, and liver histopathology were assessed. Also, the brain’s mitogen-activated protein kinase (MAPK), nuclear factor kappa B (NF-κB), antioxidant parameters, adenosine triphosphate (ATP), adenosine monophosphate (AMP), norepinephrine (NE), dopamine (DA), serotonin (5-HT), 5-hydroxyindoleacetic acid (5-HIAA) cAMP response element-binding protein (CREB) expression and B-cell lymphoma 2 (Bcl-2) expression were measured. The brain’s cognitive functions (Spontaneous locomotor activity, Rotarod performance test, Object recognition test) were assessed. B-Car prevented alteration of the brain’s cognitive function in a dose-dependent manner. The histopathological outcomes supported these biochemical evidences. Based on these results, it could be established that B-Car could be assigned to treat the brain’s neurotoxicity consequences of HE via downregualtion of MAPK/NF-κB signaling pathways.Keywords: beta-carotene, liver injury, MAPK, NF-κB, rat, thioacetamide
Procedia PDF Downloads 154764 Transcriptome and Metabolome Analysis of a Tomato Solanum Lycopersicum STAYGREEN1 Null Line Generated Using Clustered Regularly Interspaced Short Palindromic Repeats/Cas9 Technology
Authors: Jin Young Kim, Kwon Kyoo Kang
Abstract:
The SGR1 (STAYGREEN1) protein is a critical regulator of plant leaves in chlorophyll degradation and senescence. The functions and mechanisms of tomato SGR1 action are poorly understood and worthy of further investigation. To investigate the function of the SGR1 gene, we generated a SGR1-knockout (KO) null line via clustered regularly interspaced short palindromic repeats (CRISPR)/Cas9-mediated gene editing and conducted RNA sequencing and gas chromatography tandem mass spectrometry (GC-MS/MS) analysis to identify the differentially expressed genes. The SlSGR1 (Solanum lycopersicum SGR1) knockout null line clearly showed a turbid brown color with significantly higher chlorophyll and carotenoid content compared to wild-type (WT) fruit. Differential gene expression analysis revealed 728 differentially expressed genes (DEGs) between WT and sgr1 #1-6 line, including 263 and 465 downregulated and upregulated genes, respectively, for which fold change was >2, and the adjusted p-value was <0.05. Most of the DEGs were related to photosynthesis and chloroplast function. In addition, the pigment, carotenoid changes in sgr1 #1-6 line was accumulated of key primary metabolites such as sucrose and its derivatives (fructose, galactinol, raffinose), glycolytic intermediates (glucose, G6P, Fru6P) and tricarboxylic acid cycle (TCA) intermediates (malate and fumarate). Taken together, the transcriptome and metabolite profiles of SGR1-KO lines presented here provide evidence for the mechanisms underlying the effects of SGR1 and molecular pathways involved in chlorophyll degradation and carotenoid biosynthesis.Keywords: tomato, CRISPR/Cas9, null line, RNA-sequencing, metabolite profiling
Procedia PDF Downloads 121763 Breeding Cotton for Annual Growth Habit: Remobilizing End-of-season Perennial Reserves for Increased Yield
Authors: Salman Naveed, Nitant Gandhi, Grant Billings, Zachary Jones, B. Todd Campbell, Michael Jones, Sachin Rustgi
Abstract:
Cotton (Gossypium spp.) is the primary source of natural fiber in the U.S. and a major crop in the Southeastern U.S. Despite constant efforts to increase the cotton fiber yield, the yield gain has stagnated. Therefore, we undertook a novel approach to improve the cotton fiber yield by altering its growth habit from perennial to annual. In this effort, we identified genotypes with high-expression alleles of five floral induction and meristem identity genes (FT, SOC1, FUL, LFY, and AP1) from an upland cotton mini-core collection and crossed them in various combinations to develop cotton lines with annual growth habit, optimal flowering time and enhanced productivity. To facilitate the characterization of genotypes with the desired combinations of stacked alleles, we identified markers associated with the gene expression traits via genome-wide association analysis using a 63K SNP Array (Hulse-Kemp et al. 2015 G3 5:1187). Over 14,500 SNPs showed polymorphism and were used for association analysis. A total of 396 markers showed association with expression traits. Out of these 396 markers, 159 mapped to genes, 50 to untranslated regions, and 187 to random genomic regions. Biased genomic distribution of associated markers was observed where more trait-associated markers mapped to the cotton D sub-genome. Many quantitative trait loci coincided at specific genomic regions. This observation has implications as these traits could be bred together. The analysis also allowed the identification of candidate regulators of the expression patterns of these floral induction and meristem identity genes whose functions will be validated via virus-induced gene silencing.Keywords: cotton, GWAS, QTL, expression traits
Procedia PDF Downloads 151762 Graphical Theoretical Construction of Discrete time Share Price Paths from Matroid
Authors: Min Wang, Sergey Utev
Abstract:
The lessons from the 2007-09 global financial crisis have driven scientific research, which considers the design of new methodologies and financial models in the global market. The quantum mechanics approach was introduced in the unpredictable stock market modeling. One famous quantum tool is Feynman path integral method, which was used to model insurance risk by Tamturk and Utev and adapted to formalize the path-dependent option pricing by Hao and Utev. The research is based on the path-dependent calculation method, which is motivated by the Feynman path integral method. The path calculation can be studied in two ways, one way is to label, and the other is computational. Labeling is a part of the representation of objects, and generating functions can provide many different ways of representing share price paths. In this paper, the recent works on graphical theoretical construction of individual share price path via matroid is presented. Firstly, a study is done on the knowledge of matroid, relationship between lattice path matroid and Tutte polynomials and ways to connect points in the lattice path matroid and Tutte polynomials is suggested. Secondly, It is found that a general binary tree can be validly constructed from a connected lattice path matroid rather than general lattice path matroid. Lastly, it is suggested that there is a way to represent share price paths via a general binary tree, and an algorithm is developed to construct share price paths from general binary trees. A relationship is also provided between lattice integer points and Tutte polynomials of a transversal matroid. Use this way of connection together with the algorithm, a share price path can be constructed from a given connected lattice path matroid.Keywords: combinatorial construction, graphical representation, matroid, path calculation, share price, Tutte polynomial
Procedia PDF Downloads 138761 Parameters Influencing Human Machine Interaction in Hospitals
Authors: Hind Bouami
Abstract:
Handling life-critical systems complexity requires to be equipped with appropriate technology and the right human agents’ functions such as knowledge, experience, and competence in problem’s prevention and solving. Human agents are involved in the management and control of human-machine system’s performance. Documenting human agent’s situation awareness is crucial to support human-machine designers’ decision-making. Knowledge about risks, critical parameters and factors that can impact and threaten automation system’s performance should be collected using preventive and retrospective approaches. This paper aims to document operators’ situation awareness through the analysis of automated organizations’ feedback. The analysis of automated hospital pharmacies feedbacks helps to identify and control critical parameters influencing human machine interaction in order to enhance system’s performance and security. Our human machine system evaluation approach has been deployed in Macon hospital center’s pharmacy which is equipped with automated drug dispensing systems since 2015. Automation’s specifications are related to technical aspects, human-machine interaction, and human aspects. The evaluation of drug delivery automation performance in Macon hospital center has shown that the performance of the automated activity depends on the performance of the automated solution chosen, and also on the control of systemic factors. In fact, 80.95% of automation specification related to the chosen Sinteco’s automated solution is met. The performance of the chosen automated solution is involved in 28.38% of automation specifications performance in Macon hospital center. The remaining systemic parameters involved in automation specifications performance need to be controlled.Keywords: life-critical systems, situation awareness, human-machine interaction, decision-making
Procedia PDF Downloads 181760 Development of Academic Software for Medial Axis Determination of Porous Media from High-Resolution X-Ray Microtomography Data
Authors: S. Jurado, E. Pazmino
Abstract:
Determination of the medial axis of a porous media sample is a non-trivial problem of interest for several disciplines, e.g., hydrology, fluid dynamics, contaminant transport, filtration, oil extraction, etc. However, the computational tools available for researchers are limited and restricted. The primary aim of this work was to develop a series of algorithms to extract porosity, medial axis structure, and pore-throat size distributions from porous media domains. A complementary objective was to provide the algorithms as free computational software available to the academic community comprising researchers and students interested in 3D data processing. The burn algorithm was tested on porous media data obtained from High-Resolution X-Ray Microtomography (HRXMT) and idealized computer-generated domains. The real data and idealized domains were discretized in voxels domains of 550³ elements and binarized to denote solid and void regions to determine porosity. Subsequently, the algorithm identifies the layer of void voxels next to the solid boundaries. An iterative process removes or 'burns' void voxels in sequence of layer by layer until all the void space is characterized. Multiples strategies were tested to optimize the execution time and use of computer memory, i.e., segmentation of the overall domain in subdomains, vectorization of operations, and extraction of single burn layer data during the iterative process. The medial axis determination was conducted identifying regions where burnt layers collide. The final medial axis structure was refined to avoid concave-grain effects and utilized to determine the pore throat size distribution. A graphic user interface software was developed to encompass all these algorithms, including the generation of idealized porous media domains. The software allows input of HRXMT data to calculate porosity, medial axis, and pore-throat size distribution and provide output in tabular and graphical formats. Preliminary tests of the software developed during this study achieved medial axis, pore-throat size distribution and porosity determination of 100³, 320³ and 550³ voxel porous media domains in 2, 22, and 45 minutes, respectively in a personal computer (Intel i7 processor, 16Gb RAM). These results indicate that the software is a practical and accessible tool in postprocessing HRXMT data for the academic community.Keywords: medial axis, pore-throat distribution, porosity, porous media
Procedia PDF Downloads 115759 Integration of Polarization States and Color Multiplexing through a Singular Metasurface
Authors: Tarik Sipahi
Abstract:
Photonics research continues to push the boundaries of optical science, and the development of metasurface technology has emerged as a transformative force in this domain. The work presents the intricacies of a unified metasurface design tailored for efficient polarization and color control in optical systems. The proposed unified metasurface serves as a singular, nanoengineered optical element capable of simultaneous polarization modulation and color encoding. Leveraging principles from metamaterials and nanophotonics, this design allows for unprecedented control over the behavior of light at the subwavelength scale. The metasurface's spatially varying architecture enables seamless manipulation of both polarization states and color wavelengths, paving the way for a paradigm shift in optical system design. The advantages of this unified metasurface are diverse and impactful. By consolidating functions that traditionally require multiple optical components, the design streamlines optical systems, reducing complexity and enhancing overall efficiency. This approach is particularly promising for applications where compactness, weight considerations, and multifunctionality are crucial. Furthermore, the proposed unified metasurface design not only enhances multifunctionality but also addresses key challenges in optical system design, offering a versatile solution for applications demanding compactness and lightweight structures. The metasurface's capability to simultaneously manipulate polarization and color opens new possibilities in diverse technological fields. The research contributes to the evolution of optical science by showcasing the transformative potential of metasurface technology, emphasizing its role in reshaping the landscape of optical system architectures. This work represents a significant step forward in the ongoing pursuit of pushing the boundaries of photonics, providing a foundation for future innovations in compact and efficient optical devices.Keywords: metasurface, nanophotonics, optical system design, polarization control
Procedia PDF Downloads 53758 Integrated Risk Management in The Supply Chain of Essential Medicines in Zambia
Authors: Mario M. J. Musonda
Abstract:
Access to health care is a human right, which includes having timely access to affordable and quality essential medicines at the right place and in sufficient quantity. However, inefficient public sector supply chain management contributes to constant shortages of essential medicines at health facilities. Literature review involved a desktop study of published research studies and reports on risk management, supply chain management of essential medicines and their integration to increase the efficiency of the latter. The research was conducted on a sample population of offices under Ministry of Health Headquarters, Lusaka Provincial and District Offices, selected health facilities in Lusaka, Medical Stores Limited, Zambia Medicines Regulatory Authority and Cooperating Partners. Individuals involved in study were selected judgmentally by their functions under selection and quantification, regulation, procurement, storage, distribution, quality assurance, and dispensing of essential medicines. Structured interviews and discussions were held with selected experts and self-administered questionnaires were distributed. Collected and analysed data of 35 returned and usable questionnaires from the 50 distributed. The highest prioritised risks were; inadequate and inconsistent fund disbursements, weak information management systems, weak quality management systems and insufficient resources (HR and infrastructure) among others. The results for this research can be used to increase the efficiency of the public sector supply chain of essential medicines and other pharmaceuticals. The results of the study showed that there is need to implement effective risk management systems by participating institutions and organisations to increase the efficiency of the entire supply chain in order to avoid and/or reduce shortages of essential medicines at health facilities.Keywords: essential medicine, risk assessment, risk management, supply chain, supply chain risk management
Procedia PDF Downloads 443757 Literature Review on the Controversies and Changes in the Insanity Defense since the Wild Beast Standard in 1723 until the Federal Insanity Defense Reform Act of 1984
Authors: Jane E. Hill
Abstract:
Many variables led to the changes in the insanity defense since the Wild Beast Standard of 1723 until the Federal Insanity Defense Reform Act of 1984. The insanity defense is used in criminal trials and argued that the defendant is ‘not guilty by reason of insanity’ because the individual was unable to distinguish right from wrong during the time they were breaking the law. The issue that surrounds whether or not to use the insanity defense in the criminal court depends on the mental state of the defendant at the time the criminal act was committed. This leads us to the question of did the defendant know right from wrong when they broke the law? In 1723, The Wild Beast Test stated that to be exempted from punishment the individual is totally deprived of their understanding and memory and doth not know what they are doing. The Wild Beast Test became the standard in England for over seventy-five years. In 1800, James Hadfield attempted to assassinate King George III. He only made the attempt because he was having delusional beliefs. The jury and the judge gave a verdict of not guilty. However, to legal confine him; the Criminal Lunatics Act was enacted. Individuals that were deemed as ‘criminal lunatics’ and were given a verdict of not guilty would be taken into custody and not be freed into society. In 1843, the M'Naghten test required that the individual did not know the quality or the wrongfulness of the offense at the time they committed the criminal act(s). Daniel M'Naghten was acquitted on grounds of insanity. The M'Naghten Test is still a modern concept of the insanity defense used in many courts today. The Irresistible Impulse Test was enacted in the United States in 1887. The Irresistible Impulse Test suggested that offenders that could not control their behavior while they were committing a criminal act were not deterrable by the criminal sanctions in place; therefore no purpose would be served by convicting the offender. Due to the criticisms of the latter two contentions, the federal District of Columbia Court of Appeals ruled in 1954 to adopt the ‘product test’ by Sir Isaac Ray for insanity. The Durham Rule also known as the ‘product test’, stated an individual is not criminally responsible if the unlawful act was the product of mental disease or defect. Therefore, the two questions that need to be asked and answered are (1) did the individual have a mental disease or defect at the time they broke the law? and (2) was the criminal act the product of their disease or defect? The Durham courts failed to clearly define ‘mental disease’ or ‘product.’ Therefore, trial courts had difficulty defining the meaning of the terms and the controversy continued until 1972 when the Durham rule was overturned in most places. Therefore, the American Law Institute combined the M'Naghten test with the irresistible impulse test and The United States Congress adopted an insanity test for the federal courts in 1984.Keywords: insanity defense, psychology law, The Federal Insanity Defense Reform Act of 1984, The Wild Beast Standard in 1723
Procedia PDF Downloads 143756 Data Refinement Enhances The Accuracy of Short-Term Traffic Latency Prediction
Authors: Man Fung Ho, Lap So, Jiaqi Zhang, Yuheng Zhao, Huiyang Lu, Tat Shing Choi, K. Y. Michael Wong
Abstract:
Nowadays, a tremendous amount of data is available in the transportation system, enabling the development of various machine learning approaches to make short-term latency predictions. A natural question is then the choice of relevant information to enable accurate predictions. Using traffic data collected from the Taiwan Freeway System, we consider the prediction of short-term latency of a freeway segment with a length of 17 km covering 5 measurement points, each collecting vehicle-by-vehicle data through the electronic toll collection system. The processed data include the past latencies of the freeway segment with different time lags, the traffic conditions of the individual segments (the accumulations, the traffic fluxes, the entrance and exit rates), the total accumulations, and the weekday latency profiles obtained by Gaussian process regression of past data. We arrive at several important conclusions about how data should be refined to obtain accurate predictions, which have implications for future system-wide latency predictions. (1) We find that the prediction of median latency is much more accurate and meaningful than the prediction of average latency, as the latter is plagued by outliers. This is verified by machine-learning prediction using XGBoost that yields a 35% improvement in the mean square error of the 5-minute averaged latencies. (2) We find that the median latency of the segment 15 minutes ago is a very good baseline for performance comparison, and we have evidence that further improvement is achieved by machine learning approaches such as XGBoost and Long Short-Term Memory (LSTM). (3) By analyzing the feature importance score in XGBoost and calculating the mutual information between the inputs and the latencies to be predicted, we identify a sequence of inputs ranked in importance. It confirms that the past latencies are most informative of the predicted latencies, followed by the total accumulation, whereas inputs such as the entrance and exit rates are uninformative. It also confirms that the inputs are much less informative of the average latencies than the median latencies. (4) For predicting the latencies of segments composed of two or three sub-segments, summing up the predicted latencies of each sub-segment is more accurate than the one-step prediction of the whole segment, especially with the latency prediction of the downstream sub-segments trained to anticipate latencies several minutes ahead. The duration of the anticipation time is an increasing function of the traveling time of the upstream segment. The above findings have important implications to predicting the full set of latencies among the various locations in the freeway system.Keywords: data refinement, machine learning, mutual information, short-term latency prediction
Procedia PDF Downloads 169755 Enabling Translanguaging in the EFL Classroom, Affordances of Learning and Reflections
Authors: Nada Alghali
Abstract:
Translanguaging pedagogy suggests a new perspective in language education relating to multilingualism; multilingual learners have one linguistic repertoire and not two or more separate language systems (García and Wei, 2014). When learners translanguage, they are able to draw on all their language features in a flexible and integrated way (Otheguy, García, & Reid, 2015). In the Foreign Language Classroom, however, the tendency to use the target language only is still advocated as a pedagogy. This study attempts to enable learners in the English as a foreign language classroom to draw on their full linguistic repertoire through collaborative reading lessons. In observations prior to this study, in a classroom where English only policy prevails, learners still used their first language in group discussions yet were constrained at times by the teacher’s language policies. Through strategically enabling translanguaging in reading lessons (Celic and Seltzer, 2011), this study has revealed that learners showed creative ways of language use for learning and reflected positively on thisexperience. This case study enabled two groups in two different proficiency level classrooms who are learning English as a foreign language in their first year at University in Saudi Arabia. Learners in the two groups wereobserved over six weeks and wereasked to reflect their learning every week. The same learners were also interviewed at the end of translanguaging weeks after completing a modified model of the learning reflection (Ash and Clayton, 2009). This study positions translanguaging as collaborative and agentive within a sociocultural framework of learning, positioning translanguaging as a resource for learning as well as a process of learning. Translanguaging learning episodes are elicited from classroom observations, artefacts, interviews, reflections, and focus groups, where they are analysed qualitatively following the sociocultural discourse analysis (Fairclough &Wodak, 1997; Mercer, 2004). Initial outcomes suggest functions of translanguaging in collaborative reading tasks and recommendations for a collaborative translanguaging pedagogy approach in the EFL classroom.Keywords: translanguaging, EFL, sociocultural theory, discourse analysis
Procedia PDF Downloads 180754 Lung Cancer Detection and Multi Level Classification Using Discrete Wavelet Transform Approach
Authors: V. Veeraprathap, G. S. Harish, G. Narendra Kumar
Abstract:
Uncontrolled growth of abnormal cells in the lung in the form of tumor can be either benign (non-cancerous) or malignant (cancerous). Patients with Lung Cancer (LC) have an average of five years life span expectancy provided diagnosis, detection and prediction, which reduces many treatment options to risk of invasive surgery increasing survival rate. Computed Tomography (CT), Positron Emission Tomography (PET), and Magnetic Resonance Imaging (MRI) for earlier detection of cancer are common. Gaussian filter along with median filter used for smoothing and noise removal, Histogram Equalization (HE) for image enhancement gives the best results without inviting further opinions. Lung cavities are extracted and the background portion other than two lung cavities is completely removed with right and left lungs segmented separately. Region properties measurements area, perimeter, diameter, centroid and eccentricity measured for the tumor segmented image, while texture is characterized by Gray-Level Co-occurrence Matrix (GLCM) functions, feature extraction provides Region of Interest (ROI) given as input to classifier. Two levels of classifications, K-Nearest Neighbor (KNN) is used for determining patient condition as normal or abnormal, while Artificial Neural Networks (ANN) is used for identifying the cancer stage is employed. Discrete Wavelet Transform (DWT) algorithm is used for the main feature extraction leading to best efficiency. The developed technology finds encouraging results for real time information and on line detection for future research.Keywords: artificial neural networks, ANN, discrete wavelet transform, DWT, gray-level co-occurrence matrix, GLCM, k-nearest neighbor, KNN, region of interest, ROI
Procedia PDF Downloads 153753 Volcanoscape Space Configuration Zoning Based on Disaster Mitigation by Utilizing GIS Platform in Mt. Krakatau Indonesia
Authors: Vega Erdiana Dwi Fransiska, Abyan Rai Fauzan Machmudin
Abstract:
Particularly, space configuration zoning is the very first juncture of a complete space configuration and region planning. Zoning is aimed to define discrete knowledge based on a local wisdom. Ancient predecessor scientifically study the sign of natural disaster towards ethnography approach by operating this knowledge. There are three main functions of space zoning, which are control function, guidance function, and additional function. The control function refers to an instrument for development control and as one of the essentials in controlling land use. Hence, the guidance function indicates as guidance for proposing operational planning and technical development or land usage. Any additional function is useful as a supplementary for region or province planning details. This phase likewise accredits to define boundary in an open space based on geographical appearance. Informant who is categorized as an elder lives in earthquake prone area, to be precise the area is the surrounding of Mount Krakatau. The collected data is one of method for analyzed with thematic model. Later on, it will be verified. In space zoning, long-range distance sensor is applied to determine visualization of the area, which will be zoned before the step of survey to validate the data. The data, which is obtained from long-range distance sensor and site survey, will be overlaid using GIS Platform. Comparing the knowledge based on a local wisdom that is well known by elderly in that area, some of it is relevant to the research, while the others are not. Based on the site survey, the interpretation of a long-range distance sensor, and determining space zoning by considering various aspects resulted in the pattern map of space zoning. This map can be integrated with disaster mitigation affected by volcano eruption.Keywords: elderly, GIS platform, local wisdom, space zoning
Procedia PDF Downloads 255