Search results for: spatial poetic text
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3727

Search results for: spatial poetic text

1627 Syntax-Related Problems of Translation

Authors: Anna Kesoyan

Abstract:

The present paper deals with the syntax-related problems of translation from English into Armenian. Although Syntax is a part of grammar, syntax-related problems of translation are studied separately during the process of translation. Translation from one language to another is widely accepted as a challenging problem. This becomes even more challenging when the source and target languages are widely different in structure and style, as is the case with English and Armenian. Syntax-related problems of translation from English into Armenian are mainly connected with the syntactical structures of these languages, and particularly, with the word order of the sentence. The word order of the sentence of the Armenian language, which is a synthetic language, is usually characterized as “rather free”, and the word order of the English language, which is an analytical language, is characterized “fixed”. The following research examines the main translation means, particularly, syntactical transformations as the translator has to take real steps while trying to solve certain syntax-related problems. Most of the means of translation are based on the transformation of grammatical components of the sentence, without changing the main information of the text. There are several transformations that occur during translation such as word order of the sentence, transformations of certain grammatical constructions like Infinitive participial construction, Nominative with the Infinitive and Elliptical constructions which have been covered in the following research.

Keywords: elliptical constructions, nominative with the infinitive constructions, fixed and free word order, syntactic structures

Procedia PDF Downloads 447
1626 Short Answer Grading Using Multi-Context Features

Authors: S. Sharan Sundar, Nithish B. Moudhgalya, Nidhi Bhandari, Vineeth Vijayaraghavan

Abstract:

Automatic Short Answer Grading is one of the prime applications of artificial intelligence in education. Several approaches involving the utilization of selective handcrafted features, graphical matching techniques, concept identification and mapping, complex deep frameworks, sentence embeddings, etc. have been explored over the years. However, keeping in mind the real-world application of the task, these solutions present a slight overhead in terms of computations and resources in achieving high performances. In this work, a simple and effective solution making use of elemental features based on statistical, linguistic properties, and word-based similarity measures in conjunction with tree-based classifiers and regressors is proposed. The results for classification tasks show improvements ranging from 1%-30%, while the regression task shows a stark improvement of 35%. The authors attribute these improvements to the addition of multiple similarity scores to provide ensemble of scoring criteria to the models. The authors also believe the work could reinstate that classical natural language processing techniques and simple machine learning models can be used to achieve high results for short answer grading.

Keywords: artificial intelligence, intelligent systems, natural language processing, text mining

Procedia PDF Downloads 128
1625 A Proposal to Integrate Spatially Explicit Ecosystem Services with Urban Metabolic Modelling

Authors: Thomas Elliot, Javier Babi Almenar, Benedetto Rugani

Abstract:

The integration of urban metabolism (UM) with spatially explicit ecosystem service (ES) stocks has the potential to advance sustainable urban development. It will correct the lack of spatially specificity of current urban metabolism models. Furthermore, it will include into UM not only the physical properties of material and energy stocks and flows, but also the implications to the natural capital that provides and maintains human well-being. This paper presents the first stages of a modelling framework by which urban planners can assess spatially the trade-offs of ES flows resulting from urban interventions of different character and scale. This framework allows for a multi-region assessment which takes into account sustainability burdens consequent to an urban planning event occurring elsewhere in the environment. The urban boundary is defined as the Functional Urban Audit (FUA) method to account for trans-administrative ES flows. ES are mapped using CORINE land use within the FUA. These stocks and flows are incorporated into a UM assessment method to demonstrate the transfer and flux of ES arising from different urban planning implementations.

Keywords: ecological economics, ecosystem services, spatial planning, urban metabolism

Procedia PDF Downloads 325
1624 Sparsity-Based Unsupervised Unmixing of Hyperspectral Imaging Data Using Basis Pursuit

Authors: Ahmed Elrewainy

Abstract:

Mixing in the hyperspectral imaging occurs due to the low spatial resolutions of the used cameras. The existing pure materials “endmembers” in the scene share the spectra pixels with different amounts called “abundances”. Unmixing of the data cube is an important task to know the present endmembers in the cube for the analysis of these images. Unsupervised unmixing is done with no information about the given data cube. Sparsity is one of the recent approaches used in the source recovery or unmixing techniques. The l1-norm optimization problem “basis pursuit” could be used as a sparsity-based approach to solve this unmixing problem where the endmembers is assumed to be sparse in an appropriate domain known as dictionary. This optimization problem is solved using proximal method “iterative thresholding”. The l1-norm basis pursuit optimization problem as a sparsity-based unmixing technique was used to unmix real and synthetic hyperspectral data cubes.

Keywords: basis pursuit, blind source separation, hyperspectral imaging, spectral unmixing, wavelets

Procedia PDF Downloads 192
1623 R Software for Parameter Estimation of Spatio-Temporal Model

Authors: Budi Nurani Ruchjana, Atje Setiawan Abdullah, I. Gede Nyoman Mindra Jaya, Eddy Hermawan

Abstract:

In this paper, we propose the application package to estimate parameters of spatiotemporal model based on the multivariate time series analysis using the R open-source software. We build packages mainly to estimate the parameters of the Generalized Space Time Autoregressive (GSTAR) model. GSTAR is a combination of time series and spatial models that have parameters vary per location. We use the method of Ordinary Least Squares (OLS) and use the Mean Average Percentage Error (MAPE) to fit the model to spatiotemporal real phenomenon. For case study, we use oil production data from volcanic layer at Jatibarang Indonesia or climate data such as rainfall in Indonesia. Software R is very user-friendly and it is making calculation easier, processing the data is accurate and faster. Limitations R script for the estimation of model parameters spatiotemporal GSTAR built is still limited to a stationary time series model. Therefore, the R program under windows can be developed either for theoretical studies and application.

Keywords: GSTAR Model, MAPE, OLS method, oil production, R software

Procedia PDF Downloads 237
1622 Researching and Interpreting Art: Analyzing Whose Voice Matters

Authors: Donna L. Roberts

Abstract:

Beyond the fundamental question of what is (and what isn’t) art, one then moves to the question of what about art, or a specific artwork, matters. If there is an agreement that something is art, the next step is to answer the obvious, ‘So what? What does it mean?’ In answering these questions, one must decide how to focus the proverbial microscope –i.e., what level of perspective is relevant as a point of view for this analysis- the artwork itself, the artist’s intention, the viewer’s interpretation, the artwork’s reflection of the larger artistic movement, the social, political, and historical context of art? One must determine what product and what contexts are meaningful when experiencing and interpreting art. Is beauty really in the eye of the beholder? Or is it more important what the creator was trying to say than what the critic or observer heard? The fact that so many artists –from Rembrandt to Van Gogh to Picasso- include among their works at least one self-portrait seems to scream their point –I matter. But, Is a piece more impactful because of the persona behind it? Or does that persona impose limits and close one’s mind to the possibilities of interpretation? In the popular art text visual culture, Richard Howells argues against a biographical focus on the artist in the analysis of art. Similarly, abstract expressionist Mark Rothko, along with several of his contemporaries of the genre, often did not title his paintings for the express purpose of not imposing a specific meaning or interpretation on the piece. And yet, he once said, ‘The people who weep before my pictures are having the same religious experience I had when I painted them,’ thus alluding to a desire for a shared connection and revelation. This research analyzes the arguments for differing levels of interpretation and points of view when considering a work of art and/or the artist who created it.

Keywords: art analysis, art interpretation, art theory, artistic perspective

Procedia PDF Downloads 144
1621 Design and Implementation of a Geodatabase and WebGIS

Authors: Sajid Ali, Dietrich Schröder

Abstract:

The merging of internet and Web has created many disciplines and Web GIS is one these disciplines which is effectively dealing with the geospatial data in a proficient way. Web GIS technologies have provided an easy accessing and sharing of geospatial data over the internet. However, there is a single platform for easy and multiple accesses of the data lacks for the European Caribbean Association (Europaische Karibische Gesselschaft - EKG) to assist their members and other research community. The technique presented in this paper deals with designing of a geodatabase using PostgreSQL/PostGIS as an object oriented relational database management system (ORDBMS) for competent dissemination and management of spatial data and Web GIS by using OpenGeo Suite for the fast sharing and distribution of the data over the internet. The characteristics of the required design for the geodatabase have been studied and a specific methodology is given for the purpose of designing the Web GIS. At the end, validation of this Web based geodatabase has been performed over two Desktop GIS software and a web map application and it is also discussed that the contribution has all the desired modules to expedite further research in the area as per the requirements.

Keywords: desktop GISSoftware, European Caribbean association, geodatabase, OpenGeo suite, postgreSQL/PostGIS, webGIS, web map application

Procedia PDF Downloads 333
1620 Regional Hydrological Extremes Frequency Analysis Based on Statistical and Hydrological Models

Authors: Hadush Kidane Meresa

Abstract:

The hydrological extremes frequency analysis is the foundation for the hydraulic engineering design, flood protection, drought management and water resources management and planning to utilize the available water resource to meet the desired objectives of different organizations and sectors in a country. This spatial variation of the statistical characteristics of the extreme flood and drought events are key practice for regional flood and drought analysis and mitigation management. For different hydro-climate of the regions, where the data set is short, scarcity, poor quality and insufficient, the regionalization methods are applied to transfer at-site data to a region. This study aims in regional high and low flow frequency analysis for Poland River Basins. Due to high frequent occurring of hydrological extremes in the region and rapid water resources development in this basin have caused serious concerns over the flood and drought magnitude and frequencies of the river in Poland. The magnitude and frequency result of high and low flows in the basin is needed for flood and drought planning, management and protection at present and future. Hydrological homogeneous high and low flow regions are formed by the cluster analysis of site characteristics, using the hierarchical and C- mean clustering and PCA method. Statistical tests for regional homogeneity are utilized, by Discordancy and Heterogeneity measure tests. In compliance with results of the tests, the region river basin has been divided into ten homogeneous regions. In this study, frequency analysis of high and low flows using AM for high flow and 7-day minimum low flow series is conducted using six statistical distributions. The use of L-moment and LL-moment method showed a homogeneous region over entire province with Generalized logistic (GLOG), Generalized extreme value (GEV), Pearson type III (P-III), Generalized Pareto (GPAR), Weibull (WEI) and Power (PR) distributions as the regional drought and flood frequency distributions. The 95% percentile and Flow duration curves of 1, 7, 10, 30 days have been plotted for 10 stations. However, the cluster analysis performed two regions in west and east of the province where L-moment and LL-moment method demonstrated the homogeneity of the regions and GLOG and Pearson Type III (PIII) distributions as regional frequency distributions for each region, respectively. The spatial variation and regional frequency distribution of flood and drought characteristics for 10 best catchment from the whole region was selected and beside the main variable (streamflow: high and low) we used variables which are more related to physiographic and drainage characteristics for identify and delineate homogeneous pools and to derive best regression models for ungauged sites. Those are mean annual rainfall, seasonal flow, average slope, NDVI, aspect, flow length, flow direction, maximum soil moisture, elevation, and drainage order. The regional high-flow or low-flow relationship among one streamflow characteristics with (AM or 7-day mean annual low flows) some basin characteristics is developed using Generalized Linear Mixed Model (GLMM) and Generalized Least Square (GLS) regression model, providing a simple and effective method for estimation of flood and drought of desired return periods for ungauged catchments.

Keywords: flood , drought, frequency, magnitude, regionalization, stochastic, ungauged, Poland

Procedia PDF Downloads 597
1619 Rethinking Peace Journalism in Pakistan: A Critical Analysis of News Discourse on the Afghan Refugee Repatriation Conflict

Authors: Ayesha Hasan

Abstract:

This study offers unique perspectives and analyses of peace and conflict journalism through interpretative repertoire, media frames, and critical discourse analyses. Two major English publications in Pakistan, representing both long and short-form journalism, are investigated to uncover how the Afghan refugee repatriation from Pakistan in 2016-17 has been framed in Pakistani English media. Peace journalism focuses on concepts such as peace initiatives and peace building, finding common ground, and preventing further conflict. This study applies Jake Lynch’s Coding Criteria to guide the critical discourse analysis and Lee and Maslog’s Peace Journalism Quotient to examine the extent of peace journalism in each text. This study finds that peace journalism is missing in Pakistani English press, but represented, to an extent, in long-form print and online coverage. Two new alternative frames are also proposed. This study gives an in-depth understanding of if and how journalists in Pakistan are covering conflicts and framing stories that can be identified as peace journalism. This study represents significant contributions to the remarkably limited scholarship on peace and conflict journalism in Pakistan and extends Shabbir Hussain’s work on critical pragmatic perspectives on peace journalism in Pakistan.

Keywords: Afghan refugee repatriation, Critical discourse analysis, Media framing , Peace and conflict journalism

Procedia PDF Downloads 199
1618 The Use of X-Ray Computed Microtomography in Petroleum Geology: A Case Study of Unconventional Reservoir Rocks in Poland

Authors: Tomasz Wejrzanowski, Łukasz Kaczmarek, Michał Maksimczuk

Abstract:

High-resolution X-ray computed microtomography (µCT) is a non-destructive technique commonly used to determine the internal structure of reservoir rock sample. This study concerns µCT analysis of Silurian and Ordovician shales and mudstones from a borehole in the Baltic Basin, north of Poland. The spatial resolution of the µCT images obtained was 27 µm, which enabled the authors to create accurate 3-D visualizations and to calculate the ratio of pores and fractures volume to the total sample volume. A total of 1024 µCT slices were used to create a 3-D volume of sample structure geometry. These µCT slices were processed to obtain a clearly visible image and the volume ratio. A copper X-ray source filter was used to reduce image artifacts. Due to accurate technical settings of µCT it was possible to obtain high-resolution 3-D µCT images of low X-ray transparency samples. The presented results confirm the utility of µCT implementations in geoscience and show that µCT has still promising applications for reservoir exploration and characterization.

Keywords: fractures, material density, pores, structure

Procedia PDF Downloads 252
1617 Images Selection and Best Descriptor Combination for Multi-Shot Person Re-Identification

Authors: Yousra Hadj Hassen, Walid Ayedi, Tarek Ouni, Mohamed Jallouli

Abstract:

To re-identify a person is to check if he/she has been already seen over a cameras network. Recently, re-identifying people over large public cameras networks has become a crucial task of great importance to ensure public security. The vision community has deeply investigated this area of research. Most existing researches rely only on the spatial appearance information from either one or multiple person images. Actually, the real person re-id framework is a multi-shot scenario. However, to efficiently model a person’s appearance and to choose the best samples to remain a challenging problem. In this work, an extensive comparison of descriptors of state of the art associated with the proposed frame selection method is studied. Specifically, we evaluate the samples selection approach using multiple proposed descriptors. We show the effectiveness and advantages of the proposed method by extensive comparisons with related state-of-the-art approaches using two standard datasets PRID2011 and iLIDS-VID.

Keywords: camera network, descriptor, model, multi-shot, person re-identification, selection

Procedia PDF Downloads 272
1616 The Beat of 'Desolation Row', 50 Years on

Authors: May Ziade

Abstract:

Postgraduate studies in English language and literature at the University of Sydney provided opportunity for research into one of the most significant singer/songwriters of our time, Bob Dylan, and his masterpiece from the mid-1960s, ‘Desolation Row’. With a title alluding to Jack Kerouac’s Desolation Angels as well as John Steinbeck’s Cannery Row, ‘Desolation Row’ is Bob Dylan’s magnum opus. Recorded on August 4 1965, it takes pride of place as the last track on the revolutionary 1965 album of rock poetics, Highway 61 Revisited. From its inception, its epic proportions (ten long verses) and rich and baffling imagery got our attention - it amused, fascinated and beguiled. The song’s surreal and dreamlike landscape and its cast of characters, drawn from history, fiction, mythology, theology, and popular culture, lured us in and begged interpretation. What were they doing there? Where is Desolation Row? Do they want to escape from or go to ‘Desolation Row’? What was Dylan writing about and what were his influences? Through literary analysis and historical research, this paper will examine the song’s lyrics, the mid-60s context and Dylan’s vast influences to make sense, offer explanations and make connections. In particular, research findings place the Beat poets and oeuvre as a significant literary influence but it is a rich, multilayered text that straddles traditions and emerges as a paradox – a paradox that has endured and endeared itself to many. As it turns 50 this year, what better way to acknowledge this momentous occasion than at an international English language conference.

Keywords: analysis, Bob Dylan, beat context, desolation row

Procedia PDF Downloads 495
1615 Integration GIS–SCADA Power Systems to Enclosure Air Dispersion Model

Authors: Ibrahim Shaker, Amr El Hossany, Moustafa Osman, Mohamed El Raey

Abstract:

This paper will explore integration model between GIS–SCADA system and enclosure quantification model to approach the impact of failure-safe event. There are real demands to identify spatial objects and improve control system performance. Nevertheless, the employed methodology is predicting electro-mechanic operations and corresponding time to environmental incident variations. Open processing, as object systems technology, is presented for integration enclosure database with minimal memory size and computation time via connectivity drivers such as ODBC:JDBC during main stages of GIS–SCADA connection. The function of Geographic Information System is manipulating power distribution in contrast to developing issues. In other ward, GIS-SCADA systems integration will require numerical objects of process to enable system model calibration and estimation demands, determine of past events for analysis and prediction of emergency situations for response training.

Keywords: air dispersion model, environmental management, SCADA systems, GIS system, integration power system

Procedia PDF Downloads 360
1614 Twitter Sentiment Analysis during the Lockdown on New-Zealand

Authors: Smah Almotiri

Abstract:

One of the most common fields of natural language processing (NLP) is sentimental analysis. The inferred feeling in the text can be successfully mined for various events using sentiment analysis. Twitter is viewed as a reliable data point for sentimental analytics studies since people are using social media to receive and exchange different types of data on a broad scale during the COVID-19 epidemic. The processing of such data may aid in making critical decisions on how to keep the situation under control. The aim of this research is to look at how sentimental states differed in a single geographic region during the lockdown at two different times.1162 tweets were analyzed related to the COVID-19 pandemic lockdown using keywords hashtags (lockdown, COVID-19) for the first sample tweets were from March 23, 2020, until April 23, 2020, and the second sample for the following year was from March 1, 2020, until April 4, 2020. Natural language processing (NLP), which is a form of Artificial intelligence, was used for this research to calculate the sentiment value of all of the tweets by using AFINN Lexicon sentiment analysis method. The findings revealed that the sentimental condition in both different times during the region's lockdown was positive in the samples of this study, which are unique to the specific geographical area of New Zealand. This research suggests applying machine learning sentimental methods such as Crystal Feel and extending the size of the sample tweet by using multiple tweets over a longer period of time.

Keywords: sentiment analysis, Twitter analysis, lockdown, Covid-19, AFINN, NodeJS

Procedia PDF Downloads 185
1613 Critical Thinking and Academic Writing: A Case Study

Authors: Mubina Rauf

Abstract:

Critical thinking is a highly valued outcome of university education. There is an agreement in literature that it is demonstrated through the abilities to highlight issues and assumptions, find links between ideas and concepts, make correct inferences, evaluate evidence or authority and deduce conclusions (Tsui, 2002). Although Critical thinking plays a significant role in developing all academic skills, its role in developing writing skills is significant (Kurfiss, 1988). SAW (student academic writing) is an observable output of critical thinking (Wilson K. , 2016). When students apply critical thinking to their writing, they present clear, accurate, significant and logical arguments constructing their own voice in the form of an essay or dissertation (Matsuda, 2001). This presentation will show how a rubric can be used to find evidence of critical thinking in SAW. Participants will experience how evidence-based written arguments supported by background knowledge and authorial voice can develop students into efficient critical thinkers. Participants will have an opportunity to use the rubric to find the evidence of critical thinking in SAW samples. This presentation is intended for classroom teachers with or without the basic knowledge of implementing critical thinking in academic settings. Participants will also learn tips how various features of critical thinking can be developed among students. After the session, the participants will be able to use or adapt the rubric according to their needs to find evidence of critical thinking in SAW within their context.

Keywords: critical thinking, Rubric, student academic writing, argumentation, text analysis

Procedia PDF Downloads 67
1612 Modeling Thermal Changes of Urban Blocks in Relation to the Landscape Structure and Configuration in Guilan Province

Authors: Roshanak Afrakhteh, Abdolrasoul Salman Mahini, Mahdi Motagh, Hamidreza Kamyab

Abstract:

Urban Heat Islands (UHIs) are distinctive urban areas characterized by densely populated central cores surrounded by less densely populated peripheral lands. These areas experience elevated temperatures, primarily due to impermeable surfaces and specific land use patterns. The consequences of these temperature variations are far-reaching, impacting the environment and society negatively, leading to increased energy consumption, air pollution, and public health concerns. This paper emphasizes the need for simplified approaches to comprehend UHI temperature dynamics and explains how urban development patterns contribute to land surface temperature variation. To illustrate this relationship, the study focuses on the Guilan Plain, utilizing techniques like principal component analysis and generalized additive models. The research centered on mapping land use and land surface temperature in the low-lying area of Guilan province. Satellite data from Landsat sensors for three different time periods (2002, 2012, and 2021) were employed. Using eCognition software, a spatial unit known as a "city block" was utilized through object-based analysis. The study also applied the normalized difference vegetation index (NDVI) method to estimate land surface radiance. Predictive variables for urban land surface temperature within residential city blocks were identified categorized as intrinsic (related to the block's structure) and neighboring (related to adjacent blocks) variables. Principal Component Analysis (PCA) was used to select significant variables, and a Generalized Additive Model (GAM) approach, implemented using R's mgcv package, modeled the relationship between urban land surface temperature and predictor variables.Notable findings included variations in urban temperature across different years attributed to environmental and climatic factors. Block size, shared boundary, mother polygon area, and perimeter-to-area ratio were identified as main variables for the generalized additive regression model. This model showed non-linear relationships, with block size, shared boundary, and mother polygon area positively correlated with temperature, while the perimeter-to-area ratio displayed a negative trend. The discussion highlights the challenges of predicting urban surface temperature and the significance of block size in determining urban temperature patterns. It also underscores the importance of spatial configuration and unit structure in shaping urban temperature patterns. In conclusion, this study contributes to the growing body of research on the connection between land use patterns and urban surface temperature. Block size, along with block dispersion and aggregation, emerged as key factors influencing urban surface temperature in residential areas. The proposed methodology enhances our understanding of parameter significance in shaping urban temperature patterns across various regions, particularly in Iran.

Keywords: urban heat island, land surface temperature, LST modeling, GAM, Gilan province

Procedia PDF Downloads 69
1611 Intra-miR-ExploreR, a Novel Bioinformatics Platform for Integrated Discovery of MiRNA:mRNA Gene Regulatory Networks

Authors: Surajit Bhattacharya, Daniel Veltri, Atit A. Patel, Daniel N. Cox

Abstract:

miRNAs have emerged as key post-transcriptional regulators of gene expression, however identification of biologically-relevant target genes for this epigenetic regulatory mechanism remains a significant challenge. To address this knowledge gap, we have developed a novel tool in R, Intra-miR-ExploreR, that facilitates integrated discovery of miRNA targets by incorporating target databases and novel target prediction algorithms, using statistical methods including Pearson and Distance Correlation on microarray data, to arrive at high confidence intragenic miRNA target predictions. We have explored the efficacy of this tool using Drosophila melanogaster as a model organism for bioinformatics analyses and functional validation. A number of putative targets were obtained which were also validated using qRT-PCR analysis. Additional features of the tool include downloadable text files containing GO analysis from DAVID and Pubmed links of literature related to gene sets. Moreover, we are constructing interaction maps of intragenic miRNAs, using both micro array and RNA-seq data, focusing on neural tissues to uncover regulatory codes via which these molecules regulate gene expression to direct cellular development.

Keywords: miRNA, miRNA:mRNA target prediction, statistical methods, miRNA:mRNA interaction network

Procedia PDF Downloads 501
1610 A Framework for Automated Nuclear Waste Classification

Authors: Seonaid Hume, Gordon Dobie, Graeme West

Abstract:

Detecting and localizing radioactive sources is a necessity for safe and secure decommissioning of nuclear facilities. An important aspect for the management of the sort-and-segregation process is establishing the spatial distributions and quantities of the waste radionuclides, their type, corresponding activity, and ultimately classification for disposal. The data received from surveys directly informs decommissioning plans, on-site incident management strategies, the approach needed for a new cell, as well as protecting the workforce and the public. Manual classification of nuclear waste from a nuclear cell is time-consuming, expensive, and requires significant expertise to make the classification judgment call. Also, in-cell decommissioning is still in its relative infancy, and few techniques are well-developed. As with any repetitive and routine tasks, there is the opportunity to improve the task of classifying nuclear waste using autonomous systems. Hence, this paper proposes a new framework for the automatic classification of nuclear waste. This framework consists of five main stages; 3D spatial mapping and object detection, object classification, radiological mapping, source localisation based on gathered evidence and finally, waste classification. The first stage of the framework, 3D visual mapping, involves object detection from point cloud data. A review of related applications in other industries is provided, and recommendations for approaches for waste classification are made. Object detection focusses initially on cylindrical objects since pipework is significant in nuclear cells and indeed any industrial site. The approach can be extended to other commonly occurring primitives such as spheres and cubes. This is in preparation of stage two, characterizing the point cloud data and estimating the dimensions, material, degradation, and mass of the objects detected in order to feature match them to an inventory of possible items found in that nuclear cell. Many items in nuclear cells are one-offs, have limited or poor drawings available, or have been modified since installation, and have complex interiors, which often and inadvertently pose difficulties when accessing certain zones and identifying waste remotely. Hence, this may require expert input to feature match objects. The third stage, radiological mapping, is similar in order to facilitate the characterization of the nuclear cell in terms of radiation fields, including the type of radiation, activity, and location within the nuclear cell. The fourth stage of the framework takes the visual map for stage 1, the object characterization from stage 2, and radiation map from stage 3 and fuses them together, providing a more detailed scene of the nuclear cell by identifying the location of radioactive materials in three dimensions. The last stage involves combining the evidence from the fused data sets to reveal the classification of the waste in Bq/kg, thus enabling better decision making and monitoring for in-cell decommissioning. The presentation of the framework is supported by representative case study data drawn from an application in decommissioning from a UK nuclear facility. This framework utilises recent advancements of the detection and mapping capabilities of complex radiation fields in three dimensions to make the process of classifying nuclear waste faster, more reliable, cost-effective and safer.

Keywords: nuclear decommissioning, radiation detection, object detection, waste classification

Procedia PDF Downloads 197
1609 Psychodidactic Strategies to Facilitate Flow of Logical Thinking in Preparation of Academic Documents

Authors: Deni Stincer Gomez, Zuraya Monroy Nasr, Luis Pérez Alvarez

Abstract:

The preparation of academic documents such as thesis, articles and research projects is one of the requirements of the higher educational level. These documents demand the implementation of logical argumentative thinking which is experienced and executed with difficulty. To mitigate the effect of these difficulties this study designed a thesis seminar, with which the authors have seven years of experience. It is taught in a graduate program in Psychology at the National Autonomous University of Mexico. In this study the authors use the Toulmin model as a mental heuristic and for the application of a set of psychodidactic strategies that facilitate the elaboration of the plot and culmination of the thesis. The efficiency in obtaining the degree in the groups exposed to the seminar has increased by 94% compared to the 10% that existed in the generations that were not exposed to the seminar. In this article the authors will emphasize the psychodidactic strategies used. The Toulmin model alone does not guarantee the success achieved. A set of actions of a psychological nature (almost psychotherapeutic) and didactics of the teacher also seem to contribute. These are actions that derive from an understanding of the psychological, epistemological and ontogenetic obstacles and the most frequent errors in which thought tends to fall when it is demanded a logical course. The authors have grouped the strategies into three groups: 1) strategies to facilitate logical thinking, 2) strategies to strengthen the scientific self and 3) strategies to facilitate the act of writing the text. In this work the authors delve into each of them.

Keywords: psychodidactic strategies, logical thinking, academic documents, Toulmin model

Procedia PDF Downloads 175
1608 Dynamics of a Reaction-Diffusion Problems Modeling Two Predators Competing for a Prey

Authors: Owolabi Kolade Matthew

Abstract:

In this work, we investigate both the analytical and numerical studies of the dynamical model comprising of three species system. We analyze the linear stability of stationary solutions in the one-dimensional multi-system modeling the interactions of two predators and one prey species. The stability analysis has a lot of implications for understanding the various spatiotemporal and chaotic behaviors of the species in the spatial domain. The analysis results presented have established the possibility of the three interacting species to coexist harmoniously, this feat is achieved by combining the local and global analyzes to determine the global dynamics of the system. In the presence of diffusion, a viable exponential time differencing method is applied to multi-species nonlinear time-dependent partial differential equation to address the points and queries that may naturally arise. The scheme is described in detail, and justified by a number of computational experiments.

Keywords: asymptotically stable, coexistence, exponential time differencing method, global and local stability, predator-prey model, nonlinear, reaction-diffusion system

Procedia PDF Downloads 408
1607 Hand Gesture Interpretation Using Sensing Glove Integrated with Machine Learning Algorithms

Authors: Aqsa Ali, Aleem Mushtaq, Attaullah Memon, Monna

Abstract:

In this paper, we present a low cost design for a smart glove that can perform sign language recognition to assist the speech impaired people. Specifically, we have designed and developed an Assistive Hand Gesture Interpreter that recognizes hand movements relevant to the American Sign Language (ASL) and translates them into text for display on a Thin-Film-Transistor Liquid Crystal Display (TFT LCD) screen as well as synthetic speech. Linear Bayes Classifiers and Multilayer Neural Networks have been used to classify 11 feature vectors obtained from the sensors on the glove into one of the 27 ASL alphabets and a predefined gesture for space. Three types of features are used; bending using six bend sensors, orientation in three dimensions using accelerometers and contacts at vital points using contact sensors. To gauge the performance of the presented design, the training database was prepared using five volunteers. The accuracy of the current version on the prepared dataset was found to be up to 99.3% for target user. The solution combines electronics, e-textile technology, sensor technology, embedded system and machine learning techniques to build a low cost wearable glove that is scrupulous, elegant and portable.

Keywords: American sign language, assistive hand gesture interpreter, human-machine interface, machine learning, sensing glove

Procedia PDF Downloads 292
1606 Comparison of Authentication Methods in Internet of Things Technology

Authors: Hafizah Che Hasan, Fateen Nazwa Yusof, Maslina Daud

Abstract:

Internet of Things (IoT) is a powerful industry system, which end-devices are interconnected and automated, allowing the devices to analyze data and execute actions based on the analysis. The IoT technology leverages the technology of Radio-Frequency Identification (RFID) and Wireless Sensor Network (WSN), including mobile and sensor. These technologies contribute to the evolution of IoT. However, due to more devices are connected each other in the Internet, and data from various sources exchanged between things, confidentiality of the data becomes a major concern. This paper focuses on one of the major challenges in IoT; authentication, in order to preserve data integrity and confidentiality are in place. A few solutions are reviewed based on papers from the last few years. One of the proposed solutions is securing the communication between IoT devices and cloud servers with Elliptic Curve Cryptograhpy (ECC) based mutual authentication protocol. This solution focuses on Hyper Text Transfer Protocol (HTTP) cookies as security parameter.  Next proposed solution is using keyed-hash scheme protocol to enable IoT devices to authenticate each other without the presence of a central control server. Another proposed solution uses Physical Unclonable Function (PUF) based mutual authentication protocol. It emphasizes on tamper resistant and resource-efficient technology, which equals a 3-way handshake security protocol.

Keywords: Internet of Things (IoT), authentication, PUF ECC, keyed-hash scheme protocol

Procedia PDF Downloads 254
1605 Self-Supervised Pretraining on Sequences of Functional Magnetic Resonance Imaging Data for Transfer Learning to Brain Decoding Tasks

Authors: Sean Paulsen, Michael Casey

Abstract:

In this work we present a self-supervised pretraining framework for transformers on functional Magnetic Resonance Imaging (fMRI) data. First, we pretrain our architecture on two self-supervised tasks simultaneously to teach the model a general understanding of the temporal and spatial dynamics of human auditory cortex during music listening. Our pretraining results are the first to suggest a synergistic effect of multitask training on fMRI data. Second, we finetune the pretrained models and train additional fresh models on a supervised fMRI classification task. We observe significantly improved accuracy on held-out runs with the finetuned models, which demonstrates the ability of our pretraining tasks to facilitate transfer learning. This work contributes to the growing body of literature on transformer architectures for pretraining and transfer learning with fMRI data, and serves as a proof of concept for our pretraining tasks and multitask pretraining on fMRI data.

Keywords: transfer learning, fMRI, self-supervised, brain decoding, transformer, multitask training

Procedia PDF Downloads 85
1604 An Experimental Study on the Variability of Nonnative and Native Inference of Word Meanings in Timed and Untimed Conditions

Authors: Swathi M. Vanniarajan

Abstract:

Reading research suggests that online contextual vocabulary comprehension while reading is an interactive and integrative process. One’s success in it depends on a variety of factors including the amount and the nature of available linguistic and nonlinguistic cues, his/her analytical and integrative skills, schema memory (content familiarity), and processing speed characterized along the continuum of controlled to automatic processing. The experiment reported here, conducted with 30 native speakers as one group and 30 nonnative speakers as another group (all graduate students), hypothesized that while working on (24) tasks which required them to comprehend an unfamiliar word in real time without backtracking, due to the differences in the nature of their respective reading processes, the nonnative subjects would be less able to construct the meanings of the unknown words by integrating the multiple but sufficient contextual cues provided in the text but the native subjects would be able to. The results indicated that there were significant inter-group as well as intra-group differences in terms of the quality of definitions given. However, when given additional time, while the nonnative speakers could significantly improve the quality of their definitions, the native speakers in general would not, suggesting that all things being equal, time is a significant factor for success in nonnative vocabulary and reading comprehension processes and that accuracy precedes automaticity in the development of nonnative reading processes also.

Keywords: reading, second language processing, vocabulary comprehension

Procedia PDF Downloads 160
1603 Non-Local Simultaneous Sparse Unmixing for Hyperspectral Data

Authors: Fanqiang Kong, Chending Bian

Abstract:

Sparse unmixing is a promising approach in a semisupervised fashion by assuming that the observed pixels of a hyperspectral image can be expressed in the form of linear combination of only a few pure spectral signatures (end members) in an available spectral library. However, the sparse unmixing problem still remains a great challenge at finding the optimal subset of endmembers for the observed data from a large standard spectral library, without considering the spatial information. Under such circumstances, a sparse unmixing algorithm termed as non-local simultaneous sparse unmixing (NLSSU) is presented. In NLSSU, the non-local simultaneous sparse representation method for endmember selection of sparse unmixing, is used to finding the optimal subset of endmembers for the similar image patch set in the hyperspectral image. And then, the non-local means method, as a regularizer for abundance estimation of sparse unmixing, is used to exploit the abundance image non-local self-similarity. Experimental results on both simulated and real data demonstrate that NLSSU outperforms the other algorithms, with a better spectral unmixing accuracy.

Keywords: hyperspectral unmixing, simultaneous sparse representation, sparse regression, non-local means

Procedia PDF Downloads 242
1602 Simulation of 1D Dielectric Barrier Discharge in Argon Mixtures

Authors: Lucas Wilman Crispim, Patrícia Hallack, Maikel Ballester

Abstract:

This work aims at modeling electric discharges in gas mixtures. The mathematical model mimics the ignition process in a commercial spark-plug when a high voltage is applied to the plug terminals. A longitudinal unidimensional Cartesian domain is chosen for the simulation region. Energy and mass transfer are considered for a macroscopic fluid representation, while energy transfer in molecular collisions and chemical reactions are contemplated at microscopic level. The macroscopic model is represented by a set of uncoupled partial differential equations. Microscopic effects are studied within a discrete model for electronic and molecular collisions in the frame of ZDPlasKin, a plasma modeling numerical tool. The BOLSIG+ solver is employed in solving the electronic Boltzmann equation. An operator splitting technique is used to separate microscopic and macroscopic models. The simulation gas is a mixture of atomic Argon neutral, excited and ionized. Spatial and temporal evolution of such species and temperature are presented and discussed.

Keywords: CFD, electronic discharge, ignition, spark plug

Procedia PDF Downloads 156
1601 Music Therapy Intervention as a Means of Stimulating Communicative Abilities of Seniors with Neurocognitive Disorders – Theory versus Practice

Authors: Pavel Svoboda, Oldřich Müller

Abstract:

The paper contains a screening of the opinions of helping professional workers working in a home for seniors with individuals with neurocognitive disorders and compares them with the opinions of a younger generation of students who are just preparing for this work. The authors carried out a comparative questionnaire survey with both target groups, focusing on the analysis and comparison of possible differences in their knowledge in the field of care for elderly people with neurocognitive disorders. Specifically, they focused on knowledge and experience with approaches, methods and tools applicable within the framework of music therapy interventions, as they are understood in practice in comparison with the theoretical knowledge of secondary school students focused on social work. The questionnaire was mainly aimed at assessing the knowledge of the possibilities of effective memory stimulation of the elderly and their communication skills using the means of music. The conducted investigation was based on the research of studies dealing with so-called non-pharmacological approaches to the given clientele; for professional caregivers, it followed music therapy lessons, which the authors regularly implemented from the beginning of 2022. Its results will, among other things, serve as the basis for an upcoming study with a scoping design review.

Keywords: neurocognitive disorders, seniors, music therapy intervention, melody, rhythm, text, memory stimulation, communication skills

Procedia PDF Downloads 66
1600 Post-occupancy Evaluation of Greenway Based on Multi-source data : A Case Study of Jincheng Greenway in Chengdu

Authors: Qin Zhu

Abstract:

Under the development concept of Park City, Tianfu Greenway system, as the basic and pre-configuration element of Chengdu Global Park construction, connects urban open space with linear and circular structures and undertakes and exerts the ecological, cultural and recreational functions of the park system. Chengdu greenway construction is in full swing. In the process of greenway planning and construction, the landscape effect of greenway on urban quality improvement is more valued, and the long-term impact of crowd experience on the sustainable development of greenway is often ignored. Therefore, it is very important to test the effectiveness of greenway construction from the perspective of users. Taking Jincheng Greenway in Chengdu as an example, this paper attempts to introduce multi-source data to construct a post-occupancy evaluation model of greenway and adopts behavior mapping method, questionnaire survey method, web text analysis and IPA analysis method to comprehensively evaluate the user 's behavior characteristics and satisfaction. According to the evaluation results, we can grasp the actual behavior rules and comprehensive needs of users so that the experience of building greenways can be fed back in time and provide guidance for the optimization and improvement of built greenways and the planning and construction of future greenways.

Keywords: multi-source data, greenway, IPA analysis, post -occupancy evaluation (POE)

Procedia PDF Downloads 58
1599 An Agent-Based Modelling Simulation Approach to Calculate Processing Delay of GEO Satellite Payload

Authors: V. Vicente E. Mujica, Gustavo Gonzalez

Abstract:

The global coverage of broadband multimedia and internet-based services in terrestrial-satellite networks demand particular interests for satellite providers in order to enhance services with low latencies and high signal quality to diverse users. In particular, the delay of on-board processing is an inherent source of latency in a satellite communication that sometimes is discarded for the end-to-end delay of the satellite link. The frame work for this paper includes modelling of an on-orbit satellite payload using an agent model that can reproduce the properties of processing delays. In essence, a comparison of different spatial interpolation methods is carried out to evaluate physical data obtained by an GEO satellite in order to define a discretization function for determining that delay. Furthermore, the performance of the proposed agent and the development of a delay discretization function are together validated by simulating an hybrid satellite and terrestrial network. Simulation results show high accuracy according to the characteristics of initial data points of processing delay for Ku bands.

Keywords: terrestrial-satellite networks, latency, on-orbit satellite payload, simulation

Procedia PDF Downloads 265
1598 Geo-Additive Modeling of Family Size in Nigeria

Authors: Oluwayemisi O. Alaba, John O. Olaomi

Abstract:

The 2013 Nigerian Demographic Health Survey (NDHS) data was used to investigate the determinants of family size in Nigeria using the geo-additive model. The fixed effect of categorical covariates were modelled using the diffuse prior, P-spline with second-order random walk for the nonlinear effect of continuous variable, spatial effects followed Markov random field priors while the exchangeable normal priors were used for the random effects of the community and household. The Negative Binomial distribution was used to handle overdispersion of the dependent variable. Inference was fully Bayesian approach. Results showed a declining effect of secondary and higher education of mother, Yoruba tribe, Christianity, family planning, mother giving birth by caesarean section and having a partner who has secondary education on family size. Big family size is positively associated with age at first birth, number of daughters in a household, being gainfully employed, married and living with partner, community and household effects.

Keywords: Bayesian analysis, family size, geo-additive model, negative binomial

Procedia PDF Downloads 533