Search results for: language variety
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5993

Search results for: language variety

2063 Ways for Improving Citation of the Cyrillic Publications

Authors: Victoria Y. Garnova, Vladimir G. Merzlikin, Denis G. Yakovlev, Andrei А. Amelenkov, Sergey V. Khudyakov

Abstract:

Assessment of novelty of studies submitted in Russian publications is given by the method citation analysis to identify scientific research with a high degree of innovation. This may be the basis of recommendations for subjects new joint projects setting of the RF and the EU. Apart from not the best rating of Russian publications (may even its lack) current IT ensure open access to the WEB-sites of these journals that make possible own expertise selective rapid assessment of the advanced developments in Russia by interested foreign investors. Cited foreign literature in Russian journals can become the subject of study to determine the innovative attractiveness of scientific research on the background a specific future-proof abroad. Authors introduced: (1) linguistic impact factor Li-f of journals for describing the share of publications in the majority language; (2) linguistic citation index Lact characterizing the significance of scientific research and linguistic top ones Ltop for evaluation of the spectral width of citing of foreign journals.

Keywords: citation analysis, linguistic citation indexes, linguistic impact factor, innovative projects

Procedia PDF Downloads 313
2062 An Analysis of Machine Translation: Instagram Translation vs Human Translation on the Perspective Translation Quality

Authors: Aulia Fitri

Abstract:

This aims to seek which part of the linguistics with the common mistakes occurred between Instagram translation and human translation. Instagram is a social media account that is widely used by people in the world. Everyone with the Instagram account can consume the captions and pictures that are shared by their friends, celebrity, and public figures across countries. Instagram provides the machine translation under its caption space that will assist users to understand the language of their non-native. The researcher takes samples from an Indonesian public figure whereas the account is followed by many followers. The public figure tries to help her followers from other countries understand her posts by putting up the English version after the Indonesian version. However, the research on Instagram account has not been done yet even though the account is widely used by the worldwide society. There are 20 samples that will be analysed on the perspective of translation quality and linguistics tools. As the MT, Instagram tends to give a literal translation without regarding the topic meant. On the other hand, the human translation tends to exaggerate the translation which leads a different meaning in English. This is an interesting study to discuss when the human nature and robotic-system influence the translation result.

Keywords: human translation, machine translation (MT), translation quality, linguistic tool

Procedia PDF Downloads 311
2061 The Role of Financial Literacy in Driving Consumer Well-Being

Authors: Amin Nazifi, Amir Raki, Doga Istanbulluoglu

Abstract:

The incorporation of technological advancements into financial services, commonly referred to as Fintech, is primarily aimed at promoting services that are accessible, convenient, and inclusive, thereby benefiting both consumers and businesses. Fintech services employ a variety of technologies, including Artificial Intelligence (AI), blockchain, and big data, to enhance the efficiency and productivity of traditional services. Cryptocurrency, a component of Fintech, is projected to be a trillion-dollar industry, with over 320 million consumers globally investing in various forms of cryptocurrencies. However, these potentially transformative services can also lead to adverse outcomes. For instance, recent Fintech innovations have been increasingly linked to misconduct and disservice, resulting in serious implications for consumer well-being. This could be attributed to the ease of access to Fintech, which enables adults to trade cryptocurrencies, shares, and stocks via mobile applications. However, there is little known about the darker aspects of technological advancements, such as Fintech. Hence, this study aims to generate scholarly insights into the design of robust and resilient Fintech services that can add value to businesses and enhance consumer well-being. Using a mixed-method approach, the study will investigate the personal and contextual factors influencing consumers’ adoption and usage of technology innovations and their impacts on consumer well-being. First, semi-structured interviews will be conducted with a sample of Fintech users until theoretical saturation is achieved. Subsequently, based on the findings of the first study, a quantitative study will be conducted to develop and empirically test the impacts of these factors on consumers’ well-being using an online survey with a sample of 300 participants experienced in using Fintech services. This study will contribute to the growing Transformative Service Research (TSR) literature by addressing the latest priorities in service research and shedding light on the impact of fintech services on consumer well-being.

Keywords: consumer well-being, financial literacy, Fintech, service innovation

Procedia PDF Downloads 61
2060 A Novel Heuristic for Analysis of Large Datasets by Selecting Wrapper-Based Features

Authors: Bushra Zafar, Usman Qamar

Abstract:

Large data sample size and dimensions render the effectiveness of conventional data mining methodologies. A data mining technique are important tools for collection of knowledgeable information from variety of databases and provides supervised learning in the form of classification to design models to describe vital data classes while structure of the classifier is based on class attribute. Classification efficiency and accuracy are often influenced to great extent by noisy and undesirable features in real application data sets. The inherent natures of data set greatly masks its quality analysis and leave us with quite few practical approaches to use. To our knowledge first time, we present a new approach for investigation of structure and quality of datasets by providing a targeted analysis of localization of noisy and irrelevant features of data sets. Machine learning is based primarily on feature selection as pre-processing step which offers us to select few features from number of features as a subset by reducing the space according to certain evaluation criterion. The primary objective of this study is to trim down the scope of the given data sample by searching a small set of important features which may results into good classification performance. For this purpose, a heuristic for wrapper-based feature selection using genetic algorithm and for discriminative feature selection an external classifier are used. Selection of feature based on its number of occurrence in the chosen chromosomes. Sample dataset has been used to demonstrate proposed idea effectively. A proposed method has improved average accuracy of different datasets is about 95%. Experimental results illustrate that proposed algorithm increases the accuracy of prediction of different diseases.

Keywords: data mining, generic algorithm, KNN algorithms, wrapper based feature selection

Procedia PDF Downloads 310
2059 The Loss of Oral Performative Semantic Influence of the Qur'an in Its Translations

Authors: Alalddin Al-Tarawneh

Abstract:

In its literal translation, the Qur’an is frequently subject to misinterpretation as a result of failures to deliver its meaning into any language. This paper relies on the genuine aspect that the Qur’an is an oral performance in its nature; and the objective of any Qur’an translation is to deliver its meaning in English. Therefore, it approaches the translation of the Qur’an beyond the usual formal linguistic approach in order to include an extra-textual factor. This factor is the recitation or oral performance of the Qur’an, that is, tajweed as it is termed in Arabic. The translations used in this paper to apply the suggested approach were carefully chosen to be representative of the problems that exist in many Qur’an translations. These translations are The Meaning of the Holy Quran: Translation and Commentary by Ali (1989), The Meaning of the Glorious Koran by Pickthall (1997/1930), and The Quran: Arabic Text with Corresponding English Meanings by Sahih (2010). Through the examples cited in this paper, it is suggested that the agents involved in producing a ‘translation’ of the Holy Qur’an have to take into account its oral aspect which yields additional senses and meanings that are not being captured by adhering to the words of the ‘written’ discourse. This paper attempts in its translation into English.

Keywords: oral performance, tajweed, Qur'an translation, recitation

Procedia PDF Downloads 139
2058 Human-Machine Cooperation in Facial Comparison Based on Likelihood Scores

Authors: Lanchi Xie, Zhihui Li, Zhigang Li, Guiqiang Wang, Lei Xu, Yuwen Yan

Abstract:

Image-based facial features can be classified into category recognition features and individual recognition features. Current automated face recognition systems extract a specific feature vector of different dimensions from a facial image according to their pre-trained neural network. However, to improve the efficiency of parameter calculation, an algorithm generally reduces the image details by pooling. The operation will overlook the details concerned much by forensic experts. In our experiment, we adopted a variety of face recognition algorithms based on deep learning, compared a large number of naturally collected face images with the known data of the same person's frontal ID photos. Downscaling and manual handling were performed on the testing images. The results supported that the facial recognition algorithms based on deep learning detected structural and morphological information and rarely focused on specific markers such as stains and moles. Overall performance, distribution of genuine scores and impostor scores, and likelihood ratios were tested to evaluate the accuracy of biometric systems and forensic experts. Experiments showed that the biometric systems were skilled in distinguishing category features, and forensic experts were better at discovering the individual features of human faces. In the proposed approach, a fusion was performed at the score level. At the specified false accept rate, the framework achieved a lower false reject rate. This paper contributes to improving the interpretability of the objective method of facial comparison and provides a novel method for human-machine collaboration in this field.

Keywords: likelihood ratio, automated facial recognition, facial comparison, biometrics

Procedia PDF Downloads 122
2057 Designing and Evaluating Pedagogic Conversational Agents to Teach Children

Authors: Silvia Tamayo-Moreno, Diana Pérez-Marín

Abstract:

In this paper, the possibility of children studying by using an interactive learning technology called Pedagogic Conversational Agent is presented. The main benefit is that the agent is able to adapt the dialogue to each student and to provide automatic feedback. Moreover, according to Math teachers, in many cases students are unable to solve the problems even knowing the procedure to solve them, because they do not understand what they have to do. The hypothesis is that if students are helped to understand what they have to solve, they will be able to do it. Taken that into account, we have started the development of Dr. Roland, an agent to help students understand Math problems following a User-Centered Design methodology. The use of this methodology is proposed, for the first time, to design pedagogic agents to teach any subject from Secondary down to Pre-Primary education. The reason behind proposing a methodology is that while working on this project, we noticed the lack of literature to design and evaluate agents. To cover this gap, we describe how User-Centered Design can be applied, and which usability techniques can be applied to evaluate the agent.

Keywords: pedagogic conversational agent, human-computer interaction, user-centered design, natural language interface

Procedia PDF Downloads 314
2056 Sensing of Cancer DNA Using Resonance Frequency

Authors: Sungsoo Na, Chanho Park

Abstract:

Lung cancer is one of the most common severe diseases driving to the death of a human. Lung cancer can be divided into two cases of small-cell lung cancer (SCLC) and non-SCLC (NSCLC), and about 80% of lung cancers belong to the case of NSCLC. From several studies, the correlation between epidermal growth factor receptor (EGFR) and NSCLCs has been investigated. Therefore, EGFR inhibitor drugs such as gefitinib and erlotinib have been used as lung cancer treatments. However, the treatments result showed low response (10~20%) in clinical trials due to EGFR mutations that cause the drug resistance. Patients with resistance to EGFR inhibitor drugs usually are positive to KRAS mutation. Therefore, assessment of EGFR and KRAS mutation is essential for target therapies of NSCLC patient. In order to overcome the limitation of conventional therapies, overall EGFR and KRAS mutations have to be monitored. In this work, the only detection of EGFR will be presented. A variety of techniques has been presented for the detection of EGFR mutations. The standard detection method of EGFR mutation in ctDNA relies on real-time polymerase chain reaction (PCR). Real-time PCR method provides high sensitive detection performance. However, as the amplification step increases cost effect and complexity increase as well. Other types of technology such as BEAMing, next generation sequencing (NGS), an electrochemical sensor and silicon nanowire field-effect transistor have been presented. However, those technologies have limitations of low sensitivity, high cost and complexity of data analyzation. In this report, we propose a label-free and high-sensitive detection method of lung cancer using quartz crystal microbalance based platform. The proposed platform is able to sense lung cancer mutant DNA with a limit of detection of 1nM.

Keywords: cancer DNA, resonance frequency, quartz crystal microbalance, lung cancer

Procedia PDF Downloads 227
2055 The Necessity to Standardize Procedures of Providing Engineering Geological Data for Designing Road and Railway Tunneling Projects

Authors: Atefeh Saljooghi Khoshkar, Jafar Hassanpour

Abstract:

One of the main problems of the design stage relating to many tunneling projects is the lack of an appropriate standard for the provision of engineering geological data in a predefined format. In particular, this is more reflected in highway and railroad tunnel projects in which there is a number of tunnels and different professional teams involved. In this regard, comprehensive software needs to be designed using the accepted methods in order to help engineering geologists to prepare standard reports, which contain sufficient input data for the design stage. Regarding this necessity, applied software has been designed using macro capabilities and Visual Basic programming language (VBA) through Microsoft Excel. In this software, all of the engineering geological input data, which are required for designing different parts of tunnels, such as discontinuities properties, rock mass strength parameters, rock mass classification systems, boreability classification, the penetration rate, and so forth, can be calculated and reported in a standard format.

Keywords: engineering geology, rock mass classification, rock mechanic, tunnel

Procedia PDF Downloads 70
2054 Liquid-Liquid Plug Flow Characteristics in Microchannel with T-Junction

Authors: Anna Yagodnitsyna, Alexander Kovalev, Artur Bilsky

Abstract:

The efficiency of certain technological processes in two-phase microfluidics such as emulsion production, nanomaterial synthesis, nitration, extraction processes etc. depends on two-phase flow regimes in microchannels. For practical application in chemistry and biochemistry it is very important to predict the expected flow pattern for a large variety of fluids and channel geometries. In the case of immiscible liquids, the plug flow is a typical and optimal regime for chemical reactions and needs to be predicted by empirical data or correlations. In this work flow patterns of immiscible liquid-liquid flow in a rectangular microchannel with T-junction are investigated. Three liquid-liquid flow systems are considered, viz. kerosene – water, paraffin oil – water and castor oil – paraffin oil. Different flow patterns such as parallel flow, slug flow, plug flow, dispersed (droplet) flow, and rivulet flow are observed for different velocity ratios. New flow pattern of the parallel flow with steady wavy interface (serpentine flow) has been found. It is shown that flow pattern maps based on Weber numbers for different liquid-liquid systems do not match well. Weber number multiplied by Ohnesorge number is proposed as a parameter to generalize flow maps. Flow maps based on this parameter are superposed well for all liquid-liquid systems of this work and other experiments. Plug length and velocity are measured for the plug flow regime. When dispersed liquid wets channel walls plug length cannot be predicted by known empirical correlations. By means of particle tracking velocimetry technique instantaneous velocity fields in a plug flow regime were measured. Flow circulation inside plug was calculated using velocity data that can be useful for mass flux prediction in chemical reactions.

Keywords: flow patterns, hydrodynamics, liquid-liquid flow, microchannel

Procedia PDF Downloads 387
2053 Gnss Aided Photogrammetry for Digital Mapping

Authors: Muhammad Usman Akram

Abstract:

This research work based on GNSS-Aided Photogrammetry for Digital Mapping. It focuses on topographic survey of an area or site which is to be used in future Planning & development (P&D) or can be used for further, examination, exploration, research and inspection. Survey and Mapping in hard-to-access and hazardous areas are very difficult by using traditional techniques and methodologies; as well it is time consuming, labor intensive and has less precision with limited data. In comparison with the advance techniques it is saving with less manpower and provides more precise output with a wide variety of multiple data sets. In this experimentation, Aerial Photogrammetry technique is used where an UAV flies over an area and captures geocoded images and makes a Three-Dimensional Model (3-D Model), UAV operates on a user specified path or area with various parameters; Flight altitude, Ground sampling distance (GSD), Image overlapping, Camera angle etc. For ground controlling, a network of points on the ground would be observed as a Ground Control point (GCP) using Differential Global Positioning System (DGPS) in PPK or RTK mode. Furthermore, that raw data collected by UAV and DGPS will be processed in various Digital image processing programs and Computer Aided Design software. From which as an output we obtain Points Dense Cloud, Digital Elevation Model (DEM) and Ortho-photo. The imagery is converted into geospatial data by digitizing over Ortho-photo, DEM is further converted into Digital Terrain Model (DTM) for contour generation or digital surface. As a result, we get Digital Map of area to be surveyed. In conclusion, we compared processed data with exact measurements taken on site. The error will be accepted if the amount of error is not breached from survey accuracy limits set by concerned institutions.

Keywords: photogrammetry, post processing kinematics, real time kinematics, manual data inquiry

Procedia PDF Downloads 13
2052 Effect of Chemical Mutagen on Seeds Germination of Lima Bean

Authors: G. Ultanbekova, Zh. Suleimenova, Zh. Rakhmetova, G. Mombekova, S. Mantieva

Abstract:

Plant Growth Promoting Rhizobacteria (PGPR) are a group of free-living bacteria that colonize the rhizosphere, enhance plant growth of many cereals and other important agricultural crops and protect plants from disease and abiotic stresses through a wide variety of mechanisms. The use of PGPR has been proven to be an environmentally sound way of increasing crop yields by facilitating plant growth. In the present study, strain improvement of PGPR isolates were carried out by chemical mutagenesis for the improvement of growth and yield of lima bean. Induced mutagenesis is widely used for the selection of microorganisms producing biologically active substances and further improving their activities. Strain improvement is usually done by classical mutagenesis which involves exposing the microbes to chemical or physical mutagens. The strains of Pseudomonas putida 4/1, Azotobacter chroococcum Р-29 and Bacillus subtilis were subjected to mutation process for strain improvement by treatment with a chemical agent (sodium nitrite) to cause mutation and were observed for its consequent action on the seeds germination and plant growth of lima bean (Phaseolus lunatus). Bacterial mutant strains of Pseudomonas putida M-1, Azotobacter chroococcum M-1 and Bacillus subtilis M-1, treated with sodium nitrite in the concentration of 5 mg/ml for 120 min, were found effective to enhance the germination of lima bean seeds compared to parent strains. Moreover, treatment of the lima bean seeds with a mutant strain of Bacillus subtilis M-1 had a significant stimulation effect on plant growth. The length of the stems and roots of lima bean treated with Bacillus subtilis M-1 increased significantly in comparison with parent strain in 1.6 and 1.3 times, respectively.

Keywords: chemical mutagenesis, germination, kidney bean, plant growth promoting rhizobacteria (PGPR)

Procedia PDF Downloads 192
2051 The Analysis of Changes in Urban Hierarchy of Isfahan Province in the Fifty-Year Period (1956-2006)

Authors: Hamidreza Joudaki, Yousefali Ziari

Abstract:

The appearance of city and urbanism is one of the important processes which have affected social communities. Being industrialized urbanism developed along with each other in the history. In addition, they have had simple relationship for more than six thousand years, that is, from the appearance of the first cities. In 18th century by coming out of industrial capitalism, progressive development took place in urbanism in the world. In Iran, the city of each region made its decision by itself and the capital of region (downtown) was the only central part and also the regional city without any hierarchy, controlled its realm. However, this method of ruling during these three decays, because of changing in political, social and economic issues that have caused changes in rural and urban relationship. Moreover, it has changed the variety of performance of cities and systematic urban network in Iran. Today, urban system has very vast imbalanced apace and performance. In Isfahan, the trend of urbanism is like the other part of Iran and systematic urban hierarchy is not suitable and normal. This article is a quantitative and analytical. The statistical communities are Isfahan Province cities and the changes in urban network and its hierarchy during the period of fifty years (1956 -2006) has been surveyed. In addition, those data have been analyzed by model of Rank and size and Entropy index. In this article Iran cities and also the factor of entropy of primate city and urban hierarchy of Isfahan Province have been introduced. Urban residents of this Province have been reached from 55 percent to 83% (2006). As we see the analytical data reflects that there is mismatching and imbalance between cities. Because the entropy index was.91 in 1956.And it decreased to.63 in 2006. Isfahan city is the primate city in the whole of these periods. Moreover, the second and the third cities have population gap with regard to the other cities and finally, they do not follow the system of rank-size.

Keywords: urban network, urban hierarchy, primate city, Isfahan province, urbanism, first cities

Procedia PDF Downloads 248
2050 A Corpus-Based Study on the Lexical, Syntactic and Sequential Features across Interpreting Types

Authors: Qianxi Lv, Junying Liang

Abstract:

Among the various modes of interpreting, simultaneous interpreting (SI) is regarded as a ‘complex’ and ‘extreme condition’ of cognitive tasks while consecutive interpreters (CI) do not have to share processing capacity between tasks. Given that SI exerts great cognitive demand, it makes sense to posit that the output of SI may be more compromised than that of CI in the linguistic features. The bulk of the research has stressed the varying cognitive demand and processes involved in different modes of interpreting; however, related empirical research is sparse. In keeping with our interest in investigating the quantitative linguistic factors discriminating between SI and CI, the current study seeks to examine the potential lexical simplification, syntactic complexity and sequential organization mechanism with a self-made inter-model corpus of transcribed simultaneous and consecutive interpretation, translated speech and original speech texts with a total running word of 321960. The lexical features are extracted in terms of the lexical density, list head coverage, hapax legomena, and type-token ratio, as well as core vocabulary percentage. Dependency distance, an index for syntactic complexity and reflective of processing demand is employed. Frequency motif is a non-grammatically-bound sequential unit and is also used to visualize the local function distribution of interpreting the output. While SI is generally regarded as multitasking with high cognitive load, our findings evidently show that CI may impose heavier or taxing cognitive resource differently and hence yields more lexically and syntactically simplified output. In addition, the sequential features manifest that SI and CI organize the sequences from the source text in different ways into the output, to minimize the cognitive load respectively. We reasoned the results in the framework that cognitive demand is exerted both on maintaining and coordinating component of Working Memory. On the one hand, the information maintained in CI is inherently larger in volume compared to SI. On the other hand, time constraints directly influence the sentence reformulation process. The temporal pressure from the input in SI makes the interpreters only keep a small chunk of information in the focus of attention. Thus, SI interpreters usually produce the output by largely retaining the source structure so as to relieve the information from the working memory immediately after formulated in the target language. Conversely, CI interpreters receive at least a few sentences before reformulation, when they are more self-paced. CI interpreters may thus tend to retain and generate the information in a way to lessen the demand. In other words, interpreters cope with the high demand in the reformulation phase of CI by generating output with densely distributed function words, more content words of higher frequency values and fewer variations, simpler structures and more frequently used language sequences. We consequently propose a revised effort model based on the result for a better illustration of cognitive demand during both interpreting types.

Keywords: cognitive demand, corpus-based, dependency distance, frequency motif, interpreting types, lexical simplification, sequential units distribution, syntactic complexity

Procedia PDF Downloads 168
2049 Comparative Proteomic Analysis of Rice bri1 Mutant Leaves at Jointing-Booting Stage

Authors: Jiang Xu, Daoping Wang, Yinghong Pan

Abstract:

The jointing-booting stage is a critical period of both vegetative growth and reproductive growth in rice. Therefore, the proteomic analysis of the mutant Osbri1, whose corresponding gene OsBRI1 encodes the putative BRs receptor OsBRI1, at jointing-booting stage is very important for understanding the effects of BRs on vegetative and reproductive growth. In this study, the proteomes of leaves from an allelic mutant of the DWARF 61 (D61, OsBRI1) gene, Fn189 (dwarf54, d54) and its wild-type variety T65 (Taichung 65) at jointing-booting stage were analysed by using a Q Exactive plus orbitrap mass spectrometer, and more than 3,100 proteins were identified in each sample. Ontology analysis showed that these proteins distribute in various space of the cells, such as the chloroplast, mitochondrion, and nucleus, they functioned as structural components and/or catalytic enzymes and involved in many physiological processes. Moreover, quantitative analysis displayed that 266 proteins were differentially expressed in two samples, among them, 77 proteins decreased and 189 increased more than two times in Fn189 compared with T65, the proteins whose content decreased in Fn189 including b5-like Heme/Steroid binding domain containing protein, putative retrotransposon protein, putative glutaminyl-tRNA synthetase, and higher content proteins such as mTERF, putative Oligopeptidase homologue, zinc knuckle protein, and so on. A former study founded that the transcription level of a mTERF was up-regulated in the leaves of maize seedling after EBR treatment. In our experiments, it was interesting that one mTERF protein increased, but another mTERF decreased in leaves of Fn189 at jointing-booting stage, which suggested that BRs may have differential regulation mechanisms on the expression of various mTERF proteins. The relationship between other differential proteins with BRs is still unclear, and the effects of BRs on rice protein contents and its regulation mechanisms still need further research.

Keywords: bri1 mutant, jointing-booting stage, proteomic analysis, rice

Procedia PDF Downloads 240
2048 Via ad Reducendam Intensitatem Energiae Industrialis in Provincia Sino ad Conservationem Energiae

Authors: John Doe

Abstract:

This paper presents the research project “Escape Through Culture”, which is co-funded by the European Union and national resources through the Operational Programme “Competitiveness, Entrepreneurship and Innovation” 2014-2020 and the Single RTDI State Aid Action "RESEARCH - CREATE - INNOVATE". The project implementation is assumed by three partners, (1) the Computer Technology Institute and Press "Diophantus" (CTI), experienced with the design and implementation of serious games, natural language processing and ICT in education, (2) the Laboratory of Environmental Communication and Audiovisual Documentation (LECAD), part of the University of Thessaly, Department of Architecture, which is experienced with the study of creative transformation and reframing of the urban and environmental multimodal experiences through the use of AR and VR technologies, and (3) “Apoplou”, an IT Company with experience in the implementation of interactive digital applications. The research project proposes the design of innovative infrastructure of digital educational escape games for mobile devices and computers, with the use of Virtual Reality and Augmented Reality for the promotion of Greek cultural heritage in Greece and abroad. In particular, the project advocates the combination of Greek cultural heritage and literature, digital technologies advancements and the implementation of innovative gamifying practices. The cultural experience of the players will take place in 3 layers: (1) In space: the digital games produced are going to utilize the dual character of the space as a cultural landscape (the real space - landscape but also the space - landscape as presented with the technologies of augmented reality and virtual reality). (2) In literary texts: the selected texts of Greek writers will support the sense of place and the multi-sensory involvement of the user, through the context of space-time, language and cultural characteristics. (3) In the philosophy of the "escape game" tool: whether played in a computer environment, indoors or outdoors, the spatial experience is one of the key components of escape games. The innovation of the project lies both in the junction of Augmented/Virtual Reality with the promotion of cultural points of interest, as well as in the interactive, gamified practices of literary texts. The digital escape game infrastructure will be highly interactive, integrating the projection of Greek landscape cultural elements and digital literary text analysis, supporting the creation of escape games, establishing and highlighting new playful ways of experiencing iconic cultural places, such as Elefsina, Skiathos etc. The literary texts’ content will relate to specific elements of the Greek cultural heritage depicted by prominent Greek writers and poets. The majority of the texts will originate from Greek educational content available in digital libraries and repositories developed and maintained by CTI. The escape games produced will be available for use during educational field trips, thematic tourism holidays, etc. In this paper, the methodology adopted for infrastructure development will be presented. The research is based on theories of place, gamification, gaming development, making use of corpus linguistics concepts and digital humanities practices for the compilation and the analysis of literary texts.

Keywords: escape games, cultural landscapes, gamification, digital humanities, literature

Procedia PDF Downloads 231
2047 Conversion of Sweet Sorghum Bagasse to Sugars for Succinic Acid Production

Authors: Enlin Lo, Ioannis Dogaris, George Philippidis

Abstract:

Succinic acid is a compound used for manufacturing lacquers, resins, and other coating chemicals. It is also used in the food and beverage industry as a flavor additive. It is predominantly manufactured from petrochemicals, but it can also be produced by fermentation of sugars from renewable feedstocks, such as plant biomass. Bio-based succinic acid has great potential in becoming a platform chemical (building block) for commodity and high-value chemicals. In this study, the production of bio-based succinic acid from sweet sorghum was investigated. Sweet sorghum has high fermentable sugar content and can be cultivated in a variety of climates. In order to avoid competition with food feedstocks, its non-edible ‘bagasse’ (the fiber part after extracting the juice) was targeted. Initially, various conditions of pretreating sweet sorghum bagasse (SSB) were studied in an effort to remove most of the non-fermentable components and expose the cellulosic fiber containing the fermentable sugars (glucose). Concentrated (83%) phosphoric acid was utilized at temperatures 50-80 oC for 30-60 min at various SSB loadings (10-15%), coupled with enzymatic hydrolysis using commercial cellulase (Ctec2, Novozymes) enzyme, to identify the conditions that lead to the highest glucose yields for subsequent fermentation to succinic acid. As the pretreatment temperature and duration increased, the bagasse color changed from light brown to dark brown-black, indicating decomposition, which ranged from 15% to 72%, while the theoretical glucose yield is 91%. With Minitab software statistical analysis, a model was built to identify the optimal pretreatment condition for maximum glucose released. The projected theoretical bio-based succinic acid production is 23g per 100g of SSB, which will be confirmed with fermentation experiments using the bacterium Actinobacillus succinogenes.

Keywords: biomass, cellulose, enzymatic hydrolysis, fermentation, pretreatment, succinic acid

Procedia PDF Downloads 210
2046 How to Modernise the ECN

Authors: Dorota Galeza

Abstract:

This paper argues that networks, such as the ECN and the American network, are affected by certain small events which are inherent to path dependence and preclude the full evolution towards efficiency. It is advocated that the American network is superior to the ECN in many respects due to its greater flexibility and longer history. This stems in particular from the creation of the American network, which was based on a small number of cases. Such structure encourages further changes and modifications which are not necessarily radical. The ECN, by contrast, was established by legislative action, which explains its rigid structure and resistance to change. It might be the case that the ECN is subject not so much to path dependence but to past dependence. It might have to be replaced, as happened to its predecessor. This paper is an attempt to transpose the superiority of the American network on to the ECN. It looks at concepts such as judicial cooperation, harmonization of procedure, peer review and regulatory impact assessments (RIAs), and dispute resolution procedures. The aim is to adopt these concepts into the EU setting without recourse to legal transplantation. The major difficulty is that many of these concepts have been tested only in the US and it is difficult to tell whether they could be modified to meet EU standards. Concepts such as judicial cooperation might be difficult due to different language traditions in EU member states. It is hoped that greater flexibility, as in the American network, would boost legitimacy and transparency.

Keywords: ECN, networks, regulation, competition

Procedia PDF Downloads 421
2045 A Sustainable Approach for Waste Management: Automotive Waste Transformation into High Value Titanium Nitride Ceramic

Authors: Mohannad Mayyas, Farshid Pahlevani, Veena Sahajwalla

Abstract:

Automotive shredder residue (ASR) is an industrial waste, generated during the recycling process of End-of-life vehicles. The large increasing production volumes of ASR and its hazardous content have raised concerns worldwide, leading some countries to impose more restrictions on ASR waste disposal and encouraging researchers to find efficient solutions for ASR processing. Although a great deal of research work has been carried out, all proposed solutions, to our knowledge, remain commercially and technically unproven. While the volume of waste materials continues to increase, the production of materials from new sustainable sources has become of great importance. Advanced ceramic materials such as nitrides, carbides and borides are widely used in a variety of applications. Among these ceramics, a great deal of attention has been recently paid to Titanium nitride (TiN) owing to its unique characteristics. In our study, we propose a new sustainable approach for ASR management where TiN nanoparticles with ideal particle size ranging from 200 to 315 nm can be synthesized as a by-product. In this approach, TiN is thermally synthesized by nitriding pressed mixture of automotive shredder residue (ASR) incorporated with titanium oxide (TiO2). Results indicated that TiO2 influences and catalyses degradation reactions of ASR and helps to achieve fast and full decomposition. In addition, the process resulted in titanium nitride (TiN) ceramic with several unique structures (porous nanostructured, polycrystalline, micro-spherical and nano-sized structures) that were simply obtained by tuning the ratio of TiO2 to ASR, and a product with appreciable TiN content of around 85% was achieved after only one hour nitridation at 1550 °C.

Keywords: automotive shredder residue, nano-ceramics, waste treatment, titanium nitride, thermal conversion

Procedia PDF Downloads 290
2044 Fabrication of Superhydrophobic Galvanized Steel by Sintering Zinc Nanopowder

Authors: Francisco Javier Montes Ruiz-Cabello, Guillermo Guerrero-Vacas, Sara Bermudez-Romero, Miguel Cabrerizo Vilchez, Miguel Angel Rodriguez-Valverde

Abstract:

Galvanized steel is one of the widespread metallic materials used in industry. It consists on a iron-based alloy (steel) coated with a layer of zinc with variable thickness. The zinc is aimed to prevent the inner steel from corrosion and staining. Its production is cheaper than the stainless steel and this is the reason why it is employed in the construction of materials with large dimensions in aeronautics, urban/ industrial edification or ski-resorts. In all these applications, turning the natural hydrophilicity of the metal surface into superhydrophobicity is particularly interesting and would open a wide variety of additional functionalities. However, producing a superhydrophobic surface on galvanized steel may be a very difficult task. Superhydrophobic surfaces are characterized by a specific surface texture which is reached either by coating the surface with a material that incorporates such texture, or by conducting several roughening methods. Since galvanized steel is already a coated material, the incorporation of a second coating may be undesired. On the other hand, the methods that are recurrently used to incorporate the surface texture leading to superhydrophobicity in metals are aggressive and may damage their surface. In this work, we used a novel strategy which goal is to produce superhydrophobic galvanized steel by a two-step non-aggressive process. The first process is aimed to create a hierarchical structure by incorporating zinc nanoparticles sintered on the surface at a temperature slightly lower than the zinc’s melting point. The second one is a hydrophobization by a thick fluoropolymer layer deposition. The wettability of the samples is characterized in terms of tilting plate and bouncing drop experiments, while the roughness is analyzed by confocal microscopy. The durability of the produced surfaces was also explored.

Keywords: galvanaized steel, superhydrophobic surfaces, sintering nanoparticles, zinc nanopowder

Procedia PDF Downloads 144
2043 Evaluation and Analysis of ZigBee-Based Wireless Sensor Network: Home Monitoring as Case Study

Authors: Omojokun G. Aju, Adedayo O. Sule

Abstract:

ZigBee wireless sensor and control network is one of the most popularly deployed wireless technologies in recent years. This is because ZigBee is an open standard lightweight, low-cost, low-speed, low-power protocol that allows true operability between systems. It is built on existing IEEE 802.15.4 protocol and therefore combines the IEEE 802.15.4 features and newly added features to meet required functionalities thereby finding applications in wide variety of wireless networked systems. ZigBee‘s current focus is on embedded applications of general-purpose, inexpensive, self-organising networks which requires low to medium data rates, high number of nodes and very low power consumption such as home/industrial automation, embedded sensing, medical data collection, smart lighting, safety and security sensor networks, and monitoring systems. Although the ZigBee design specification includes security features to protect data communication confidentiality and integrity, however, when simplicity and low-cost are the goals, security is normally traded-off. A lot of researches have been carried out on ZigBee technology in which emphasis has mainly been placed on ZigBee network performance characteristics such as energy efficiency, throughput, robustness, packet delay and delivery ratio in different scenarios and applications. This paper investigate and analyse the data accuracy, network implementation difficulties and security challenges of ZigBee network applications in star-based and mesh-based topologies with emphases on its home monitoring application using the ZigBee ProBee ZE-10 development boards for the network setup. The paper also expose some factors that need to be considered when designing ZigBee network applications and suggest ways in which ZigBee network can be designed to provide more resilient to network attacks.

Keywords: home monitoring, IEEE 802.14.5, topology, wireless security, wireless sensor network (WSN), ZigBee

Procedia PDF Downloads 374
2042 Comparative Analysis of Automation Testing Tools

Authors: Amit Bhanushali

Abstract:

In the ever-changing landscape of software development, automated software testing has emerged as a critical component of the Software Development Life Cycle (SDLC). This research undertakes a comparative study of three major automated testing tools -UFT, Selenium, and RPA- evaluating them on usability, maintenance, and effectiveness. Leveraging existing JAVA-based applications as test cases, the study aims to guide testers in selecting the optimal tool for specific applications. By exploring key features such as source and licensing, testing expenses, object repositories, usability, and language support, the research provides practical insights into UFT, Selenium, and RPA. Acknowledging the pivotal role of these tools in streamlining testing processes amid time constraints and resource limitations, the study assists professionals in making informed choices aligned with their organizational needs.

Keywords: software testing tools, software development lifecycle (SDLC), test automation frameworks, automated software, JAVA-based, UFT, selenium and RPA (robotic process automation), source and licensing, object repository

Procedia PDF Downloads 87
2041 The Training Demands of Nursing Assistants on Urinary Incontinence in Nursing Homes: A Mixed Methods Study

Authors: Lulu Liao, Huijing Chen, Yinan Zhao, Hongting Ning, Hui Feng

Abstract:

Urinary tract infection rate is an important index of care quality in nursing homes. The aim of the study is to understand the nursing assistant's current knowledge and attitudes of urinary incontinence and to explore related stakeholders' viewpoint about urinary incontinence training. This explanatory sequential study used Knowledge, Practice, and Attitude Model (KAP) and Adult Learning Theories, as the conceptual framework. The researchers collected data from 509 nursing assistants in sixteen nursing homes in Hunan province in China. The questionnaire survey was to assess the knowledge and attitude of urinary incontinence of nursing assistants. On the basis of quantitative research and combined with focus group, training demands were identified, which nurse managers should adopt to improve nursing assistants’ professional practice ability in urinary incontinence. Most nursing assistants held the poor knowledge (14.0 ± 4.18) but had positive attitudes (35.5 ± 3.19) toward urinary incontinence. There was a significant positive correlation between urinary incontinence knowledge and nursing assistants' year of work and educational level, urinary incontinence attitude, and education level (p < 0.001). Despite a general awareness of the importance of prevention of urinary tract infections, not all nurse managers fully valued the training in urinary incontinence compared with daily care training. And the nursing assistants required simple education resources to equip them with skills to address problem about urinary incontinence. The variety of learning methods also highlighted the need for educational materials, and nursing assistants had shown a strong interest in online learning. Related education material should be developed to meet the learning need of nurse assistants and provide suitable training method for planned quality improvement in urinary incontinence.

Keywords: mixed methods, nursing assistants, nursing homes, urinary incontinence

Procedia PDF Downloads 131
2040 Social Networking Application: What Is Their Quality and How Can They Be Adopted in Open Distance Learning Environments?

Authors: Asteria Nsamba

Abstract:

Social networking applications and tools have become compelling platforms for generating and sharing knowledge across the world. Social networking applications and tools refer to a variety of social media platforms which include Facebook, Twitter WhatsApp, blogs and Wikis. The most popular of these platforms are Facebook, with 2.41 billion active users on a monthly basis, followed by WhatsApp with 1.6 billion users and Twitter with 330 million users. These communication platforms have not only impacted social lives but have also impacted students’ learning, across different delivery modes in higher education: distance, conventional and blended learning modes. With this amount of interest in these platforms, knowledge sharing has gained importance within the context in which it is required. In open distance learning (ODL) contexts, social networking platforms can offer students and teachers the platform on which to create and share knowledge, and form learning collaborations. Thus, they can serve as support mechanisms to increase interactions and reduce isolation and loneliness inherent in ODL. Despite this potential and opportunity, research indicates that many ODL teachers are not inclined to using social media tools in learning. Although it is unclear why these tools are uncommon in these environments, concerns raised in the literature have indicated that many teachers have not mastered the art of teaching with technology. Using technological, pedagogical content knowledge (TPCK) and product quality theory, and Bloom’s Taxonomy as lenses, this paper is aimed at; firstly, assessing the quality of three social media applications: Facebook, Twitter and WhatsApp, in order to determine the extent to which they are suitable platforms for teaching and learning, in terms of content generation, information sharing and learning collaborations. Secondly, the paper demonstrates the application of teaching, learning and assessment using Bloom’s Taxonomy.

Keywords: distance education, quality, social networking tools, TPACK

Procedia PDF Downloads 119
2039 A Digital Environment for Developing Mathematical Abilities in Children with Autism Spectrum Disorder

Authors: M. Isabel Santos, Ana Breda, Ana Margarida Almeida

Abstract:

Research on academic abilities of individuals with autism spectrum disorder (ASD) underlines the importance of mathematics interventions. Yet the proposal of digital applications for children and youth with ASD continues to attract little attention, namely, regarding the development of mathematical reasoning, being the use of the digital technologies an area of great interest for individuals with this disorder and its use is certainly a facilitative strategy in the development of their mathematical abilities. The use of digital technologies can be an effective way to create innovative learning opportunities to these students and to develop creative, personalized and constructive environments, where they can develop differentiated abilities. The children with ASD often respond well to learning activities involving information presented visually. In this context, we present the digital Learning Environment on Mathematics for Autistic children (LEMA) that was a research project conducive to a PhD in Multimedia in Education and was developed by the Thematic Line Geometrix, located in the Department of Mathematics, in a collaboration effort with DigiMedia Research Center, of the Department of Communication and Art (University of Aveiro, Portugal). LEMA is a digital mathematical learning environment which activities are dynamically adapted to the user’s profile, towards the development of mathematical abilities of children aged 6–12 years diagnosed with ASD. LEMA has already been evaluated with end-users (both students and teacher’s experts) and based on the analysis of the collected data readjustments were made, enabling the continuous improvement of the prototype, namely considering the integration of universal design for learning (UDL) approaches, which are of most importance in ASD, due to its heterogeneity. The learning strategies incorporated in LEMA are: (i) provide options to custom choice of math activities, according to user’s profile; (ii) integrates simple interfaces with few elements, presenting only the features and content needed for the ongoing task; (iii) uses a simple visual and textual language; (iv) uses of different types of feedbacks (auditory, visual, positive/negative reinforcement, hints with helpful instructions including math concept definitions, solved math activities using split and easier tasks and, finally, the use of videos/animations that show a solution to the proposed activity); (v) provides information in multiple representation, such as text, video, audio and image for better content and vocabulary understanding in order to stimulate, motivate and engage users to mathematical learning, also helping users to focus on content; (vi) avoids using elements that distract or interfere with focus and attention; (vii) provides clear instructions and orientation about tasks to ease the user understanding of the content and the content language, in order to stimulate, motivate and engage the user; and (viii) uses buttons, familiarly icons and contrast between font and background. Since these children may experience little sensory tolerance and may have an impaired motor skill, besides the user to have the possibility to interact with LEMA through the mouse (point and click with a single button), the user has the possibility to interact with LEMA through Kinect device (using simple gesture moves).

Keywords: autism spectrum disorder, digital technologies, inclusion, mathematical abilities, mathematical learning activities

Procedia PDF Downloads 109
2038 Quantitative Evaluation of Supported Catalysts Key Properties from Electron Tomography Studies: Assessing Accuracy Using Material-Realistic 3D-Models

Authors: Ainouna Bouziane

Abstract:

The ability of Electron Tomography to recover the 3D structure of catalysts, with spatial resolution in the subnanometer scale, has been widely explored and reviewed in the last decades. A variety of experimental techniques, based either on Transmission Electron Microscopy (TEM) or Scanning Transmission Electron Microscopy (STEM) have been used to reveal different features of nanostructured catalysts in 3D, but High Angle Annular Dark Field imaging in STEM mode (HAADF-STEM) stands out as the most frequently used, given its chemical sensitivity and avoidance of imaging artifacts related to diffraction phenomena when dealing with crystalline materials. In this regard, our group has developed a methodology that combines image denoising by undecimated wavelet transforms (UWT) with automated, advanced segmentation procedures and parameter selection methods using CS-TVM (Compressed Sensing-total variation minimization) algorithms to reveal more reliable quantitative information out of the 3D characterization studies. However, evaluating the accuracy of the magnitudes estimated from the segmented volumes is also an important issue that has not been properly addressed yet, because a perfectly known reference is needed. The problem particularly complicates in the case of multicomponent material systems. To tackle this key question, we have developed a methodology that incorporates volume reconstruction/segmentation methods. In particular, we have established an approach to evaluate, in quantitative terms, the accuracy of TVM reconstructions, which considers the influence of relevant experimental parameters like the range of tilt angles, image noise level or object orientation. The approach is based on the analysis of material-realistic, 3D phantoms, which include the most relevant features of the system under analysis.

Keywords: electron tomography, supported catalysts, nanometrology, error assessment

Procedia PDF Downloads 75
2037 Evaluation of Elements Impurities in Drugs According to Pharmacopoeia by use FESEM-EDS Technique

Authors: Rafid Doulab

Abstract:

Elemental Impurities in the Pharmaceuticals industryis are indispensable to ensure pharmaceuticalssafety for 24 elements. Although atomic absorption and inductively coupled plasma are used in the U.S Pharmacopeia and the European Pharmacopoeia, FESEM with energy dispersive spectrometers can be applied as an alternative analysis method for quantitative and qualitative results for a variety of elements without chemical pretreatment, unlike other techniques. This technique characterizes by shortest time, with more less contamination, no reagent consumption, and generation of minimal residue or waste, as well as sample preparations time limiting, with minimal analysis error. Simple dilution for powder or direct analysis for liquid, we analyzed the usefulness of EDS method in testing with field emission scanning electron microscopy (FESEM, SUPRA 55 Carl Zeiss Germany) with an X-ray energy dispersion (XFlash6l10 Bruker Germany). The samples analyzed directly without coating by applied 5µ of known concentrated diluted sample on carbon stub with accelerated voltage according to sample thickness, the result for this spot was in atomic percentage, and by Avogadro converted factor, the final result will be in microgram. Conclusion and recommendation: The conclusion of this study is application of FESEM-EDS in US pharmacopeia and ICH /Q3D guideline to reach a high-precision and accurate method in element impurities analysis of drugs or bulk materials to determine the permitted daily exposure PDE in liquid or solid specimens, and to obtain better results than other techniques, by the way it does not require complex methods or chemicals for digestion, which interfere with the final results with the possibility of to keep the sample at any time for re analysis. The recommendation is to use this technique in pharmacopeia as standard methods like inductively coupled plasma both ICP-AES, ICP-OES, and ICP-MS.

Keywords: pharmacopoeia, FESEM-EDS, element impurities, atomic concentration

Procedia PDF Downloads 112
2036 Determination of Temperature Dependent Characteristic Material Properties of Commercial Thermoelectric Modules

Authors: Ahmet Koyuncu, Abdullah Berkan Erdogmus, Orkun Dogu, Sinan Uygur

Abstract:

Thermoelectric modules are integrated to electronic components to keep their temperature in specific values in electronic cooling applications. They can be used in different ambient temperatures. The cold side temperatures of thermoelectric modules depend on their hot side temperatures, operation currents, and heat loads. Performance curves of thermoelectric modules are given at most two different hot surface temperatures in product catalogs. Characteristic properties are required to select appropriate thermoelectric modules in thermal design phase of projects. Generally, manufacturers do not provide characteristic material property values of thermoelectric modules to customers for confidentiality. Common commercial software applied like ANSYS ICEPAK, FloEFD, etc., include thermoelectric modules in their libraries. Therefore, they can be easily used to predict the effect of thermoelectric usage in thermal design. Some software requires only the performance values in different temperatures. However, others like ICEPAK require three temperature-dependent equations for material properties (Seebeck coefficient (α), electrical resistivity (β), and thermal conductivity (γ)). Since the number and the variety of thermoelectric modules are limited in this software, definitions of characteristic material properties of thermoelectric modules could be required. In this manuscript, the method of derivation of characteristic material properties from the datasheet of thermoelectric modules is presented. Material characteristics were estimated from two different performance curves by experimentally and numerically in this study. Numerical calculations are accomplished in ICEPAK by using a thermoelectric module exists in the ICEPAK library. A new experimental setup was established to perform experimental study. Because of similar results of numerical and experimental studies, it can be said that proposed equations are approved. This approximation can be suggested for the analysis includes different type or brand of TEC modules.

Keywords: electrical resistivity, material characteristics, thermal conductivity, thermoelectric coolers, seebeck coefficient

Procedia PDF Downloads 172
2035 Simplifying the Migration of Architectures in Embedded Applications Introducing a Pattern Language to Support the Workforce

Authors: Farha Lakhani, Michael J. Pont

Abstract:

There are two main architectures used to develop software for modern embedded systems: these can be labelled as “event-triggered” (ET) and “time-triggered” (TT). The research presented in this paper is concerned with the issues involved in migration between these two architectures. Although TT architectures are widely used in safety-critical applications they are less familiar to developers of mainstream embedded systems. The research presented in this paper began from the premise that–for a broad class of systems that have been implemented using an ET architecture–migration to a TT architecture would improve reliability. It may be tempting to assume that conversion between ET and TT designs will simply involve converting all event-handling software routines into periodic activities. However, the required changes to the software architecture are, in many cases rather more profound. The main contribution of the work presented in this paper is to identify ways in which the significant effort involved in migrating between existing ET architectures and “equivalent” (and effective) TT architectures could be reduced. The research described in this paper has taken an innovative step in this regard by introducing the use of ‘Design patterns’ for this purpose for the first time.

Keywords: embedded applications, software architectures, reliability, pattern

Procedia PDF Downloads 323
2034 A Survey on Speech Emotion-Based Music Recommendation System

Authors: Chirag Kothawade, Gourie Jagtap, PreetKaur Relusinghani, Vedang Chavan, Smitha S. Bhosale

Abstract:

Psychological research has proven that music relieves stress, elevates mood, and is responsible for the release of “feel-good” chemicals like oxytocin, serotonin, and dopamine. It comes as no surprise that music has been a popular tool in rehabilitation centers and therapy for various disorders, thus with the interminably rising numbers of people facing mental health-related issues across the globe, addressing mental health concerns is more crucial than ever. Despite the existing music recommendation systems, there is a dearth of holistically curated algorithms that take care of the needs of users. Given that, an undeniable majority of people turn to music on a regular basis and that music has been proven to increase cognition, memory, and sleep quality while reducing anxiety, pain, and blood pressure, it is the need of the hour to fashion a product that extracts all the benefits of music in the most extensive and deployable method possible. Our project aims to ameliorate our users’ mental state by building a comprehensive mood-based music recommendation system called “Viby”.

Keywords: language, communication, speech recognition, interaction

Procedia PDF Downloads 55