Search results for: word processing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4366

Search results for: word processing

2836 A Pilot Study to Investigate the Use of Machine Translation Post-Editing Training for Foreign Language Learning

Authors: Hong Zhang

Abstract:

The main purpose of this study is to show that machine translation (MT) post-editing (PE) training can help our Chinese students learn Spanish as a second language. Our hypothesis is that they might make better use of it by learning PE skills specific for foreign language learning. We have developed PE training materials based on the data collected in a previous study. Training material included the special error types of the output of MT and the error types that our Chinese students studying Spanish could not detect in the experiment last year. This year we performed a pilot study in order to evaluate the PE training materials effectiveness and to what extent PE training helps Chinese students who study the Spanish language. We used screen recording to record these moments and made note of every action done by the students. Participants were speakers of Chinese with intermediate knowledge of Spanish. They were divided into two groups: Group A performed PE training and Group B did not. We prepared a Chinese text for both groups, and participants translated it by themselves (human translation), and then used Google Translate to translate the text and asked them to post-edit the raw MT output. Comparing the results of PE test, Group A could identify and correct the errors faster than Group B students, Group A did especially better in omission, word order, part of speech, terminology, mistranslation, official names, and formal register. From the results of this study, we can see that PE training can help Chinese students learn Spanish as a second language. In the future, we could focus on the students’ struggles during their Spanish studies and complete the PE training materials to teach Chinese students learning Spanish with machine translation.

Keywords: machine translation, post-editing, post-editing training, Chinese, Spanish, foreign language learning

Procedia PDF Downloads 144
2835 Automatic Furrow Detection for Precision Agriculture

Authors: Manpreet Kaur, Cheol-Hong Min

Abstract:

The increasing advancement in the robotics equipped with machine vision sensors applied to precision agriculture is a demanding solution for various problems in the agricultural farms. An important issue related with the machine vision system concerns crop row and weed detection. This paper proposes an automatic furrow detection system based on real-time processing for identifying crop rows in maize fields in the presence of weed. This vision system is designed to be installed on the farming vehicles, that is, submitted to gyros, vibration and other undesired movements. The images are captured under image perspective, being affected by above undesired effects. The goal is to identify crop rows for vehicle navigation which includes weed removal, where weeds are identified as plants outside the crop rows. The images quality is affected by different lighting conditions and gaps along the crop rows due to lack of germination and wrong plantation. The proposed image processing method consists of four different processes. First, image segmentation based on HSV (Hue, Saturation, Value) decision tree. The proposed algorithm used HSV color space to discriminate crops, weeds and soil. The region of interest is defined by filtering each of the HSV channels between maximum and minimum threshold values. Then the noises in the images were eliminated by the means of hybrid median filter. Further, mathematical morphological processes, i.e., erosion to remove smaller objects followed by dilation to gradually enlarge the boundaries of regions of foreground pixels was applied. It enhances the image contrast. To accurately detect the position of crop rows, the region of interest is defined by creating a binary mask. The edge detection and Hough transform were applied to detect lines represented in polar coordinates and furrow directions as accumulations on the angle axis in the Hough space. The experimental results show that the method is effective.

Keywords: furrow detection, morphological, HSV, Hough transform

Procedia PDF Downloads 231
2834 Genetic Data of Deceased People: Solving the Gordian Knot

Authors: Inigo de Miguel Beriain

Abstract:

Genetic data of deceased persons are of great interest for both biomedical research and clinical use. This is due to several reasons. On the one hand, many of our diseases have a genetic component; on the other hand, we share genes with a good part of our biological family. Therefore, it would be possible to improve our response considerably to these pathologies if we could use these data. Unfortunately, at the present moment, the status of data on the deceased is far from being satisfactorily resolved by the EU data protection regulation. Indeed, the General Data Protection Regulation has explicitly excluded these data from the category of personal data. This decision has given rise to a fragmented legal framework on this issue. Consequently, each EU member state offers very different solutions. For instance, Denmark considers the data as personal data of the deceased person for a set period of time while some others, such as Spain, do not consider this data as such, but have introduced some specifically focused regulations on this type of data and their access by relatives. This is an extremely dysfunctional scenario from multiple angles, not least of which is scientific cooperation at the EU level. This contribution attempts to outline a solution to this dilemma through an alternative proposal. Its main hypothesis is that, in reality, health data are, in a sense, a rara avis within data in general because they do not refer to one person but to several. Hence, it is possible to think that all of them can be considered data subjects (although not all of them can exercise the corresponding rights in the same way). When the person from whom the data were obtained dies, the data remain as personal data of his or her biological relatives. Hence, the general regime provided for in the GDPR may apply to them. As these are personal data, we could go back to thinking in terms of a general prohibition of data processing, with the exceptions provided for in Article 9.2 and on the legal bases included in Article 6. This may be complicated in practice, given that, since we are dealing with data that refer to several data subjects, it may be complex to refer to some of these bases, such as consent. Furthermore, there are theoretical arguments that may oppose this hypothesis. In this contribution, it is shown, however, that none of these objections is of sufficient substance to delegitimize the argument exposed. Therefore, the conclusion of this contribution is that we can indeed build a general framework on the processing of personal data of deceased persons in the context of the GDPR. This would constitute a considerable improvement over the current regulatory framework, although it is true that some clarifications will be necessary for its practical application.

Keywords: collective data conceptual issues, data from deceased people, genetic data protection issues, GDPR and deceased people

Procedia PDF Downloads 154
2833 The First Japanese-Japanese Dictionary for Non-Japanese Using the Defining Vocabulary

Authors: Minoru Moriguchi

Abstract:

This research introduces the concept of a monolingual Japanese dictionary for non-native speakers of Japanese, whose temporal title is Dictionary of Contemporary Japanese for Advanced Learners (DCJAL). As the language market is very small compared with English, a monolingual Japanese dictionary for non-native speakers, containing sufficient entries, has not been published yet. In such a dictionary environment, Japanese-language learners are using bilingual dictionaries or monolingual Japanese dictionaries for Japanese people. This research started in 2017, as a project team which consists of four Japanese and two non-native speakers, all of whom are linguists of the Japanese language. The team has been trying to propose the concept of a monolingual dictionary for non-native speakers of Japanese and to provide the entry list, the definition samples, the list of defining vocabulary, and the writing manual. As the result of seven-year research, DCJAL has come to have 28,060 head words, 539 entry examples, 4,598-word defining vocabulary, and the writing manual. First, the number of the entry was determined as about 30,000, based on an experimental method using existing six dictionaries. To make the entry list satisfying this number, words suitable for DCJAL were extracted from the Tsukuba corpus of the Japanese language, and later the entry list was adjusted according to the experience as Japanese instructor. Among the head words of the entry list, 539 words were selected and added with lexicographical information such as proficiency level, pronunciation, writing system (hiragana, katakana, kanji, or alphabet), definition, example sentences, idiomatic expression, synonyms, antonyms, grammatical information, sociolinguistic information, and etymology. While writing the definition of the above 539 words, the list of the defining vocabulary was constructed, based on frequent vocabulary used in a Japanese monolingual dictionary. Although the concept of DCJAL has been almost perfected, it may need some more adjustment, and the research is continued.

Keywords: monolingual dictionary, the Japanese language, non-native speaker of Japanese, defining vocabulary

Procedia PDF Downloads 41
2832 Enhancing Creative Writing Skill through the Implementation of Creative Thinking Process

Authors: Bussabamintra Chalauisaeng

Abstract:

The creative writing skill of Thai fourth year university learners majoring in English at Khon Kaen University, Thailand has been enhanced in an English creative writing course through the implementation of creative thinking process. The creative writing assignments cover writing a variety of short poems and a short story, bibliography and short play scripts. However, this study focuses mainly on writing short poems and short stories through the implementation of creative thinking process via action research design with on-going needs analysis and feedbacks to meet their learning needs for 45 hours. At the end of the course, forty two learners’ creative writing skill appeared to be significantly improved. Through the research instruments such as the tasks assigned both inside and outside the class as self –study including class observation, semi-conversational interviews and teacher feedback both in persons and on line including peer feedbacks. The research findings show that the target learners could produce better short poems and short story assessed by the set of criteria such as the creative and innovative short poems and short stories with complete and interesting elements of a short story like plot, theme, setting, symbolism and so on. This includes a higher level of the awareness of the pragmatic use of English writing in terms of word choices, grammar rules and writing styles. All of these outcomes reflect positive trends of success in terms of the learners’ improved creative writing skill as well as better attitudes to and motivation for learning to write English for pleasure. More interestingly, many learners claimed that this innovative teaching method through the implementation of creative thinking process integrated with creative writing help stretch their imaginations and inspire them to become a writer in the future.

Keywords: creative thinking process, creative writing skill, enhancing, implementing

Procedia PDF Downloads 174
2831 Revisiting Domestication and Foreignisation Methods: Translating the Quran by the Hybrid Approach

Authors: Aladdin Al-Tarawneh

Abstract:

The Quran, as it is the sacred book of Islam and considered the literal word of God (Allah) in Arabic, is highly translated into many languages; however, the foreignising or the literal approach excessively stains the quality and discredits the final product in the eyes of its receptors. Such an approach fails to capture the intended meaning of the Quran and to communicate it in any language. Therefore, this study is conducted to propose a different approach that seeks involving other ones according to a hybrid model. Indeed, this study challenges the binary adherence that is highly used in Translation Studies (TS) in general and in the translation of the Quran in particular. Drawing on the genuine fact that the Quran can be communicated in any language in terms of meaning, and the translation is not sacred; this paper approaches the translation of the Quran by blending different methods like domestication or foreignisation in a systematic way, avoiding the binary choice made by many translators. To reach this aim, the paper has a conceptual part that seeks to elucidate and clarify the main methods employed in TS, and criticise and modify them to propose the new hybrid approach (the hybrid model) for translating the Quran – that is, the deductive method. To support and validate the outcome of the previous part, a comparative model is employed in order to highlight the differences between the suggested translation and other widely used ones – that is, the inductive method. By applying this methodology, the paper proves that there is a deficiency of communicating the original meaning of the Quran in light of the foreignising approach. In conclusion, the paper suggests producing a Quran translation has to take into account the adoption of many techniques to express the meaning of the Quran as understood in the original, and to offer this understanding in English in the most native-like manner to serve the intended target readers.

Keywords: Quran translation, hybrid approach, domestication, foreignization, hybrid model

Procedia PDF Downloads 163
2830 Learning-by-Heart vs. Learning by Thinking: Fostering Thinking in Foreign Language Learning A Comparison of Two Approaches

Authors: Danijela Vranješ, Nataša Vukajlović

Abstract:

Turning to learner-centered teaching instead of the teacher-centered approach brought a whole new perspective into the process of teaching and learning and set a new goal for improving the educational process itself. However, recently a tremendous decline in students’ performance on various standardized tests can be observed, above all on the PISA-test. The learner-centeredness on its own is not enough anymore: the students’ ability to think is deteriorating. Especially in foreign language learning, one can encounter a lot of learning by heart: whether it is grammar or vocabulary, teachers often seem to judge the students’ success merely on how well they can recall a specific word, phrase, or grammar rule, but they rarely aim to foster their ability to think. Convinced that foreign language teaching can do both, this research aims to discover how two different approaches to teaching foreign language foster the students’ ability to think as well as to what degree they help students get to the state-determined level of foreign language at the end of the semester as defined in the Common European Framework. For this purpose, two different curricula were developed: one is a traditional, learner-centered foreign language curriculum that aims at teaching the four competences as defined in the Common European Framework and serves as a control variable, whereas the second one has been enriched with various thinking routines and aims at teaching the foreign language as a means to communicate ideas and thoughts rather than reducing it to the four competences. Moreover, two types of tests were created for each approach, each based on the content taught during the semester. One aims to test the students’ competences as defined in the CER, and the other aims to test the ability of students to draw on the knowledge gained and come to their own conclusions based on the content taught during the semester. As it is an ongoing study, the results are yet to be interpreted.

Keywords: common european framework of reference, foreign language learning, foreign language teaching, testing and assignment

Procedia PDF Downloads 107
2829 GPU Based Real-Time Floating Object Detection System

Authors: Jie Yang, Jian-Min Meng

Abstract:

A GPU-based floating object detection scheme is presented in this paper which is designed for floating mine detection tasks. This system uses contrast and motion information to eliminate as many false positives as possible while avoiding false negatives. The GPU computation platform is deployed to allow detecting objects in real-time. From the experimental results, it is shown that with certain configuration, the GPU-based scheme can speed up the computation up to one thousand times compared to the CPU-based scheme.

Keywords: object detection, GPU, motion estimation, parallel processing

Procedia PDF Downloads 474
2828 A Literature Review of Precision Agriculture: Applications of Diagnostic Diseases in Corn, Potato, and Rice Based on Artificial Intelligence

Authors: Carolina Zambrana, Grover Zurita

Abstract:

The food loss production that occurs in deficient agricultural production is one of the major problems worldwide. This puts the population's food security and the efficiency of farming investments at risk. It is to be expected that this food security will be achieved with the own and efficient production of each country. It will have an impact on the well-being of its population and, thus, also on food sovereignty. The production losses in quantity and quality occur due to the lack of efficient detection of diseases at an early stage. It is very difficult to solve the agriculture efficiency using traditional methods since it takes a long time to be carried out due to detection imprecision of the main diseases, especially when the production areas are extensive. Therefore, the main objective of this research study is to perform a systematic literature review, of the latest five years, of Precision Agriculture (PA) to be able to understand the state of the art of the set of new technologies, procedures, and optimization processes with Artificial Intelligence (AI). This study will focus on Corns, Potatoes, and Rice diagnostic diseases. The extensive literature review will be performed on Elsevier, Scopus, and IEEE databases. In addition, this research will focus on advanced digital imaging processing and the development of software and hardware for PA. The convolution neural network will be handling special attention due to its outstanding diagnostic results. Moreover, the studied data will be incorporated with artificial intelligence algorithms for the automatic diagnosis of crop quality. Finally, precision agriculture with technology applied to the agricultural sector allows the land to be exploited efficiently. This system requires sensors, drones, data acquisition cards, and global positioning systems. This research seeks to merge different areas of science, control engineering, electronics, digital image processing, and artificial intelligence for the development, in the near future, of a low-cost image measurement system that allows the optimization of crops with AI.

Keywords: precision agriculture, convolutional neural network, deep learning, artificial intelligence

Procedia PDF Downloads 79
2827 Didactics of Literature within the Brechtian Theatre in Edward Albee's Who's Afraid of Virginia Woolf? and Ernest Lehman's Screenplay Adaptation from an Audiovisual Perspective

Authors: Angel Mauricio Castillo

Abstract:

The background to the way theatrical performances and music dramas- as they were known in the mid-nineteenth century, provided the audience with a complete immersion into the feelings of the characters through poetry, music and other artistic representations which create a false sense of reality. However, a novel representation on stage some eighty years later, which is non-cathartic, is significant because it represents the antithesis to the common creations of the period and is originated by the separation of the elements as a dominant. A succinct description of the basic methodologies includes the sense of defamiliarization that results as a near translation of the German word Verfremdung will be referred to along this work as the V-effect (also known as the ‘alienation effect’) and will embody the representation of the performing techniques that enables the audience to watch a play being fully aware of its nature. A play might sometimes present the audience with a constant reminder that it is only a play; therefore, all elements will be introduced to provoke dissimilar reactions and opinions. A clear indication of the major findings of the study is that there is a strong correlation between Hegel, Marx and Brecht as it is disclosed how the didactics of Literature have been influencing not only Brecht’s productions but also every educational context in which these ideas are intertwined. The result is a new dialectical process that is to say, a new thesis that creates independent thinking skills on the part of the audience. Therefore, this model opposes to the Hegelian formula thesis-antithesis-synthesis in that the synthesis in the Brechtian theatre will inevitably fall into the category of a different thesis within an enlightening type of discourse. The confronting ideas of illusion versus reality will create a new dialectical thesis instead of resulting into a synthesis.

Keywords: Brechtian theatre, didactics, literature, education

Procedia PDF Downloads 179
2826 The Power of Words: The Use of Language in Ethan Frome

Authors: Ritu Sharma

Abstract:

In order to be objective, critics must examine the dynamic relationships between the author, the reader, the text, and the outside world. However, it is also crucial to recognize that because the language was created by God, meaning is ingrained in it. Meaning is located in and discovered through literature rather than being limited to the author, reader, text, or the outside world. The link between the author, the reader, and the text is crucial because literature unites an author and a reader through the use of language. Literature is a potent kind of communication, and Ethan Frome's audience is forever changed as a result of the book's language and the language its characters use. The narrative of Ethan Frome and his wife Zeena is presented in Ethan Frome. Ethan's story is told throughout the course of the book, revealed through the eyes of the narrator, an outsider passing through Starkfield, as well as through the insight that the narrator gains from the townspeople and his stay on the Frome farm. The story is set in the rural New England community of Starkfield, Massachusetts. The weather provides the ideal setting for Ethan and the narrator to get to know one another as the narrator gets preoccupied with unraveling the narrative that underlies Ethan's physical anomalies. In addition to telling a gripping tale and capturing human nature as it is, Ethan Frome uses its storyline to achieve something more significant. The book by Edith Wharton supports language. Zeena's deliberate and convincing language challenges relativity and meaninglessness. Ethan and Mattie's effort to effectively use words reflects the complexity of language, and their battle illustrates the influence that language may have if and when it is used. Ethan Frome defends the written word, the foundation upon which it is constructed, as a literary work. Communication is based on language, and as the characters respond to and get involved in disputes throughout the book, Zeena, Ethan, and Mattie, each reflects particular theories of communication that help define their uses of communication within the broader context of language.

Keywords: dynamic relationships, potent, communication, complexity

Procedia PDF Downloads 91
2825 Achieving Flow at Work: An Experience Sampling Study to Comprehend How Cognitive Task Characteristics and Work Environments Predict Flow Experiences

Authors: Jonas De Kerf, Rein De Cooman, Sara De Gieter

Abstract:

For many decades, scholars have aimed to understand how work can become more meaningful by maximizing both potential and enhancing feelings of satisfaction. One of the largest contributions towards such positive psychology was made with the introduction of the concept of ‘flow,’ which refers to a condition in which people feel intense engagement and effortless action. Since then, valuable research on work-related flow has indicated that this state of mind is related to positive outcomes for both organizations (e.g., social, supportive climates) and workers (e.g., job satisfaction). Yet, scholars still do not fully comprehend how such deep involvement at work is obtained, given the notion that flow is considered a short-term, complex, and dynamic experience. Most research neglects that people who experience flow ought to be optimally challenged so that intense concentration is required. Because attention is at the core of this enjoyable state of mind, this study aims to comprehend how elements that affect workers’ cognitive functioning impact flow at work. Research on cognitive performance suggests that working on mentally demanding tasks (e.g., information processing tasks) requires workers to concentrate deeply, as a result leading to flow experiences. Based on social facilitation theory, working on such tasks in an isolated environment eases concentration. Prior research has indicated that working at home (instead of working at the office) or in a closed office (rather than in an open-plan office) impacts employees’ overall functioning in terms of concentration and productivity. Consequently, we advance such knowledge and propose an interaction by combining cognitive task characteristics and work environments among part-time teleworkers. Hence, we not only aim to shed light on the relation between cognitive tasks and flow but also provide empirical evidence that workers performing such tasks achieve the highest states of flow while working either at home or in closed offices. In July 2022, an experience-sampling study will be conducted that uses a semi-random signal schedule to understand how task and environment predictors together impact part-time teleworkers’ flow. More precisely, about 150 knowledge workers will fill in multiple surveys a day for two consecutive workweeks to report their flow experiences, cognitive tasks, and work environments. Preliminary results from a pilot study indicate that on a between level, tasks high in information processing go along with high self-reported fluent productivity (i.e., making progress). As expected, evidence was found for higher fluency in productivity for workers performing information processing tasks both at home and in a closed office, compared to those performing the same tasks at the office or in open-plan offices. This study expands the current knowledge on work-related flow by looking at a task and environmental predictors that enable workers to obtain such a peak state. While doing so, our findings suggest that practitioners should strive for ideal alignments between tasks and work locations to work with both deep involvement and gratification.

Keywords: cognitive work, office lay-out, work location, work-related flow

Procedia PDF Downloads 101
2824 Toward Subtle Change Detection and Quantification in Magnetic Resonance Neuroimaging

Authors: Mohammad Esmaeilpour

Abstract:

One of the important open problems in the field of medical image processing is detection and quantification of small changes. In this poster, we try to investigate that, how the algebraic decomposition techniques can be used for semiautomatically detecting and quantifying subtle changes in Magnetic Resonance (MR) neuroimaging volumes. We mostly focus on the low-rank values of the matrices achieved from decomposing MR image pairs during a period of time. Besides, a skillful neuroradiologist will help the algorithm to distinguish between noises and small changes.

Keywords: magnetic resonance neuroimaging, subtle change detection and quantification, algebraic decomposition, basis functions

Procedia PDF Downloads 474
2823 Evaluation of Digital Marketing Strategies by Behavioral Economics

Authors: Sajjad Esmaeili Aghdam

Abstract:

Economics typically conceptualizes individual behavior as the consequence of external states, for example, budgets and prices (or respective beliefs) and choices. As the main goal, we focus on the influence of a range of Behavioral Economics factors on Strategies of Digital Marketing, evaluation of strategies and deformation of it into highly prospective marketing strategies. The different forms of behavioral prospects all lead to the succeeding two main results. First, the steadiness of the economic dynamics in a currency union be contingent fatefully on the level of economic incorporation. More economic incorporation leads to more steady economic dynamics. Electronic word-of-mouth (eWOM) is “all casual communications focused at consumers through Internet-based technology connected to the usage or characteristics of specific properties and services or their venders.” eWOM can take many methods, the most significant one being online analyses. Writing this paper, 72 articles have been gathered, focusing on the title and the aim of the article from research search engines like Google Scholar, Web of Science, and PubMed. Recent research in strategic management and marketing proposes that markets should not be viewed as a given and deterministic setting, exogenous to the firm. Instead, firms are progressively abstracted as dynamic inventors of market prospects. The use of new technologies touches all spheres of the modern lifestyle. Social and economic life becomes unbearable without fast, applicable, first-class and fitting material. Psychology and economics (together known as behavioral economics) are two protruding disciplines underlying many theories in marketing. The wide marketing works papers consumers’ none balanced behavior even though behavioral biases might not continuously be steadily called or officially labeled.

Keywords: behavioral economics, digital marketing, marketing strategy, high impact strategies

Procedia PDF Downloads 183
2822 Morpho-Syntactic Pattern in Maithili Urdu

Authors: Mohammad Jahangeer Warsi

Abstract:

This is, perhaps, the first linguistic study of Maithili Urdu, a dialect of Urdu language of Indo-Aryan family, spoken by around four million speakers in Darbhanga, Samastipur, Begusarai, Madhubani, and Muzafarpur districts of Bihar. It has the subject–verb–object (SOV) word order and it lacks script and literature. Needless to say, this work is an attempt to document this dialect so that it should contribute to the field of descriptive linguistics. Besides, it is also spoken by majority of Maithili diaspora community. Maithili Urdu does not have its own script or literature, yet it has maintained an oral history of over many centuries. It has contributed to enriching the Maithili, Hindi and Urdu languages and literature very profoundly. Dialects are the contact languages of particular regions, and they have a deep impact on their cultural heritage. Slowly with time, these dialects begin to take shape of languages. The convergence of a dialect into a language is a symbol and pride of the people who speak it. Although, confined to the five districts of northern Bihar, yet highly popular among the natives, it is the primary mode of communication of the local Muslims. The paper will focus on the structure of expressions about Maithili Urdu that include the structure of words, phrases, clauses, and sentences. There are clear differences in linguistic features of Maithili Urdu vis-à-vis Urdu, Maithili and Hindi. Though being a dialect of Urdu, interestingly, there is only one second person pronoun tu and lack of agentive marker –ne. Although being spoken in the vicinity of Hindi, Urdu and Maithili, it undoubtedly has its own linguistic features, of them, verb conjugation is remarkably unique. Because of the oral tradition of this link language, intonation has become significantly prominent. This paper will discuss the morpho-syntactic pattern of Maithili Urdu and will go through a sample text to authenticate the findings.

Keywords: cultural heritage, morpho-syntactic pattern, Maithili Urdu, verb conjugation

Procedia PDF Downloads 214
2821 A Critical Discourse Analysis of the Construction of Artists' Reputation by Online Art Magazines

Authors: Thomas Soro, Tim Stott, Brendan O'Rourke

Abstract:

The construction of artistic reputation has been examined within sociology, philosophy, and economics but, baring a few noteworthy exceptions its discursive aspect has been largely ignored. This is particularly surprising given that contemporary artworks primarily rely on discourse to construct their ontological status. This paper contributes a discourse analytical perspective to the broad body of literature on artistic reputation by providing an understanding of how it is discursively constructed within the institutional context of online contemporary art magazines. This paper uses corpora compiled from the websites of e-flux and ARTnews, two leading online contemporary art magazines, to examine how these organisations discursively construct the reputation of artists. By constructing word-sketches of the term 'Artist', the paper identified the most significant modifiers attributed to artists and the most significant verbs which have 'artist' as an object or subject. The most significant results were analysed through concordances and demonstrated a somewhat surprising lack of evaluative representation. To examine this feature more closely, the paper then analysed three announcement texts from e-flux’s site and three review texts from ARTnews' site, comparing the use of modifiers and verbs in the representation of artists, artworks, and institutions. The results of this analysis support the corpus findings, suggesting that artists are rarely represented in evaluative terms. Based on the relatively high frequency of evaluation in the representation of artworks and institutions, these results suggest that there may be discursive norms at work in the field of online contemporary art magazines which regulate the use of verbs and modifiers in the evaluation of artists.

Keywords: contemporary art, corpus linguistics, critical discourse analysis, symbolic capital

Procedia PDF Downloads 165
2820 Employing Visual Culture to Enhance Initial Adult Maltese Language Acquisition

Authors: Jacqueline Żammit

Abstract:

Recent research indicates that the utilization of right-brain strategies holds significant implications for the acquisition of language skills. Nevertheless, the utilization of visual culture as a means to stimulate these strategies and amplify language retention among adults engaging in second language (L2) learning remains a relatively unexplored area. This investigation delves into the impact of visual culture on activating right-brain processes during the initial stages of language acquisition, particularly in the context of teaching Maltese as a second language (ML2) to adult learners. By employing a qualitative research approach, this study convenes a focus group comprising twenty-seven educators to delve into a range of visual culture techniques integrated within language instruction. The collected data is subjected to thematic analysis using NVivo software. The findings underscore a variety of impactful visual culture techniques, encompassing activities such as drawing, sketching, interactive matching games, orthographic mapping, memory palace strategies, wordless picture books, picture-centered learning methodologies, infographics, Face Memory Game, Spot the Difference, Word Search Puzzles, the Hidden Object Game, educational videos, the Shadow Matching technique, Find the Differences exercises, and color-coded methodologies. These identified techniques hold potential for application within ML2 classes for adult learners. Consequently, this study not only provides insights into optimizing language learning through specific visual culture strategies but also furnishes practical recommendations for enhancing language competencies and skills.

Keywords: visual culture, right-brain strategies, second language acquisition, maltese as a second language, visual aids, language-based activities

Procedia PDF Downloads 61
2819 Depolymerised Natural Polysaccharides Enhance the Production of Medicinal and Aromatic Plants and Their Active Constituents

Authors: M. Masroor Akhtar Khan, Moin Uddin, Lalit Varshney

Abstract:

Recently, there has been a rapidly expanding interest in finding applications of natural polymers in view of value addition to agriculture. It is now being realized that radiation processing of natural polysaccharides can be beneficially utilized either to improve the existing methodologies used for processing the natural polymers or to impart value addition to agriculture by converting them into more useful form. Gamma-ray irradiation is employed to degrade and lower the molecular weight of some of the natural polysaccharides like alginates, chitosan and carrageenan into small sized oligomers. When these oligomers are applied to plants as foliar sprays, they elicit various kinds of biological and physiological activities, including promotion of plant growth, seed germination, shoot elongation, root growth, flower production, suppression of heavy metal stress, etc. Furthermore, application of these oligomers can shorten the harvesting period of various crops and help in reducing the use of insecticides and chemical fertilizers. In recent years, the oligomers of sodium alginate obtained by irradiating the latter with gamma-rays at 520 kGy dose are being employed. It was noticed that the oligomers derived from the natural polysaccharides could induce growth, photosynthetic efficiency, enzyme activities and most importantly the production of secondary metabolite in the plants like Artemisia annua, Beta vulgaris, Catharanthus roseus, Chrysopogon zizanioides, Cymbopogon flexuosus, Eucalyptus citriodora, Foeniculum vulgare, Geranium sp., Mentha arvensis, Mentha citrata, Mentha piperita, Mentha virdis, Papaver somniferum and Trigonella foenum-graecum. As a result of the application of these oligomers, the yield and/or contents of the active constituents of the aforesaid plants were significantly enhanced. The productivity, as well as quality of medicinal and aromatic plants, may be ameliorated by this novel technique in an economical way as a very little quantity of these irradiated (depolymerised) polysaccharides is needed. Further, this is a very safe technique, as we did not expose the plants directly to radiation. The radiation was used to depolymerize the polysaccharides into oligomers.

Keywords: essential oil, medicinal and aromatic plants, plant production, radiation processed polysaccharides, active constituents

Procedia PDF Downloads 444
2818 Study of the Design and Simulation Work for an Artificial Heart

Authors: Mohammed Eltayeb Salih Elamin

Abstract:

This study discusses the concept of the artificial heart using engineering concepts, of the fluid mechanics and the characteristics of the non-Newtonian fluid. For the purpose to serve heart patients and improve aspects of their lives and since the Statistics review according to world health organization (WHO) says that heart disease and blood vessels are the first cause of death in the world. Statistics shows that 30% of the death cases in the world by the heart disease, so simply we can consider it as the number one leading cause of death in the entire world is heart failure. And since the heart implantation become a very difficult and not always available, the idea of the artificial heart become very essential. So it’s important that we participate in the developing this idea by searching and finding the weakness point in the earlier designs and hoping for improving it for the best of humanity. In this study a pump was designed in order to pump blood to the human body and taking into account all the factors that allows it to replace the human heart, in order to work at the same characteristics and the efficiency of the human heart. The pump was designed on the idea of the diaphragm pump. Three models of blood obtained from the blood real characteristics and all of these models were simulated in order to study the effect of the pumping work on the fluid. After that, we study the properties of this pump by using Ansys15 software to simulate blood flow inside the pump and the amount of stress that it will go under. The 3D geometries modeling was done using SOLID WORKS and the geometries then imported to Ansys design modeler which is used during the pre-processing procedure. The solver used throughout the study is Ansys FLUENT. This is a tool used to analysis the fluid flow troubles and the general well-known term used for this branch of science is known as Computational Fluid Dynamics (CFD). Basically, Design Modeler used during the pre-processing procedure which is a crucial step before the start of the fluid flow problem. Some of the key operations are the geometry creations which specify the domain of the fluid flow problem. Next is mesh generation which means discretization of the domain to solve governing equations at each cell and later, specify the boundary zones to apply boundary conditions for the problem. Finally, the pre–processed work will be saved at the Ansys workbench for future work continuation.

Keywords: Artificial heart, computational fluid dynamic heart chamber, design, pump

Procedia PDF Downloads 459
2817 Problems of Boolean Reasoning Based Biclustering Parallelization

Authors: Marcin Michalak

Abstract:

Biclustering is the way of two-dimensional data analysis. For several years it became possible to express such issue in terms of Boolean reasoning, for processing continuous, discrete and binary data. The mathematical backgrounds of such approach — proved ability of induction of exact and inclusion–maximal biclusters fulfilling assumed criteria — are strong advantages of the method. Unfortunately, the core of the method has quite high computational complexity. In the paper the basics of Boolean reasoning approach for biclustering are presented. In such context the problems of computation parallelization are risen.

Keywords: Boolean reasoning, biclustering, parallelization, prime implicant

Procedia PDF Downloads 125
2816 Epistemology in African Philosophy: A Critique of African Concept of Knowledge

Authors: Ovett Nwosimiri

Abstract:

African tradition and what it entails are the content of African concepts of knowledge. The study of African concepts of knowledge is also known as African epistemology. In other words, African epistemology is a branch of African philosophy that deals with knowledge. This branch of African philosophy engages with the nature and concept of knowledge, the ways in which knowledge can be gained, the ways in which one can justify an epistemic claim or validate a knowledge claim and the limit of human knowledge, etc. The protagonists of African epistemology based their argument for a distinctive or unique African epistemology on the premise or proposition “that each race is endowed with a distinctive nature and embodies in its civilization a particular spirit”. All human beings share some certain basic values and perceptions irrespective of where you come from, and this idea actually fosters some forms of interaction between people from different nationality. Africans like other people share in some certain values, perceptions, and interaction with the rest of the world. These basic values, perceptions, and interaction that Africans share with the rest of the word prompted African people to attempt to “modernize” their societies or develop some forms of their tradition in harmony with the ethos of the contemporary world. Based on the above ideas, it would be interesting to investigate if such (African) epistemology is still unique. The advocates of African epistemology focus on the externalist notion of justification and neglect the idea that both the internalist and externalist notion of justification are needed in order to arrive at a coherent and well-founded account of epistemic justification. Thus, this paper will critically examine the claims that there is a unique African epistemology (a mode of knowing that is peculiar to Africans, and that African mode of knowing is social, monism and situated notion of knowledge), and the grounds for justifying beliefs and epistemic claims.

Keywords: internalist, externalist, knowledge, justification

Procedia PDF Downloads 264
2815 Ischemic Stroke Detection in Computed Tomography Examinations

Authors: Allan F. F. Alves, Fernando A. Bacchim Neto, Guilherme Giacomini, Marcela de Oliveira, Ana L. M. Pavan, Maria E. D. Rosa, Diana R. Pina

Abstract:

Stroke is a worldwide concern, only in Brazil it accounts for 10% of all registered deaths. There are 2 stroke types, ischemic (87%) and hemorrhagic (13%). Early diagnosis is essential to avoid irreversible cerebral damage. Non-enhanced computed tomography (NECT) is one of the main diagnostic techniques used due to its wide availability and rapid diagnosis. Detection depends on the size and severity of lesions and the time spent between the first symptoms and examination. The Alberta Stroke Program Early CT Score (ASPECTS) is a subjective method that increases the detection rate. The aim of this work was to implement an image segmentation system to enhance ischemic stroke and to quantify the area of ischemic and hemorrhagic stroke lesions in CT scans. We evaluated 10 patients with NECT examinations diagnosed with ischemic stroke. Analyzes were performed in two axial slices, one at the level of the thalamus and basal ganglion and one adjacent to the top edge of the ganglionic structures with window width between 80 and 100 Hounsfield Units. We used different image processing techniques such as morphological filters, discrete wavelet transform and Fuzzy C-means clustering. Subjective analyzes were performed by a neuroradiologist according to the ASPECTS scale to quantify ischemic areas in the middle cerebral artery region. These subjective analysis results were compared with objective analyzes performed by the computational algorithm. Preliminary results indicate that the morphological filters actually improve the ischemic areas for subjective evaluations. The comparison in area of the ischemic region contoured by the neuroradiologist and the defined area by computational algorithm showed no deviations greater than 12% in any of the 10 examination tests. Although there is a tendency that the areas contoured by the neuroradiologist are smaller than those obtained by the algorithm. These results show the importance of a computer aided diagnosis software to assist neuroradiology decisions, especially in critical situations as the choice of treatment for ischemic stroke.

Keywords: ischemic stroke, image processing, CT scans, Fuzzy C-means

Procedia PDF Downloads 366
2814 Processing Design of Miniature Casting Incorporating Stereolithography Technologies

Authors: Pei-Hsing Huang, Wei-Ju Huang

Abstract:

Investment casting is commonly used in the production of metallic components with complex shapes, due to its high dimensional precision, good surface finish, and low cost. However, the process is cumbersome, and the period between trial casting and final production can be very long, thereby limiting business opportunities and competitiveness. In this study, we replaced conventional wax injection with stereolithography (SLA) 3D printing to speed up the trial process and reduce costs. We also used silicone molds to further reduce costs to avoid the high costs imposed by photosensitive resin.

Keywords: investment casting, stereolithography, wax molding, 3D printing

Procedia PDF Downloads 404
2813 Raising the Property Provisions of the Topographic Located near the Locality of Gircov, Romania

Authors: Carmen Georgeta Dumitrache

Abstract:

Measurements of terrestrial science aims to study the totality of operations and computing, which are carried out for the purposes of representation on the plan or map of the land surface in a specific cartographic projection and topographic scale. With the development of society, the metrics have evolved, and they land, being dependent on the achievement of a goal-bound utility of economic activity and of a scientific purpose related to determining the form and dimensions of the Earth. For measurements in the field, data processing and proper representation on drawings and maps of planimetry and landform of the land, using topographic and geodesic instruments, calculation and graphical reporting, which requires a knowledge of theoretical and practical concepts from different areas of science and technology. In order to use properly in practice, topographical and geodetic instruments designed to measure precise angles and distances are required knowledge of geometric optics, precision mechanics, the strength of materials, and more. For processing, the results from field measurements are necessary for calculation methods, based on notions of geometry, trigonometry, algebra, mathematical analysis and computer science. To be able to illustrate topographic measurements was established for the lifting of property located near the locality of Gircov, Romania. We determine this total surface of the plan (T30), parcel/plot, but also in the field trace the coordinates of a parcel. The purpose of the removal of the planimetric consisted of: the exact determination of the bounding surface; analytical calculation of the surface; comparing the surface determined with the one registered in the documents produced; drawing up a plan of location and delineation with closeness and distance contour, as well as highlighting the parcels comprising this property; drawing up a plan of location and delineation with closeness and distance contour for a parcel from Dave; in the field trace outline of plot points from the previous point. The ultimate goal of this work was to determine and represent the surface, but also to tear off a plot of the surface total, while respecting the first surface condition imposed by the Act of the beneficiary's property.

Keywords: topography, surface, coordinate, modeling

Procedia PDF Downloads 257
2812 The Redundant Kana: A Pragmatic Reading

Authors: Manal Mohammed Hisham Said Najjar

Abstract:

The Arab Grammarians shed light on the redundant kana (was) and gave it a considerable attention. However, their considerations and interpretations pertaining to using this verb varied: is it used to determine tense? Or used for further emphasis or for another function? Does it have a syntactic function? Morphologically, could it be used in other forms than the past? In addition, Arab Grammarians discussed the possibility of using kana to locate itself in between the syntactic constructs of a sentence, a phrase, or a collocation. Others questioned its position whether it is in initial or final. This study found out that the redundant kana (was) is cited in Quran and was used by the Arabs in their speech and poetry. This redundant kana, whether used in initial position or in a final position, or in between the constructs of a sentence, a phrase, or a collocation, implies pragmatic meanings intended by the speaker or the poet to serve different functions, such as to indicate the past tense, to provide emphasis, and to refer to the continuity of the effect and meaning of a verb or adjective. The study concludes that this verb kana can be utilized in different contexts to achieve a specific effect as did the old Arabs who used it to add specific shades of meanings. Kana as a redundant word could be added to further highlight the meaning aimed at in a specific utterance. In addition, this verb can be used in both the past and the present morphological form; and its availability in an utterance could be functional and could not be. In other words, the study found out that the redundant kana can be used in various positions in an utterance, initial, final, or in between a syntactic structure, provided that this use is pragmatically functional. In conclusion, this paper seeks to invite the scholars of the Arabic language to coin a new term which is the “pragmatic kana” to replace the term “kana alzae’da (redundant kana)” which might mean that its use is redundant and void of significance – a fact that is illogical due to its recurrent use in the Holy Quran. NOTE: Please take this study not the other one (sent by mistake) and titled kana alnaqisa

Keywords: redundan, kana, grammarians, quran

Procedia PDF Downloads 130
2811 EEG and DC-Potential Level Сhanges in the Elderly

Authors: Irina Deputat, Anatoly Gribanov, Yuliya Dzhos, Alexandra Nekhoroshkova, Tatyana Yemelianova, Irina Bolshevidtseva, Irina Deryabina, Yana Kereush, Larisa Startseva, Tatyana Bagretsova, Irina Ikonnikova

Abstract:

In the modern world the number of elderly people increases. Preservation of functionality of an organism in the elderly becomes very important now. During aging the higher cortical functions such as feelings, perception, attention, memory, and ideation are gradual decrease. It is expressed in the rate of information processing reduction, volume of random access memory loss, ability to training and storing of new information decrease. Perspective directions in studying of aging neurophysiological parameters are brain imaging: computer electroencephalography, neuroenergy mapping of a brain, and also methods of studying of a neurodynamic brain processes. Research aim – to study features of a brain aging in elderly people by electroencephalogram (EEG) and the DC-potential level. We examined 130 people aged 55 - 74 years that did not have psychiatric disorders and chronic states in a decompensation stage. EEG was recorded with a 128-channel GES-300 system (USA). EEG recordings are collected while the participant sits at rest with their eyes closed for 3 minutes. For a quantitative assessment of EEG we used the spectral analysis. The range was analyzed on delta (0,5–3,5 Hz), a theta - (3,5–7,0 Hz), an alpha 1-(7,0–11,0 Hz) an alpha 2-(11–13,0 Hz), beta1-(13–16,5 Hz) and beta2-(16,5–20 Hz) ranges. In each frequency range spectral power was estimated. The 12-channel hardware-software diagnostic ‘Neuroenergometr-KM’ complex was applied for registration, processing and the analysis of a brain constant potentials level. The DC-potential level registered in monopolar leads. It is revealed that the EEG of elderly people differ in higher rates of spectral power in the range delta (р < 0,01) and a theta - (р < 0,05) rhythms, especially in frontal areas in aging. By results of the comparative analysis it is noted that elderly people 60-64 aged differ in higher values of spectral power alfa-2 range in the left frontal and central areas (р < 0,05) and also higher values beta-1 range in frontal and parieto-occipital areas (р < 0,05). Study of a brain constant potential level distribution revealed increase of total energy consumption on the main areas of a brain. In frontal leads we registered the lowest values of constant potential level. Perhaps it indicates decrease in an energy metabolism in this area and difficulties of executive functions. The comparative analysis of a potential difference on the main assignments testifies to unevenness of a lateralization of a brain functions at elderly people. The results of a potential difference between right and left hemispheres testify to prevalence of the left hemisphere activity. Thus, higher rates of functional activity of a cerebral cortex are peculiar to people of early advanced age (60-64 years) that points to higher reserve opportunities of central nervous system. By 70 years there are age changes of a cerebral power exchange and level of electrogenesis of a brain which reflect deterioration of a condition of homeostatic mechanisms of self-control and the program of processing of the perceptual data current flow.

Keywords: brain, DC-potential level, EEG, elderly people

Procedia PDF Downloads 484
2810 Development of Internet of Things (IoT) with Mobile Voice Picking and Cargo Tracing Systems in Warehouse Operations of Third-Party Logistics

Authors: Eugene Y. C. Wong

Abstract:

The increased market competition, customer expectation, and warehouse operating cost in third-party logistics have motivated the continuous exploration in improving operation efficiency in warehouse logistics. Cargo tracing in ordering picking process consumes excessive time for warehouse operators when handling enormous quantities of goods flowing through the warehouse each day. Internet of Things (IoT) with mobile cargo tracing apps and database management systems are developed this research to facilitate and reduce the cargo tracing time in order picking process of a third-party logistics firm. An operation review is carried out in the firm with opportunities for improvement being identified, including inaccurate inventory record in warehouse management system, excessive tracing time on stored products, and product misdelivery. The facility layout has been improved by modifying the designated locations of various types of products. The relationship among the pick and pack processing time, cargo tracing time, delivery accuracy, inventory turnover, and inventory count operation time in the warehouse are evaluated. The correlation of the factors affecting the overall cycle time is analysed. A mobile app is developed with the use of MIT App Inventor and the Access management database to facilitate cargo tracking anytime anywhere. The information flow framework from warehouse database system to cloud computing document-sharing, and further to the mobile app device is developed. The improved performance on cargo tracing in the order processing cycle time of warehouse operators have been collected and evaluated. The developed mobile voice picking and tracking systems brings significant benefit to the third-party logistics firm, including eliminating unnecessary cargo tracing time in order picking process and reducing warehouse operators overtime cost. The mobile tracking device is further planned to enhance the picking time and cycle count of warehouse operators with voice picking system in the developed mobile apps as future development.

Keywords: warehouse, order picking process, cargo tracing, mobile app, third-party logistics

Procedia PDF Downloads 374
2809 Survival of Micro-Encapsulated Probiotic Lactic Acid Bacteria in Mutton Nuggets and Their Assessments in Simulated Gastro-Intestinal Conditions

Authors: Rehana Akhter, Sajad A. Rather, F. A. Masoodi, Adil Gani, S. M. Wani

Abstract:

During recent years probiotic food products receive market interest as health-promoting, functional foods, which are believed to contribute health benefits. In order to deliver the health benefits by probiotic bacteria, it has been recommended that they must be present at a minimum level of 106 CFU/g to 107 CFU/g at point of delivery or be eaten in sufficient amounts to yield a daily intake of 108 CFU. However a major challenge in relation to the application of probiotic cultures in food matrix is the maintenance of viability during processing which might lead to important losses in viability as probiotic cultures are very often thermally labile and sensitive to acidity, oxygen or other food constituents for example, salts. In this study Lactobacillus plantarum and Lactobacillus casei were encapsulated in calcium alginate beads with the objective of enhancing their survivability and preventing exposure to the adverse conditions of the gastrointestinal tract and where then inoculated in mutton nuggets. Micro encapsulated Lactobacillus plantarum and Lactobacillus casei were resistant to simulated gastric conditions (pH 2, 2h) and bile solution (3%, 2 h) resulting in significantly (p ≤ 0.05) improved survivability when compared with free cell counterparts. A high encapsulation yield was found due to the encapsulation procedure. After incubation at low pH-values, micro encapsulation yielded higher survival rates compared to non-encapsulated probiotic cells. The viable cell numbers of encapsulated Lactobacillus plantarum and Lactobacillus casei were 107-108 CFU/g higher compared to free cells after 90 min incubation at pH 2.5. The viable encapsulated cells were inoculated into mutton nuggets at the rate of 108 to 1010 CFU/g. The micro encapsulated Lactobacillus plantarum and Lactobacillus casei achieved higher survival counts (105-107 CFU/g) than the free cell counterparts (102-104 CFU/g). Thus micro encapsulation offers an effective means of delivery of viable probiotic bacterial cells to the colon and maintaining their survival during simulated gastric, intestinal juice and processing conditions during nugget preparation.

Keywords: survival, Lactobacillus plantarum, Lactobacillus casei, micro-encapsulation, nugget

Procedia PDF Downloads 279
2808 Recognition of Tifinagh Characters with Missing Parts Using Neural Network

Authors: El Mahdi Barrah, Said Safi, Abdessamad Malaoui

Abstract:

In this paper, we present an algorithm for reconstruction from incomplete 2D scans for tifinagh characters. This algorithm is based on using correlation between the lost block and its neighbors. This system proposed contains three main parts: pre-processing, features extraction and recognition. In the first step, we construct a database of tifinagh characters. In the second step, we will apply “shape analysis algorithm”. In classification part, we will use Neural Network. The simulation results demonstrate that the proposed method give good results.

Keywords: Tifinagh character recognition, neural networks, local cost computation, ANN

Procedia PDF Downloads 334
2807 Area-Efficient FPGA Implementation of an FFT Processor by Reusing Butterfly Units

Authors: Atin Mukherjee, Amitabha Sinha, Debesh Choudhury

Abstract:

Fast Fourier transform (FFT) of large-number of samples requires larger hardware resources of field programmable gate arrays and it asks for more area as well as power. In this paper, an area efficient architecture of FFT processor is proposed, that reuses the butterfly units more than once. The FFT processor is emulated and the results are validated on Virtex-6 FPGA. The proposed architecture outperforms the conventional architecture of a N-point FFT processor in terms of area which is reduced by a factor of log_N(2) with the negligible increase of processing time.

Keywords: FFT, FPGA, resource optimization, butterfly units

Procedia PDF Downloads 523