Search results for: computational lexicography
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2037

Search results for: computational lexicography

2037 Perspectives of Computational Modeling in Sanskrit Lexicons

Authors: Baldev Ram Khandoliyan, Ram Kishor

Abstract:

India has a classical tradition of Sanskrit Lexicons. Research work has been done on the study of Indian lexicography. India has seen amazing strides in Information and Communication Technology (ICT) applications for Indian languages in general and for Sanskrit in particular. Since Machine Translation from Sanskrit to other Indian languages is often the desired goal, traditional Sanskrit lexicography has attracted a lot of attention from the ICT and Computational Linguistics community. From Nighaŋţu and Nirukta to Amarakośa and Medinīkośa, Sanskrit owns a rich history of lexicography. As these kośas do not follow the same typology or standard in the selection and arrangement of the words and the information related to them, several types of Kośa-styles have emerged in this tradition. The model of a grammar given by Aṣṭādhyāyī is well appreciated by Indian and western linguists and grammarians. But the different models provided by lexicographic tradition also have importance. The general usefulness of Sanskrit traditional Kośas is well discussed by some scholars. That is most of the matter made available in the text. Some also have discussed the good arrangement of lexica. This paper aims to discuss some more use of the different models of Sanskrit lexicography especially focusing on its computational modeling and its use in different computational operations.

Keywords: computational lexicography, Sanskrit Lexicons, nighanṭu, kośa, Amarkosa

Procedia PDF Downloads 162
2036 Contrastive Linguistics as a Way to Improve Translation Equivalence in Interlingual Lexicography: The Case of Verbs

Authors: R. A. S. Zacarias

Abstract:

Interlingual is one of the most complex, and engaging one among the several perspectives in lexicography. This is because it involves contacts and contrasts between two or more languages. Considering the fact that translation equivalence goes beyond a mere fixed relation of correspondence, understanding the differences and similarities between linguistic categories by pairs of languages is the basis for effective translations. One of the theoretical approaches that have proved useful in finding improved solutions for enhance translation equivalents for bilingual dictionaries is contrastive linguistics. This paper presents an applied qualitative research based on exploratory and descriptive approaches. This is achieved through an error analysis of students’ errors as well as by a contrastive analysis of Portuguese and English verb systems.

Keywords: bilingual lexicography, contrastive linguistics, translation equivalent, Portuguese-English

Procedia PDF Downloads 475
2035 Particular Features of the First Romanian Multilingual Dictionaries

Authors: Mihaela Mocanu

Abstract:

The Romanian multilingual dictionaries – also named polyglot, plurilingual or polylingual dictionaries, have known a slow yet constant development starting with the end of the 17th century, when the first such work is attested, to the present time, when we witness a considerable increase of the number of polyglot dictionaries, especially the terminological ones. This paper aims at analyzing the context in which the first Romanian multilingual dictionaries were issued, as well as and the organization and structure particularities of the first lexicographic works of this type. The irretrievable loss of some of these works as well as the partial conservation of others renders the attempt to retrace the beginnings of Romanian lexicography extremely difficult. The research methodology is part of a descriptive and analytical approach based on two types of sources, subject to contrastive analysis: the notes made by the initiators of lexicographic projects and the testimonies of their contemporaries, respectively, along with the specialized studies regarding the history of the old Romanian lexicography. The analysis of the contents has indicated that these dictionaries lacked a scientific apparatus in the true sense of the phrase, failed to obey unitary organizational criteria, being limited, most of the times, to mere inventories of words, where the Romanian term was assigned its correspondent in other languages. Motivated by practical reasons, the first multilingual dictionaries were aimed at the clerics their purpose being to ensure the translators’ fidelity towards the original religious texts, regarded as sacred.

Keywords: Romanian lexicography, multilingual dictionary, terminology, language

Procedia PDF Downloads 291
2034 Architecture of a Preliminary Course on Computational Thinking

Authors: Mintu Philip, Renumol V. G.

Abstract:

An introductory programming course is a major challenge faced in Computing Education. Many of the introductory programming courses fail because student concentrate mainly on writing programs using a programming language rather than involving in problem solving. Computational thinking is a general approach to solve problems. This paper proposes a new preliminary course that aims to develop computational thinking skills in students, which may help them to become good programmers. The proposed course is designed based on the four basic components of computational thinking - abstract thinking, logical thinking, modeling thinking and constructive thinking. In this course, students are engaged in hands-on problem solving activities using a new problem solving model proposed in this paper.

Keywords: computational thinking, computing education, abstraction, constructive thinking, modelling thinking

Procedia PDF Downloads 455
2033 An Empirical Study of the Effect of Robot Programming Education on the Computational Thinking of Young Children: The Role of Flowcharts

Authors: Wei Sun, Yan Dong

Abstract:

There is an increasing interest in introducing computational thinking at an early age. Computational thinking, like mathematical thinking, engineering thinking, and scientific thinking, is a kind of analytical thinking. Learning computational thinking skills is not only to improve technological literacy, but also allows learners to equip with practicable skills such as problem-solving skills. As people realize the importance of computational thinking, the field of educational technology faces a problem: how to choose appropriate tools and activities to help students develop computational thinking skills. Robots are gradually becoming a popular teaching tool, as robots provide a tangible way for young children to access to technology, and controlling a robot through programming offers them opportunities to engage in developing computational thinking. This study explores whether the introduction of flowcharts into the robotics programming courses can help children convert natural language into a programming language more easily, and then to better cultivate their computational thinking skills. An experimental study was adopted with a sample of children ages six to seven (N = 16) participated, and a one-meter-tall humanoid robot was used as the teaching tool. Results show that children can master basic programming concepts through robotic courses. Children's computational thinking has been significantly improved. Besides, results suggest that flowcharts do have an impact on young children’s computational thinking skills development, but it only has a significant effect on the "sequencing" and "correspondence" skills. Overall, the study demonstrates that the humanoid robot and flowcharts have qualities that foster young children to learn programming and develop computational thinking skills.

Keywords: robotics, computational thinking, programming, young children, flow chart

Procedia PDF Downloads 146
2032 A Lexicographic Approach to Obstacles Identified in the Ontological Representation of the Tree of Life

Authors: Sandra Young

Abstract:

The biodiversity literature is vast and heterogeneous. In today’s data age, numbers of data integration and standardisation initiatives aim to facilitate simultaneous access to all the literature across biodiversity domains for research and forecasting purposes. Ontologies are being used increasingly to organise this information, but the rationalisation intrinsic to ontologies can hit obstacles when faced with the intrinsic fluidity and inconsistency found in the domains comprising biodiversity. Essentially the problem is a conceptual one: biological taxonomies are formed on the basis of specific, physical specimens yet nomenclatural rules are used to provide labels to describe these physical objects. These labels are ambiguous representations of the physical specimen. An example of this is with the genus Melpomene, the scientific nomenclatural representation of a genus of ferns, but also for a genus of spiders. The physical specimens for each of these are vastly different, but they have been assigned the same nomenclatural reference. While there is much research into the conceptual stability of the taxonomic concept versus the nomenclature used, to the best of our knowledge as yet no research has looked empirically at the literature to see the conceptual plurality or singularity of the use of these species’ names, the linguistic representation of a physical entity. Language itself uses words as symbols to represent real world concepts, whether physical entities or otherwise, and as such lexicography has a well-founded history in the conceptual mapping of words in context for dictionary making. This makes it an ideal candidate to explore this problem. The lexicographic approach uses corpus-based analysis to look at word use in context, with a specific focus on collocated word frequencies (the frequencies of words used in specific grammatical and collocational contexts). It allows for inconsistencies and contradictions in the source data and in fact includes these in the word characterisation so that 100% of the available evidence is counted. Corpus analysis is indeed suggested as one of the ways to identify concepts for ontology building, because of its ability to look empirically at data and show patterns in language usage, which can indicate conceptual ideas which go beyond words themselves. In this sense it could potentially be used to identify if the hierarchical structures present within the empirical body of literature match those which have been identified in ontologies created to represent them. The first stages of this research have revealed a hierarchical structure that becomes apparent in the biodiversity literature when annotating scientific species’ names, common names and more general names as classes, which will be the focus of this paper. The next step in the research is focusing on a larger corpus in which specific words can be analysed and then compared with existing ontological structures looking at the same material, to evaluate the methods by means of an alternative perspective. This research aims to provide evidence as to the validity of the current methods in knowledge representation for biological entities, and also shed light on the way that scientific nomenclature is used within the literature.

Keywords: ontology, biodiversity, lexicography, knowledge representation, corpus linguistics

Procedia PDF Downloads 137
2031 A Computational Study of the Electron Transport in HgCdTe Bulk Semiconductor

Authors: N. Dahbi, M. Daoudi

Abstract:

This paper deals with the use of computational method based on Monte Carlo simulation in order to investigate the transport phenomena of the electron in HgCdTe narrow band gap semiconductor. Via this method we can evaluate the time dependence of the transport parameters: velocity, energy and mobility of electrons through matter (HgCdTe).

Keywords: Monte Carlo, transport parameters, HgCdTe, computational mechanics

Procedia PDF Downloads 474
2030 Research Activity in Computational Science Using High Performance Computing: Co-Authorship Network Analysis

Authors: Sul-Ah Ahn, Youngim Jung

Abstract:

The research activities of the computational scientists using high-performance computing are analyzed using bibliometric approaches. This study aims at providing computational scientists using high-performance computing and relevant policy planners with useful bibliometric results for an assessment of research activities. In order to achieve this purpose, we carried out a co-authorship network analysis of journal articles to assess the research activities of computational scientists using high-performance computing as a case study. For this study, we used journal articles of the Scopus database from Elsevier covering the time period of 2006-2015. We extracted the author rank in the computational science field using high-performance computing by the number of papers published during ten years from 2006. Finally, we drew the co-authorship network for 50 top-authors and their coauthors and described some features of the co-authorship network in relation to the author rank. Suggestions for further studies are discussed.

Keywords: co-authorship network analysis, computational science, high performance computing, research activity

Procedia PDF Downloads 321
2029 Atwood's Canadianisms and Neologisms: A Cognitive Approach to Literature

Authors: Eleonora Sasso

Abstract:

This paper takes as its starting point the notions of cognitive linguistics and lexical blending, and uses both these theoretical concepts to advance a new reading of Margaret Atwood’s latest writings, one which sees them as paramount literary examples of norm and usage in bilingual Canadian lexicography. Atwood’s prose seems to be imbued with Canadianisms and neologisms, lexical blends of zoomorphic forms, a kind of meeting-point between two conceptual structures which follow the principles of lexical economy and asyntactic relation. Atwood’s neologisms also attest to the undeniable impact on language exerted by Canada’s aboriginal peoples. This paper aims to track through these references and with the aid of the Eskimo-English dictionary look at the linguistic issues – attitudes to contaminations and hybridisations, questions of lexical blending in literary examples, etc – which they raise. Atwood’s fiction, whose cognitive linguistic strategy employs ‘the virtues of scissors and matches’, always strives to achieve isomorphism between word form and concept.

Keywords: Atwood, Canadianisms, cognitive science, Eskimo/English dictionary

Procedia PDF Downloads 264
2028 Alternative Computational Arrangements on g-Group (g > 2) Profile Analysis

Authors: Emmanuel U. Ohaegbulem, Felix N. Nwobi

Abstract:

Alternative and simple computational arrangements in carrying out multivariate profile analysis when more than two groups (populations) are involved are presented. These arrangements have been demonstrated to not only yield equivalent results for the test statistics (the Wilks lambdas), but they have less computational efforts relative to other arrangements so far presented in the literature; in addition to being quite simple and easy to apply.

Keywords: coincident profiles, g-group profile analysis, level profiles, parallel profiles, repeated measures MANOVA

Procedia PDF Downloads 448
2027 Integrating and Evaluating Computational Thinking in an Undergraduate Marine Science Course

Authors: Dana Christensen

Abstract:

Undergraduate students, particularly in the environmental sciences, have difficulty displaying quantitative skills in their laboratory courses. Students spend time sampling in the field, often using new methods, and are expected to make sense of the data they collect. Computational thinking may be used to navigate these new experiences. We developed a curriculum for the marine science department at a small liberal arts college in the Northeastern United States based on previous computational thinking frameworks. This curriculum incorporates marine science data sets with specific objectives and topics selected by the faculty at the College. The curriculum was distributed to all students enrolled in introductory marine science classes as a mandatory module. Two pre-tests and post-tests will be used to quantitatively assess student progress on both content-based and computational principles. Student artifacts are being collected with each lesson to be coded for content-specific and computational-specific items in qualitative assessment. There is an overall gap in marine science education research, especially curricula that focus on computational thinking and associated quantitative assessment. The curricula itself, the assessments, and our results may be modified and applied to other environmental science courses due to the nature of the inquiry-based laboratory components that use quantitative skills to understand nature.

Keywords: marine science, computational thinking, curriculum assessment, quantitative skills

Procedia PDF Downloads 59
2026 Lexical Bundles in the Alexiad of Anna Comnena: Computational and Discourse Analysis Approach

Authors: Georgios Alexandropoulos

Abstract:

The purpose of this study is to examine the historical text of Alexiad by Anna Comnena using computational tools for the extraction of lexical bundles containing the name of her father, Alexius Comnenus. For this reason, in this research we apply corpus linguistics techniques for the automatic extraction of lexical bundles and through them we will draw conclusions about how these lexical bundles serve her support provided to her father.

Keywords: lexical bundles, computational literature, critical discourse analysis, Alexiad

Procedia PDF Downloads 623
2025 Metaphor Institutionalization as Phase Transition: Case Studies of Chinese Metaphors

Authors: Xuri Tang, Ting Pan

Abstract:

Metaphor institutionalization refers to the propagation of a metaphor that leads to its acceptance in speech community as a norm of the language. Such knowledge is important to both theoretical studies of metaphor and practical disciplines such as lexicography and language generation. This paper reports an empirical study of metaphor institutionalization of 14 Chinese metaphors. It first explores the pattern of metaphor institutionalization by fitting the logistic function (or S-shaped curve) to time series data of conventionality of the metaphors that are automatically obtained from a large-scale diachronic Chinese corpus. Then it reports a questionnaire-based survey on the propagation scale of each metaphor, which is measured by the average number of subjects that can easily understand the metaphorical expressions. The study provides two pieces of evidence supporting the hypothesis that metaphor institutionalization is a phrase transition: (1) the pattern of metaphor institutionalization is an S-shaped curve and (2) institutionalized metaphors generally do not propagate to the whole community but remain in equilibrium state. This conclusion helps distinguish metaphor institutionalization from topicalization and other types of semantic change.

Keywords: metaphor institutionalization, phase transition, propagation scale, s-shaped curve

Procedia PDF Downloads 169
2024 Binarized-Weight Bilateral Filter for Low Computational Cost Image Smoothing

Authors: Yu Zhang, Kohei Inoue, Kiichi Urahama

Abstract:

We propose a simplified bilateral filter with binarized coefficients for accelerating it. Its computational cost is further decreased by sampling pixels. This computationally low cost filter is useful for smoothing or denoising images by using mobile devices with limited computational power.

Keywords: bilateral filter, binarized-weight bilateral filter, image smoothing, image denoising, pixel sampling

Procedia PDF Downloads 468
2023 Simulation of Photocatalytic Degradation of Rhodamine B in Annular Photocatalytic Reactor

Authors: Jatinder Kumar, Ajay Bansal

Abstract:

Simulation of a photocatalytic reactor helps in understanding the complex behavior of the photocatalytic degradation. Simulation also aids the designing and optimization of the photocatalytic reactor. Lack of simulation strategies is a huge hindrance in the commercialization of the photocatalytic technology. With the increased performance of computational resources, and development of simulation software, computational fluid dynamics (CFD) is becoming an affordable engineering tool to simulate and optimize reactor designs. In the present paper, a CFD (Computational fluid dynamics) model for simulating the performance of an immobilized-titanium dioxide based annular photocatalytic reactor was developed. The computational model integrates hydrodynamics, species mass transport, and chemical reaction kinetics using a commercial CFD code Fluent 6.3.26. The CFD model was based on the intrinsic kinetic parameters determined experimentally in a perfectly mixed batch reactor. Rhodamine B, a complex organic compound, was selected as a test pollutant for photocatalytic degradation. It was observed that CFD could become a valuable tool to understand and improve the photocatalytic systems.

Keywords: simulation, computational fluid dynamics (CFD), annular photocatalytic reactor, titanium dioxide

Procedia PDF Downloads 584
2022 Continuum-Based Modelling Approaches for Cell Mechanics

Authors: Yogesh D. Bansod, Jiri Bursa

Abstract:

The quantitative study of cell mechanics is of paramount interest since it regulates the behavior of the living cells in response to the myriad of extracellular and intracellular mechanical stimuli. The novel experimental techniques together with robust computational approaches have given rise to new theories and models, which describe cell mechanics as a combination of biomechanical and biochemical processes. This review paper encapsulates the existing continuum-based computational approaches that have been developed for interpreting the mechanical responses of living cells under different loading and boundary conditions. The salient features and drawbacks of each model are discussed from both structural and biological points of view. This discussion can contribute to the development of even more precise and realistic computational models of cell mechanics based on continuum approaches or on their combination with microstructural approaches, which in turn may provide a better understanding of mechanotransduction in living cells.

Keywords: cell mechanics, computational models, continuum approach, mechanical models

Procedia PDF Downloads 363
2021 All-or-None Principle and Weakness of Hodgkin-Huxley Mathematical Model

Authors: S. A. Sadegh Zadeh, C. Kambhampati

Abstract:

Mathematical and computational modellings are the necessary tools for reviewing, analysing, and predicting processes and events in the wide spectrum range of scientific fields. Therefore, in a field as rapidly developing as neuroscience, the combination of these two modellings can have a significant role in helping to guide the direction the field takes. The paper combined mathematical and computational modelling to prove a weakness in a very precious model in neuroscience. This paper is intended to analyse all-or-none principle in Hodgkin-Huxley mathematical model. By implementation the computational model of Hodgkin-Huxley model and applying the concept of all-or-none principle, an investigation on this mathematical model has been performed. The results clearly showed that the mathematical model of Hodgkin-Huxley does not observe this fundamental law in neurophysiology to generating action potentials. This study shows that further mathematical studies on the Hodgkin-Huxley model are needed in order to create a model without this weakness.

Keywords: all-or-none, computational modelling, mathematical model, transmembrane voltage, action potential

Procedia PDF Downloads 616
2020 Using the Cluster Computing to Improve the Computational Speed of the Modular Exponentiation in RSA Cryptography System

Authors: Te-Jen Chang, Ping-Sheng Huang, Shan-Ten Cheng, Chih-Lin Lin, I-Hui Pan, Tsung- Hsien Lin

Abstract:

RSA system is a great contribution for the encryption and the decryption. It is based on the modular exponentiation. We call this system as “a large of numbers for calculation”. The operation of a large of numbers is a very heavy burden for CPU. For increasing the computational speed, in addition to improve these algorithms, such as the binary method, the sliding window method, the addition chain method, and so on, the cluster computer can be used to advance computational speed. The cluster system is composed of the computers which are installed the MPICH2 in laboratory. The parallel procedures of the modular exponentiation can be processed by combining the sliding window method with the addition chain method. It will significantly reduce the computational time of the modular exponentiation whose digits are more than 512 bits and even more than 1024 bits.

Keywords: cluster system, modular exponentiation, sliding window, addition chain

Procedia PDF Downloads 520
2019 A Fast, Portable Computational Framework for Aerodynamic Simulations

Authors: Mehdi Ghommem, Daniel Garcia, Nathan Collier, Victor Calo

Abstract:

We develop a fast, user-friendly implementation of a potential flow solver based on the unsteady vortex lattice method (UVLM). The computational framework uses the Python programming language which has easy integration with the scripts requiring computationally-expensive operations written in Fortran. The mixed-language approach enables high performance in terms of solution time and high flexibility in terms of easiness of code adaptation to different system configurations and applications. This computational tool is intended to predict the unsteady aerodynamic behavior of multiple moving bodies (e.g., flapping wings, rotating blades, suspension bridges...) subject to an incoming air. We simulate different aerodynamic problems to validate and illustrate the usefulness and effectiveness of the developed computational tool.

Keywords: unsteady aerodynamics, numerical simulations, mixed-language approach, potential flow

Procedia PDF Downloads 290
2018 Robot Spatial Reasoning via 3D Models

Authors: John Allard, Alex Rich, Iris Aguilar, Zachary Dodds

Abstract:

With this paper we present several experiences deploying novel, low-cost resources for computing with 3D spatial models. Certainly, computing with 3D models undergirds some of our field’s most important contributions to the human experience. Most often, those are contrived artifacts. This work extends that tradition by focusing on novel resources that deliver uncontrived models of a system’s current surroundings. Atop this new capability, we present several projects investigating the student-accessibility of the computational tools for reasoning about the 3D space around us. We conclude that, with current scaffolding, real-world 3D models are now an accessible and viable foundation for creative computational work.

Keywords: 3D vision, matterport model, real-world 3D models, mathematical and computational methods

Procedia PDF Downloads 535
2017 Big Data Analytics and Data Security in the Cloud via Fully Homomorphic Encryption

Authors: Waziri Victor Onomza, John K. Alhassan, Idris Ismaila, Noel Dogonyaro Moses

Abstract:

This paper describes the problem of building secure computational services for encrypted information in the Cloud Computing without decrypting the encrypted data; therefore, it meets the yearning of computational encryption algorithmic aspiration model that could enhance the security of big data for privacy, confidentiality, availability of the users. The cryptographic model applied for the computational process of the encrypted data is the Fully Homomorphic Encryption Scheme. We contribute theoretical presentations in high-level computational processes that are based on number theory and algebra that can easily be integrated and leveraged in the Cloud computing with detail theoretic mathematical concepts to the fully homomorphic encryption models. This contribution enhances the full implementation of big data analytics based cryptographic security algorithm.

Keywords: big data analytics, security, privacy, bootstrapping, homomorphic, homomorphic encryption scheme

Procedia PDF Downloads 378
2016 The Analysis of Indian Culture through the Lexicographical Discourse of Hindi-French Dictionary

Authors: Tanzil Ansari

Abstract:

A dictionary is often considered as a list of words, arranged in alphabetical orders, providing information on a language or languages and it informs us about the spelling, the pronunciation, the origin, the gender and the grammatical functions of new and unknown words. In other words, it is first and foremost a linguistic tool. But, the research across the world in the field of linguistic and lexicography proved that a dictionary is not only a linguistic tool but also a cultural product through which a lexicographer transmits the culture of a country or a linguistic community from his or her ideology. It means, a dictionary does not present only language and its metalinguistic functions but also its culture. Every language consists of some words and expressions which depict the culture of its language. In this way, it is impossible to disassociate language from its culture. There is always an ideology that plays an important role in the depiction of any culture. Using the orientalism theory of Edward Said to represent the east, the objective of the present research is to study the representation of Indian culture through the lexicographical discourse of Hindi-French Dictionary of Federica Boschetti, a French lexicographer. The results show that the Indian culture is stereotypical and monolithic. It also shows India as male oriented country where women are exploited by male-dominated society. The study is focused on Hindi-French dictionary, but its line of argument can be compared to dictionaries produced in other languages.

Keywords: culture, dictionary, lexicographical discourse, stereotype image

Procedia PDF Downloads 300
2015 The Presence of Anglicisms in Italian Fashion Magazines and Fashion Blogs

Authors: Vivian Orsi

Abstract:

The present research investigates the lexicon of a fashion magazine, whose universe is very receptive to lexical loans, especially those from English, called Anglicisms. Specifically, we intend to discuss the presence of English items and expressions in the Vogue Italia fashion magazine. Besides, we aim to study the anglicisms used in an Italian fashion blog called The Blonde Salad. Within the discussion of fashion blogs and their contributions to scientific studies, we adopt the theories of Lexicology / Lexicography to define Anglicism (BIDERMAN, 2001), and the observation of its prestige in the Italian Language (ROGATO, 2008; BISETTO, 2003). According to the theoretical basis mentioned, we intend to make a brief analysis of the Anglicisms collected from posts of the first year of existence of such fashion blog, emphasizing also the keywords that have the role to encapsulate the content of the text, allowing the reader to retrieve information from the post of the blog. About the use of English in Italian magazines and blogs, we can affirm that it seems to represent sophistication, assuming the value of prerequisite to participate in the fashion centers of the world. Besides, we believe, as Barthes says (1990, p. 215), that “Fashion does not evolve, it changes: its lexicon is new each year, like that of a language which always keeps the same system but suddenly and regularly ‘changes’ the currency of its words”. Fashion is a mode of communication: it is present in man's interaction with the world, which means that such lexical universe is represented according to the particularities of each culture.

Keywords: anglicism, lexicology, magazines, blogs, fashion

Procedia PDF Downloads 332
2014 Computational Fluid Dynamics Analysis for Radon Dispersion Study and Mitigation

Authors: A. K. Visnuprasad, P. J. Jojo, Reshma Bhaskaran

Abstract:

Computational fluid dynamics (CFD) is used to simulate the distribution of indoor radon concentration in a living room with elevated levels of radon concentration which varies from 22 Bqm-3 to 1533 Bqm-3 in 24 hours. Finite volume method (FVM) was used for the simulation. The simulation results were experimentally validated at 16 points in two horizontal planes (y=1.4m & y=2.0m) using pin-hole dosimeters and at 3 points using scintillation radon monitor (SRM). Passive measurement using pin-hole dosimeters were performed in all seasons. Another simulation was done to find a suitable position for a passive ventilation system for the effective mitigation of radon.

Keywords: indoor radon, computational fluid dynamics, radon flux, ventilation rate, pin-hole dosimeter

Procedia PDF Downloads 412
2013 Investigation of the Physical Computing in Computational Thinking Practices, Computer Programming Concepts and Self-Efficacy for Crosscutting Ideas in STEM Content Environments

Authors: Sarantos Psycharis

Abstract:

Physical Computing, as an instructional model, is applied in the framework of the Engineering Pedagogy to teach “transversal/cross-cutting ideas” in a STEM content approach. Labview and Arduino were used in order to connect the physical world with real data in the framework of the so called Computational Experiment. Tertiary prospective engineering educators were engaged during their course and Computational Thinking (CT) concepts were registered before and after the intervention across didactic activities using validated questionnaires for the relationship between self-efficacy, computer programming, and CT concepts when STEM content epistemology is implemented in alignment with the Computational Pedagogy model. Results show a significant change in students’ responses for self-efficacy for CT before and after the instruction. Results also indicate a significant relation between the responses in the different CT concepts/practices. According to the findings, STEM content epistemology combined with Physical Computing should be a good candidate as a learning and teaching approach in university settings that enhances students’ engagement in CT concepts/practices.

Keywords: arduino, computational thinking, computer programming, Labview, self-efficacy, STEM

Procedia PDF Downloads 113
2012 Unconventional Calculus Spreadsheet Functions

Authors: Chahid K. Ghaddar

Abstract:

The spreadsheet engine is exploited via a non-conventional mechanism to enable novel worksheet solver functions for computational calculus. The solver functions bypass inherent restrictions on built-in math and user defined functions by taking variable formulas as a new type of argument while retaining purity and recursion properties. The enabling mechanism permits integration of numerical algorithms into worksheet functions for solving virtually any computational problem that can be modelled by formulas and variables. Several examples are presented for computing integrals, derivatives, and systems of deferential-algebraic equations. Incorporation of the worksheet solver functions with the ubiquitous spreadsheet extend the utility of the latter as a powerful tool for computational mathematics.

Keywords: calculus, differential algebraic equations, solvers, spreadsheet

Procedia PDF Downloads 360
2011 Simulations of NACA 65-415 and NACA 64-206 Airfoils Using Computational Fluid Dynamics

Authors: David Nagy

Abstract:

This paper exemplifies the influence of the purpose of an aircraft on the aerodynamic properties of its airfoil. In particular, the research takes into consideration two types of aircraft, namely cargo aircraft and military high-speed aircraft and compares their airfoil characteristics using their NACA airfoils as well as computational fluid dynamics. The results show that airfoils of aircraft designed for cargo have a heavier focus on maintaining a large lift force whereas speed-oriented airplanes focus on minimizing the drag force.

Keywords: aerodynamic simulation, aircraft, airfoil, computational fluid dynamics, lift to drag ratio, NACA 64-206, NACA 65-415

Procedia PDF Downloads 386
2010 Computational Approach to the Interaction of Neurotoxins and Kv1.3 Channel

Authors: Janneth González, George Barreto, Ludis Morales, Angélica Sabogal

Abstract:

Sea anemone neurotoxins are peptides that interact with Na+ and K+ channels, resulting in specific alterations on their functions. Some of these neurotoxins (1ROO, 1BGK, 2K9E, 1BEI) are important for the treatment of nearly eighty autoimmune disorders due to their specificity for Kv1.3 channel. The aim of this study was to identify the common residues among these neurotoxins by computational methods, and establish whether there is a pattern useful for the future generation of a treatment for autoimmune diseases. Our results showed eight new key common residues between the studied neurotoxins interacting with a histidine ring and the selectivity filter of the receptor, thus showing a possible pattern of interaction. This knowledge may serve as an input for the design of more promising drugs for autoimmune treatments.

Keywords: neurotoxins, potassium channel, Kv1.3, computational methods, autoimmune diseases

Procedia PDF Downloads 372
2009 Big Data Analytics and Data Security in the Cloud via Fully Homomorphic Encyption Scheme

Authors: Victor Onomza Waziri, John K. Alhassan, Idris Ismaila, Noel Dogonyara

Abstract:

This paper describes the problem of building secure computational services for encrypted information in the Cloud. Computing without decrypting the encrypted data; therefore, it meets the yearning of computational encryption algorithmic aspiration model that could enhance the security of big data for privacy or confidentiality, availability and integrity of the data and user’s security. The cryptographic model applied for the computational process of the encrypted data is the Fully Homomorphic Encryption Scheme. We contribute a theoretical presentations in a high-level computational processes that are based on number theory that is derivable from abstract algebra which can easily be integrated and leveraged in the Cloud computing interface with detail theoretic mathematical concepts to the fully homomorphic encryption models. This contribution enhances the full implementation of big data analytics based on cryptographic security algorithm.

Keywords: big data analytics, security, privacy, bootstrapping, Fully Homomorphic Encryption Scheme

Procedia PDF Downloads 479
2008 Computational Neurosciences: An Inspiration from Biological Neurosciences

Authors: Harsh Sadawarti, Kamal Malik

Abstract:

Humans are the unique and the most powerful creature on this planet just because of the high level of intelligence gifted by nature. Computational Intelligence is highly influenced by the term natural intelligence, neurosciences and mathematics. To deal with the in-depth study of computational intelligence and to utilize it in real-life applications, it is quite important to understand its simulation with the human brain. In this paper, the three important parts, Frontal Lobe, Occipital Lobe and Parietal Lobe of the human brain, are compared with the ANN(Artificial Neural Network), CNN(Convolutional Neural network), and RNN(Recurrent Neural Network), respectively. Intelligent computational systems are created by combining deductive reasoning, logical concepts and high-level algorithms with the simulation and study of the human brain. Human brain is a combination of Physiology, Psychology, emotions, calculations and many other parameters which are of utmost importance that determines the overall intelligence. To create intelligent algorithms, smart machines and to simulate the human brain in an effective manner, it is quite important to have an insight into the human brain and the basic concepts of biological neurosciences.

Keywords: computational intelligence, neurosciences, convolutional neural network, recurrent neural network, artificial neural network, frontal lobe, occipital lobe, parietal lobe

Procedia PDF Downloads 108