Search results for: writing DNA code from scratch
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2178

Search results for: writing DNA code from scratch

1878 An Investigation on Overstrength Factor (Ω) of Reinforced Concrete Buildings in Turkish Earthquake Draft Code (TEC-2016)

Authors: M. Hakan Arslan, I. Hakkı Erkan

Abstract:

Overstrength factor is an important parameter of load reduction factor. In this research, the overstrength factor (Ω) of reinforced concrete (RC) buildings and the parameters of Ω in TEC-2016 draft version have been explored. For this aim, 48 RC buildings have been modeled according to the current seismic code TEC-2007 and Turkish Building Code-500-2000 criteria. After modelling step, nonlinear static pushover analyses have been applied to these buildings by using TEC-2007 Section 7. After the nonlinear pushover analyses, capacity curves (lateral load-lateral top displacement curves) have been plotted for 48 RC buildings. Using capacity curves, overstrength factors (Ω) have been derived for each building. The obtained overstrength factor (Ω) values have been compared with TEC-2016 values for related building types, and the results have been interpreted. According to the obtained values from the study, overstrength factor (Ω) given in TEC-2016 draft code is found quite suitable.

Keywords: reinforced concrete buildings, overstrength factor, earthquake, static pushover analysis

Procedia PDF Downloads 338
1877 Code Switching and Code Mixing among Adolescents in Kashmir

Authors: Sarwat un Nisa

Abstract:

One of the remarkable gifts that a human being is blessed with is the ability to speak using a combination of sounds. Different combinations of sounds combine to form a word which in turn make a sentence and therefore give birth to a language. A person can either be a monolingual, i.e., can speak one language or bilingual, i.e., can speak more than one language. Whether a person speaks one language or multiple languages or in whatever language a person speaks, the main aim is to communicate, express ideas, feelings or thoughts. Sometimes the choice of a language is deliberate and sometimes it is a habitual act. The language which is used to put our ideas across speaks many things about our cultural, linguistic and ethnic identities. It can never be claimed that bilinguals are better than monolinguals in terms of linguistic skills, bilinguals or multilinguals have more than one language at their disposal. Therefore, how effectively two languages are used by the same person keeps linguists always intrigued. The most prominent and common features found in the speech of bilingual speakers are code switching and code mixing. The aim of the present paper is to explore these features among the adolescent speakers of Kashmir. The reason for studying the linguistics behavior of adolescents is the age when a person is neither an adult nor a child. They want to drift away from the norms and make a new norm for themselves. Therefore, how their linguistics skills are influenced by their age is of great interest because it can set the trend for the future generation. Kashmir is a multilingual society where three languages, i.e., Kashmiri, Urdu, and English are regularly used by the speakers, especially the educated ones. Kashmiri is widely used at home or mostly among adults. Urdu is the official language, and English is used in schools and for most of the written official correspondences. Thus, it is not uncommon to find these three languages coming in contact with each other quite frequently. The language contact results in the code switching and code mixing. In this paper different aspects of code switching and code mixing are discussed. Research Method: The data were collected from the different districts of Kashmir. The informants did not have prior knowledge of the survey. The situation was spontaneous and natural. The topics were introduced by the interviewer to the group of informants which comprised of three participants. They were asked to discuss the topic, most of the times without any intervention of the interviewer. Along with conversations, the informants also filled in written questionnaires comprising sociolinguistic questions. Questionnaires were analysed to get an idea about the sociolinguistic attitude of the informants. Percentage, frequency, and average were used as statistical tools to analyse the data. Conclusions were drawn taking into consideration of interpretations of both speech samples and questionnaires.

Keywords: code mixing, code switching, Kashmir, bilingualism

Procedia PDF Downloads 121
1876 Developing Writing Skills of Learners with Persistent Literacy Difficulties through the Explicit Teaching of Grammar in Context: Action Research in a Welsh Secondary School

Authors: Jean Ware, Susan W. Jones

Abstract:

Background: The benefits of grammar instruction in the teaching of writing is contested in most English speaking countries. A majority of Anglophone countries abandoned the teaching of grammar in the 1950s based on the conclusions that it had no positive impact on learners’ development of reading, writing, and language. Although the decontextualised teaching of grammar is not helpful in improving writing, a curriculum with a focus on grammar in an embedded and meaningful way can help learners develop their understanding of the mechanisms of language. Although British learners are generally not taught grammar rules explicitly, learners in schools in France, the Netherlands, and Germany are taught explicitly about the structure of their own language. Exposing learners to grammatical analysis can help them develop their understanding of language. Indeed, if learners are taught that each part of speech has an identified role in the sentence. This means that rather than have to memorise lists of words or spelling patterns, they can focus on determining each word or phrase’s task in the sentence. These processes of categorisation and deduction are higher order thinking skills. When considering definitions of dyslexia available in Great Britain, the explicit teaching of grammar in context could help learners with persistent literacy difficulties. Indeed, learners with dyslexia often develop strengths in problem solving; the teaching of grammar could, therefore, help them develop their understanding of language by using analytical and logical thinking. Aims: This study aims at gaining a further understanding of how the explicit teaching of grammar in context can benefit learners with persistent literacy difficulties. The project is designed to identify ways of adapting existing grammar focussed teaching materials so that learners with specific learning difficulties such as dyslexia can use them to further develop their writing skills. It intends to improve educational practice through action, analysis and reflection. Research Design/Methods: The project, therefore, uses an action research design and multiple sources of evidence. The data collection tools used were standardised test data, teacher assessment data, semi-structured interviews, learners’ before and after attempts at a writing task at the beginning and end of the cycle, documentary data and lesson observation carried out by a specialist teacher. Existing teaching materials were adapted for use with five Year 9 learners who had experienced persistent literacy difficulties from primary school onwards. The initial adaptations included reducing the amount of content to be taught in each lesson, and pre teaching some of the metalanguage needed. Findings: Learners’ before and after attempts at the writing task were scored by a colleague who did not know the order of the attempts. All five learners’ scores were higher on the second writing task. Learners reported that they had enjoyed the teaching approach. They also made suggestions to be included in the second cycle, as did the colleague who carried out observations. Conclusions: Although this is a very small exploratory study, these results suggest that adapting grammar focused teaching materials shows promise for helping learners with persistent literacy difficulties develop their writing skills.

Keywords: explicit teaching of grammar in context, literacy acquisition, persistent literacy difficulties, writing skills

Procedia PDF Downloads 138
1875 The Analysis of TRACE/FRAPTRAN in the Fuel Rods of Maanshan PWR for LBLOCA

Authors: J. R. Wang, W. Y. Li, H. T. Lin, J. H. Yang, C. Shih, S. W. Chen

Abstract:

Fuel rod analysis program transient (FRAPTRAN) code was used to study the fuel rod performance during a postulated large break loss of coolant accident (LBLOCA) in Maanshan nuclear power plant (NPP). Previous transient results from thermal hydraulic code, TRACE, with the same LBLOCA scenario, were used as input boundary conditions for FRAPTRAN. The simulation results showed that the peak cladding temperatures and the fuel center line temperatures were all below the 10CFR50.46 LOCA criteria. In addition, the maximum hoop stress was 18 MPa and the oxide thickness was 0.003 mm for the present simulation cases, which are all within the safety operation ranges. The present study confirms that this analysis method, the FRAPTRAN code combined with TRACE, is an appropriate approach to predict the fuel integrity under LBLOCA with operational ECCS.

Keywords: FRAPTRAN, TRACE, LOCA, PWR

Procedia PDF Downloads 492
1874 A Comparative Genre-Based Study of Research Articles' Method and Results Sections Authored by Iranian and English Native Speakers

Authors: Mohammad Amin Mozaheb, Mahnaz Saeidi, Saeideh Ahangari, Saeideh Ahangari

Abstract:

The present genre-driven study aims at comparing moves and sub-moves deployed by Iranian and English medical writers while writing their research articles in English. To obtain the goals of the study, the researchers randomly selected a number of medical articles and compared them using Nwogu (1997)’s model. The results of relevant statistical tests, Chi-square tests for goodness of fit, used for comparing the two groups of the articles dubbed IrISI (Iranian ISI articles) and EISI (English ISI articles) have shown that no significant difference exists between the two groups of the articles in terms of the moves and sub-moves used in the method and results sections of them. The findings can be beneficial for people interested in English for Specific Purposes (ESP) and medical experts. The findings can also increase language awareness and genre awareness among researchers who are interested in publishing their research outcomes in ISI-indexed journals in the Islamic Republic of Iran and some other world countries.

Keywords: writing, ESP, research articles, medical sciences, language, scientific writing

Procedia PDF Downloads 350
1873 Design and Analysis of Piping System with Supports Using CAESAR-II

Authors: M. Jamuna Rani, K. Ramanathan

Abstract:

A steam power plant is housed with various types of equipments like boiler, turbine, heat exchanger etc. These equipments are mainly connected with piping systems. Such a piping layout design depends mainly on stress analysis and flexibility. It will vary with respect to pipe geometrical properties, pressure, temperature, and supports. The present paper is to analyze the presence and effect of hangers and expansion joints in the piping layout/routing using CAESAR-II software. Main aim of piping stress analysis is to provide adequate flexibility for absorbing thermal expansion, code compliance for stresses and displacement incurred in piping system. The design is said to be safe if all these are in allowable range as per code. In this study, a sample problem is considered for analysis as per power piping ASME B31.1 code and the results thus obtained are compared.

Keywords: ASTM B31.1, hanger, expansion joint, CAESAR-II

Procedia PDF Downloads 342
1872 A Simple Recursive Framework to Generate Gray Codes for Weak Orders in Constant Amortized Time

Authors: Marsden Jacques, Dennis Wong

Abstract:

A weak order is a way to rank n objects where ties are allowed. In this talk, we present a recursive framework to generate Gray codes for weak orders. We then describe a simple algorithm based on the framework that generates 2-Gray codes for weak orders in constant amortized time per string. This framework can easily be modified to generate other Gray codes for weak orders. We provide an example on using the framework to generate the first Shift Gray code for weak orders, also in constant amortized time, where consecutive strings differ by a shift or a symbol change.

Keywords: weak order, Cayley permutation, Gray code, shift Gray code

Procedia PDF Downloads 152
1871 Recent Legal Changes in Turkish Commercial Law to Be a Part of International Markets and Their Results

Authors: Ibrahim Arslan

Abstract:

Since 1984, Turkey has experienced a significant transformation in legal and economic matters. The most consequential examples of this transformation in recent years are the renewal of the Commercial Code and the Check Act. Nowadays, the commercial activity is not limited within the boundaries of the country; on the contrary, as required by the global economy, it has an international dimension. For this reason, unlike some other legal principles, the rules regulating the commercial life should be compatible with the international standards as much as possible. Otherwise the development possibility in the global markets will be limited. The Check Act has been adopted in 2009 and the Commercial Code has been adopted in 2011. The Commercial Code has been entered into force on 1 July 2012. The international dimension of check is in-disputable for it is based on the Geneva Convention. However, the Turkish business life has created a unique application of this legal tool. This application is called “post-date” checks. Indeed the majority of the checks being used in the market are post-dated checks. The holders of these checks have waited the date written on the check for presentation and collection. Thus, the actual situation has occurred. This actual situation has been legitimized via Check Act No. 5941 and post dated checks have gained a legal status. In the preparation of the new the Turkish Commercial Code one of the goals is "to ensure that the Turkish commercial law becomes a part of the international market". To achieve this goal, significant changes have been made especially concerning the independent external audition of the corporations, the board structure and public disclosure regulations. These changes aim to facilitate the internationalization of Turkish corporations as well as intensification of foreign direct investments through foreign capital. Although the target has been determined this way, after the adoption but five days before the entry into force of the Turkish Commercial Code No. 6102, a law made backward going alterations concerning independent external audition and public disclosure regulations. Turkish Commercial Code has been currently in force with its altered status. Both the regulations in the Check Act as well as the changes in the Commercial Code are not compatible with the goals introduced by rationale “to ensure Turkish commercial law to be a part of the international market” as such.

Keywords: Turkish Commercial Code No. 6102, Turkish Check Act, “post-date” checks, legal changes

Procedia PDF Downloads 271
1870 The Application of Lesson Study Model in Writing Review Text in Junior High School

Authors: Sulastriningsih Djumingin

Abstract:

This study has some objectives. It aims at describing the ability of the second-grade students to write review text without applying the Lesson Study model at SMPN 18 Makassar. Second, it seeks to describe the ability of the second-grade students to write review text by applying the Lesson Study model at SMPN 18 Makassar. Third, it aims at testing the effectiveness of the Lesson Study model in writing review text at SMPN 18 Makassar. This research was true experimental design with posttest Only group design involving two groups consisting of one class of the control group and one class of the experimental group. The research populations were all the second-grade students at SMPN 18 Makassar amounted to 250 students consisting of 8 classes. The sampling technique was purposive sampling technique. The control class was VIII2 consisting of 30 students, while the experimental class was VIII8 consisting of 30 students. The research instruments were in the form of observation and tests. The collected data were analyzed using descriptive statistical techniques and inferential statistical techniques with t-test types processed using SPSS 21 for windows. The results shows that: (1) of 30 students in control class, there are only 14 (47%) students who get the score more than 7.5, categorized as inadequate; (2) in the experimental class, there are 26 (87%) students who obtain the score of 7.5, categorized as adequate; (3) the Lesson Study models is effective to be applied in writing review text. Based on the comparison of the ability of the control class and experimental class, it indicates that the value of t-count is greater than the value of t-table (2.411> 1.667). It means that the alternative hypothesis (H1) proposed by the researcher is accepted.

Keywords: application, lesson study, review text, writing

Procedia PDF Downloads 183
1869 Automatic Teller Machine System Security by Using Mobile SMS Code

Authors: Husnain Mushtaq, Mary Anjum, Muhammad Aleem

Abstract:

The main objective of this paper is used to develop a high security in Automatic Teller Machine (ATM). In these system bankers will collect the mobile numbers from the customers and then provide a code on their mobile number. In most country existing ATM machine use the magnetic card reader. The customer is identifying by inserting an ATM card with magnetic card that hold unique information such as card number and some security limitations. By entering a personal identification number, first the customer is authenticated then will access bank account in order to make cash withdraw or other services provided by the bank. Cases of card fraud are another problem once the user’s bank card is missing and the password is stolen, or simply steal a customer’s card & PIN the criminal will draw all cash in very short time, which will being great financial losses in customer, this type of fraud has increase worldwide. So to resolve this problem we are going to provide the solution using “Mobile SMS code” and ATM “PIN code” in order to improve the verify the security of customers using ATM system and confidence in the banking area.

Keywords: PIN, inquiry, biometric, magnetic strip, iris recognition, face recognition

Procedia PDF Downloads 342
1868 Skew Cyclic Codes over Fq+uFq+…+uk-1Fq

Authors: Jing Li, Xiuli Li

Abstract:

This paper studies a special class of linear codes, called skew cyclic codes, over the ring R= Fq+uFq+…+uk-1Fq, where q is a prime power. A Gray map ɸ from R to Fq and a Gray map ɸ' from Rn to Fnq are defined, as well as an automorphism Θ over R. It is proved that the images of skew cyclic codes over R under map ɸ' and Θ are cyclic codes over Fq, and they still keep the dual relation.

Keywords: skew cyclic code, gray map, automorphism, cyclic code

Procedia PDF Downloads 275
1867 Genre Analysis and Interview: Body Paragraphs of Student English Academic Essays

Authors: Chek Kim Loi

Abstract:

This study reports on a study examining the body paragraphs of English academic essays written by some ESL (English as a Second Language) undergraduate students. These students took English for Academic Purposes course for one semester at a public university in Malaysia. In addition to analyzing the communicative purposes employed in the sample, for triangulation of data, student participants were interviewed on their academic writing experience in their English for Academic Purposes (EAP) classroom. The present study has pedagogical implications in an EAP classroom.

Keywords: academic writing, body paragraphs, communicative purposes, pedagogical implications

Procedia PDF Downloads 232
1866 Adult Learners’ Code-Switching in the EFL Classroom: An Analysis of Frequency and Type of Code-Switching

Authors: Elizabeth Patricia Beck

Abstract:

Stepping into various English as foreign language classrooms, one will see some fundamental similarities. There will likely be groups of students working collaboratively, possibly sitting at tables together. They will be using a set coursebook or photocopies of materials developed by publishers or the teacher. The teacher will be carefully monitoring students’ behaviour and progress. The teacher will also likely be insisting that the students only speak English together, possibly having implemented a complex penalty and award systems to encourage this. This is communicative language teaching and it is commonly how foreign languages are taught around the world. Recently, there has been much interest in the codeswitching behaviour of learners in foreign or second language classrooms. It is a significant topic as it relates to second language acquisition theory, language teaching training and policy, and student expectations and classroom practice. Generally in an English as a foreign language context, an ‘English Only’ policy is the norm. This is based on historical factors, socio-political influence and theories surrounding language learning. The trend, however, is shifting and, based on these same factors, a re-examination of language use in the foreign language classroom is taking place. This paper reports the findings of an examination into the codeswitching behaviour of learners with a shared native language in an English classroom. Specifically, it addresses the question of classroom code-switching by adult learners in the EFL classroom during student-to-student, spoken interaction. Three generic categories of code switching are proposed based on published research and classroom practice. Italian adult learners at three levels were observed and patterns of language use were identified, recorded and analysed using the proposed categories. After observations were completed, a questionnaire was distributed to the students focussing on attitudes and opinions around language choice in the EFL classroom, specifically, the usefulness of L1 for specific functions in the classroom. The paper then investigates the relationship between learners’ foreign language proficiency and the frequency and type of code-switching that they engaged in, and the relationship between learners’ attitudes to classroom code-switching and their behaviour. Results show that code switching patterns underwent changes as the students’ level of English language proficiency improved, and that students’ attitudes towards code-switching generally correlated with their behaviour with some exceptions, however. Finally, the discussion focusses on the details of the language produced in observation, possible influencing factors that may affect the frequency and type of code switching that took place, and additional influencing factors that may affect students’ attitudes towards code switching in the foreign language classroom. An evaluation of the limitations of this study is offered and some suggestions are made for future research in this field of study.

Keywords: code-switching, EFL, second language aquisition, adult learners

Procedia PDF Downloads 253
1865 About Multi-Resolution Techniques for Large Eddy Simulation of Reactive Multi-Phase Flows

Authors: Giacomo Rossi, Bernardo Favini, Eugenio Giacomazzi, Franca Rita Picchia, Nunzio Maria Salvatore Arcidiacono

Abstract:

A numerical technique for mesh refinement in the HeaRT (Heat Release and Transfer) numerical code is presented. In the CFD framework, Large Eddy Simulation (LES) approach is gaining in importance as a tool for simulating turbulent combustion processes, also if this approach has an high computational cost due to the complexity of the turbulent modeling and the high number of grid points necessary to obtain a good numerical solution. In particular, when a numerical simulation of a big domain is performed with a structured grid, the number of grid points can increase so much that the simulation becomes impossible: this problem can be overcame with a mesh refinement technique. Mesh refinement technique developed for HeaRT numerical code (a staggered finite difference code) is based on an high order reconstruction of the variables at the grid interfaces by means of a least square quasi-ENO interpolation: numerical code is written in modern Fortran (2003 standard of newer) and is parallelized using domain decomposition and message passing interface (MPI) standard.

Keywords: LES, multi-resolution, ENO, fortran

Procedia PDF Downloads 343
1864 Interlingual Interference in Students’ Writing

Authors: Zakaria Khatraoui

Abstract:

Interlanguage has transcendentally capitalized its central role over a considerable metropolitan landscape. Either academically driven or pedagogically oriented, Interlanguage has principally floated as important than ever before. It academically probes theoretical and linguistic issues in the turf and further malleably flows from idea to reality to vindicate a bridging philosophy between theory and educational rehearsal. Characteristically, the present research grants a prolifically developed theoretical framework that is conversely sustained by empirical teaching practices, along with teasing apart the narrowly confined implementation. The focus of this interlingual study is placed stridently on syntactic errors projected in students’ writing as performance. To attain this endeavor, the paper appropriates qualitatively a plethora of focal methodological choices sponsored by a solid design. The steadily undeniable ipso facto to be examined is the creative sense of syntactic errors unequivocally endorsed by the tangible dominance of cognitively intralingual errors over linguistically interlingual ones. Subsequently, this paper attempts earnestly to highlight transferable implications worth indicating both theoretical and pedagogically professional principles. In particular, results are fundamentally relative to the scholarly community in a multidimensional sense to recommend actions of educational value.

Keywords: interlanguage, interference, error, writing

Procedia PDF Downloads 44
1863 Randomness in Cybertext: A Study on Computer-Generated Poetry from the Perspective of Semiotics

Authors: Hongliang Zhang

Abstract:

The use of chance procedures and randomizers in poetry-writing can be traced back to surrealist works, which, by appealing to Sigmund Freud's theories, were still logocentrism. In the 1960s, random permutation and combination were extensively used by the Oulipo, John Cage and Jackson Mac Low, which further deconstructed the metaphysical presence of writing. Today, the randomly-generated digital poetry has emerged as a genre of cybertext which should be co-authored by readers. At the same time, the classical theories have now been updated by cybernetics and media theories. N· Katherine Hayles put forward the concept of ‘the floating signifiers’ by Jacques Lacan to be the ‘the flickering signifiers’ , arguing that the technology per se has become a part of the textual production. This paper makes a historical review of the computer-generated poetry in the perspective of semiotics, emphasizing that the randomly-generated digital poetry which hands over the dual tasks of both interpretation and writing to the readers demonstrates the intervention of media technology in literature. With the participation of computerized algorithm and programming languages, poems randomly generated by computers have not only blurred the boundary between encoder and decoder, but also raises the issue of human-machine. It is also a significant feature of the cybertext that the productive process of the text is full of randomness.

Keywords: cybertext, digital poetry, poetry generator, semiotics

Procedia PDF Downloads 157
1862 Hydrodynamic and Sediment Transport Analysis of Computational Fluid Dynamics Designed Flow Regulating Liner (Smart Ditch)

Authors: Saman Mostafazadeh-Fard, Zohrab Samani, Kenneth Suazo

Abstract:

Agricultural ditch liners are used to prevent soil erosion and reduce seepage losses. This paper introduced an approach to validate a computational fluid dynamics (CFD) platform FLOW-3D code and its use to design a flow-regulating corrugated agricultural ditch liner system (Smart Ditch (SM)). Hydrodynamic and sediment transport analyses were performed on the proposed liner flow using the CFD platform FLOW-3D code. The code's hydrodynamic and scour and sediment transport models were calibrated and validated using lab data with an accuracy of 94 % and 95%, respectively. The code was then used to measure hydrodynamic parameters of sublayer turbulent intensity, kinetic energy, dissipation, and packed sediment mass normalized with respect to sublayer flow velocity. Sublayer turbulent intensity, kinetic energy, and dissipation in the SM flow were significantly higher than CR flow. An alternative corrugated liner was also designed, and sediment transport was measured and compared to SM and CR flows. Normalized packed sediment mass with respect to average sublayer flow velocity was 27.8 % lower in alternative flow compared to SM flow. CFD platform FLOW-3D code could effectively be used to design corrugated ditch liner systems and perform hydrodynamic and sediment transport analysis under various corrugation designs.

Keywords: CFD, hydrodynamic, sediment transport, ditch, liner design

Procedia PDF Downloads 103
1861 Mapping Feature Models to Code Using a Reference Architecture: A Case Study

Authors: Karam Ignaim, Joao M. Fernandes, Andre L. Ferreira

Abstract:

Mapping the artifacts coming from a set of similar products family developed in an ad-hoc manner to make up the resulting software product line (SPL) plays a key role to maintain the consistency between requirements and code. This paper presents a feature mapping approach that focuses on tracing the artifact coming from the migration process, the current feature model (FM), to the other artifacts of the resulting SPL, the reference architecture, and code. Thus, our approach relates each feature of the current FM to its locations in the implementation code, using the reference architecture as an intermediate artifact (as a centric point) to preserve consistency among them during an SPL evolution. The approach uses a particular artifact (i.e., traceability tree) as a solution for managing the mapping process. Tool support is provided using friendlyMapper. We have evaluated the feature mapping approach and tool support by putting the approach into practice (i.e., conducting a case study) of the automotive domain for Classical Sensor Variants Family at Bosch Car Multimedia S.A. The evaluation reveals that the mapping approach presented by this paper fits the automotive domain.

Keywords: feature location, feature models, mapping, software product lines, traceability

Procedia PDF Downloads 98
1860 Computational Study of Flow and Heat Transfer Characteristics of an Incompressible Fluid in a Channel Using Lattice Boltzmann Method

Authors: Imdat Taymaz, Erman Aslan, Kemal Cakir

Abstract:

The Lattice Boltzmann Method (LBM) is performed to computationally investigate the laminar flow and heat transfer of an incompressible fluid with constant material properties in a 2D channel with a built-in triangular prism. Both momentum and energy transport is modelled by the LBM. A uniform lattice structure with a single time relaxation rule is used. Interpolation methods are applied for obtaining a higher flexibility on the computational grid, where the information is transferred from the lattice structure to the computational grid by Lagrange interpolation. The flow is researched on for different Reynolds number, while Prandtl number is keeping constant as a 0.7. The results show how the presence of a triangular prism effects the flow and heat transfer patterns for the steady-state and unsteady-periodic flow regimes. As an evaluation of the accuracy of the developed LBM code, the results are compared with those obtained by a commercial CFD code. It is observed that the present LBM code produces results that have similar accuracy with the well-established CFD code, as an additionally, LBM needs much smaller CPU time for the prediction of the unsteady phonema.

Keywords: laminar forced convection, lbm, triangular prism

Procedia PDF Downloads 354
1859 The First Japanese-Japanese Dictionary for Non-Japanese Using the Defining Vocabulary

Authors: Minoru Moriguchi

Abstract:

This research introduces the concept of a monolingual Japanese dictionary for non-native speakers of Japanese, whose temporal title is Dictionary of Contemporary Japanese for Advanced Learners (DCJAL). As the language market is very small compared with English, a monolingual Japanese dictionary for non-native speakers, containing sufficient entries, has not been published yet. In such a dictionary environment, Japanese-language learners are using bilingual dictionaries or monolingual Japanese dictionaries for Japanese people. This research started in 2017, as a project team which consists of four Japanese and two non-native speakers, all of whom are linguists of the Japanese language. The team has been trying to propose the concept of a monolingual dictionary for non-native speakers of Japanese and to provide the entry list, the definition samples, the list of defining vocabulary, and the writing manual. As the result of seven-year research, DCJAL has come to have 28,060 head words, 539 entry examples, 4,598-word defining vocabulary, and the writing manual. First, the number of the entry was determined as about 30,000, based on an experimental method using existing six dictionaries. To make the entry list satisfying this number, words suitable for DCJAL were extracted from the Tsukuba corpus of the Japanese language, and later the entry list was adjusted according to the experience as Japanese instructor. Among the head words of the entry list, 539 words were selected and added with lexicographical information such as proficiency level, pronunciation, writing system (hiragana, katakana, kanji, or alphabet), definition, example sentences, idiomatic expression, synonyms, antonyms, grammatical information, sociolinguistic information, and etymology. While writing the definition of the above 539 words, the list of the defining vocabulary was constructed, based on frequent vocabulary used in a Japanese monolingual dictionary. Although the concept of DCJAL has been almost perfected, it may need some more adjustment, and the research is continued.

Keywords: monolingual dictionary, the Japanese language, non-native speaker of Japanese, defining vocabulary

Procedia PDF Downloads 23
1858 Ambient Vibration Testing of Existing Buildings in Madinah

Authors: Tarek M. Alguhane, Ayman H. Khalil, M. N. Fayed, Ayman M. Ismail

Abstract:

The elastic period has a primary role in the seismic assessment of buildings. Reliable calculations and/or estimates of the fundamental frequency of a building and its site are essential during analysis and design process. Various code formulas based on empirical data are generally used to estimate the fundamental frequency of a structure. For existing structures, in addition to code formulas and available analytical tools such as modal analyses, various methods of testing including ambient and forced vibration testing procedures may be used to determine dynamic characteristics. In this study, the dynamic properties of the 32 buildings located in the Madinah of Saudi Arabia were identified using ambient motions recorded at several, spatially-distributed locations within each building. Ambient vibration measurements of buildings have been analyzed and the fundamental longitudinal and transverse periods for all tested buildings are presented. The fundamental mode of vibration has been compared in plots with codes formulae (Saudi Building Code, EC8, and UBC1997). The results indicate that measured periods of existing buildings are shorter than that given by most empirical code formulas. Recommendations are given based on the common design and construction practice in Madinah city.

Keywords: ambient vibration, fundamental period, RC buildings, infill walls

Procedia PDF Downloads 243
1857 Teaching Academic Writing for Publication: A Liminal Threshold Experience Towards Development of Scholarly Identity

Authors: Belinda du Plooy, Ruth Albertyn, Christel Troskie-De Bruin, Ella Belcher

Abstract:

In the academy, scholarliness or intellectual craftsmanship is considered the highest level of achievement, culminating in being consistently successfully published in impactful, peer-reviewed journals and books. Scholarliness implies rigorous methods, systematic exposition, in-depth analysis and evaluation, and the highest level of critical engagement and reflexivity. However, being a scholar does not happen automatically when one becomes an academic or completes graduate studies. A graduate qualification is an indication of one’s level of research competence but does not necessarily prepare one for the type of scholarly writing for publication required after a postgraduate qualification has been conferred. Scholarly writing for publication requires a high-level skillset and a specific mindset, which must be intentionally developed. The rite of passage to become a scholar is an iterative process with liminal spaces, thresholds, transitions, and transformations. The journey from researcher to published author is often fraught with rejection, insecurity, and disappointment and requires resilience and tenacity from those who eventually triumph. It cannot be achieved without support, guidance, and mentorship. In this article, the authors use collective auto-ethnography (CAE) to describe the phases and types of liminality encountered during the liminal journey toward scholarship. The authors speak as long-time facilitators of Writing for Academic Publication (WfAP) capacity development events (training workshops and writing retreats) presented at South African universities. Their WfAP facilitation practice is structured around experiential learning principles that allow them to act as critical reading partners and reflective witnesses for the writer-participants of their WfAP events. They identify three essential facilitation features for the effective holding of a generative, liminal, and transformational writing space for novice academic writers in order to enable their safe passage through the various liminal spaces they encounter during their scholarly development journey. These features are that facilitators should be agents of disruption and liminality while also guiding writers through these liminal spaces; that there should be a sense of mutual trust and respect, shared responsibility and accountability in order for writers to produce publication-worthy scholarly work; and that this can only be accomplished with the continued application of high levels of sensitivity and discernment by WfAP facilitators. These are key features for successful WfAP scholarship training events, where focused, individual input triggers personal and professional transformational experiences, which in turn translate into high-quality scholarly outputs.

Keywords: academic writing, liminality, scholarship, scholarliness, threshold experience, writing for publication

Procedia PDF Downloads 32
1856 Web Quest as the Tool for Business Writing Skills Enhancement at Technical University EFL Classes

Authors: Nadezda Kobzeva

Abstract:

Under the current trend of globalization, economic and technological dynamics information and the means by which it is delivered and renewed becomes out-of-date rapidly. Thus, educational systems as well as higher education are being seriously tested. New strategies’ developing that is supported by Information and Communication Technology is urgently required. The essential educators’ mission is to meet the demands of the future by preparing our young learners with proper knowledge, skills and innovation capabilities necessary to advance our competitiveness globally. In response to the modern society and future demands, the oldest Siberian Tomsk Polytechnic University has wisely proposed several initiatives to promote the integration of Information and Communication Technology (ICT) in education, and increase the competitiveness of graduates by emphasizing inquiry-based learning, higher order thinking and problem solving. This paper gives a brief overview of how Web Quest as ICT device is being used for language teaching and describes its use advantages for teaching English as a Foreign Language (EFL), in particular business writing skills. This study proposes to use Web Quest to promote higher order thinking and ICT integration in the process of engineers training in Tomsk Polytechnic University, Russia.

Keywords: web quest, web quest in pedagogy, resume (CVs) and cover letter writing skills, ICT integration

Procedia PDF Downloads 355
1855 Creativity in the Use of Sinhala and English in Advertisements in Sri Lanka: A Morphological Analysis

Authors: Chamindi Dilkushi Senaratne

Abstract:

Sri Lanka has lived with the English language for more than 200 years. Although officially considered a link language, the phenomenal usage of English by the Sinhala-English bilingual has given rise to a mixed code with identifiable structural characteristics. The extensive use of the mixed language by the average Sri Lankan bilingual has resulted in it being used as a medium of communication by creative writers of bilingual advertisements in Sri Lanka. This study analyses the way in which English is used in bilingual advertisements in both print and electronic media in Sri Lanka. The theoretical framework for the study is based on Kachru’s analysis of the use of English by the bilingual, Muysken’s typology on code mixing theories in colonial settings and Myers-Scotton’s theory on the Matrix Language Framework Model. The study will look at a selection of Sinhala-English advertisements published in newspapers from 2015 to 2016. Only advertisements using both Sinhala and English are used for the analysis. To substantiate data collected from the newspapers, the study will select bilingual advertisements from television advertisements. The objective of the study is to analyze the mixed patterns used for creative purposes by advertisers. The results of the study will reveal the creativity used by the Sinhala –English bilingual and the morphological processes used by the creators of Sinhala-English bilingual advertisements to attract the masses.

Keywords: bilingual, code mixing, morphological processes, mixed code

Procedia PDF Downloads 260
1854 The Human Rights Code: Fundamental Rights as the Basis of Human-Robot Coexistence

Authors: Gergely G. Karacsony

Abstract:

Fundamental rights are the result of thousand years’ progress of legislation, adjudication and legal practice. They serve as the framework of peaceful cohabitation of people, protecting the individual from any abuse by the government or violation by other people. Artificial intelligence, however, is the development of the very recent past, being one of the most important prospects to the future. Artificial intelligence is now capable of communicating and performing actions the same way as humans; such acts are sometimes impossible to tell from actions performed by flesh-and-blood people. In a world, where human-robot interactions are more and more common, a new framework of peaceful cohabitation is to be found. Artificial intelligence, being able to take part in almost any kind of interaction where personal presence is not necessary without being recognized as a non-human actor, is now able to break the law, violate people’s rights, and disturb social peace in many other ways. Therefore, a code of peaceful coexistence is to be found or created. We should consider the issue, whether human rights can serve as the code of ethical and rightful conduct in the new era of artificial intelligence and human coexistence. In this paper, we will examine the applicability of fundamental rights to human-robot interactions as well as to the actions of artificial intelligence performed without human interaction whatsoever. Robot ethics has been a topic of discussion and debate of philosophy, ethics, computing, legal sciences and science fiction writing long before the first functional artificial intelligence has been introduced. Legal science and legislation have approached artificial intelligence from different angles, regulating different areas (e.g. data protection, telecommunications, copyright issues), but they are only chipping away at the mountain of legal issues concerning robotics. For a widely acceptable and permanent solution, a more general set of rules would be preferred to the detailed regulation of specific issues. We argue that human rights as recognized worldwide are able to be adapted to serve as a guideline and a common basis of coexistence of robots and humans. This solution has many virtues: people don’t need to adjust to a completely unknown set of standards, the system has proved itself to withstand the trials of time, legislation is easier, and the actions of non-human entities are more easily adjudicated within their own framework. In this paper we will examine the system of fundamental rights (as defined in the most widely accepted source, the 1966 UN Convention on Human Rights), and try to adapt each individual right to the actions of artificial intelligence actors; in each case we will examine the possible effects on the legal system and the society of such an approach, finally we also examine its effect on the IT industry.

Keywords: human rights, robot ethics, artificial intelligence and law, human-robot interaction

Procedia PDF Downloads 225
1853 The Impact of Blended Learning on Developing the students' Writing Skills and the Perception of Instructors and Students: Hawassa University in Focus

Authors: Mulu G. Gencha, Gebremedhin Simon, Menna Olango

Abstract:

This study was conducted at Hawassa University (HwU) in the Southern Nation Nationalities Peoples Regional State (SNNPRS) of Ethiopia. The prime concern of this study was to examine the writing performances of experimental and control group students, perception of experimental group students, and subject instructors. The course was blended learning (BL). Blended learning is a hybrid of classroom and on-line learning. Participants were eighty students from the School of Computer Science. Forty students attended the BL delivery involved using Face-to-Face (FTF) and campus-based online instruction. All instructors, fifty, of School of Language and Communication Studies along with 10 FGD members participated in the study. The experimental group went to the computer lab two times a week for four months, March-June, 2012, using the local area network (LAN), and software (MOODLE) writing program. On the other hand, the control group, forty students, took the FTF writing course five times a week for four months in similar academic calendar. The three instruments, the attitude questionnaire, tests and FGD were designed to identify views of students, instructors, and FGD participants on BL. At the end of the study, students’ final course scores were evaluated. Data were analyzed using independent samples t-tests. A statistically, significant difference was found between the FTF and BL (p<0.05). The analysis showed that the BL group was more successful than the conventional group. Besides, both instructors and students had positive attitude towards BL. The final section of the thesis showed the potential benefits and challenges, considering the pedagogical implications for the BL, and recommended possible avenues for further works.

Keywords: blended learning, computer attitudes, computer usefulness, computer liking, computer confidence, computer phobia

Procedia PDF Downloads 394
1852 Developing Laser Spot Position Determination and PRF Code Detection with Quadrant Detector

Authors: Mohamed Fathy Heweage, Xiao Wen, Ayman Mokhtar, Ahmed Eldamarawy

Abstract:

In this paper, we are interested in modeling, simulation, and measurement of the laser spot position with a quadrant detector. We enhance detection and tracking of semi-laser weapon decoding system based on microcontroller. The system receives the reflected pulse through quadrant detector and processes the laser pulses through a processing circuit, a microcontroller decoding laser pulse reflected by the target. The seeker accuracy will be enhanced by the decoding system, the laser detection time based on the receiving pulses number is reduced, a gate is used to limit the laser pulse width. The model is implemented based on Pulse Repetition Frequency (PRF) technique with two microcontroller units (MCU). MCU1 generates laser pulses with different codes. MCU2 decodes the laser code and locks the system at the specific code. The codes EW selected based on the two selector switches. The system is implemented and tested in Proteus ISIS software. The implementation of the full position determination circuit with the detector is produced. General system for the spot position determination was performed with the laser PRF for incident radiation and the mechanical system for adjusting system at different angles. The system test results show that the system can detect the laser code with only three received pulses based on the narrow gate signal, and good agreement between simulation and measured system performance is obtained.

Keywords: four quadrant detector, pulse code detection, laser guided weapons, pulse repetition frequency (PRF), Atmega 32 microcontrollers

Procedia PDF Downloads 361
1851 A Study on the Coefficient of Transforming Relative Lateral Displacement under Linear Analysis of Structure to Its Real Relative Lateral Displacement

Authors: Abtin Farokhipanah

Abstract:

In recent years, analysis of structures is based on ductility design in contradictory to strength design in surveying earthquake effects on structures. ASCE07-10 code offers to intensify relative drifts calculated from a linear analysis with Cd which is called (Deflection Amplification Factor) to obtain the real relative drifts which can be calculated using nonlinear analysis. This lateral drift should be limited to the code boundaries. Calculation of this amplification factor for different structures, comparing with ASCE07-10 code and offering the best coefficient are the purposes of this research. Following our target, short and tall building steel structures with various earthquake resistant systems in linear and nonlinear analysis should be surveyed, so these questions will be answered: 1. Does the Response Modification Coefficient (R) have a meaningful relation to Deflection Amplification Factor? 2. Does structure height, seismic zone, response spectrum and similar parameters have an effect on the conversion coefficient of linear analysis to real drift of structure? The procedure has used to conduct this research includes: (a) Study on earthquake resistant systems, (b) Selection of systems and modeling, (c) Analyzing modeled systems using linear and nonlinear methods, (d) Calculating conversion coefficient for each system and (e) Comparing conversion coefficients with the code offered ones and concluding results.

Keywords: ASCE07-10 code, deflection amplification factor, earthquake engineering, lateral displacement of structures, response modification coefficient

Procedia PDF Downloads 337
1850 Performance Comparison of Deep Convolutional Neural Networks for Binary Classification of Fine-Grained Leaf Images

Authors: Kamal KC, Zhendong Yin, Dasen Li, Zhilu Wu

Abstract:

Intra-plant disease classification based on leaf images is a challenging computer vision task due to similarities in texture, color, and shape of leaves with a slight variation of leaf spot; and external environmental changes such as lighting and background noises. Deep convolutional neural network (DCNN) has proven to be an effective tool for binary classification. In this paper, two methods for binary classification of diseased plant leaves using DCNN are presented; model created from scratch and transfer learning. Our main contribution is a thorough evaluation of 4 networks created from scratch and transfer learning of 5 pre-trained models. Training and testing of these models were performed on a plant leaf images dataset belonging to 16 distinct classes, containing a total of 22,265 images from 8 different plants, consisting of a pair of healthy and diseased leaves. We introduce a deep CNN model, Optimized MobileNet. This model with depthwise separable CNN as a building block attained an average test accuracy of 99.77%. We also present a fine-tuning method by introducing the concept of a convolutional block, which is a collection of different deep neural layers. Fine-tuned models proved to be efficient in terms of accuracy and computational cost. Fine-tuned MobileNet achieved an average test accuracy of 99.89% on 8 pairs of [healthy, diseased] leaf ImageSet.

Keywords: deep convolution neural network, depthwise separable convolution, fine-grained classification, MobileNet, plant disease, transfer learning

Procedia PDF Downloads 164
1849 A User-Directed Approach to Optimization via Metaprogramming

Authors: Eashan Hatti

Abstract:

In software development, programmers often must make a choice between high-level programming and high-performance programs. High-level programming encourages the use of complex, pervasive abstractions. However, the use of these abstractions degrades performance-high performance demands that programs be low-level. In a compiler, the optimizer attempts to let the user have both. The optimizer takes high-level, abstract code as an input and produces low-level, performant code as an output. However, there is a problem with having the optimizer be a built-in part of the compiler. Domain-specific abstractions implemented as libraries are common in high-level languages. As a language’s library ecosystem grows, so does the number of abstractions that programmers will use. If these abstractions are to be performant, the optimizer must be extended with new optimizations to target them, or these abstractions must rely on existing general-purpose optimizations. The latter is often not as effective as needed. The former presents too significant of an effort for the compiler developers, as they are the only ones who can extend the language with new optimizations. Thus, the language becomes more high-level, yet the optimizer – and, in turn, program performance – falls behind. Programmers are again confronted with a choice between high-level programming and high-performance programs. To investigate a potential solution to this problem, we developed Peridot, a prototype programming language. Peridot’s main contribution is that it enables library developers to easily extend the language with new optimizations themselves. This allows the optimization workload to be taken off the compiler developers’ hands and given to a much larger set of people who can specialize in each problem domain. Because of this, optimizations can be much more effective while also being much more numerous. To enable this, Peridot supports metaprogramming designed for implementing program transformations. The language is split into two fragments or “levels”, one for metaprogramming, the other for high-level general-purpose programming. The metaprogramming level supports logic programming. Peridot’s key idea is that optimizations are simply implemented as metaprograms. The meta level supports several specific features which make it particularly suited to implementing optimizers. For instance, metaprograms can automatically deduce equalities between the programs they are optimizing via unification, deal with variable binding declaratively via higher-order abstract syntax, and avoid the phase-ordering problem via non-determinism. We have found that this design centered around logic programming makes optimizers concise and easy to write compared to their equivalents in functional or imperative languages. Overall, implementing Peridot has shown that its design is a viable solution to the problem of writing code which is both high-level and performant.

Keywords: optimization, metaprogramming, logic programming, abstraction

Procedia PDF Downloads 66