Search results for: deceptive features
2903 Biomedical Definition Extraction Using Machine Learning with Synonymous Feature
Authors: Jian Qu, Akira Shimazu
Abstract:
OOV (Out Of Vocabulary) terms are terms that cannot be found in many dictionaries. Although it is possible to translate such OOV terms, the translations do not provide any real information for a user. We present an OOV term definition extraction method by using information available from the Internet. We use features such as occurrence of the synonyms and location distances. We apply machine learning method to find the correct definitions for OOV terms. We tested our method on both biomedical type and name type OOV terms, our work outperforms existing work with an accuracy of 86.5%.Keywords: information retrieval, definition retrieval, OOV (out of vocabulary), biomedical information retrieval
Procedia PDF Downloads 4942902 Implementing Lesson Study in Qatari Mathematics Classroom: A Case Study of a New Experience for Teachers through IMPULS-QU Lesson Study Program
Authors: Areej Isam Barham
Abstract:
The implementation of Japanese lesson study approach in the mathematics classroom has been grown worldwide as a model of professional development for teachers. In Qatar, the implementation of IMPULS-QU lesson study program aimed to establish a robust organizational improvement model of professional development for mathematics teachers in Qatar schools. This study describes the implementation of a lesson study model at Al-Markhyia Independent Primary School through different stages; and discusses how the planning process, the research lesson, and the post discussion participates in providing teachers and researchers with a successful research lesson for teacher professional development. The research followed a case study approach in one mathematics classroom. Two teachers and one professional development specialist participated the planning process. One teacher conducted the research lesson study by introducing a problem solving related to the concept of the ‘Mean’ in a mathematics class, 21 students in grade 6 participated in solving the mathematic problem, 11 teachers, 4 professional development specialists, and 4 mathematics professors observed the research lesson. All previous participants except the students participated in a pre and post-lesson discussion within this research. This study followed a qualitative research approach by analyzing the collected data through different stages in the research lesson study. Observation, field notes, and semi-structured interviews conducted to collect data to achieve the research aims. One feature of this lesson study research is that this research describes the implementation for a lesson study as a new experience for one mathematics teacher and 21 students after 3 years of conducting IMPULS-QU project in Al-Markhyia school. The research describes various stages through the implementation of this lesson study model starting from the planning process and ending by the post discussion process. Findings of the study also address the impact of lesson study approach in teaching mathematics for the development of teachers from their point views. Results of the study show the benefits of using lesson study from the point views of participated teachers, theory perceptions about the essential features of lesson study, and their needs for future development. The discussion of the study addresses different features and issues related to the implementation of IMPULS-QU lesson study model in the mathematics classroom. In the light of the study, the research presents recommendations and suggestions for future professional development.Keywords: lesson study, mathematics education, mathematics teaching experience, teacher professional development
Procedia PDF Downloads 1852901 Formal Development of Electronic Identity Card System Using Event-B
Authors: Tomokazu Nagata, Jawid Ahmad Baktash
Abstract:
The goal of this paper is to explore the use of formal methods for Electronic Identity Card System. Nowadays, one of the core research directions in a constantly growing distributed environment is the improvement of the communication process. The responsibility for proper verification becomes crucial. Formal methods can play an essential role in the development and testing of systems. The thesis presents two different methodologies for assessing correctness. Our first approach employs abstract interpretation techniques for creating a trace based model for Electronic Identity Card System. The model was used for building a semi decidable procedure for verifying the system model. We also developed the code for the eID System and can cover three parts login to system sending of Acknowledgment from user side, receiving of all information from server side and log out from system. The new concepts of impasse and spawned sessions that we introduced led our research to original statements about the intruder’s knowledge and eID system coding with respect to secrecy. Furthermore, we demonstrated that there is a bound on the number of sessions needed for the analysis of System.Electronic identity (eID) cards promise to supply a universal, nation-wide mechanism for user authentication. Most European countries have started to deploy eID for government and private sector applications. Are government-issued electronic ID cards the proper way to authenticate users of online services? We use the eID project as a showcase to discuss eID from an application perspective. The new eID card has interesting design features, it is contact-less, it aims to protect people’s privacy to the extent possible, and it supports cryptographically strong mutual authentication between users and services. Privacy features include support for pseudonymous authentication and per service controlled access to individual data items. The article discusses key concepts, the eID infrastructure, observed and expected problems, and open questions. The core technology seems ready for prime time and government projects deploy it to the masses. But application issues may hamper eID adoption for online applications.Keywords: eID, event-B, Pro-B, formal method, message passing
Procedia PDF Downloads 2352900 Recursion, Merge and Event Sequence: A Bio-Mathematical Perspective
Authors: Noury Bakrim
Abstract:
Formalization is indeed a foundational Mathematical Linguistics as demonstrated by the pioneering works. While dialoguing with this frame, we nonetheless propone, in our approach of language as a real object, a mathematical linguistics/biosemiotics defined as a dialectical synthesis between induction and computational deduction. Therefore, relying on the parametric interaction of cycles, rules, and features giving way to a sub-hypothetic biological point of view, we first hypothesize a factorial equation as an explanatory principle within Category Mathematics of the Ergobrain: our computation proposal of Universal Grammar rules per cycle or a scalar determination (multiplying right/left columns of the determinant matrix and right/left columns of the logarithmic matrix) of the transformable matrix for rule addition/deletion and cycles within representational mapping/cycle heredity basing on the factorial example, being the logarithmic exponent or power of rule deletion/addition. It enables us to propone an extension of minimalist merge/label notions to a Language Merge (as a computing principle) within cycle recursion relying on combinatorial mapping of rules hierarchies on external Entax of the Event Sequence. Therefore, to define combinatorial maps as language merge of features and combinatorial hierarchical restrictions (governing, commanding, and other rules), we secondly hypothesize from our results feature/hierarchy exponentiation on graph representation deriving from Gromov's Symbolic Dynamics where combinatorial vertices from Fe are set to combinatorial vertices of Hie and edges from Fe to Hie such as for all combinatorial group, there are restriction maps representing different derivational levels that are subgraphs: the intersection on I defines pullbacks and deletion rules (under restriction maps) then under disjunction edges H such that for the combinatorial map P belonging to Hie exponentiation by intersection there are pullbacks and projections that are equal to restriction maps RM₁ and RM₂. The model will draw on experimental biomathematics as well as structural frames with focus on Amazigh and English (cases from phonology/micro-semantics, Syntax) shift from Structure to event (especially Amazigh formant principle resolving its morphological heterogeneity).Keywords: rule/cycle addition/deletion, bio-mathematical methodology, general merge calculation, feature exponentiation, combinatorial maps, event sequence
Procedia PDF Downloads 1272899 Development of 90y-Chitosan Complex for Radiosynovectomy
Authors: A. Mirzaei, S. Zolghadri, M. Athari-Allaf, H. Yousefnia, A. R. Jalilian
Abstract:
Rheumatoid arthritis is the most common autoimmune disease, leading to the destruction of the joints. The aim of this study was the preparation of 90Y-chitosan complex as a novel agent for radiosynovectomy. The complex was prepared in the diluted acetic acid solution. At the optimized condition, the radiochemical purity of higher than 99% was obtained by ITLC method on Whatman No. 1 and by using a mixture of methanol/water/acetic acid (4:4:2) as the mobile phase. The complex was stable in acidic media (pH=3) and its radiochemical purity was above 98% even after 48 hours. The biodistribution data in rats showed that there was no significant leakage of the injected activity even after 48 h. Considering all of the excellent features of the complex, 90Y-chitosan can be used to manipulate synovial inflammation effectively.Keywords: chitosan, Y-90, radiosynovectomy, biodistribution
Procedia PDF Downloads 4832898 2L1, a Bridge between L1 and L2
Authors: Elena Ginghina
Abstract:
There are two major categories of language acquisition: first and second language acquisition, which distinguish themselves in their learning process and in their ultimate attainment. However, in the case of a bilingual child, one of the languages he grows up with receives gradually the features of a second language. This phenomenon characterizes the successive first language acquisition, when the initial state of the child is already marked by another language. Nevertheless, the dominance of the languages can change throughout the life, if the exposure to language and the quality of the input are better in 2L1. Related to the exposure to language and the quality of the input, there are cases even at the simultaneous bilingualism, where the two languages although learned from birth one, differ from one another at some point. This paper aims to see, what makes a 2L1 to become a second language and under what circumstances can a L2 learner reach a native or a near native speaker level.Keywords: bilingualism, first language acquisition, native speakers of German, second language acquisition
Procedia PDF Downloads 5742897 From Shallow Semantic Representation to Deeper One: Verb Decomposition Approach
Authors: Aliaksandr Huminski
Abstract:
Semantic Role Labeling (SRL) as shallow semantic parsing approach includes recognition and labeling arguments of a verb in a sentence. Verb participants are linked with specific semantic roles (Agent, Patient, Instrument, Location, etc.). Thus, SRL can answer on key questions such as ‘Who’, ‘When’, ‘What’, ‘Where’ in a text and it is widely applied in dialog systems, question-answering, named entity recognition, information retrieval, and other fields of NLP. However, SRL has the following flaw: Two sentences with identical (or almost identical) meaning can have different semantic role structures. Let consider 2 sentences: (1) John put butter on the bread. (2) John buttered the bread. SRL for (1) and (2) will be significantly different. For the verb put in (1) it is [Agent + Patient + Goal], but for the verb butter in (2) it is [Agent + Goal]. It happens because of one of the most interesting and intriguing features of a verb: Its ability to capture participants as in the case of the verb butter, or their features as, say, in the case of the verb drink where the participant’s feature being liquid is shared with the verb. This capture looks like a total fusion of meaning and cannot be decomposed in direct way (in comparison with compound verbs like babysit or breastfeed). From this perspective, SRL looks really shallow to represent semantic structure. If the key point in semantic representation is an opportunity to use it for making inferences and finding hidden reasons, it assumes by default that two different but semantically identical sentences must have the same semantic structure. Otherwise we will have different inferences from the same meaning. To overcome the above-mentioned flaw, the following approach is suggested. Assume that: P is a participant of relation; F is a feature of a participant; Vcp is a verb that captures a participant; Vcf is a verb that captures a feature of a participant; Vpr is a primitive verb or a verb that does not capture any participant and represents only a relation. In another word, a primitive verb is a verb whose meaning does not include meanings from its surroundings. Then Vcp and Vcf can be decomposed as: Vcp = Vpr +P; Vcf = Vpr +F. If all Vcp and Vcf will be represented this way, then primitive verbs Vpr can be considered as a canonical form for SRL. As a result of that, there will be no hidden participants caught by a verb since all participants will be explicitly unfolded. An obvious example of Vpr is the verb go, which represents pure movement. In this case the verb drink can be represented as man-made movement of liquid into specific direction. Extraction and using primitive verbs for SRL create a canonical representation unique for semantically identical sentences. It leads to the unification of semantic representation. In this case, the critical flaw related to SRL will be resolved.Keywords: decomposition, labeling, primitive verbs, semantic roles
Procedia PDF Downloads 3662896 Narratives in Science as Covert Prestige Indicators
Authors: Zinaida Shelkovnikova
Abstract:
The language in science is changing and meets the demands of the society. We shall argue that in the varied modern world there are important reasons for the integration of narratives into scientific discourse. As far as nowadays scientists are faced with extremely prompt science development and progress; modern scientific society lives in the conditions of tough competition. The integration of narratives into scientific discourse is thus a good way to prompt scientific experience to different audiences and to express covert prestige of the discourse. Narratives also form the identity of the persuasive narrator. Using the narrative approach to the scientific discourse analysis we reveal the sociocultural diversity of the scientists. If you want to attract audience’s attention to your scientific research, narratives should be integrated into your scientific discourse. Those who understand this consistent pattern are considered the leading scientists. Taking into account that it is prestigious to be renowned, celebrated in science, it is a covert prestige to write narratives in science. We define a science narrative as the intentional, consequent, coherent, event discourse or a discourse fragment, which contains the author creativity, in some cases intrigue, and gives mostly qualitative information (compared with quantitative data) in order to provide maximum understanding of the research. Science narratives also allow the effective argumentation and consequently construct the identity of the persuasive narrator. However, skills of creating appropriate scientific discourse reflect the level of prestige. In order to teach postgraduate students to be successful in English scientific writing and to be prestigious in the scientific society, we have defined the science narrative and outlined its main features and characteristics. Narratives contribute to audience’s involvement with the narrator and his/her narration. In general, the way in which a narrative is performed may result in (limited or greater) contact with the audience. To gain these aim authors use emotional fictional elements; descriptive elements: adjectives; adverbs; comparisons and so on; author’s evaluative elements. Thus, the features of science narrativity are the following: descriptive tools; authors evaluation; qualitative information exceeds the quantitative data; facts take the event status; understandability; accessibility; creativity; logics; intrigue; esthetic nature; fiction. To conclude, narratives function covert prestige of the scientific discourse and shape the identity of the persuasive scientist.Keywords: covert prestige, narrativity, scientific discourse, scientific narrative
Procedia PDF Downloads 3992895 Ionic Liquids as Substrates for Metal-Organic Framework Synthesis
Authors: Julian Mehler, Marcus Fischer, Martin Hartmann, Peter S. Schulz
Abstract:
During the last two decades, the synthesis of metal-organic frameworks (MOFs) has gained ever increasing attention. Based on their pore size and shape as well as host-guest interactions, they are of interest for numerous fields related to porous materials, like catalysis and gas separation. Usually, MOF-synthesis takes place in an organic solvent between room temperature and approximately 220 °C, with mixtures of polyfunctional organic linker molecules and metal precursors as substrates. Reaction temperatures above the boiling point of the solvent, i.e. solvothermal reactions, are run in autoclaves or sealed glass vessels under autogenous pressures. A relatively new approach for the synthesis of MOFs is the so-called ionothermal synthesis route. It applies an ionic liquid as a solvent, which can serve as a structure-directing template and/or a charge-compensating agent in the final coordination polymer structure. Furthermore, this method often allows for less harsh reaction conditions than the solvothermal route. Here a variation of the ionothermal approach is reported, where the ionic liquid also serves as an organic linker source. By using 1-ethyl-3-methylimidazolium terephthalates ([EMIM][Hbdc] and [EMIM]₂[bdc]), the one-step synthesis of MIL-53(Al)/Boehemite composites with interesting features is possible. The resulting material is already formed at moderate temperatures (90-130 °C) and is stabilized in the usually unfavored ht-phase. Additionally, in contrast to already published procedures for MIL-53(Al) synthesis, no further activation at high temperatures is mandatory. A full characterization of this novel composite material is provided, including XRD, SS-NMR, El-Al., SEM as well as sorption measurements and its interesting features are compared to MIL-53(Al) samples produced by the classical solvothermal route. Furthermore, the syntheses of the applied ionic liquids and salts is discussed. The influence of the degree of ionicity of the linker source [EMIM]x[H(2-x)bdc] on the crystal structure and the achievable synthesis temperature are investigated and give insight into the role of the IL during synthesis. Aside from the synthesis of MIL-53 from EMIM terephthalates, the use of the phosphonium cation in this approach is discussed as well. Additionally, the employment of ILs in the preparation of other MOFs is presented briefly. This includes the ZIF-4 framework from the respective imidazolate ILs and chiral camphorate based frameworks from their imidazolium precursors.Keywords: ionic liquids, ionothermal synthesis, material synthesis, MIL-53, MOFs
Procedia PDF Downloads 2082894 A Computer-Aided System for Tooth Shade Matching
Authors: Zuhal Kurt, Meral Kurt, Bilge T. Bal, Kemal Ozkan
Abstract:
Shade matching and reproduction is the most important element of success in prosthetic dentistry. Until recently, shade matching procedure was implemented by dentists visual perception with the help of shade guides. Since many factors influence visual perception; tooth shade matching using visual devices (shade guides) is highly subjective and inconsistent. Subjective nature of this process has lead to the development of instrumental devices. Nowadays, colorimeters, spectrophotometers, spectroradiometers and digital image analysing systems are used for instrumental shade selection. Instrumental devices have advantages that readings are quantifiable, can obtain more rapidly and simply, objectively and precisely. However, these devices have noticeable drawbacks. For example, translucent structure and irregular surfaces of teeth lead to defects on measurement with these devices. Also between the results acquired by devices with different measurement principles may make inconsistencies. So, its obligatory to search for new methods for dental shade matching process. A computer-aided system device; digital camera has developed rapidly upon today. Currently, advances in image processing and computing have resulted in the extensive use of digital cameras for color imaging. This procedure has a much cheaper process than the use of traditional contact-type color measurement devices. Digital cameras can be taken by the place of contact-type instruments for shade selection and overcome their disadvantages. Images taken from teeth show morphology and color texture of teeth. In last decades, a new method was recommended to compare the color of shade tabs taken by a digital camera using color features. This method showed that visual and computer-aided shade matching systems should be used as concatenated. Recently using methods of feature extraction techniques are based on shape description and not used color information. However, color is mostly experienced as an essential property in depicting and extracting features from objects in the world around us. When local feature descriptors with color information are extended by concatenating color descriptor with the shape descriptor, that descriptor will be effective on visual object recognition and classification task. Therefore, the color descriptor is to be used in combination with a shape descriptor it does not need to contain any spatial information, which leads us to use local histograms. This local color histogram method is remain reliable under variation of photometric changes, geometrical changes and variation of image quality. So, coloring local feature extraction methods are used to extract features, and also the Scale Invariant Feature Transform (SIFT) descriptor used to for shape description in the proposed method. After the combination of these descriptors, the state-of-art descriptor named by Color-SIFT will be used in this study. Finally, the image feature vectors obtained from quantization algorithm are fed to classifiers such as Nearest Neighbor (KNN), Naive Bayes or Support Vector Machines (SVM) to determine label(s) of the visual object category or matching. In this study, SVM are used as classifiers for color determination and shade matching. Finally, experimental results of this method will be compared with other recent studies. It is concluded from the study that the proposed method is remarkable development on computer aided tooth shade determination system.Keywords: classifiers, color determination, computer-aided system, tooth shade matching, feature extraction
Procedia PDF Downloads 4442893 Cracking Mode and Path in Duplex Stainless Steels Failure
Authors: Faraj A. E. Alhegagi, Bassam F. A. Alhajaji
Abstract:
Ductile and brittle fractures are the two main modes for the failure of engineering components. Fractures are classified with respect to several characteristics, such as strain to fracture, ductile or brittle crystallographic mode, shear or cleavage, and the appearance of fracture, granular or transgranular. Cleavage is a brittle fracture involves transcrystalline fracture along specific crystallographic planes and in certain directions. Fracture of duplex stainless steels takes place transgranularly by cleavage of the ferrite phase. On the other hand, ductile fracture occurs after considerable plastic deformation prior to failure and takes place by void nucleation, growth, and coalescence to provide an easy fracture path. Twinning causes depassivation more readily than slip and appears at stress lower than the theoretical yield stress. Consequently, damage due to twinning can occur well before that due to slip. Stainless steels are clean materials with the low efficiency of second particles phases on the fracture mechanism. The ferrite cleavage and austenite tear off are the main mode by which duplex stainless steels fails. In this study, the cracking mode and path of specimens of duplex stainless steels were investigated. Zeron 100 specimens were heat treated to different times cooled down and pulled to failure. The fracture surface was investigated by scanning electron microscopy (SEM) concentrating on the cracking mechanism, path, and origin. Cracking mechanisms were studied for those grains either as ferrite or austenite grains identified according to fracture surface features. Cracks propagated through the ferrite and the austenite two phases were investigated. Cracks arrested at the grain boundary were studied as well. For specimens aged for 100h, the ferrite phase was noted to crack by cleavage along well-defined planes while austenite ridges were clearly observed within the ferrite grains. Some grains were observed to fail with topographic features that were not clearly identifiable as ferrite cleavage or austenite tearing. Transgranular cracking was observed taking place in the ferrite phase on well-defined planes. No intergranular cracks were observed for the tested material. The austenite phase was observed to serve as a crack bridge and crack arrester.Keywords: austenite ductile tear off, cracking mode, ferrite cleavage, stainless steels failure
Procedia PDF Downloads 1432892 Locus of Control and Sense of Happiness: A Mediating Role of Self-Esteem
Authors: Ivanna Shubina
Abstract:
Background/Objectives and Goals: Recent interest in positive psychology is reflected in a plenty of studies conducted on its basic constructs (e.g. self-esteem and happiness) in interrelation with personality features, social rules, business and technology development. The purpose of this study is to investigate the mediating role of self-esteem, exploring the relationships between self-esteem and happiness, self-esteem and locus of control (LOC). It hypothesizes that self-esteem may be interpreted as a predictor of happiness and mediator in the locus of control establishment. A plenty of various empirical studies results have been analyzed in order to collect data for this theoretical study, and some of the analysed results can be considered as arguable or incoherent. However, the majority of results indicate a strong relationship between three considered concepts: self-esteem, happiness, the locus of control. Methods: In particular, this study addresses the following broad research questions: i) Is self-esteem just an index of global happiness? ii) May happiness be possible or realizable without a healthy self-confidence and self-acceptance? iii) To what extent does self-esteem influence on the level of happiness? iv) Is high self-esteem a sufficient condition for happiness? v) Is self-esteem is a strong predictor of internal locus of control maintenance? vi) Is high self-esteem related to internal LOC, while low self-esteem to external LOC? In order to find the answers for listed questions, 60 reliable sources have been analyzed, results of what are discussed more detailed below. Expected Results/Conclusion/Contribution:It is recognized that the relationship between self-esteem, happiness, locus of control is complex: internal LOC is contributing to happiness, but it is not directly related to it; self-esteem is a powerful and important psychological factor in mental health and well-being; the feelings of being worthy and empowered are associated with significant achievements and high self-esteem; strong and appropriate self-esteem (when the discrepancy between “ideal” and “real” self is balanced) is correlated with more internal LOC (when the individual tends to believe that personal achievements depend on possessed features, vigor, and persistence). Despite the special attention paid to happiness, the locus of control and self-esteem, independently, theoretical and empirical equivocations within each literature foreclose many obvious predictions about the nature of their empirical distinction. In terms of theoretical framework, no model has achieved consensus as an ultimate theoretical background for any of the mentioned constructs. To be able to clarify the relationship between self-esteem, happiness, and locus of control more interdisciplinary studies have to take place in order to get data on heterogeneous samples, provided from various countries, cultures, and social groups.Keywords: happiness, locus of control, self-esteem, mediation
Procedia PDF Downloads 2452891 Clinical Feature Analysis and Prediction on Recurrence in Cervical Cancer
Authors: Ravinder Bahl, Jamini Sharma
Abstract:
The paper demonstrates analysis of the cervical cancer based on a probabilistic model. It involves technique for classification and prediction by recognizing typical and diagnostically most important test features relating to cervical cancer. The main contributions of the research include predicting the probability of recurrences in no recurrence (first time detection) cases. The combination of the conventional statistical and machine learning tools is applied for the analysis. Experimental study with real data demonstrates the feasibility and potential of the proposed approach for the said cause.Keywords: cervical cancer, recurrence, no recurrence, probabilistic, classification, prediction, machine learning
Procedia PDF Downloads 3602890 Conversational Assistive Technology of Visually Impaired Person for Social Interaction
Authors: Komal Ghafoor, Tauqir Ahmad, Murtaza Hanif, Hira Zaheer
Abstract:
Assistive technology has been developed to support visually impaired people in their social interactions. Conversation assistive technology is designed to enhance communication skills, facilitate social interaction, and improve the quality of life of visually impaired individuals. This technology includes speech recognition, text-to-speech features, and other communication devices that enable users to communicate with others in real time. The technology uses natural language processing and machine learning algorithms to analyze spoken language and provide appropriate responses. It also includes features such as voice commands and audio feedback to provide users with a more immersive experience. These technologies have been shown to increase the confidence and independence of visually impaired individuals in social situations and have the potential to improve their social skills and relationships with others. Overall, conversation-assistive technology is a promising tool for empowering visually impaired people and improving their social interactions. One of the key benefits of conversation-assistive technology is that it allows visually impaired individuals to overcome communication barriers that they may face in social situations. It can help them to communicate more effectively with friends, family, and colleagues, as well as strangers in public spaces. By providing a more seamless and natural way to communicate, this technology can help to reduce feelings of isolation and improve overall quality of life. The main objective of this research is to give blind users the capability to move around in unfamiliar environments through a user-friendly device by face, object, and activity recognition system. This model evaluates the accuracy of activity recognition. This device captures the front view of the blind, detects the objects, recognizes the activities, and answers the blind query. It is implemented using the front view of the camera. The local dataset is collected that includes different 1st-person human activities. The results obtained are the identification of the activities that the VGG-16 model was trained on, where Hugging, Shaking Hands, Talking, Walking, Waving video, etc.Keywords: dataset, visually impaired person, natural language process, human activity recognition
Procedia PDF Downloads 582889 The Use of Corpora in Improving Modal Verb Treatment in English as Foreign Language Textbooks
Authors: Lexi Li, Vanessa H. K. Pang
Abstract:
This study aims to demonstrate how native and learner corpora can be used to enhance modal verb treatment in EFL textbooks in mainland China. It contributes to a corpus-informed and learner-centered design of grammar presentation in EFL textbooks that enhances the authenticity and appropriateness of textbook language for target learners. The linguistic focus is will, would, can, could, may, might, shall, should, must. The native corpus is the spoken component of BNC2014 (hereafter BNCS2014). The spoken part is chosen because pedagogical purpose of the textbooks is communication-oriented. Using the standard query option of CQPweb, 5% of each of the nine modals was sampled from BNCS2014. The learner corpus is the POS-tagged Ten-thousand English Compositions of Chinese Learners (TECCL). All the essays under the 'secondary school' section were selected. A series of five secondary coursebooks comprise the textbook corpus. All the data in both the learner and the textbook corpora are retrieved through the concordance functions of WordSmith Tools (version, 5.0). Data analysis was divided into two parts. The first part compared the patterns of modal verbs in the textbook corpus and BNC2014 with respect to distributional features, semantic functions, and co-occurring constructions to examine whether the textbooks reflect the authentic use of English. Secondly, the learner corpus was analyzed in terms of the use (distributional features, semantic functions, and co-occurring constructions) and the misuse (syntactic errors, e.g., she can sings*.) of the nine modal verbs to uncover potential difficulties that confront learners. The analysis of distribution indicates several discrepancies between the textbook corpus and BNCS2014. The first four most frequent modal verbs in BNCS2014 are can, would, will, could, while can, will, should, could are the top four in the textbooks. Most strikingly, there is an unusually high proportion of can (41.1%) in the textbooks. The results on different meanings shows that will, would and must are the most problematic. For example, for will, the textbooks contain 20% more occurrences of 'volition' and 20% less of 'prediction' than those in BNCS2014. Regarding co-occurring structures, the textbooks over-represented the structure 'modal +do' across the nine modal verbs. Another major finding is that the structure of 'modal +have done' that frequently co-occur with could, would, should, and must is underused in textbooks. Besides, these four modal verbs are the most difficult for learners, as the error analysis shows. This study demonstrates how the synergy of native and learner corpora can be harnessed to improve EFL textbook presentation of modal verbs in a way that textbooks can provide not only authentic language used in natural discourse but also appropriate design tailed for the needs of target learners.Keywords: English as Foreign Language, EFL textbooks, learner corpus, modal verbs, native corpus
Procedia PDF Downloads 1422888 Feature Extraction and Classification Based on the Bayes Test for Minimum Error
Authors: Nasar Aldian Ambark Shashoa
Abstract:
Classification with a dimension reduction based on Bayesian approach is proposed in this paper . The first step is to generate a sample (parameter) of fault-free mode class and faulty mode class. The second, in order to obtain good classification performance, a selection of important features is done with the discrete karhunen-loeve expansion. Next, the Bayes test for minimum error is used to classify the classes. Finally, the results for simulated data demonstrate the capabilities of the proposed procedure.Keywords: analytical redundancy, fault detection, feature extraction, Bayesian approach
Procedia PDF Downloads 5272887 Evaluating Global ‘Thing’ Security of Consumer Products
Authors: Achutha Raman
Abstract:
Today's brave new world features a bonanza of digitally interconnected products, or ‘things,’ that improve convenience, possibilities, and in some cases efficiency for consumers. Nonetheless, even as the market accelerates, this Internet of ‘things’ is subject to substantial leakage of consumer personal data. First defining the fluid concept of ‘things,’ this paper subsequently uses case studies taken from the EU, Asia, and the US, to highlight large gaps and comprehensively evaluate the state of security for consumer ‘things.’ Ultimately, this paper offers several ways of improving the present status quo, and especially focuses on an evaluative approach that augments the standard mechanism of Firmware Over the Air Updates, and ought to be easily implementable.Keywords: cybersecurity, FOTA, Internet of Things, transnational privacy
Procedia PDF Downloads 2182886 Efficient Residual Road Condition Segmentation Network Based on Reconstructed Images
Authors: Xiang Shijie, Zhou Dong, Tian Dan
Abstract:
This paper focuses on the application of real-time semantic segmentation technology in complex road condition recognition, aiming to address the critical issue of how to improve segmentation accuracy while ensuring real-time performance. Semantic segmentation technology has broad application prospects in fields such as autonomous vehicle navigation and remote sensing image recognition. However, current real-time semantic segmentation networks face significant technical challenges and optimization gaps in balancing speed and accuracy. To tackle this problem, this paper conducts an in-depth study and proposes an innovative Guided Image Reconstruction Module. By resampling high-resolution images into a set of low-resolution images, this module effectively reduces computational complexity, allowing the network to more efficiently extract features within limited resources, thereby improving the performance of real-time segmentation tasks. In addition, a dual-branch network structure is designed in this paper to fully leverage the advantages of different feature layers. A novel Hybrid Attention Mechanism is also introduced, which can dynamically capture multi-scale contextual information and effectively enhance the focus on important features, thus improving the segmentation accuracy of the network in complex road condition. Compared with traditional methods, the proposed model achieves a better balance between accuracy and real-time performance and demonstrates competitive results in road condition segmentation tasks, showcasing its superiority. Experimental results show that this method not only significantly improves segmentation accuracy while maintaining real-time performance, but also remains stable across diverse and complex road conditions, making it highly applicable in practical scenarios. By incorporating the Guided Image Reconstruction Module, dual-branch structure, and Hybrid Attention Mechanism, this paper presents a novel approach to real-time semantic segmentation tasks, which is expected to further advance the development of this field.Keywords: hybrid attention mechanism, image reconstruction, real-time, road status recognition
Procedia PDF Downloads 232885 Identifying Temporary Housing Main Vertexes through Assessing Post-Disaster Recovery Programs
Authors: S. M. Amin Hosseini, Oriol Pons, Carmen Mendoza Arroyo, Albert de la Fuente
Abstract:
In the aftermath of a natural disaster, the major challenge most cities and societies face, regardless of their diverse level of prosperity, is to provide temporary housing (TH) for the displaced population (DP). However, the features of TH, which have been applied in previous recovery programs, greatly varied from case to case. This situation demonstrates that providing temporary accommodation for DP in a short period time and usually in great numbers is complicated in terms of satisfying all the beneficiaries’ needs, regardless of the societies’ welfare levels. Furthermore, when previously used strategies are applied to different areas, the chosen strategies are most likely destined to fail, unless the strategies are context and culturally based. Therefore, as the population of disaster-prone cities are increasing, decision-makers need a platform to help to determine all the factors, which caused the outcomes of the prior programs. To this end, this paper aims to assess the problems, requirements, limitations, potential responses, chosen strategies, and their outcomes, in order to determine the main elements that have influenced the TH process. In this regard, and in order to determine a customizable strategy, this study analyses the TH programs of five different cases as: Marmara earthquake, 1999; Bam earthquake, 2003; Aceh earthquake and tsunami, 2004; Hurricane Katrina, 2005; and, L’Aquila earthquake, 2009. The research results demonstrate that the main vertexes of TH are: (1) local characteristics, including local potential and affected population features, (2) TH properties, which needs to be considered in four phases: planning, provision/construction, operation, and second life, and (3) natural hazards impacts, which embraces intensity and type. Accordingly, this study offers decision-makers the opportunity to discover the main vertexes, their subsets, interactions, and the relation between strategies and outcomes based on the local conditions of each case. Consequently, authorities may acquire the capability to design a customizable method in the face of complicated post-disaster housing in the wake of future natural disasters.Keywords: post-disaster temporary accommodation, urban resilience, natural disaster, local characteristic
Procedia PDF Downloads 2432884 Assessment the Quality of Telecommunication Services by Fuzzy Inferences System
Authors: Oktay Nusratov, Ramin Rzaev, Aydin Goyushov
Abstract:
Fuzzy inference method based approach to the forming of modular intellectual system of assessment the quality of communication services is proposed. Developed under this approach the basic fuzzy estimation model takes into account the recommendations of the International Telecommunication Union in respect of the operation of packet switching networks based on IP-protocol. To implement the main features and functions of the fuzzy control system of quality telecommunication services it is used multilayer feedforward neural network.Keywords: quality of communication, IP-telephony, fuzzy set, fuzzy implication, neural network
Procedia PDF Downloads 4682883 An Investigation of Vegetable Oils as Potential Insulating Liquid
Authors: Celal Kocatepe, Eyup Taslak, Celal Fadil Kumru, Oktay Arikan
Abstract:
While choosing insulating oil, characteristic features such as thermal cooling, endurance, efficiency and being environment-friendly should be considered. Mineral oils are referred as petroleum-based oil. In this study, vegetable oils investigated as an alternative insulating liquid to mineral oil. Dissipation factor, breakdown voltage, relative dielectric constant and resistivity changes with the frequency and voltage of mineral, rapeseed and nut oils were measured. Experimental studies were performed according to ASTM D924 and IEC 60156 standards.Keywords: breakdown voltage, dielectric dissipation factor, mineral oil, vegetable oils
Procedia PDF Downloads 6932882 On Performance of Cache Replacement Schemes in NDN-IoT
Authors: Rasool Sadeghi, Sayed Mahdi Faghih Imani, Negar Najafi
Abstract:
The inherent features of Named Data Networking (NDN) provides a robust solution for Internet of Thing (IoT). Therefore, NDN-IoT has emerged as a combined architecture which exploits the benefits of NDN for interconnecting of the heterogeneous objects in IoT. In NDN-IoT, caching schemes are a key role to improve the network performance. In this paper, we consider the effectiveness of cache replacement schemes in NDN-IoT scenarios. We investigate the impact of replacement schemes on average delay, average hop count, and average interest retransmission when replacement schemes are Least Frequently Used (LFU), Least Recently Used (LRU), First-In-First-Out (FIFO) and Random. The simulation results demonstrate that LFU and LRU present a stable performance when the cache size changes. Moreover, the network performance improves when the number of consumers increases.Keywords: NDN-IoT, cache replacement, performance, ndnSIM
Procedia PDF Downloads 3652881 Towards the Design of Gripper Independent of Substrate Surface Structures
Authors: Annika Schmidt, Ausama Hadi Ahmed, Carlo Menon
Abstract:
End effectors for robotic systems are becoming more and more advanced, resulting in a growing variety of gripping tasks. However, most grippers are application specific. This paper presents a gripper that interacts with an object’s surface rather than being dependent on a defined shape or size. For this purpose, ingressive and astrictive features are combined to achieve the desired gripping capabilities. The developed prototype is tested on a variety of surfaces with different hardness and roughness properties. The results show that the gripping mechanism works on all of the tested surfaces. The influence of the material properties on the amount of the supported load is also studied and the efficiency is discussed.Keywords: claw, dry adhesion, insects, material properties
Procedia PDF Downloads 3592880 Clustering the Wheat Seeds Using SOM Artificial Neural Networks
Authors: Salah Ghamari
Abstract:
In this study, the ability of self organizing map artificial (SOM) neural networks in clustering the wheat seeds varieties according to morphological properties of them was considered. The SOM is one type of unsupervised competitive learning. Experimentally, five morphological features of 300 seeds (including three varieties: gaskozhen, Md and sardari) were obtained using image processing technique. The results show that the artificial neural network has a good performance (90.33% accuracy) in classification of the wheat varieties despite of high similarity in them. The highest classification accuracy (100%) was achieved for sardari.Keywords: artificial neural networks, clustering, self organizing map, wheat variety
Procedia PDF Downloads 6562879 Study on Clarification of the Core Technology in a Monozukuri Company
Authors: Nishiyama Toshiaki, Tadayuki Kyountani, Nguyen Huu Phuc, Shigeyuki Haruyama, Oke Oktavianty
Abstract:
It is important to clarify the company’s core technology in product development process to strengthen their power in providing technology that meets the customer requirement. QFD method is adopted to clarify the core technology through identifying the high element technologies that are related to the voice of customer, and offer the most delightful features for customer. AHP is used to determine the importance of evaluating factors. A case study was conducted by using this approach in Japan’s Monozukuri Company (so called manufacturing company) to clarify their core technology based on customer requirements.Keywords: core technology, QFD, voices of customer, analysis procedure
Procedia PDF Downloads 3842878 Two Taxa of Paradiacheopsis Genera Recordings of the Myxomycetes from Turkey
Authors: Dursun Yağız, Ahmet Afyon
Abstract:
The study materials were collected from Isparta province in 2008. These materials were moved to the laboratory. The 'Most Chamber Techniques' were applied to the materials in the laboratory. Materials were examined with a stereo microscope. As a result of investigations carried out on the samples of sporophores which were developed in the laboratory, Paradiacheopsis erythropodia (Ing) Nann.-Bremek. and Paradiacheopsis longipes Hooff & Nann.-Bremek. species were identified. As a result of the literature research, it is determined that these taxa were new recordings in Turkey. The identified taxa have been added to Turkey's myxomycota. These two taxa’ microscopic features, photos, localities and substrate information were given.Keywords: myxomycete, paradiacheopsis, Turkey, slime mould
Procedia PDF Downloads 2822877 Morphological and Chemical Characterization of the Surface of Orthopedic Implant Materials
Authors: Bertalan Jillek, Péter Szabó, Judit Kopniczky, István Szabó, Balázs Patczai, Kinga Turzó
Abstract:
Hip and knee prostheses are one of the most frequently used medical implants, that can significantly improve patients’ quality of life. Long term success and biointegration of these prostheses depend on several factors, like bulk and surface characteristics, construction and biocompatibility of the material. The applied surgical technique, the general health condition and life-quality of the patient are also determinant factors. Medical devices used in orthopedic surgeries have different surfaces depending on their function inside the human body. Surface roughness of these implants determines the interaction with the surrounding tissues. Numerous modifications have been applied in the recent decades to improve a specific property of an implant. Our goal was to compare the surface characteristics of typical implant materials used in orthopedic surgery and traumatology. Morphological and chemical structure of Vortex plate anodized titanium, cemented THR (total hip replacement) stem high nitrogen REX steel (SS), uncemented THR stem and cup titanium (Ti) alloy with titanium plasma spray coating (TPS), cemented cup and uncemented acetabular liner HXL and UHMWPE and TKR (total knee replacement) femoral component CoCrMo alloy (Sanatmetal Ltd, Hungary) discs were examined. Visualization and elemental analysis were made by scanning electron microscopy (SEM) and energy dispersive spectroscopy (EDS). Surface roughness was determined by atomic force microscopy (AFM) and profilometry. SEM and AFM revealed the morphological and roughness features of the examined materials. TPS Ti presented the highest Ra value (25 ± 2 μm, followed by CoCrMo alloy (535 ± 19 nm), Ti (227 ± 15 nm) and stainless steel (170 ± 11 nm). The roughness of the HXL and UHMWPE surfaces was in the same range, 147 ± 13 nm and 144 ± 15 nm, respectively. EDS confirmed typical elements on the investigated prosthesis materials: Vortex plate Ti (Ti, O, P); TPS Ti (Ti, O, Al); SS (Fe, Cr, Ni, C) CoCrMo (Co, Cr, Mo), HXL (C, Al, Ni) and UHMWPE (C, Al). The results indicate that the surface of prosthesis materials have significantly different features and the applied investigation methods are suitable for their characterization. Contact angle measurements and in vitro cell culture testing are further planned to test their surface energy characteristics and biocompatibility.Keywords: morphology, PE, roughness, titanium
Procedia PDF Downloads 1262876 Sustainable Development Approach for Coastal Erosion Problem in Thailand: Using Bamboo Sticks to Rehabilitate Coastal Erosion
Authors: Sutida Maneeanakekul, Dusit Wechakit, Somsak Piriyayota
Abstract:
Coastal erosion is a major problem in Thailand, in both the Gulf of Thailand and the Andaman Sea coasts. According to the Department of Marine and Coastal Resources, land erosion occurred along the 200 km coastline with an average rate of 5 meters/year. Coastal erosion affects public and government properties, as well as the socio-economy of the country, including emigration in coastal communities, loss of habitats, and decline in fishery production. To combat the problem of coastal erosion, projects utilizing bamboo sticks for coastal defense against erosion were carried out in 5 areas beginning in November, 2010, including: Pak Klong Munharn- Samut Songkhram Province; Ban Khun Samutmaneerat, Pak Klong Pramong and Chao Matchu Shrine-Samut Sakhon Province,and Pak Klong Hongthong – Chachoengsao Province by Marine and Coastal Resources Department. In 2012, an evaluation of the effectiveness of solving the problem of coastal erosion by using bamboo stick was carried out, with a focus on three aspects. Firstly, the change in physical and biological features after using the bamboo stick technique was assessed. Secondly, participation of people in the community in the way of managing the problem of coastal erosion were these aspects evaluated as part of the study. The last aspect that was evaluated is the satisfaction of the community toward this technique. The results of evaluation showed that the amounts of sediment have dramatically changed behind the bamboo sticks lines. The increase of sediment was found to be about 23.50-56.20 centimeters (during 2012-2013). In terms of biological aspect, there has been an increase in mangrove forest areas, especially at Bang Ya Prak, Samut Sakhon Province. Average tree density was found to be about 4,167 trees per square meter. Additionally, an increase in production of fisheries was observed. Presently, the change in the evaluated physical features tends to increase in every aspect, including the satisfaction of people in community toward the process of solving the erosion problem. People in the community are involved in the preparatory, operation, monitoring and evaluation process to resolve the problem in the medium levels.Keywords: bamboo sticks, coastal erosion, rehabilitate, Thailand sustainable development approach
Procedia PDF Downloads 2472875 Comparative Performance Analysis of Nonlinearity Cancellation Techniques for MOS-C Realization in Integrator Circuits
Authors: Hasan Çiçekli, Ahmet Gökçen, Uğur Çam
Abstract:
In this paper, a comparative performance analysis of mostly used four nonlinearity cancellation techniques used to realize the passive resistor by MOS transistors is presented. The comparison is done by using an integrator circuit which is employing sequentially Op-amp, OTRA and ICCII as active element. All of the circuits are implemented by MOS-C realization and simulated by PSPICE program using 0.35 µm process TSMC MOSIS model parameters. With MOS-C realization, the circuits became electronically tunable and fully integrable which is very important in IC design. The output waveforms, frequency responses, THD analysis results and features of the nonlinearity cancellation techniques are also given.Keywords: integrator circuits, MOS-C realization, nonlinearity cancellation, tuneable resistors
Procedia PDF Downloads 5332874 Use of Machine Learning Algorithms to Pediatric MR Images for Tumor Classification
Authors: I. Stathopoulos, V. Syrgiamiotis, E. Karavasilis, A. Ploussi, I. Nikas, C. Hatzigiorgi, K. Platoni, E. P. Efstathopoulos
Abstract:
Introduction: Brain and central nervous system (CNS) tumors form the second most common group of cancer in children, accounting for 30% of all childhood cancers. MRI is the key imaging technique used for the visualization and management of pediatric brain tumors. Initial characterization of tumors from MRI scans is usually performed via a radiologist’s visual assessment. However, different brain tumor types do not always demonstrate clear differences in visual appearance. Using only conventional MRI to provide a definite diagnosis could potentially lead to inaccurate results, and so histopathological examination of biopsy samples is currently considered to be the gold standard for obtaining definite diagnoses. Machine learning is defined as the study of computational algorithms that can use, complex or not, mathematical relationships and patterns from empirical and scientific data to make reliable decisions. Concerning the above, machine learning techniques could provide effective and accurate ways to automate and speed up the analysis and diagnosis for medical images. Machine learning applications in radiology are or could potentially be useful in practice for medical image segmentation and registration, computer-aided detection and diagnosis systems for CT, MR or radiography images and functional MR (fMRI) images for brain activity analysis and neurological disease diagnosis. Purpose: The objective of this study is to provide an automated tool, which may assist in the imaging evaluation and classification of brain neoplasms in pediatric patients by determining the glioma type, grade and differentiating between different brain tissue types. Moreover, a future purpose is to present an alternative way of quick and accurate diagnosis in order to save time and resources in the daily medical workflow. Materials and Methods: A cohort, of 80 pediatric patients with a diagnosis of posterior fossa tumor, was used: 20 ependymomas, 20 astrocytomas, 20 medulloblastomas and 20 healthy children. The MR sequences used, for every single patient, were the following: axial T1-weighted (T1), axial T2-weighted (T2), FluidAttenuated Inversion Recovery (FLAIR), axial diffusion weighted images (DWI), axial contrast-enhanced T1-weighted (T1ce). From every sequence only a principal slice was used that manually traced by two expert radiologists. Image acquisition was carried out on a GE HDxt 1.5-T scanner. The images were preprocessed following a number of steps including noise reduction, bias-field correction, thresholding, coregistration of all sequences (T1, T2, T1ce, FLAIR, DWI), skull stripping, and histogram matching. A large number of features for investigation were chosen, which included age, tumor shape characteristics, image intensity characteristics and texture features. After selecting the features for achieving the highest accuracy using the least number of variables, four machine learning classification algorithms were used: k-Nearest Neighbour, Support-Vector Machines, C4.5 Decision Tree and Convolutional Neural Network. The machine learning schemes and the image analysis are implemented in the WEKA platform and MatLab platform respectively. Results-Conclusions: The results and the accuracy of images classification for each type of glioma by the four different algorithms are still on process.Keywords: image classification, machine learning algorithms, pediatric MRI, pediatric oncology
Procedia PDF Downloads 149