Search results for: Case Based Reasoning technique (CBR).
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 15108

Search results for: Case Based Reasoning technique (CBR).

15048 A General Regression Test Selection Technique

Authors: Walid S. Abd El-hamid, Sherif S. El-etriby, Mohiy M. Hadhoud

Abstract:

This paper presents a new methodology to select test cases from regression test suites. The selection strategy is based on analyzing the dynamic behavior of the applications that written in any programming language. Methods based on dynamic analysis are more safe and efficient. We design a technique that combine the code based technique and model based technique, to allow comparing the object oriented of an application that written in any programming language. We have developed a prototype tool that detect changes and select test cases from test suite.

Keywords: Regression testing, Model based testing, Dynamicbehavior.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1979
15047 Testing Visual Abilities of Machines - Visual Intelligence Tests

Authors: Zbigniew Les, Magdalena Les

Abstract:

Intelligence tests are series of tasks designed to measure the capacity to make abstractions, to learn, and to deal with novel situations. Testing of the visual abilities of the shape understanding system (SUS) is performed based on the visual intelligence tests. In this paper the progressive matrices tests are formulated as tasks given to SUS. These tests require good visual problem solving abilities of the human subject. SUS solves these tests by performing complex visual reasoning transforming the visual forms (tests) into the string forms. The experiment proved that the proposed method, which is part of the SUS visual understanding abilities, can solve a test that is very difficult for human subject.

Keywords: Shape understanding, intelligence test, visual concept, visual reasoning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1410
15046 Trust Managementfor Pervasive Computing Environments

Authors: Denis Trcek

Abstract:

Trust is essential for further and wider acceptance of contemporary e-services. It was first addressed almost thirty years ago in Trusted Computer System Evaluation Criteria standard by the US DoD. But this and other proposed approaches of that period were actually solving security. Roughly some ten years ago, methodologies followed that addressed trust phenomenon at its core, and they were based on Bayesian statistics and its derivatives, while some approaches were based on game theory. However, trust is a manifestation of judgment and reasoning processes. It has to be dealt with in accordance with this fact and adequately supported in cyber environment. On the basis of the results in the field of psychology and our own findings, a methodology called qualitative algebra has been developed, which deals with so far overlooked elements of trust phenomenon. It complements existing methodologies and provides a basis for a practical technical solution that supports management of trust in contemporary computing environments. Such solution is also presented at the end of this paper.

Keywords: internet security, trust management, multi-agent systems, reasoning and judgment, modeling and simulation, qualitativealgebra

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1582
15045 Edge Detection in Digital Images Using Fuzzy Logic Technique

Authors: Abdallah A. Alshennawy, Ayman A. Aly

Abstract:

The fuzzy technique is an operator introduced in order to simulate at a mathematical level the compensatory behavior in process of decision making or subjective evaluation. The following paper introduces such operators on hand of computer vision application. In this paper a novel method based on fuzzy logic reasoning strategy is proposed for edge detection in digital images without determining the threshold value. The proposed approach begins by segmenting the images into regions using floating 3x3 binary matrix. The edge pixels are mapped to a range of values distinct from each other. The robustness of the proposed method results for different captured images are compared to those obtained with the linear Sobel operator. It is gave a permanent effect in the lines smoothness and straightness for the straight lines and good roundness for the curved lines. In the same time the corners get sharper and can be defined easily.

Keywords: Fuzzy logic, Edge detection, Image processing, computer vision, Mechanical parts, Measurement.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4768
15044 Modeling User Behaviour by Planning

Authors: Alfredo Milani, Silvia Suriani

Abstract:

A model of user behaviour based automated planning is introduced in this work. The behaviour of users of web interactive systems can be described in term of a planning domain encapsulating the timed actions patterns representing the intended user profile. The user behaviour recognition is then posed as a planning problem where the goal is to parse a given sequence of user logs of the observed activities while reaching a final state. A general technique for transforming a timed finite state automata description of the behaviour into a numerical parameter planning model is introduced. Experimental results show that the performance of a planning based behaviour model is effective and scalable for real world applications. A major advantage of the planning based approach is to represent in a single automated reasoning framework problems of plan recognitions, plan synthesis and plan optimisation.

Keywords: User behaviour, Timed Transition Automata, Automated Planning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1347
15043 Mining and Visual Management of XML-Based Image Collections

Authors: Khalil Shihab, Nida Al-Chalabi

Abstract:

This article describes Uruk, the virtual museum of Iraq that we developed for visual exploration and retrieval of image collections. The system largely exploits the loosely-structured hierarchy of XML documents that provides a useful representation method to store semi-structured or unstructured data, which does not easily fit into existing database. The system offers users the capability to mine and manage the XML-based image collections through a web-based Graphical User Interface (GUI). Typically, at an interactive session with the system, the user can browse a visual structural summary of the XML database in order to select interesting elements. Using this intermediate result, queries combining structure and textual references can be composed and presented to the system. After query evaluation, the full set of answers is presented in a visual and structured way.

Keywords: Data-centric XML, graphical user interfaces, information retrieval, case-based reasoning, fuzzy sets

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1790
15042 Analysis of Users’ Behavior on Book Loan Log Based On Association Rule Mining

Authors: Kanyarat Bussaban, Kunyanuth Kularbphettong

Abstract:

This research aims to create a model for analysis of student behavior using Library resources based on data mining technique in case of Suan Sunandha Rajabhat University. The model was created under association rules, Apriori algorithm. The results were found 14 rules and the rules were tested with testing data set and it showed that the ability of classify data was 79.24percent and the MSE was 22.91. The results showed that the user’s behavior model by using association rule technique can use to manage the library resources.

Keywords: Behavior, data mining technique, Apriori algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2306
15041 Continuity of Defuzzification and Its Application to Fuzzy Control

Authors: Takashi Mitsuishi, Kiyoshi Sawada, Yasunari Shidama

Abstract:

The mathematical framework for studying of a fuzzy approximate reasoning is presented in this paper. Two important defuzzification methods (Area defuzzification and Height defuzzification) besides the center of gravity method which is the best well known defuzzification method are described. The continuity of the defuzzification methods and its application to a fuzzy feedback control are discussed.

Keywords: Fuzzy approximate reasoning, defuzzification, area method, height method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1676
15040 An Efficient Energy Adaptive Hybrid Error Correction Technique for Underwater Wireless Sensor Networks

Authors: Ammar Elyas babiker, M.Nordin B. Zakaria, Hassan Yosif, Samir B. Ibrahim

Abstract:

Variable channel conditions in underwater networks, and variable distances between sensors due to water current, leads to variable bit error rate (BER). This variability in BER has great effects on energy efficiency of error correction techniques used. In this paper an efficient energy adaptive hybrid error correction technique (AHECT) is proposed. AHECT adaptively changes error technique from pure retransmission (ARQ) in a low BER case to a hybrid technique with variable encoding rates (ARQ & FEC) in a high BER cases. An adaptation algorithm depends on a precalculated packet acceptance rate (PAR) look-up table, current BER, packet size and error correction technique used is proposed. Based on this adaptation algorithm a periodically 3-bit feedback is added to the acknowledgment packet to state which error correction technique is suitable for the current channel conditions and distance. Comparative studies were done between this technique and other techniques, and the results show that AHECT is more energy efficient and has high probability of success than all those techniques.

Keywords: Underwater communication, wireless sensornetworks, error correction technique, energy efficiency

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2151
15039 The Use of Artificial Intelligence in Digital Forensics and Incident Response in a Constrained Environment

Authors: Dipo Dunsin, Mohamed C. Ghanem, Karim Ouazzane

Abstract:

Digital investigators often have a hard time spotting evidence in digital information. It has become hard to determine which source of proof relates to a specific investigation. A growing concern is that the various processes, technology, and specific procedures used in the digital investigation are not keeping up with criminal developments. Therefore, criminals are taking advantage of these weaknesses to commit further crimes. In digital forensics investigations, artificial intelligence (AI) is invaluable in identifying crime. Providing objective data and conducting an assessment is the goal of digital forensics and digital investigation, which will assist in developing a plausible theory that can be presented as evidence in court. This research paper aims at developing a multiagent framework for digital investigations using specific intelligent software agents (ISAs). The agents communicate to address particular tasks jointly and keep the same objectives in mind during each task. The rules and knowledge contained within each agent are dependent on the investigation type. A criminal investigation is classified quickly and efficiently using the case-based reasoning (CBR) technique. The proposed framework development is implemented using the Java Agent Development Framework, Eclipse, Postgres repository, and a rule engine for agent reasoning. The proposed framework was tested using the Lone Wolf image files and datasets. Experiments were conducted using various sets of ISAs and VMs. There was a significant reduction in the time taken for the Hash Set Agent to execute. As a result of loading the agents, 5% of the time was lost, as the File Path Agent prescribed deleting 1,510, while the Timeline Agent found multiple executable files. In comparison, the integrity check carried out on the Lone Wolf image file using a digital forensic tool kit took approximately 48 minutes (2,880 ms), whereas the MADIK framework accomplished this in 16 minutes (960 ms). The framework is integrated with Python, allowing for further integration of other digital forensic tools, such as AccessData Forensic Toolkit (FTK), Wireshark, Volatility, and Scapy.

Keywords: Artificial intelligence, computer science, criminal investigation, digital forensics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1292
15038 A Black-box Approach for Response Quality Evaluation of Conversational Agent Systems

Authors: Ong Sing Goh, C. Ardil, Wilson Wong, Chun Che Fung

Abstract:

The evaluation of conversational agents or chatterbots question answering systems is a major research area that needs much attention. Before the rise of domain-oriented conversational agents based on natural language understanding and reasoning, evaluation is never a problem as information retrieval-based metrics are readily available for use. However, when chatterbots began to become more domain specific, evaluation becomes a real issue. This is especially true when understanding and reasoning is required to cater for a wider variety of questions and at the same time to achieve high quality responses. This paper discusses the inappropriateness of the existing measures for response quality evaluation and the call for new standard measures and related considerations are brought forward. As a short-term solution for evaluating response quality of conversational agents, and to demonstrate the challenges in evaluating systems of different nature, this research proposes a blackbox approach using observation, classification scheme and a scoring mechanism to assess and rank three example systems, AnswerBus, START and AINI.

Keywords: Evaluation, conversational agents, Response Quality, chatterbots

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1927
15037 Response Quality Evaluation in Heterogeneous Question Answering System: A Black-box Approach

Authors: Goh Ong Sing, C. Ardil, Wilson Wong, Shahrin Sahib

Abstract:

The evaluation of the question answering system is a major research area that needs much attention. Before the rise of domain-oriented question answering systems based on natural language understanding and reasoning, evaluation is never a problem as information retrieval-based metrics are readily available for use. However, when question answering systems began to be more domains specific, evaluation becomes a real issue. This is especially true when understanding and reasoning is required to cater for a wider variety of questions and at the same time achieve higher quality responses The research in this paper discusses the inappropriateness of the existing measure for response quality evaluation and in a later part, the call for new standard measures and the related considerations are brought forward. As a short-term solution for evaluating response quality of heterogeneous systems, and to demonstrate the challenges in evaluating systems of different nature, this research presents a black-box approach using observation, classification scheme and a scoring mechanism to assess and rank three example systems (i.e. AnswerBus, START and NaLURI).

Keywords: Evaluation, question answering, response quality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1859
15036 A Study on the Iterative Scheme for Stratified Shields Gamma Ray Buildup FactorsUsing Layer-Splitting Technique in Double-Layer Shields

Authors: Sari F. Alkhatib, Chang Je Park, Gyuhong Roh

Abstract:

Theiterative scheme which is used to treat buildup factors for stratified shields is being investigated here using the layer-splitting technique.A simple suggested formalism for the scheme based on the Kalos’ formula is introduced, based on which the implementation of the testing technique is carried out.

The second layer in a double-layer shield was split into two equivalent layers and the scheme (with the suggested formalism) was implemented on the new “three-layer” shieldconfiguration.The results of such manipulation on water-lead and water-iron shields combinations are presented here for 1MeV photons.

It was found that splitting the second layer introduces some deviation on the overall buildup factor value. This expected deviation appeared to be higher in the case of low Z layer followed by high Z. However, the overall performance of the iterative scheme showed a great consistency and strong coherence even with the introduced changes. The introduced layer-splitting testing technique shows the capability to be implemented in test the iterative scheme with a wide range of formalisms.

Keywords: Buildup Factor, Iterative Scheme, Stratified Shields

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1643
15035 Enhancement of Low Contrast Satellite Images using Discrete Cosine Transform and Singular Value Decomposition

Authors: A. K. Bhandari, A. Kumar, P. K. Padhy

Abstract:

In this paper, a novel contrast enhancement technique for contrast enhancement of a low-contrast satellite image has been proposed based on the singular value decomposition (SVD) and discrete cosine transform (DCT). The singular value matrix represents the intensity information of the given image and any change on the singular values change the intensity of the input image. The proposed technique converts the image into the SVD-DCT domain and after normalizing the singular value matrix; the enhanced image is reconstructed by using inverse DCT. The visual and quantitative results suggest that the proposed SVD-DCT method clearly shows the increased efficiency and flexibility of the proposed method over the exiting methods such as Linear Contrast Stretching technique, GHE technique, DWT-SVD technique, DWT technique, Decorrelation Stretching technique, Gamma Correction method based techniques.

Keywords: Singular Value Decomposition (SVD), discretecosine transforms (DCT), image equalization and satellite imagecontrast enhancement.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3838
15034 Semantic Web Agent Communication Capable of Reasoning with Ontology and Agent Locations

Authors: Visit Hirankitti, Vuong Tran Xuan

Abstract:

Multi-agent communication of Semantic Web information cannot be realized without the need to reason with ontology and agent locations. This is because for an agent to be able to reason with an external semantic web ontology, it must know where and how to access to that ontology. Similarly, for an agent to be able to communicate with another agent, it must know where and how to send a message to that agent. In this paper we propose a framework of an agent which can reason with ontology and agent locations in order to perform reasoning with multiple distributed ontologies and perform communication with other agents on the semantic web. The agent framework and its communication mechanism are formulated entirely in meta-logic.

Keywords: Semantic Web, agent communication, ontologies.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1460
15033 Academic Program Administration via Semantic Web – A Case Study

Authors: Qurban A Memon, Shakeel A. Khoja

Abstract:

Generally, administrative systems in an academic environment are disjoint and support independent queries. The objective in this work is to semantically connect these independent systems to provide support to queries run on the integrated platform. The proposed framework, by enriching educational material in the legacy systems, provides a value-added semantics layer where activities such as annotation, query and reasoning can be carried out to support management requirements. We discuss the development of this ontology framework with a case study of UAE University program administration to show how semantic web technologies can be used by administration to develop student profiles for better academic program management.

Keywords: Academic Program Administration, Semantic Web, Web Technology

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1619
15032 Effectiveness of Dominant Color Descriptor Technique in Medical Image Retrieval Application

Authors: Mohd Kamir Yusof

Abstract:

This paper presents a dominant color descriptor technique for medical image retrieval. The medical image system will collect and store into medical database. The purpose of dominant color descriptor (DCD) technique is to retrieve medical image and to display similar image using queried image. First, this technique will search and retrieve medical image based on keyword entered by user. After image is found, the system will assign this image as a queried image. DCD technique will calculate the image value of dominant color. Then, system will search and retrieve again medical image based on value of dominant color query image. Finally, the system will display similar images with the queried image to user. Simple application has been developed and tested using dominant color descriptor. Result based on experiment indicates this technique is effective and can be used for medical image retrieval.

Keywords: Medical Image Retrieval, Dominant ColorDescriptor.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1742
15031 Piecewise Interpolation Filter for Effective Processing of Large Signal Sets

Authors: Anatoli Torokhti, Stanley Miklavcic

Abstract:

Suppose KY and KX are large sets of observed and reference signals, respectively, each containing N signals. Is it possible to construct a filter F : KY → KX that requires a priori information only on few signals, p  N, from KX but performs better than the known filters based on a priori information on every reference signal from KX? It is shown that the positive answer is achievable under quite unrestrictive assumptions. The device behind the proposed method is based on a special extension of the piecewise linear interpolation technique to the case of random signal sets. The proposed technique provides a single filter to process any signal from the arbitrarily large signal set. The filter is determined in terms of pseudo-inverse matrices so that it always exists.

Keywords: Wiener filter, filtering of stochastic signals.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1412
15030 Characterization of Adhesive Layers in Sandwich Composites by Nondestructive Technique

Authors: E. Barkanov, E. Skukis, M. Wesolowski, A. Chate

Abstract:

New nondestructive technique, namely an inverse technique based on vibration tests, to characterize nonlinear mechanical properties of adhesive layers in sandwich composites is developed. An adhesive layer is described as a viscoelastic isotropic material with storage and loss moduli which are both frequency dependent values in wide frequency range. An optimization based on the planning of experiments and response surface technique to minimize the error functional is applied to decrease considerably the computational expenses. The developed identification technique has been tested on aluminum panels and successfully applied to characterize viscoelastic material properties of 3M damping polymer ISD-112 used as a core material in sandwich panels.

Keywords: Adhesive layer, finite element method, inverse technique, sandwich panel, vibration test, viscoelastic material properties.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2251
15029 A Low Cost and High Quality Duty-Cycle Modulation Scheme and Applications

Authors: B. Lonla Moffo, J. Mbihi, L. Nneme Nneme

Abstract:

In this paper, a low cost duty-cycle modulation scheme is studied in depth and compared to the standard pulse width modulation technique. Using a mix of analytical reasoning and electronics simulation tools, it is shown that under the same operating conditions, most characteristics of the proposed duty-cycle modulation scheme are better than those provided by a standard pulse width modulation technique. The simulation results obtained when testing both modulation control policies on prototyping systems, indicate that the proposed duty-cycle modulation approach, appears to be a high quality control policy in a wide variety of application areas, including A/D and D/A conversion, signal transmission and switching control in power electronics.

Keywords: Duty-cycle Modulation, Operational amplifiers, Pulse width modulation, Power electronics, Signal processing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2745
15028 A Step-wise Zoom Technique for Exploring Image-based Virtual Reality Applications

Authors: D. R. Awang Rambli, S. Sulaiman, M.Y. Nayan, A.R. Asoruddin

Abstract:

Existing image-based virtual reality applications allow users to view image-based 3D virtual environment in a more interactive manner. User could “walkthrough"; looks left, right, up and down and even zoom into objects in these virtual worlds of images. However what the user sees during a “zoom in" is just a close-up view of the same image which was taken from a distant. Thus, this does not give the user an accurate view of the object from the actual distance. In this paper, a simple technique for zooming in an object in a virtual scene is presented. The technique is based on the 'hotspot' concept in existing application. Instead of navigation between two different locations, the hotspots are used to focus into an object in the scene. For each object, several hotspots are created. A different picture is taken for each hotspot. Each consecutive hotspot created will take the user closer to the object. This will provide the user with a correct of view of the object based on his proximity to the object. Implementation issues and the relevance of this technique in potential application areas are highlighted.

Keywords: Hotspots, image-based VR, camera zooms, virtualreality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1531
15027 A Robust Data Hiding Technique based on LSB Matching

Authors: Emad T. Khalaf, Norrozila Sulaiman

Abstract:

Many researchers are working on information hiding techniques using different ideas and areas to hide their secrete data. This paper introduces a robust technique of hiding secret data in image based on LSB insertion and RSA encryption technique. The key of the proposed technique is to encrypt the secret data. Then the encrypted data will be converted into a bit stream and divided it into number of segments. However, the cover image will also be divided into the same number of segments. Each segment of data will be compared with each segment of image to find the best match segment, in order to create a new random sequence of segments to be inserted then in a cover image. Experimental results show that the proposed technique has a high security level and produced better stego-image quality.

Keywords: steganography; LSB Matching; RSA Encryption; data segments

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2220
15026 Portfolio Management for Construction Company during Covid-19 Using AHP Technique

Authors: Sareh Rajabi, Salwa Bheiry

Abstract:

In general, Covid-19 created many financial and non-financial damages to the economy and community. Level and severity of covid-19 as pandemic case varies over the region and due to different types of the projects. Covid-19 virus emerged as one of the most imperative risk management factors word-wide recently. Therefore, as part of portfolio management assessment, it is essential to evaluate severity of such risk on the project and program in portfolio management level to avoid any risky portfolio. Covid-19 appeared very effectively in South America, part of Europe and Middle East. Such pandemic infection affected the whole universe, due to lock down, interruption in supply chain management, health and safety requirements, transportations and commercial impacts. Therefore, this research proposes Analytical Hierarchy Process (AHP) to analyze and assess such pandemic case like Covid-19 and its impacts on the construction projects. The AHP technique uses four sub-criteria: Health and safety, commercial risk, completion risk and contractual risk to evaluate the project and program. The result will provide the decision makers with information which project has higher or lower risk in case of Covid-19 and pandemic scenario. Therefore, the decision makers can have most feasible solution based on effective weighted criteria for project selection within their portfolio to match with the organization’s strategies.

Keywords: Portfolio management, risk management, COVID-19, analytical hierarchy process technique.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 832
15025 Reasoning With Non-Binary Logics

Authors: Sylvia Encheva

Abstract:

Students in high education are presented with new terms and concepts in nearly every lecture they attend. Many of them prefer Web-based self-tests for evaluation of their concepts understanding since they can use those tests independently of tutors- working hours and thus avoid the necessity of being in a particular place at a particular time. There is a large number of multiple-choice tests in almost every subject designed to contribute to higher level learning or discover misconceptions. Every single test provides immediate feedback to a student about the outcome of that test. In some cases a supporting system displays an overall score in case a test is taken several times by a student. What we still find missing is how to secure delivering of personalized feedback to a user while taking into consideration the user-s progress. The present work is motivated to throw some light on that question.

Keywords: Clustering, rough sets, many valued logic, predictions

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1694
15024 Generating Class-Based Test Cases for Interface Classes of Object-Oriented Gray-Box Frameworks

Authors: Jehad Al Dallal, Paul Sorenson

Abstract:

An application framework provides a reusable design and implementation for a family of software systems. Application developers extend the framework to build their particular applications using hooks. Hooks are the places identified to show how to use and customize the framework. Hooks define Framework Interface Classes (FICs) and their possible specifications, which helps in building reusable test cases for the implementations of these classes. In applications developed using gray-box frameworks, FICs inherit framework classes or use them without inheritance. In this paper, a test-case generation technique is extended to build test cases for FICs built for gray-box frameworks. A tool is developed to automate the introduced technique.

Keywords: Class testing, object-oriented framework, reusable test case.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1517
15023 The Effect of Information vs. Reasoning Gap Tasks on the Frequency of Conversational Strategies and Accuracy in Speaking among Iranian Intermediate EFL Learners

Authors: Hooriya Sadr Dadras, Shiva Seyed Erfani

Abstract:

Speaking skills merit meticulous attention both on the side of the learners and the teachers. In particular, accuracy is a critical component to guarantee the messages to be conveyed through conversation because a wrongful change may adversely alter the content and purpose of the talk. Different types of tasks have served teachers to meet numerous educational objectives. Besides, negotiation of meaning and the use of different strategies have been areas of concern in socio-cultural theories of SLA. Negotiation of meaning is among the conversational processes which have a crucial role in facilitating the understanding and expression of meaning in a given second language. Conversational strategies are used during interaction when there is a breakdown in communication that leads to the interlocutor attempting to remedy the gap through talk. Therefore, this study was an attempt to investigate if there was any significant difference between the effect of reasoning gap tasks and information gap tasks on the frequency of conversational strategies used in negotiation of meaning in classrooms on one hand, and on the accuracy in speaking of Iranian intermediate EFL learners on the other. After a pilot study to check the practicality of the treatments, at the outset of the main study, the Preliminary English Test was administered to ensure the homogeneity of 87 out of 107 participants who attended the intact classes of a 15 session term in one control and two experimental groups. Also, speaking sections of PET were used as pretest and posttest to examine their speaking accuracy. The tests were recorded and transcribed to estimate the percentage of the number of the clauses with no grammatical errors in the total produced clauses to measure the speaking accuracy. In all groups, the grammatical points of accuracy were instructed and the use of conversational strategies was practiced. Then, different kinds of reasoning gap tasks (matchmaking, deciding on the course of action, and working out a time table) and information gap tasks (restoring an incomplete chart, spot the differences, arranging sentences into stories, and guessing game) were manipulated in experimental groups during treatment sessions, and the students were required to practice conversational strategies when doing speaking tasks. The conversations throughout the terms were recorded and transcribed to count the frequency of the conversational strategies used in all groups. The results of statistical analysis demonstrated that applying both the reasoning gap tasks and information gap tasks significantly affected the frequency of conversational strategies through negotiation. In the face of the improvements, the reasoning gap tasks had a more significant impact on encouraging the negotiation of meaning and increasing the number of conversational frequencies every session. The findings also indicated both task types could help learners significantly improve their speaking accuracy. Here, applying the reasoning gap tasks was more effective than the information gap tasks in improving the level of learners’ speaking accuracy.

Keywords: Accuracy in speaking, conversational strategies, information gap tasks, reasoning gap tasks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1170
15022 The Potential Benefits of Multimedia Information Representation in Enhancing Students’ Critical Thinking and History Reasoning

Authors: Ang Ling Weay, Mona Masood

Abstract:

This paper discusses the potential benefits of an interactive multimedia information representation in enhancing students’ critical thinking aligned with history reasoning in learning history amongst Secondary School students in Malaysia. Two modes of multimedia information representation were implemented; chronologic and thematic information representations. A qualitative study of an unstructured interview was conducted among two history teachers, one history education lecturer, two i-think experts, and five students from Form Four secondary school. The interview was to elicit their opinions on the implementation of thinking maps and interactive multimedia information representation in history learning. The key elements of the interactive multimedia (e.g. multiple media, user control, interactivity and use of timelines and concept maps) were then considered to improve the learning process. Findings of the preliminary investigation reveal that the interactive multimedia information representations have the potential benefits to be implemented as an instructional resource in enhancing students’ higher order thinking skills (HOTs). This paper concludes by giving suggestions for future work.

Keywords: Multimedia Information Representation, Critical Thinking, History Reasoning, Chronological and Thematic Information Representation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2388
15021 An Implementation of Fuzzy Logic Technique for Prediction of the Power Transformer Faults

Authors: Omar M. Elmabrouk., Roaa Y. Taha., Najat M. Ebrahim, Sabbreen A. Mohammed

Abstract:

Power transformers are the most crucial part of power electrical system, distribution and transmission grid. This part is maintained using predictive or condition-based maintenance approach. The diagnosis of power transformer condition is performed based on Dissolved Gas Analysis (DGA). There are five main methods utilized for analyzing these gases. These methods are International Electrotechnical Commission (IEC) gas ratio, Key Gas, Roger gas ratio, Doernenburg, and Duval Triangle. Moreover, due to the importance of the transformers, there is a need for an accurate technique to diagnose and hence predict the transformer condition. The main objective of this technique is to avoid the transformer faults and hence to maintain the power electrical system, distribution and transmission grid. In this paper, the DGA was utilized based on the data collected from the transformer records available in the General Electricity Company of Libya (GECOL) which is located in Benghazi-Libya. The Fuzzy Logic (FL) technique was implemented as a diagnostic approach based on IEC gas ratio method. The FL technique gave better results and approved to be used as an accurate prediction technique for power transformer faults. Also, this technique is approved to be a quite interesting for the readers and the concern researchers in the area of FL mathematics and power transformer.

Keywords: Fuzzy logic, dissolved gas-in-oil analysis, DGA, prediction, power transformer.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1357
15020 Impact of Hard Limited Clipping Crest Factor Reduction Technique on Bit Error Rate in OFDM Based Systems

Authors: Theodore Grosch, Felipe Koji Godinho Hoshino

Abstract:

In wireless communications, 3GPP LTE is one of the solutions to meet the greater transmission data rate demand. One issue inherent to this technology is the PAPR (Peak-to-Average Power Ratio) of OFDM (Orthogonal Frequency Division Multiplexing) modulation. This high PAPR affects the efficiency of power amplifiers. One approach to mitigate this effect is the Crest Factor Reduction (CFR) technique. In this work, we simulate the impact of Hard Limited Clipping Crest Factor Reduction technique on BER (Bit Error Rate) in OFDM based Systems. In general, the results showed that CFR has more effects on higher digital modulation schemes, as expected. More importantly, we show the worst-case degradation due to CFR on QPSK, 16QAM, and 64QAM signals in a linear system. For example, hard clipping of 9 dB results in a 2 dB increase in signal to noise energy at a 1% BER for 64-QAM modulation.

Keywords: Bit error rate, crest factor reduction, OFDM, physical layer simulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2140
15019 Practical Problems as Tools for the Development of Secondary School Students’ Motivation to Learn Mathematics

Authors: M. Rodionov, Z. Dedovets

Abstract:

This article discusses plausible reasoning use for solution to practical problems. Such reasoning is the major driver of motivation and implementation of mathematical, scientific and educational research activity. A general, practical problem solving algorithm is presented which includes an analysis of specific problem content to build, solve and interpret the underlying mathematical model. The author explores the role of practical problems such as the stimulation of students' interest, the development of their world outlook and their orientation in the modern world at the different stages of learning mathematics in secondary school. Particular attention is paid to the characteristics of those problems which were systematized and presented in the conclusions.

Keywords: Mathematics, motivation, secondary school, student, practical problem.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1065