Search results for: logical method
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8108

Search results for: logical method

8108 Data Acquisition from Cell Phone using Logical Approach

Authors: Keonwoo Kim, Dowon Hong, Kyoil Chung, Jae-Cheol Ryou

Abstract:

Cell phone forensics to acquire and analyze data in the cellular phone is nowadays being used in a national investigation organization and a private company. In order to collect cellular phone flash memory data, we have two methods. Firstly, it is a logical method which acquires files and directories from the file system of the cell phone flash memory. Secondly, we can get all data from bit-by-bit copy of entire physical memory using a low level access method. In this paper, we describe a forensic tool to acquire cell phone flash memory data using a logical level approach. By our tool, we can get EFS file system and peek memory data with an arbitrary region from Korea CDMA cell phone.

Keywords: Forensics, logical method, acquisition, cell phone, flash memory.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4049
8107 Heuristic Set-Covering-Based Postprocessing for Improving the Quine-McCluskey Method

Authors: Miloš Šeda

Abstract:

Finding the minimal logical functions has important applications in the design of logical circuits. This task is solved by many different methods but, frequently, they are not suitable for a computer implementation. We briefly summarise the well-known Quine-McCluskey method, which gives a unique procedure of computing and thus can be simply implemented, but, even for simple examples, does not guarantee an optimal solution. Since the Petrick extension of the Quine-McCluskey method does not give a generally usable method for finding an optimum for logical functions with a high number of values, we focus on interpretation of the result of the Quine-McCluskey method and show that it represents a set covering problem that, unfortunately, is an NP-hard combinatorial problem. Therefore it must be solved by heuristic or approximation methods. We propose an approach based on genetic algorithms and show suitable parameter settings.

Keywords: Boolean algebra, Karnaugh map, Quine-McCluskey method, set covering problem, genetic algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2751
8106 Calculation of Inflation from Salaries Instead of Consumer Products: A Logical Exercise

Authors: E. Dahlen

Abstract:

Inflation can be calculated from either the prices of consumer products or from salaries. This paper presents a logical exercise that shows it is easier to calculate inflation from salaries than from consumer products. While the prices of consumer products may change due to technological advancement, such as automation, which must be corrected for, salaries do not. If technological advancements are not accounted for within calculations based on consumer product prices, inflation can be confused with real wage changes, since both inflation and real wage changes affect the prices of consumer products. The method employed in this paper is a logical exercise. Logical arguments are presented that suggest the existence of many different feasible ways by which inflation can be determined. Then a short mathematical exercise will be presented which shows that one of these methods –using salaries – contains the fewest number of unknown parameters, and hence, is the preferred method, since the risk of mistakes is lower. From the results, it can be concluded that salaries, rather than consumer products, should be used to calculate inflation.

Keywords: Inflation, logic, math, real wages.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 665
8105 A New Approach for Flexible Document Categorization

Authors: Jebari Chaker, Ounelli Habib

Abstract:

In this paper we propose a new approach for flexible document categorization according to the document type or genre instead of topic. Our approach implements two homogenous classifiers: contextual classifier and logical classifier. The contextual classifier is based on the document URL, whereas, the logical classifier use the logical structure of the document to perform the categorization. The final categorization is obtained by combining contextual and logical categorizations. In our approach, each document is assigned to all predefined categories with different membership degrees. Our experiments demonstrate that our approach is best than other genre categorization approaches.

Keywords: Categorization, combination, flexible, logicalstructure, genre, category, URL.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1428
8104 The Usefulness of Logical Structure in Flexible Document Categorization

Authors: Jebari Chaker, Ounalli Habib

Abstract:

This paper presents a new approach for automatic document categorization. Exploiting the logical structure of the document, our approach assigns a HTML document to one or more categories (thesis, paper, call for papers, email, ...). Using a set of training documents, our approach generates a set of rules used to categorize new documents. The approach flexibility is carried out with rule weight association representing your importance in the discrimination between possible categories. This weight is dynamically modified at each new document categorization. The experimentation of the proposed approach provides satisfactory results.

Keywords: categorization rule, document categorization, flexible categorization, logical structure.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1195
8103 Influence of Paralleled Capacitance Effect in Well-defined Multiple Value Logical Level System with Active Load

Authors: Chih Chin Yang, Yen Chun Lin, Hsiao Hsuan Cheng

Abstract:

Three similar negative differential resistance (NDR) profiles with both high peak to valley current density ratio (PVCDR) value and high peak current density (PCD) value in unity resonant tunneling electronic circuit (RTEC) element is developed in this paper. The PCD values and valley current density (VCD) values of the three NDR curves are all about 3.5 A and 0.8 A, respectively. All PV values of NDR curves are 0.40 V, 0.82 V, and 1.35 V, respectively. The VV values are 0.61 V, 1.07 V, and 1.69 V, respectively. All PVCDR values reach about 4.4 in three NDR curves. The PCD value of 3.5 A in triple PVCDR RTEC element is better than other resonant tunneling devices (RTD) elements. The high PVCDR value is concluded the lower VCD value about 0.8 A. The low VCD value is achieved by suitable selection of resistors in triple PVCDR RTEC element. The low PV value less than 1.35 V possesses low power dispersion in triple PVCDR RTEC element. The designed multiple value logical level (MVLL) system using triple PVCDR RTEC element provides equidistant logical level. The logical levels of MVLL system are about 0.2 V, 0.8 V, 1.5 V, and 2.2 V from low voltage to high voltage and then 2.2 V, 1.3 V, 0.8 V, and 0.2 V from high voltage back to low voltage in half cycle of sinusoid wave. The output level of four levels MVLL system is represented in 0.3 V, 1.1 V, 1.7 V, and 2.6 V, which satisfies the NMP condition of traditional two-bit system. The remarkable logical characteristic of improved MVLL system with paralleled capacitor are with four significant stable logical levels about 220 mV, 223 mV, 228 mV, and 230 mV. The stability and articulation of logical levels of improved MVLL system are outstanding. The average holding time of improved MVLL system is approximately 0.14 μs. The holding time of improved MVLL system is fourfold than of basic MVLL system. The function of additional capacitor in the improved MVLL system is successfully discovered.

Keywords: Capacitance, Logical level, Constant current source

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1331
8102 Psychodidactic Strategies to Facilitate the Flow of Logical Thinking in the Preparation of Academic Documents

Authors: Deni Stincer Gomez, Zuraya Monroy Nasr, Luis Pérez Alvarez

Abstract:

The preparation of academic documents, such as thesis, articles and research projects, is one of the requirements of the higher educational level. These documents demand the implementation of logical argumentative thinking which is experienced and executed with difficulty. To mitigate the effect of these difficulties we designed a thesis seminar, with which we have seven years of experience. It is taught in a graduate program in Psychology at the National Autonomous University of Mexico. In this seminar we use the Toulmin model as a mental heuristic and for the application of a set of psychodidactic strategies that facilitate the elaboration of the plot and culmination of the thesis. The efficiency in obtaining the degree in the groups exposed to the seminar has increased by 94% compared to the 10% that existed in the generations that were not exposed to the seminar. In this article we will emphasize the psychodidactic strategies used. The Toulmin model alone does not guarantee the success achieved. A set of actions of a psychological nature (almost psychotherapeutic) and didactics of the teacher also seem to contribute. These are actions that derive from an understanding of the psychological, epistemological and ontogenetic obstacles and the most frequent errors in which thought tends to fall when it is demanded a logical course. We have grouped the strategies into three groups: 1) strategies to facilitate logical thinking, 2) strategies to strengthen the scientific self and 3) strategies to facilitate the act of writing the text. In this work we delve into each of them.

Keywords: psychodidactic strategies, logical thinking, academic documents, Toulmin model

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 314
8101 HaskellFL: A Tool for Detecting Logical Errors in Haskell

Authors: Vanessa Vasconcelos, Mariza A. S. Bigonha

Abstract:

Understanding and using the functional paradigm is a challenge for many programmers. Looking for logical errors in code may take a lot of a developer’s time when a program grows in size. In order to facilitate both processes, this paper presents HaskellFL, a tool that uses fault localization techniques to locate a logical error in Haskell code. The Haskell subset used in this work is sufficiently expressive for those studying Functional Programming to get immediate help debugging their code and to answer questions about key concepts associated with the functional paradigm. HaskellFL was tested against Functional Programming assignments submitted by students enrolled at the Functional Programming class at the Federal University of Minas Gerais and against exercises from the Exercism Haskell track that are publicly available in GitHub. This work also evaluated the effectiveness of two fault localization techniques, Tarantula and Ochiai, in the Haskell context. Furthermore, the EXAM score was chosen to evaluate the tool’s effectiveness, and results showed that HaskellFL reduced the effort needed to locate an error for all tested scenarios. The results also showed that the Ochiai method was more effective than Tarantula.

Keywords: Debug, fault localization, functional programming, Haskell.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 641
8100 Generating Speq Rules based on Automatic Proof of Logical Equivalence

Authors: Katsunori Miura, Kiyoshi Akama, Hiroshi Mabuchi

Abstract:

In the Equivalent Transformation (ET) computation model, a program is constructed by the successive accumulation of ET rules. A method by meta-computation by which a correct ET rule is generated has been proposed. Although the method covers a broad range in the generation of ET rules, all important ET rules are not necessarily generated. Generation of more ET rules can be achieved by supplementing generation methods which are specialized for important ET rules. A Specialization-by-Equation (Speq) rule is one of those important rules. A Speq rule describes a procedure in which two variables included in an atom conjunction are equalized due to predicate constraints. In this paper, we propose an algorithm that systematically and recursively generate Speq rules and discuss its effectiveness in the synthesis of ET programs. A Speq rule is generated based on proof of a logical formula consisting of given atom set and dis-equality. The proof is carried out by utilizing some ET rules and the ultimately obtained rules in generating Speq rules.

Keywords: Equivalent transformation, ET rule, Equation of two variables, Rule generation, Specialization-by-Equation rule

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1240
8099 Content-based Retrieval of Medical Images

Authors: Lilac A. E. Al-Safadi

Abstract:

With the advance of multimedia and diagnostic images technologies, the number of radiographic images is increasing constantly. The medical field demands sophisticated systems for search and retrieval of the produced multimedia document. This paper presents an ongoing research that focuses on the semantic content of radiographic image documents to facilitate semantic-based radiographic image indexing and a retrieval system. The proposed model would divide a radiographic image document, based on its semantic content, and would be converted into a logical structure or a semantic structure. The logical structure represents the overall organization of information. The semantic structure, which is bound to logical structure, is composed of semantic objects with interrelationships in the various spaces in the radiographic image.

Keywords: Semantic Indexing, Content-Based Retrieval, Radiographic Images, Data Model

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1442
8098 Reasoning with Dynamic Domains and Computer Security

Authors: Yun Bai

Abstract:

Representing objects in a dynamic domain is essential in commonsense reasoning under some circumstances. Classical logics and their nonmonotonic consequences, however, are usually not able to deal with reasoning with dynamic domains due to the fact that every constant in the logical language denotes some existing object in the static domain. In this paper, we explore a logical formalization which allows us to represent nonexisting objects in commonsense reasoning. A formal system named N-theory is proposed for this purpose and its possible application in computer security is briefly discussed.

Keywords: knowledge representation and reasoning, commonsensereasoning, computer security

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1395
8097 The Impact of Social Stratification to the Phenomenon of “Terrorism“

Authors: Rustamov Nasim, Roostamov Yunusbek

Abstract:

In this work social stratification is considered as one of significant factor which generate the phenomena “terrorism” and it puts the accent on correlation connection between them, with the object of creation info-logical model generation of phenomena of “terrorism” based on stratification process.

Keywords: Social stratification, stratification process, generation of phenomena “terrorism”, conceptions – “terror”, “terrorize” and “terrorism”, info-logical model of phenomena of “terrorism”.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4181
8096 Enhancing Security in Resource Sharing Using Key Holding Mechanism

Authors: M. Victor Jose, V. Seenivasagam

Abstract:

This paper describes a logical method to enhance security on the grid computing to restrict the misuse of the grid resources. This method is an economic and efficient one to avoid the usage of the special devices. The security issues, techniques and solutions needed to provide a secure grid computing environment are described. A well defined process for security management among the resource accesses and key holding algorithm is also proposed. In this method, the identity management, access control and authorization and authentication are effectively handled.

Keywords: Grid security, Irregular binary series, Key holding mechanism, Resource identity, Secure resource access.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1682
8095 Extended Deductive Databases with Uncertain Information

Authors: Daniel Stamate

Abstract:

The paper presents an approach for handling uncertain information in deductive databases using multivalued logics. Uncertainty means that database facts may be assigned logical values other than the conventional ones - true and false. The logical values represent various degrees of truth, which may be combined and propagated by applying the database rules. A corresponding multivalued database semantics is defined. We show that it extends successful conventional semantics as the well-founded semantics, and has a polynomial time data complexity.

Keywords: Reasoning under uncertainty, multivalued logics, deductive databases, logic programs, multivalued semantics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1296
8094 Integrated Subset Split for Balancing Network Utilization and Quality of Routing

Authors: S. V. Kasmir Raja, P. Herbert Raj

Abstract:

The overlay approach has been widely used by many service providers for Traffic Engineering (TE) in large Internet backbones. In the overlay approach, logical connections are set up between edge nodes to form a full mesh virtual network on top of the physical topology. IP routing is then run over the virtual network. Traffic engineering objectives are achieved through carefully routing logical connections over the physical links. Although the overlay approach has been implemented in many operational networks, it has a number of well-known scaling issues. This paper proposes a new approach to achieve traffic engineering without full-mesh overlaying with the help of integrated approach and equal subset split method. Traffic engineering needs to determine the optimal routing of traffic over the existing network infrastructure by efficiently allocating resource in order to optimize traffic performance on an IP network. Even though constraint-based routing [1] of Multi-Protocol Label Switching (MPLS) is developed to address this need, since it is not widely tested or debugged, Internet Service Providers (ISPs) resort to TE methods under Open Shortest Path First (OSPF), which is the most commonly used intra-domain routing protocol. Determining OSPF link weights for optimal network performance is an NP-hard problem. As it is not possible to solve this problem, we present a subset split method to improve the efficiency and performance by minimizing the maximum link utilization in the network via a small number of link weight modifications. The results of this method are compared against results of MPLS architecture [9] and other heuristic methods.

Keywords: Constraint based routing, Link Utilization, Subsetsplit method and Traffic Engineering.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1341
8093 Dynamic Attribute Dependencies in Relational Attribute Grammars

Authors: K. Barbar, M. Dehayni, A. Awada, M. Smaili

Abstract:

Considering the theory of attribute grammars, we use logical formulas instead of traditional functional semantic rules. Following the decoration of a derivation tree, a suitable algorithm should maintain the consistency of the formulas together with the evaluation of the attributes. This may be a Prolog-like resolution, but this paper examines a somewhat different strategy, based on production specialization, local consistency and propagation: given a derivation tree, it is interactively decorated, i.e. incrementally checked and evaluated. The non-directed dependencies are dynamically directed during attribute evaluation.

Keywords: Input/Output attribute grammars, local consistency, logical programming, propagation, relational attribute grammars.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1407
8092 Towards Security in Virtualization of SDN

Authors: Wanqing You, Kai Qian, Xi He, Ying Qian

Abstract:

In this paper, the potential security issues brought by the virtualization of a Software Defined Networks (SDN) would be analyzed. The virtualization of SDN is achieved by FlowVisor (FV). With FV, a physical network is divided into multiple isolated logical networks while the underlying resources are still shared by different slices (isolated logical networks). However, along with the benefits brought by network virtualization, it also presents some issues regarding security. By examining security issues existing in an OpenFlow network, which uses FlowVisor to slice it into multiple virtual networks, we hope we can get some significant results and also can get furtherdiscussions among the security of SDN virtualization.

Keywords: FlowVisor, Network virtualization, Potential threats, Possible solutions.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2112
8091 Fuzzy Metric Approach for Fuzzy Time Series Forecasting based on Frequency Density Based Partitioning

Authors: Tahseen Ahmed Jilani, Syed Muhammad Aqil Burney, C. Ardil

Abstract:

In the last 15 years, a number of methods have been proposed for forecasting based on fuzzy time series. Most of the fuzzy time series methods are presented for forecasting of enrollments at the University of Alabama. However, the forecasting accuracy rates of the existing methods are not good enough. In this paper, we compared our proposed new method of fuzzy time series forecasting with existing methods. Our method is based on frequency density based partitioning of the historical enrollment data. The proposed method belongs to the kth order and time-variant methods. The proposed method can get the best forecasting accuracy rate for forecasting enrollments than the existing methods.

Keywords: Fuzzy logical groups, fuzzified enrollments, fuzzysets, fuzzy time series.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3140
8090 A Fuzzy Time Series Forecasting Model for Multi-Variate Forecasting Analysis with Fuzzy C-Means Clustering

Authors: Emrah Bulut, Okan Duru, Shigeru Yoshida

Abstract:

In this study, a fuzzy integrated logical forecasting method (FILF) is extended for multi-variate systems by using a vector autoregressive model. Fuzzy time series forecasting (FTSF) method was recently introduced by Song and Chissom [1]-[2] after that Chen improved the FTSF method. Rather than the existing literature, the proposed model is not only compared with the previous FTS models, but also with the conventional time series methods such as the classical vector autoregressive model. The cluster optimization is based on the C-means clustering method. An empirical study is performed for the prediction of the chartering rates of a group of dry bulk cargo ships. The root mean squared error (RMSE) metric is used for the comparing of results of methods and the proposed method has superiority than both traditional FTS methods and also the classical time series methods.

Keywords: C-means clustering, Fuzzy time series, Multi-variate design

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2242
8089 Use of Fuzzy Edge Image in Block Truncation Coding for Image Compression

Authors: Amarunnishad T.M., Govindan V.K., Abraham T. Mathew

Abstract:

An image compression method has been developed using fuzzy edge image utilizing the basic Block Truncation Coding (BTC) algorithm. The fuzzy edge image has been validated with classical edge detectors on the basis of the results of the well-known Canny edge detector prior to applying to the proposed method. The bit plane generated by the conventional BTC method is replaced with the fuzzy bit plane generated by the logical OR operation between the fuzzy edge image and the corresponding conventional BTC bit plane. The input image is encoded with the block mean and standard deviation and the fuzzy bit plane. The proposed method has been tested with test images of 8 bits/pixel and size 512×512 and found to be superior with better Peak Signal to Noise Ratio (PSNR) when compared to the conventional BTC, and adaptive bit plane selection BTC (ABTC) methods. The raggedness and jagged appearance, and the ringing artifacts at sharp edges are greatly reduced in reconstructed images by the proposed method with the fuzzy bit plane.

Keywords: Image compression, Edge detection, Ground truth image, Peak signal to noise ratio

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1637
8088 A New History Based Method to Handle the Recurring Concept Shifts in Data Streams

Authors: Hossein Morshedlou, Ahmad Abdollahzade Barforoush

Abstract:

Recent developments in storage technology and networking architectures have made it possible for broad areas of applications to rely on data streams for quick response and accurate decision making. Data streams are generated from events of real world so existence of associations, which are among the occurrence of these events in real world, among concepts of data streams is logical. Extraction of these hidden associations can be useful for prediction of subsequent concepts in concept shifting data streams. In this paper we present a new method for learning association among concepts of data stream and prediction of what the next concept will be. Knowing the next concept, an informed update of data model will be possible. The results of conducted experiments show that the proposed method is proper for classification of concept shifting data streams.

Keywords: Data Stream, Classification, Concept Shift, History.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1231
8087 Compromise Ratio Method for Decision Making under Fuzzy Environment using Fuzzy Distance Measure

Authors: Debashree Guha, Debjani Chakraborty

Abstract:

The aim of this paper is to adopt a compromise ratio (CR) methodology for fuzzy multi-attribute single-expert decision making proble. In this paper, the rating of each alternative has been described by linguistic terms, which can be expressed as triangular fuzzy numbers. The compromise ratio method for fuzzy multi-attribute single expert decision making has been considered here by taking the ranking index based on the concept that the chosen alternative should be as close as possible to the ideal solution and as far away as possible from the negative-ideal solution simultaneously. From logical point of view, the distance between two triangular fuzzy numbers also is a fuzzy number, not a crisp value. Therefore a fuzzy distance measure, which is itself a fuzzy number, has been used here to calculate the difference between two triangular fuzzy numbers. Now in this paper, with the help of this fuzzy distance measure, it has been shown that the compromise ratio is a fuzzy number and this eases the problem of the decision maker to take the decision. The computation principle and the procedure of the compromise ratio method have been described in detail in this paper. A comparative analysis of the compromise ratio method previously proposed [1] and the newly adopted method have been illustrated with two numerical examples.

Keywords: Compromise ratio method, Fuzzy multi-attributesingle-expert decision making, Fuzzy number, Linguistic variable

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1355
8086 On Algebraic Structure of Improved Gauss-Seidel Iteration

Authors: O. M. Bamigbola, A. A. Ibrahim

Abstract:

Analysis of real life problems often results in linear systems of equations for which solutions are sought. The method to employ depends, to some extent, on the properties of the coefficient matrix. It is not always feasible to solve linear systems of equations by direct methods, as such the need to use an iterative method becomes imperative. Before an iterative method can be employed to solve a linear system of equations there must be a guaranty that the process of solution will converge. This guaranty, which must be determined apriori, involve the use of some criterion expressible in terms of the entries of the coefficient matrix. It is, therefore, logical that the convergence criterion should depend implicitly on the algebraic structure of such a method. However, in deference to this view is the practice of conducting convergence analysis for Gauss- Seidel iteration on a criterion formulated based on the algebraic structure of Jacobi iteration. To remedy this anomaly, the Gauss- Seidel iteration was studied for its algebraic structure and contrary to the usual assumption, it was discovered that some property of the iteration matrix of Gauss-Seidel method is only diagonally dominant in its first row while the other rows do not satisfy diagonal dominance. With the aid of this structure we herein fashion out an improved version of Gauss-Seidel iteration with the prospect of enhancing convergence and robustness of the method. A numerical section is included to demonstrate the validity of the theoretical results obtained for the improved Gauss-Seidel method.

Keywords: Linear system of equations, Gauss-Seidel iteration, algebraic structure, convergence.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2870
8085 Using Dempster-Shafer Theory in XML Information Retrieval

Authors: F. Raja, M. Rahgozar, F. Oroumchian

Abstract:

XML is a markup language which is becoming the standard format for information representation and data exchange. A major purpose of XML is the explicit representation of the logical structure of a document. Much research has been performed to exploit logical structure of documents in information retrieval in order to precisely extract user information need from large collections of XML documents. In this paper, we describe an XML information retrieval weighting scheme that tries to find the most relevant elements in XML documents in response to a user query. We present this weighting model for information retrieval systems that utilize plausible inferences to infer the relevance of elements in XML documents. We also add to this model the Dempster-Shafer theory of evidence to express the uncertainty in plausible inferences and Dempster-Shafer rule of combination to combine evidences derived from different inferences.

Keywords: Dempster-Shafer theory, plausible inferences, XMLinformation retrieval.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1473
8084 Quantum Dot Cellular Automata Based Effective Design of Combinational and Sequential Logical Structures

Authors: Hema Sandhya Jagarlamudi, Mousumi Saha, Pavan Kumar Jagarlamudi

Abstract:

The use of Quantum dots is a promising emerging Technology for implementing digital system at the nano level. It is effecient for attractive features such as faster speed , smaller size and low power consumption than transistor technology. In this paper, various Combinational and sequential logical structures - HALF ADDER, SR Latch and Flip-Flop, D Flip-Flop preceding NAND, NOR, XOR,XNOR are discussed based on QCA design, with comparatively less number of cells and area. By applying these layouts, the hardware requirements for a QCA design can be reduced. These structures are designed and simulated using QCA Designer Tool. By taking full advantage of the unique features of this technology, we are able to create complete circuits on a single layer of QCA. Such Devices are expected to function with ultra low power Consumption and very high speeds.

Keywords: QCA, QCA Designer, Clock, Majority Gate

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2571
8083 A Preliminary Study for Design of Automatic Block Reallocation Algorithm with Genetic Algorithm Method in the Land Consolidation Projects

Authors: Tayfun Çay, Yaşar İnceyol, Abdurrahman Özbeyaz

Abstract:

Land reallocation is one of the most important steps in land consolidation projects. Many different models were proposed for land reallocation in the literature such as Fuzzy Logic, block priority based land reallocation and Spatial Decision Support Systems. A model including four parts is considered for automatic block reallocation with genetic algorithm method in land consolidation projects. These stages are preparing data tables for a project land, determining conditions and constraints of land reallocation, designing command steps and logical flow chart of reallocation algorithm and finally writing program codes of Genetic Algorithm respectively. In this study, we designed the first three steps of the considered model comprising four steps.

Keywords: Genetic algorithm, land consolidation, landholding, land reallocation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1849
8082 A New Fuzzy Decision Support Method for Analysis of Economic Factors of Turkey's Construction Industry

Authors: R. Tur, A. Yardımcı

Abstract:

Imperfect knowledge cannot be avoided all the time. Imperfections may have several forms; uncertainties, imprecision and incompleteness. When we look to classification of methods for the management of imperfect knowledge we see fuzzy set-based techniques. The choice of a method to process data is linked to the choice of knowledge representation, which can be numerical, symbolic, logical or semantic and it depends on the nature of the problem to be solved for example decision support, which will be mentioned in our study. Fuzzy Logic is used for its ability to manage imprecise knowledge, but it can take advantage of the ability of neural networks to learn coefficients or functions. Such an association of methods is typical of so-called soft computing. In this study a new method was used for the management of imprecision for collected knowledge which related to economic analysis of construction industry in Turkey. Because of sudden changes occurring in economic factors decrease competition strength of construction companies. The better evaluation of these changes in economical factors in view of construction industry will made positive influence on company-s decisions which are dealing construction.

Keywords: Fuzzy logic, decision support systems, construction industry.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1585
8081 RF Power Consumption Emulation Optimized with Interval Valued Homotopies

Authors: Deogratius Musiige, François Anton, Vital Yatskevich, Laulagnet Vincent, Darka Mioc, Nguyen Pierre

Abstract:

This paper presents a methodology towards the emulation of the electrical power consumption of the RF device during the cellular phone/handset transmission mode using the LTE technology. The emulation methodology takes the physical environmental variables and the logical interface between the baseband and the RF system as inputs to compute the emulated power dissipation of the RF device. The emulated power, in between the measured points corresponding to the discrete values of the logical interface parameters is computed as a polynomial interpolation using polynomial basis functions. The evaluation of polynomial and spline curve fitting models showed a respective divergence (test error) of 8% and 0.02% from the physically measured power consumption. The precisions of the instruments used for the physical measurements have been modeled as intervals. We have been able to model the power consumption of the RF device operating at 5MHz using homotopy between 2 continuous power consumptions of the RF device operating at the bandwidths 3MHz and 10MHz.

Keywords: Radio frequency, high power amplifier, baseband, LTE, power, emulation, homotopy, interval analysis, Tx power, register-transfer level.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1735
8080 Unified Method to Block Pornographic Images in Websites

Authors: Sakthi Priya Balaji R., Vijayendar G.

Abstract:

This paper proposes a technique to block adult images displayed in websites. The filter is designed so as to perform even in exceptional cases such as, where face detection is not possible or improper face visibility. This is achieved by using an alternative phase to extract the MFC (Most Frequent Color) from the Human Body regions estimated using a biometric of anthropometric distances between fixed rigidly connected body locations. The logical results generated can be protected from overriding by a firewall or intrusion, by encrypting the result in a SSH data packet.

Keywords: Face detection, characteristics extraction andclassification, Component based shape analysis and classification, open source SSH V2 protocol

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1336
8079 A Computer Aided Model for Supporting Design Education

Authors: Leyla Y. Tokman, Rusen Yamaçlı

Abstract:

Educating effective architect designers is an important goal of architectural education. But what contributes to students- performance, and to critical and creative thinking in architectural design education? Besides teaching architecture students how to understand logical arguments, eliminate the inadequate solutions and focus on the correct ones, it is also crucial to teach students how to focus on exploring ideas and the alternative solutions and seeking for other right answers rather than one. This paper focuses on the enhancing architectural design education and may provide implications for enhancing teaching design.

Keywords: Architectural education, design studio, teaching method, GUI-Graphical User Interface.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1785