Search results for: Chronological and Thematic Information Representation.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4411

Search results for: Chronological and Thematic Information Representation.

4171 Solving Part Type Selection and Loading Problem in Flexible Manufacturing System Using Real Coded Genetic Algorithms – Part I: Modeling

Authors: Wayan F. Mahmudy, Romeo M. Marian, Lee H. S. Luong

Abstract:

This paper and its companion (Part 2) deal with modeling and optimization of two NP-hard problems in production planning of flexible manufacturing system (FMS), part type selection problem and loading problem. The part type selection problem and the loading problem are strongly related and heavily influence the system-s efficiency and productivity. The complexity of the problems is harder when flexibilities of operations such as the possibility of operation processed on alternative machines with alternative tools are considered. These problems have been modeled and solved simultaneously by using real coded genetic algorithms (RCGA) which uses an array of real numbers as chromosome representation. These real numbers can be converted into part type sequence and machines that are used to process the part types. This first part of the papers focuses on the modeling of the problems and discussing how the novel chromosome representation can be applied to solve the problems. The second part will discuss the effectiveness of the RCGA to solve various test bed problems.

Keywords: Flexible manufacturing system, production planning, part type selection problem, loading problem, real-coded genetic algorithm

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2107
4170 Deep Learning Based, End-to-End Metaphor Detection in Greek with Recurrent and Convolutional Neural Networks

Authors: Konstantinos Perifanos, Eirini Florou, Dionysis Goutsos

Abstract:

This paper presents and benchmarks a number of end-to-end Deep Learning based models for metaphor detection in Greek. We combine Convolutional Neural Networks and Recurrent Neural Networks with representation learning to bear on the metaphor detection problem for the Greek language. The models presented achieve exceptional accuracy scores, significantly improving the previous state-of-the-art results, which had already achieved accuracy 0.82. Furthermore, no special preprocessing, feature engineering or linguistic knowledge is used in this work. The methods presented achieve accuracy of 0.92 and F-score 0.92 with Convolutional Neural Networks (CNNs) and bidirectional Long Short Term Memory networks (LSTMs). Comparable results of 0.91 accuracy and 0.91 F-score are also achieved with bidirectional Gated Recurrent Units (GRUs) and Convolutional Recurrent Neural Nets (CRNNs). The models are trained and evaluated only on the basis of training tuples, the related sentences and their labels. The outcome is a state-of-the-art collection of metaphor detection models, trained on limited labelled resources, which can be extended to other languages and similar tasks.

Keywords: Metaphor detection, deep learning, representation learning, embeddings.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 553
4169 Information Resource Management Maturity Model

Authors: Afshari H., Khosravi Sh.

Abstract:

Nowadays there are more than thirty maturity models in different knowledge areas. Maturity model is an area of interest that contributes organizations to find out where they are in a specific knowledge area and how to improve it. As Information Resource Management (IRM) is the concept that information is a major corporate resource and must be managed using the same basic principles used to manage other assets, assessment of the current IRM status and reveal the improvement points can play a critical role in developing an appropriate information structure in organizations. In this paper we proposed a framework for information resource management maturity model (IRM3) that includes ten best practices for the maturity assessment of the organizations' IRM.

Keywords: Information resource management (IRM), information resource management maturity model (IRM3), maturity model, best practice.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2384
4168 Positive Analysis on Vulnerability, Information Security Incidents, and the Countermeasures of Japanese Internet Service Providers

Authors: Toshihiko Takemura, Makoto Osajima, Masatoshi Kawano

Abstract:

This paper includes a positive analysis to quantitatively grasp the relationship among vulnerability, information security incidents, and the countermeasures by using data based on a 2007 questionnaire survey for Japanese ISPs (Internet Service Providers). To grasp the relationships, logistic regression analysis is used. The results clarify that there are relationships between information security incidents and the countermeasures. Concretely, there is a positive relationship between information security incidents and the number of information security systems introduced as well as a negative relationship between information security incidents and information security education. It is also pointed out that (especially, local) ISPs do not execute efficient information security countermeasures/ investment concerned with systems, and it is suggested that they should positively execute information security education. In addition, to further heighten the information security level of Japanese telecommunication infrastructure, the necessity and importance of the government to implement policy to support the countermeasures of ISPs is insisted.

Keywords: Information security countermeasures, information security incidents, internet service providers, positive analysis

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1665
4167 A High Bitrate Information Hiding Algorithm for Video in Video

Authors: Wang Shou-Dao, Xiao Chuang-Bai, Lin Yu

Abstract:

In high bitrate information hiding techniques, 1 bit is embedded within each 4 x 4 Discrete Cosine Transform (DCT) coefficient block by means of vector quantization, then the hidden bit can be effectively extracted in terminal end. In this paper high bitrate information hiding algorithms are summarized, and the scheme of video in video is implemented. Experimental result shows that the host video which is embedded numerous auxiliary information have little visually quality decline. Peak Signal to Noise Ratio (PSNR)Y of host video only degrades 0.22dB in average, while the hidden information has a high percentage of survives and keeps a high robustness in H.264/AVC compression, the average Bit Error Rate(BER) of hiding information is 0.015%.

Keywords: Information Hiding, Embed, Quantification, Extract

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1898
4166 The Quality of Accounting Information of Private Companies in the Czech Republic

Authors: Kateřina Struhařová

Abstract:

The paper gives the evidence of quality of accounting information of Czech private companies. In general the private companies in the Czech Republic do not see the benefits of providing accounting information of high quality. Based on the research of financial statements of entrepreneurs and companies in Zlin region it was confirmed that the quality of accounting information differs among the private entities and that the major impact on the accounting information quality has the fact if the financial statements are audited as well as the size of the entity. Also the foreign shareholders and lenders have some impact on the accounting information quality.

Keywords: Accounting information quality, Financial Statements, Czech Republic.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1616
4165 ISCS (Information Security Check Service) for the Safety and Reliability of Communications

Authors: Jong-Whoi Shin, Jin-Tae Lee, Sang-Soo Jang, Jae-II Lee

Abstract:

Recent widespread use of information and communication technology has greatly changed information security risks that businesses and institutions encounter. Along with this situation, in order to ensure security and have confidence in electronic trading, it has become important for organizations to take competent information security measures to provide international confidence that sensitive information is secure. Against this backdrop, the approach to information security checking has come to an important issue, which is believed to be common to all countries. The purpose of this paper is to introduce the new system of information security checking program in Korea and to propose synthetic information security countermeasures under domestic circumstances in order to protect physical equipment, security management and technology, and the operation of security check for securing services on ISP(Internet Service Provider), IDC(Internet Data Center), and e-commerce(shopping malls, etc.)

Keywords: Information Security Check Service, safety criteria, object enterpriser.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1610
4164 Evaluation Method for Information Security Levels of CIIP (Critical Information Infrastructure Protection)

Authors: Soon-Tai Park, Jong-Whoi Shin, Bog-Ki Min, Ik-Sub Lee, Gang-Shin Lee, Jae-Il Lee

Abstract:

As the information age matures, major social infrastructures such as communication, finance, military and energy, have become ever more dependent on information communication systems. And since these infrastructures are connected to the Internet, electronic intrusions such as hacking and viruses have become a new security threat. Especially, disturbance or neutralization of a major social infrastructure can result in extensive material damage and social disorder. To address this issue, many nations around the world are researching and developing various techniques and information security policies as a government-wide effort to protect their infrastructures from newly emerging threats. This paper proposes an evaluation method for information security levels of CIIP (Critical Information Infrastructure Protection), which can enhance the security level of critical information infrastructure by checking the current security status and establish security measures accordingly to protect infrastructures effectively.

Keywords: Information Security Evaluation Methodology, Critical Information Infrastructure Protection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1783
4163 Treatment of Chrome Tannery Wastewater by Biological Process - A Mini Review

Authors: Supriyo Goswami, Debabrata Mazumder

Abstract:

Chrome tannery wastewater causes serious environmental hazard due to its high pollution potential. As a result, rigorous treatment is necessary for abatement of pollution from this type of wastewater. There are many research studies on chrome tannery wastewater treatment in the field of physical, chemical, and biological methods. In general, biological treatment process is found ineffective for direct application because of adverse effects by toxic chromium, sulphide, chloride etc. However, biological methods were employed mainly for a few sub processes generating significant amount of organic matter and without chromium, chlorides etc. In this context the present paper reviews the characteristics feature and pollution potential of wastewater generated from chrome tannery units and treatment of the same. The different biological processes used earlier and their chronological development for treatment of the chrome tannery wastewater are thoroughly reviewed in this paper. In this regard, the scope of hybrid bioreactor - an advanced technology option has also been explored, as this kind of treatment is well suited for the wastewater having inhibitory substances. 

Keywords: Composite tannery wastewater, biological treatment, Hybrid bioreactor, Organic removal

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4224
4162 The Role of Online Videos in Undergraduate Casual-Leisure Information Behaviors

Authors: Nei-Ching Yeh

Abstract:

This study describes undergraduate casual-leisure information behaviors relevant to online videos. Diaries and in-depth interviews were used to collect data. Twenty-four undergraduates participated in this study (9 men, 15 women; all were aged 18–22 years). This study presents a model of casual-leisure information behaviors and contributes new insights into user experience in casual-leisure settings, such as online video programs, with implications for other information domains.

Keywords: Casual-leisure information behaviors, information behavior, online videos, role.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1186
4161 CompPSA: A Component-Based Pairwise RNA Secondary Structure Alignment Algorithm

Authors: Ghada Badr, Arwa Alturki

Abstract:

The biological function of an RNA molecule depends on its structure. The objective of the alignment is finding the homology between two or more RNA secondary structures. Knowing the common functionalities between two RNA structures allows a better understanding and a discovery of other relationships between them. Besides, identifying non-coding RNAs -that is not translated into a protein- is a popular application in which RNA structural alignment is the first step A few methods for RNA structure-to-structure alignment have been developed. Most of these methods are partial structure-to-structure, sequence-to-structure, or structure-to-sequence alignment. Less attention is given in the literature to the use of efficient RNA structure representation and the structure-to-structure alignment methods are lacking. In this paper, we introduce an O(N2) Component-based Pairwise RNA Structure Alignment (CompPSA) algorithm, where structures are given as a component-based representation and where N is the maximum number of components in the two structures. The proposed algorithm compares the two RNA secondary structures based on their weighted component features rather than on their base-pair details. Extensive experiments are conducted illustrating the efficiency of the CompPSA algorithm when compared to other approaches and on different real and simulated datasets. The CompPSA algorithm shows an accurate similarity measure between components. The algorithm gives the flexibility for the user to align the two RNA structures based on their weighted features (position, full length, and/or stem length). Moreover, the algorithm proves scalability and efficiency in time and memory performance.

Keywords: Alignment, RNA secondary structure, pairwise, component-based, data mining.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 974
4160 Topographic Arrangement of 3D Design Components on 2D Maps by Unsupervised Feature Extraction

Authors: Stefan Menzel

Abstract:

As a result of the daily workflow in the design development departments of companies, databases containing huge numbers of 3D geometric models are generated. According to the given problem engineers create CAD drawings based on their design ideas and evaluate the performance of the resulting design, e.g. by computational simulations. Usually, new geometries are built either by utilizing and modifying sets of existing components or by adding single newly designed parts to a more complex design. The present paper addresses the two facets of acquiring components from large design databases automatically and providing a reasonable overview of the parts to the engineer. A unified framework based on the topographic non-negative matrix factorization (TNMF) is proposed which solves both aspects simultaneously. First, on a given database meaningful components are extracted into a parts-based representation in an unsupervised manner. Second, the extracted components are organized and visualized on square-lattice 2D maps. It is shown on the example of turbine-like geometries that these maps efficiently provide a wellstructured overview on the database content and, at the same time, define a measure for spatial similarity allowing an easy access and reuse of components in the process of design development.

Keywords: Design decomposition, topographic non-negative matrix factorization, parts-based representation, self-organization, unsupervised feature extraction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1379
4159 A Novel Machining Signal Filtering Technique: Z-notch Filter

Authors: Nuawi M. Z., Lamin F., Ismail A. R., Abdullah S., Wahid Z.

Abstract:

A filter is used to remove undesirable frequency information from a dynamic signal. This paper shows that the Znotch filter filtering technique can be applied to remove the noise nuisance from a machining signal. In machining, the noise components were identified from the sound produced by the operation of machine components itself such as hydraulic system, motor, machine environment and etc. By correlating the noise components with the measured machining signal, the interested components of the measured machining signal which was less interfered by the noise, can be extracted. Thus, the filtered signal is more reliable to be analysed in terms of noise content compared to the unfiltered signal. Significantly, the I-kaz method i.e. comprises of three dimensional graphical representation and I-kaz coefficient, Z∞ could differentiate between the filtered and the unfiltered signal. The bigger space of scattering and the higher value of Z∞ demonstrated that the signal was highly interrupted by noise. This method can be utilised as a proactive tool in evaluating the noise content in a signal. The evaluation of noise content is very important as well as the elimination especially for machining operation fault diagnosis purpose. The Z-notch filtering technique was reliable in extracting noise component from the measured machining signal with high efficiency. Even though the measured signal was exposed to high noise disruption, the signal generated from the interaction between cutting tool and work piece still can be acquired. Therefore, the interruption of noise that could change the original signal feature and consequently can deteriorate the useful sensory information can be eliminated.

Keywords: Digital signal filtering, I-kaz method, Machiningmonitoring, Noise Cancelling, Sound

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1884
4158 Numerical Investigation of Improved Aerodynamic Performance of a NACA 0015 Airfoil Using Synthetic Jet

Authors: K. Boualem, T. Yahiaoui, A. Azzi

Abstract:

Numerical investigations are performed to analyze the flow behavior over NACA0015 and to evaluate the efficiency of synthetic jet as active control device. The second objective of this work is to investigate the influence of momentum coefficient of synthetic jet on the flow behaviour. The unsteady Reynolds-averaged Navier-Stokes equations of the turbulent flow are solved using, k-ω SST provided by ANSYS CFX-CFD code. The model presented in this paper is a comprehensive representation of the information found in the literature. Comparison of obtained numerical flow parameters with the experimental ones shows that the adopted computational procedure reflects nearly the real flow nature. Also, numerical results state that use of synthetic jets devices has positive effects on the flow separation, and thus, aerodynamic performance improvement of NACA0015 airfoil. It can also be observed that the use of synthetic jet increases the lift coefficient about 13.3% and reduces the drag coefficient about 52.7%.

Keywords: Active control, CFD, NACA airfoil, synthetic jet.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1657
4157 Improve of Evaluation Method for Information Security Levels of CIIP (Critical Information Infrastructure Protection)

Authors: Dong-Young Yoo, Jong-Whoi Shin, Gang Shin Lee, Jae-Il Lee

Abstract:

As the disfunctions of the information society and social development progress, intrusion problems such as malicious replies, spam mail, private information leakage, phishing, and pharming, and side effects such as the spread of unwholesome information and privacy invasion are becoming serious social problems. Illegal access to information is also becoming a problem as the exchange and sharing of information increases on the basis of the extension of the communication network. On the other hand, as the communication network has been constructed as an international, global system, the legal response against invasion and cyber-attack from abroad is facing its limit. In addition, in an environment where the important infrastructures are managed and controlled on the basis of the information communication network, such problems pose a threat to national security. Countermeasures to such threats are developed and implemented on a yearly basis to protect the major infrastructures of information communication. As a part of such measures, we have developed a methodology for assessing the information protection level which can be used to establish the quantitative object setting method required for the improvement of the information protection level.

Keywords: Information Security Evaluation Methodology, Critical Information Infrastructure Protection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1660
4156 Information Technologies in Automotive Assembly Industry in Thailand

Authors: Jirarat Teeravaraprug, Usawadee Inklay

Abstract:

This paper gave an attempt in prioritizing information  technologies that organizations should give concentration. The case  study was organizations in the automotive assembly industry in  Thailand. Data were first collected to gather all information  technologies known and used in the automotive assembly industry in  Thailand. Five experts from the industries were surveyed based on  the concept of fuzzy DEMATEL. The information technologies were  categorized into six groups, which were communication, transaction,  planning, organization management, warehouse management, and  transportation. The cause groups of information technologies for each  group were analyzed and presented. Moreover, the relationship  between the used and the significant information technologies was  given. Discussions based on the used information technologies and  the research results are given.

 

Keywords: Information technology, automotive assembly industry, fuzzy DEMATEL.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1719
4155 Learning to Order Terms: Supervised Interestingness Measures in Terminology Extraction

Authors: Jérôme Azé, Mathieu Roche, Yves Kodratoff, Michèle Sebag

Abstract:

Term Extraction, a key data preparation step in Text Mining, extracts the terms, i.e. relevant collocation of words, attached to specific concepts (e.g. genetic-algorithms and decisiontrees are terms associated to the concept “Machine Learning" ). In this paper, the task of extracting interesting collocations is achieved through a supervised learning algorithm, exploiting a few collocations manually labelled as interesting/not interesting. From these examples, the ROGER algorithm learns a numerical function, inducing some ranking on the collocations. This ranking is optimized using genetic algorithms, maximizing the trade-off between the false positive and true positive rates (Area Under the ROC curve). This approach uses a particular representation for the word collocations, namely the vector of values corresponding to the standard statistical interestingness measures attached to this collocation. As this representation is general (over corpora and natural languages), generality tests were performed by experimenting the ranking function learned from an English corpus in Biology, onto a French corpus of Curriculum Vitae, and vice versa, showing a good robustness of the approaches compared to the state-of-the-art Support Vector Machine (SVM).

Keywords: Text-mining, Terminology Extraction, Evolutionary algorithm, ROC Curve.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1659
4154 Mining and Visual Management of XML-Based Image Collections

Authors: Khalil Shihab, Nida Al-Chalabi

Abstract:

This article describes Uruk, the virtual museum of Iraq that we developed for visual exploration and retrieval of image collections. The system largely exploits the loosely-structured hierarchy of XML documents that provides a useful representation method to store semi-structured or unstructured data, which does not easily fit into existing database. The system offers users the capability to mine and manage the XML-based image collections through a web-based Graphical User Interface (GUI). Typically, at an interactive session with the system, the user can browse a visual structural summary of the XML database in order to select interesting elements. Using this intermediate result, queries combining structure and textual references can be composed and presented to the system. After query evaluation, the full set of answers is presented in a visual and structured way.

Keywords: Data-centric XML, graphical user interfaces, information retrieval, case-based reasoning, fuzzy sets

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1790
4153 Logic Programming and Artificial Neural Networks in Pharmacological Screening of Schinus Essential Oils

Authors: José Neves, M. Rosário Martins, Fátima Candeias, Diana Ferreira, Sílvia Arantes, Júlio Cruz-Morais, Guida Gomes, Joaquim Macedo, António Abelha, Henrique Vicente

Abstract:

Some plants of genus Schinus have been used in the folk medicine as topical antiseptic, digestive, purgative, diuretic, analgesic or antidepressant, and also for respiratory and urinary infections. Chemical composition of essential oils of S. molle and S. terebinthifolius had been evaluated and presented high variability according with the part of the plant studied and with the geographic and climatic regions. The pharmacological properties, namely antimicrobial, anti-tumoural and anti-inflammatory activities are conditioned by chemical composition of essential oils. Taking into account the difficulty to infer the pharmacological properties of Schinus essential oils without hard experimental approach, this work will focus on the development of a decision support system, in terms of its knowledge representation and reasoning procedures, under a formal framework based on Logic Programming, complemented with an approach to computing centered on Artificial Neural Networks and the respective Degree-of-Confidence that one has on such an occurrence.

Keywords: Artificial neuronal networks, essential oils, knowledge representation and reasoning, logic programming, Schinus molle L, Schinus terebinthifolius raddi.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2422
4152 Managing the Information System Life Cycle in Construction and Manufacturing

Authors: Carlos J. Costa, Manuela Aparício

Abstract:

In this paper we present the information life cycle and analyze the importance of managing the corporate application portfolio across this life cycle. The approach presented here corresponds not just to the extension of the traditional information system development life cycle. This approach is based in the generic life cycle. In this paper it is proposed a model of an information system life cycle, supported in the assumption that a system has a limited life. But, this limited life may be extended. This model is also applied in several cases; being reported here two examples of the framework application in a construction enterprise and in a manufacturing enterprise.

Keywords: Information systems/technology, information systems life cycle, organization engineering, information economics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1807
4151 Developing a Viral Artifact to Improve Employees’ Security Behavior

Authors: Stefan Bauer, Josef Frysak

Abstract:

According to the scientific information management literature, the improper use of information technology (e.g. personal computers) by employees are one main cause for operational and information security loss events. Therefore, organizations implement information security awareness programs to increase employees’ awareness to further prevention of loss events. However, in many cases these information security awareness programs consist of conventional delivery methods like posters, leaflets, or internal messages to make employees aware of information security policies. We assume that a viral information security awareness video might be more effective medium than conventional methods commonly used by organizations. The purpose of this research is to develop a viral video artifact to improve employee security behavior concerning information technology.

Keywords: Information Security Awareness, Delivery Methods, Viral Videos, Employee Security Behavior.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1805
4150 Double Reduction of Ada-ECATNet Representation using Rewriting Logic

Authors: Noura Boudiaf, Allaoua Chaoui

Abstract:

One major difficulty that faces developers of concurrent and distributed software is analysis for concurrency based faults like deadlocks. Petri nets are used extensively in the verification of correctness of concurrent programs. ECATNets [2] are a category of algebraic Petri nets based on a sound combination of algebraic abstract types and high-level Petri nets. ECATNets have 'sound' and 'complete' semantics because of their integration in rewriting logic [12] and its programming language Maude [13]. Rewriting logic is considered as one of very powerful logics in terms of description, verification and programming of concurrent systems. We proposed in [4] a method for translating Ada-95 tasking programs to ECATNets formalism (Ada-ECATNet). In this paper, we show that ECATNets formalism provides a more compact translation for Ada programs compared to the other approaches based on simple Petri nets or Colored Petri nets (CPNs). Such translation doesn-t reduce only the size of program, but reduces also the number of program states. We show also, how this compact Ada-ECATNet may be reduced again by applying reduction rules on it. This double reduction of Ada-ECATNet permits a considerable minimization of the memory space and run time of corresponding Maude program.

Keywords: Ada tasking, ECATNets, Algebraic Petri Nets, Compact Representation, Analysis, Rewriting Logic, Maude.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1407
4149 Efficient Program Slicing Algorithms for Measuring Functional Cohesion and Parallelism

Authors: Jehad Al Dallal

Abstract:

Program slicing is the task of finding all statements in a program that directly or indirectly influence the value of a variable occurrence. The set of statements that can affect the value of a variable at some point in a program is called a program slice. In several software engineering applications, such as program debugging and measuring program cohesion and parallelism, several slices are computed at different program points. In this paper, algorithms are introduced to compute all backward and forward static slices of a computer program by traversing the program representation graph once. The program representation graph used in this paper is called Program Dependence Graph (PDG). We have conducted an experimental comparison study using 25 software modules to show the effectiveness of the introduced algorithm for computing all backward static slices over single-point slicing approaches in computing the parallelism and functional cohesion of program modules. The effectiveness of the algorithm is measured in terms of time execution and number of traversed PDG edges. The comparison study results indicate that using the introduced algorithm considerably saves the slicing time and effort required to measure module parallelism and functional cohesion.

Keywords: Backward slicing, cohesion measure, forward slicing, parallelism measure, program dependence graph, program slicing, static slicing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1448
4148 A Survey of Various Algorithms for Vlsi Physical Design

Authors: Rajine Swetha R, B. Shekar Babu, Sumithra Devi K.A

Abstract:

Electronic Systems are the core of everyday lives. They form an integral part in financial networks, mass transit, telephone systems, power plants and personal computers. Electronic systems are increasingly based on complex VLSI (Very Large Scale Integration) integrated circuits. Initial electronic design automation is concerned with the design and production of VLSI systems. The next important step in creating a VLSI circuit is Physical Design. The input to the physical design is a logical representation of the system under design. The output of this step is the layout of a physical package that optimally or near optimally realizes the logical representation. Physical design problems are combinatorial in nature and of large problem sizes. Darwin observed that, as variations are introduced into a population with each new generation, the less-fit individuals tend to extinct in the competition of basic necessities. This survival of fittest principle leads to evolution in species. The objective of the Genetic Algorithms (GA) is to find an optimal solution to a problem .Since GA-s are heuristic procedures that can function as optimizers, they are not guaranteed to find the optimum, but are able to find acceptable solutions for a wide range of problems. This survey paper aims at a study on Efficient Algorithms for VLSI Physical design and observes the common traits of the superior contributions.

Keywords: Genetic Algorithms, Physical Design, VLSI.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1740
4147 A Study on a Discrete Event Simulation Model for Availability Analysis of Weapon Systems

Authors: Hye Lyeong Kim, Sang Yeong Choi

Abstract:

This paper discusses a discrete event simulation model for the availability analysis of weapon systems. This model incorporates missions, operational tasks and system reliability structures to analyze the availability of a weapon system. The proposed simulation model consists of 5 modules: Simulation Engine, Maintenance Organizations, System, its Mission Profile and RBD which are based on missions and operational tasks. Simulation Engine executes three kinds of discrete events in chronological order. The events are mission events generated by Mission Profile, failure events generated by System, and maintenance events executed by Maintenance Organization. Finally, this paper shows the case study of a system's availability analysis and mission reliability using the simulation model.

Keywords: MTBF (Mean Time Between Failure), MTTR (Mean Time To Repair), Availability, Reliability, RBD (Reliability Block Diagram)

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2650
4146 SVM-based Multiview Face Recognition by Generalization of Discriminant Analysis

Authors: Dakshina Ranjan Kisku, Hunny Mehrotra, Jamuna Kanta Sing, Phalguni Gupta

Abstract:

Identity verification of authentic persons by their multiview faces is a real valued problem in machine vision. Multiview faces are having difficulties due to non-linear representation in the feature space. This paper illustrates the usability of the generalization of LDA in the form of canonical covariate for face recognition to multiview faces. In the proposed work, the Gabor filter bank is used to extract facial features that characterized by spatial frequency, spatial locality and orientation. Gabor face representation captures substantial amount of variations of the face instances that often occurs due to illumination, pose and facial expression changes. Convolution of Gabor filter bank to face images of rotated profile views produce Gabor faces with high dimensional features vectors. Canonical covariate is then used to Gabor faces to reduce the high dimensional feature spaces into low dimensional subspaces. Finally, support vector machines are trained with canonical sub-spaces that contain reduced set of features and perform recognition task. The proposed system is evaluated with UMIST face database. The experiment results demonstrate the efficiency and robustness of the proposed system with high recognition rates.

Keywords: Biometrics, Multiview face Recognition, Gaborwavelets, LDA, SVM.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1503
4145 Research and Development of Net-Centric Information Sharing Platform

Authors: Xiaoqing Wang, Fang Youyuan, Zheng Yanxing, Gu Tianyang, Zong Jianjian, Tong Jinrong

Abstract:

Compared with traditional distributed environment, the net-centric environment brings on more demanding challenges for information sharing with the characteristics of ultra-large scale and strong distribution, dynamic, autonomy, heterogeneity, redundancy. This paper realizes an information sharing model and a series of core services, through which provides an open, flexible and scalable information sharing platform.

Keywords: Net-centric environment, Information sharing, Metadata registry and catalog, Cross-domain data access control.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1370
4144 Evolutionary Approach for Automated Discovery of Censored Production Rules

Authors: Kamal K. Bharadwaj, Basheer M. Al-Maqaleh

Abstract:

In the recent past, there has been an increasing interest in applying evolutionary methods to Knowledge Discovery in Databases (KDD) and a number of successful applications of Genetic Algorithms (GA) and Genetic Programming (GP) to KDD have been demonstrated. The most predominant representation of the discovered knowledge is the standard Production Rules (PRs) in the form If P Then D. The PRs, however, are unable to handle exceptions and do not exhibit variable precision. The Censored Production Rules (CPRs), an extension of PRs, were proposed by Michalski & Winston that exhibit variable precision and supports an efficient mechanism for handling exceptions. A CPR is an augmented production rule of the form: If P Then D Unless C, where C (Censor) is an exception to the rule. Such rules are employed in situations, in which the conditional statement 'If P Then D' holds frequently and the assertion C holds rarely. By using a rule of this type we are free to ignore the exception conditions, when the resources needed to establish its presence are tight or there is simply no information available as to whether it holds or not. Thus, the 'If P Then D' part of the CPR expresses important information, while the Unless C part acts only as a switch and changes the polarity of D to ~D. This paper presents a classification algorithm based on evolutionary approach that discovers comprehensible rules with exceptions in the form of CPRs. The proposed approach has flexible chromosome encoding, where each chromosome corresponds to a CPR. Appropriate genetic operators are suggested and a fitness function is proposed that incorporates the basic constraints on CPRs. Experimental results are presented to demonstrate the performance of the proposed algorithm.

Keywords: Censored Production Rule, Data Mining, MachineLearning, Evolutionary Algorithms.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1881
4143 A Web Text Mining Flexible Architecture

Authors: M. Castellano, G. Mastronardi, A. Aprile, G. Tarricone

Abstract:

Text Mining is an important step of Knowledge Discovery process. It is used to extract hidden information from notstructured o semi-structured data. This aspect is fundamental because much of the Web information is semi-structured due to the nested structure of HTML code, much of the Web information is linked, much of the Web information is redundant. Web Text Mining helps whole knowledge mining process to mining, extraction and integration of useful data, information and knowledge from Web page contents. In this paper, we present a Web Text Mining process able to discover knowledge in a distributed and heterogeneous multiorganization environment. The Web Text Mining process is based on flexible architecture and is implemented by four steps able to examine web content and to extract useful hidden information through mining techniques. Our Web Text Mining prototype starts from the recovery of Web job offers in which, through a Text Mining process, useful information for fast classification of the same are drawn out, these information are, essentially, job offer place and skills.

Keywords: Web text mining, flexible architecture, knowledgediscovery.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2665
4142 Information Systems Outsourcing Reasons and Risks: An Empirical Study

Authors: Reyes Gonzalez, Jose Gasco, Juan Llopis

Abstract:

Outsourcing, a management practice strongly consolidated within the area of Information Systems, is currently going through a stage of unstoppable growth. This paper makes a proposal about the main reasons which may lead firms to adopt Information Systems Outsourcing. It will equally analyse the potential risks that IS clients are likely to face. An additional objective is to assess these reasons and risks in the case of large Spanish firms, while simultaneously examining their evolution over time.

Keywords: Information Systems, Information Technologies, Outsourcing, Reasons, Risks, Survey.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3276