Search results for: Information Processing.
4542 Designing a Tool for Software Maintenance
Authors: Amir Ngah, Masita Abdul Jalil, Zailani Abdullah
Abstract:
The aim of software maintenance is to maintain the software system in accordance with advancement in software and hardware technology. One of the early works on software maintenance is to extract information at higher level of abstraction. In this paper, we present the process of how to design an information extraction tool for software maintenance. The tool can extract the basic information from old programs such as about variables, based classes, derived classes, objects of classes, and functions. The tool have two main parts; the lexical analyzer module that can read the input file character by character, and the searching module which users can get the basic information from the existing programs. We implemented this tool for a patterned sub-C++ language as an input file.
Keywords: Extraction tool, software maintenance, reverse engineering, C++.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24024541 Edge Detection Using Multi-Agent System: Evaluation on Synthetic and Medical MR Images
Authors: A. Nachour, L. Ouzizi, Y. Aoura
Abstract:
Recent developments on multi-agent system have brought a new research field on image processing. Several algorithms are used simultaneously and improved in deferent applications while new methods are investigated. This paper presents a new automatic method for edge detection using several agents and many different actions. The proposed multi-agent system is based on parallel agents that locally perceive their environment, that is to say, pixels and additional environmental information. This environment is built using Vector Field Convolution that attract free agent to the edges. Problems of partial, hidden or edges linking are solved with the cooperation between agents. The presented method was implemented and evaluated using several examples on different synthetic and medical images. The obtained experimental results suggest that this approach confirm the efficiency and accuracy of detected edge.
Keywords: Edge detection, medical MR images, multi-agent systems, vector field convolution.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19094540 Using Weblog to Promote Critical Thinking – An Exploratory Study
Authors: Huay Lit Woo, Qiyun Wang
Abstract:
Weblog is an Internet tool that is believed to possess great potential to facilitate learning in education. This study wants to know if weblog can be used to promote students- critical thinking. It used a group of secondary two students from a Singapore school to write weblogs as a means of substitution for their traditional handwritten assignments. The topics for the weblogging are taken from History syllabus but modified to suit the purpose of this study. Weblogs from the students were collected and analysed using a known coding system for measuring critical thinking. Results show that the topic for blogging is crucial in determining the types of critical thinking employed by the students. Students are seen to display critical thinking traits in the areas of information sourcing, linking information to arguments and viewpoints justification. Students- criticalness is more profound when the information for writing a topic is readily available. Otherwise, they tend to be less critical and subjective. The study also found that students lack the ability to source for external information suggesting that students may need to be taught information literacy in order to widen their use of critical thinking skills.Keywords: Affordance, blog, critical thinking, perception, weblog.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21754539 Automated Fact-Checking By Incorporating Contextual Knowledge and Multi-Faceted Search
Authors: Wenbo Wang, Yi-fang Brook Wu
Abstract:
The spread of misinformation and disinformation has become a major concern, particularly with the rise of social media as a primary source of information for many people. As a means to address this phenomenon, automated fact-checking has emerged as a safeguard against the spread of misinformation and disinformation. Existing fact-checking approaches aim to determine whether a news claim is true or false, and they have achieved decent veracity prediction accuracy. However, the state of the art methods rely on manually verified external information to assist the checking model in making judgments, which requires significant human resources. This study presents a framework, SAC, which focuses on 1) augmenting the representation of a claim by incorporating additional context using general-purpose, comprehensive and authoritative data; 2) developing a search function to automatically select relevant, new and credible references; 3) focusing on the important parts of the representations of a claim and its reference that are most relevant to the fact-checking task. The experimental results demonstrate that: 1) Augmenting the representations of claims and references through the use of a knowledge base, combined with the multi-head attention technique, contributes to improved performance of fact-checking. 2) SAC with auto-selected references outperforms existing fact-checking approaches with manual selected references. Future directions of this study include I) exploring knowledge graph in Wikidata to dynamically augment the representations of claims and references without introducing too much noises; II) exploring semantic relations in claims and references to further enhance fact-checking.
Keywords: Fact checking, claim verification, Deep Learning, Natural Language Processing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 984538 Local Linear Model Tree (LOLIMOT) Reconfigurable Parallel Hardware
Authors: A. Pedram, M. R. Jamali, T. Pedram, S. M. Fakhraie, C. Lucas
Abstract:
Local Linear Neuro-Fuzzy Models (LLNFM) like other neuro- fuzzy systems are adaptive networks and provide robust learning capabilities and are widely utilized in various applications such as pattern recognition, system identification, image processing and prediction. Local linear model tree (LOLIMOT) is a type of Takagi-Sugeno-Kang neuro fuzzy algorithm which has proven its efficiency compared with other neuro fuzzy networks in learning the nonlinear systems and pattern recognition. In this paper, a dedicated reconfigurable and parallel processing hardware for LOLIMOT algorithm and its applications are presented. This hardware realizes on-chip learning which gives it the capability to work as a standalone device in a system. The synthesis results on FPGA platforms show its potential to improve the speed at least 250 of times faster than software implemented algorithms.
Keywords: LOLIMOT, hardware, neurofuzzy systems, reconfigurable, parallel.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 38934537 Analyzing Disclosure Practice of Religious Nonprofit Organizations using Partial Disclosure Index
Authors: Ruhaya Atan, Saunah Zainon, Roland Yeow Theng Nam, Sharifah Aliman
Abstract:
This study examines the relevance of disclosure practices in improving the accountability and transparency of religious nonprofit organizations (RNPOs). The assessment of disclosure is based on the annual returns of RNPOs for the financial year 2010. In order to quantify the information disclosed in the annual returns, partial disclosure indexes of basic information (BI) disclosure index, financial information (FI) disclosure index and governance information (GI) disclosure index have been built which takes into account the content of information items in the annual returns. The empirical evidence obtained revealed low disclosure practices among RNPOs in the sample. The multiple regression results showed that the organizational attribute of the board size appeared to be the most significant predictor for both partial index on the extent of BI disclosure index, and FI disclosure index. On the other hand, the extent of financial information disclosure is related to the amount of donation received by RNPOs. On GI disclosure index, the existence of an external audit appeared to be significant variable. This study has contributed to the academic literature in providing empirical evidence of the disclosure practices among RNPOs.Keywords: disclosure, index, partial, NPOs, religious
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18844536 Over-Height Vehicle Detection in Low Headroom Roads Using Digital Video Processing
Authors: Vahid Khorramshahi, Alireza Behrad, Neeraj K. Kanhere
Abstract:
In this paper we present a new method for over-height vehicle detection in low headroom streets and highways using digital video possessing. The accuracy and the lower price comparing to present detectors like laser radars and the capability of providing extra information like speed and height measurement make this method more reliable and efficient. In this algorithm the features are selected and tracked using KLT algorithm. A blob extraction algorithm is also applied using background estimation and subtraction. Then the world coordinates of features that are inside the blobs are estimated using a noble calibration method. As, the heights of the features are calculated, we apply a threshold to select overheight features and eliminate others. The over-height features are segmented using some association criteria and grouped using an undirected graph. Then they are tracked through sequential frames. The obtained groups refer to over-height vehicles in a scene.Keywords: Feature extraction, over-height vehicle detection, traffic monitoring, vehicle tracking.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 28334535 Efficient Hardware Realization of Truncated Multipliers using FPGA
Authors: Muhammad H. Rais,
Abstract:
Truncated multiplier is a good candidate for digital signal processing (DSP) applications including finite impulse response (FIR) and discrete cosine transform (DCT). Through truncated multiplier a significant reduction in Field Programmable Gate Array (FPGA) resources can be achieved. This paper presents for the first time a comparison of resource utilization of Spartan-3AN and Virtex-5 implementation of standard and truncated multipliers using Very High Speed Integrated Circuit Hardware Description Language (VHDL). The Virtex-5 FPGA shows significant improvement as compared to Spartan-3AN FPGA device. The Virtex-5 FPGA device shows better performance with a percentage ratio of number of occupied slices for standard to truncated multipliers is increased from 40% to 73.86% as compared to Spartan- 3AN is decreased from 68.75% to 58.78%. Results show that the anomaly in Spartan-3AN FPGA device average connection and maximum pin delay have been efficiently reduced in Virtex-5 FPGA device.Keywords: Digital Signal Processing (DSP), FieldProgrammable Gate Array (FPGA), Spartan-3AN, TruncatedMultiplier, Virtex-5, VHDL.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25634534 Investigation of the Unbiased Characteristic of Doppler Frequency to Different Antenna Array Geometries
Authors: Somayeh Komeylian
Abstract:
Array signal processing techniques have been recently developing in a variety application of the performance enhancement of receivers by refraining the power of jamming and interference signals. In this scenario, biases induced to the antenna array receiver degrade significantly the accurate estimation of the carrier phase. Owing to the integration of frequency becomes the carrier phase, we have obtained the unbiased doppler frequency for the high precision estimation of carrier phase. The unbiased characteristic of Doppler frequency to the power jamming and the other interference signals allows achieving the highly accurate estimation of phase carrier. In this study, we have rigorously investigated the unbiased characteristic of Doppler frequency to the variation of the antenna array geometries. The simulation results have efficiently verified that the Doppler frequency remains also unbiased and accurate to the variation of antenna array geometries.
Keywords: Array signal processing, unbiased Doppler frequency, GNSS, carrier phase, slowly fluctuating point target.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9104533 A Query Optimization Strategy for Autonomous Distributed Database Systems
Authors: Dina K. Badawy, Dina M. Ibrahim, Alsayed A. Sallam
Abstract:
Distributed database is a collection of logically related databases that cooperate in a transparent manner. Query processing uses a communication network for transmitting data between sites. It refers to one of the challenges in the database world. The development of sophisticated query optimization technology is the reason for the commercial success of database systems, which complexity and cost increase with increasing number of relations in the query. Mariposa, query trading and query trading with processing task-trading strategies developed for autonomous distributed database systems, but they cause high optimization cost because of involvement of all nodes in generating an optimal plan. In this paper, we proposed a modification on the autonomous strategy K-QTPT that make the seller’s nodes with the lowest cost have gradually high priorities to reduce the optimization time. We implement our proposed strategy and present the results and analysis based on those results.
Keywords: Autonomous strategies, distributed database systems, high priority, query optimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10604532 The Optimal Equilibrium Capacity of Information Hiding Based on Game Theory
Authors: Ziquan Hu, Kun She, Shahzad Ali, Kai Yan
Abstract:
Game theory could be used to analyze the conflicted issues in the field of information hiding. In this paper, 2-phase game can be used to build the embedder-attacker system to analyze the limits of hiding capacity of embedding algorithms: the embedder minimizes the expected damage and the attacker maximizes it. In the system, the embedder first consumes its resource to build embedded units (EU) and insert the secret information into EU. Then the attacker distributes its resource evenly to the attacked EU. The expected equilibrium damage, which is maximum damage in value from the point of view of the attacker and minimum from the embedder against the attacker, is evaluated by the case when the attacker attacks a subset from all the EU. Furthermore, the optimal equilibrium capacity of hiding information is calculated through the optimal number of EU with the embedded secret information. Finally, illustrative examples of the optimal equilibrium capacity are presented.Keywords: 2-Phase Game, Expected Equilibrium damage, InformationHiding, Optimal Equilibrium Capacity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16244531 Proposal of a Model Supporting Decision-Making on Information Security Risk Treatment
Authors: Ritsuko Kawasaki (Aiba), Takeshi Hiromatsu
Abstract:
Management is required to understand all information security risks within an organization, and to make decisions on which information security risks should be treated in what level by allocating how much amount of cost. However, such decision-making is not usually easy, because various measures for risk treatment must be selected with the suitable application levels. In addition, some measures may have objectives conflicting with each other. It also makes the selection difficult. Therefore, this paper provides a model which supports the selection of measures by applying multi-objective analysis to find an optimal solution. Additionally, a list of measures is also provided to make the selection easier and more effective without any leakage of measures.
Keywords: Information security risk treatment, Selection of risk measures, Risk acceptance and Multi-objective optimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21394530 A Study on the Average Information Ratio of Perfect Secret-Sharing Schemes for Access Structures Based On Bipartite Graphs
Authors: Hui-Chuan Lu
Abstract:
A perfect secret-sharing scheme is a method to distribute a secret among a set of participants in such a way that only qualified subsets of participants can recover the secret and the joint share of participants in any unqualified subset is statistically independent of the secret. The collection of all qualified subsets is called the access structure of the perfect secret-sharing scheme. In a graph-based access structure, each vertex of a graph G represents a participant and each edge of G represents a minimal qualified subset. The average information ratio of a perfect secret-sharing scheme realizing the access structure based on G is defined as AR = (Pv2V (G) H(v))/(|V (G)|H(s)), where s is the secret and v is the share of v, both are random variables from and H is the Shannon entropy. The infimum of the average information ratio of all possible perfect secret-sharing schemes realizing a given access structure is called the optimal average information ratio of that access structure. Most known results about the optimal average information ratio give upper bounds or lower bounds on it. In this present structures based on bipartite graphs and determine the exact values of the optimal average information ratio of some infinite classes of them.
Keywords: secret-sharing scheme, average information ratio, star covering, core sequence.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15864529 Visualizing Transit Through a Web Based Geographic Information System
Authors: Ricardo Hoar
Abstract:
Currently in many major cities, public transit schedules are disseminated through lists of routes, grids of stop times and static maps. This paper describes a web based geographic information system which disseminates the same schedule information through intuitive GIS techniques. Using data from Calgary, Canada, an map based interface has been created to allow users to see routes, stops and moving buses all at once. Zoom and pan controls as well as satellite imagery allows users to apply their personal knowledge about the local geography to achieve faster, and more pertinent transit results. Using asynchronous requests to web services, users are immersed in an application where buses and stops can be added and removed interactively, without the need to wait for responses to HTTP requests.Keywords: Geographic Information Systems, Public Transit, WebServices, AJAX, Human Computer Interface
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18714528 An Empirical Mode Decomposition Based Method for Action Potential Detection in Neural Raw Data
Authors: Sajjad Farashi, Mohammadjavad Abolhassani, Mostafa Taghavi Kani
Abstract:
Information in the nervous system is coded as firing patterns of electrical signals called action potential or spike so an essential step in analysis of neural mechanism is detection of action potentials embedded in the neural data. There are several methods proposed in the literature for such a purpose. In this paper a novel method based on empirical mode decomposition (EMD) has been developed. EMD is a decomposition method that extracts oscillations with different frequency range in a waveform. The method is adaptive and no a-priori knowledge about data or parameter adjusting is needed in it. The results for simulated data indicate that proposed method is comparable with wavelet based methods for spike detection. For neural signals with signal-to-noise ratio near 3 proposed methods is capable to detect more than 95% of action potentials accurately.
Keywords: EMD, neural data processing, spike detection, wavelet decomposition.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23784527 Research on Urban Point of Interest Generalization Method Based on Mapping Presentation
Authors: Chengming Li, Yong Yin, Peipei Guo, Xiaoli Liu
Abstract:
Without taking account of the attribute richness of POI (point of interest) data and spatial distribution limited by roads, a POI generalization method considering both attribute information and spatial distribution has been proposed against the existing point generalization algorithm merely focusing on overall information of point groups. Hierarchical characteristic of urban POI information expression has been firstly analyzed to point out the measurement feature of the corresponding hierarchy. On this basis, an urban POI generalizing strategy has been put forward: POIs urban road network have been divided into three distribution pattern; corresponding generalization methods have been proposed according to the characteristic of POI data in different distribution patterns. Experimental results showed that the method taking into account both attribute information and spatial distribution characteristics of POI can better implement urban POI generalization in the mapping presentation.
Keywords: POI, Road network, spatial information expression, selection method, distribution pattern.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10414526 Specification of Agent Explicit Knowledge in Cryptographic Protocols
Authors: Khair Eddin Sabri, Ridha Khedri, Jason Jaskolka
Abstract:
Cryptographic protocols are widely used in various applications to provide secure communications. They are usually represented as communicating agents that send and receive messages. These agents use their knowledge to exchange information and communicate with other agents involved in the protocol. An agent knowledge can be partitioned into explicit knowledge and procedural knowledge. The explicit knowledge refers to the set of information which is either proper to the agent or directly obtained from other agents through communication. The procedural knowledge relates to the set of mechanisms used to get new information from what is already available to the agent. In this paper, we propose a mathematical framework which specifies the explicit knowledge of an agent involved in a cryptographic protocol. Modelling this knowledge is crucial for the specification, analysis, and implementation of cryptographic protocols. We also, report on a prototype tool that allows the representation and the manipulation of the explicit knowledge.Keywords: Information Algebra, Agent Knowledge, CryptographicProtocols
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14774525 Hippocratic Database: A Privacy-Aware Database
Authors: Norjihan Abdul Ghani, Zailani Mohd Sidek
Abstract:
Nowadays, organizations and business has several motivating factors to protect an individual-s privacy. Confidentiality refers to type of sharing information to third parties. This is always referring to private information, especially for personal information that usually needs to keep as a private. Because of the important of privacy concerns today, we need to design a database system that suits with privacy. Agrawal et. al. has introduced Hippocratic Database also we refer here as a privacy-aware database. This paper will explain how HD can be a future trend for web-based application to enhance their privacy level of trustworthiness among internet users.
Keywords: Hippocratic database, privacy, privacy-aware.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18354524 A Frame Work for Query Results Refinement in Multimedia Databases
Authors: Humaira Liaquat, Nadeem Iftikhar, Shaukat Ali, Zohaib Zafar Iqbal
Abstract:
In the current age, retrieval of relevant information from massive amount of data is a challenging job. Over the years, precise and relevant retrieval of information has attained high significance. There is a growing need in the market to build systems, which can retrieve multimedia information that precisely meets the user's current needs. In this paper, we have introduced a framework for refining query results before showing it to the user, using ambient intelligence, user profile, group profile, user location, time, day, user device type and extracted features. A prototypic tool was also developed to demonstrate the efficiency of the proposed approach.Keywords: Context aware retrieval, Information retrieval, Ambient Intelligence, Multimedia databases, User and group profile.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15494523 Spatial Behavioral Model-Based Dynamic Data-Driven Diagram Information Model
Authors: Chiung-Hui Chen
Abstract:
Diagram and drawing are important ways to communicate and the reproduce of architectural design, Due to the development of information and communication technology, the professional thinking of architecture and interior design are also change rapidly. In development process of design, diagram always play very important role. This study is based on diagram theories, observe and record interaction between man and objects, objects and space, and space and time in a modern nuclear family. Construct a method for diagram to systematically and visualized describe the space plan of a modern nuclear family toward an intelligent design, to assist designer to retrieve information and review event pattern of past and present.Keywords: Digital diagram, information model, context aware, data analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18574522 X-Corner Detection for Camera Calibration Using Saddle Points
Authors: Abdulrahman S. Alturki, John S. Loomis
Abstract:
This paper discusses a corner detection algorithm for camera calibration. Calibration is a necessary step in many computer vision and image processing applications. Robust corner detection for an image of a checkerboard is required to determine intrinsic and extrinsic parameters. In this paper, an algorithm for fully automatic and robust X-corner detection is presented. Checkerboard corner points are automatically found in each image without user interaction or any prior information regarding the number of rows or columns. The approach represents each X-corner with a quadratic fitting function. Using the fact that the X-corners are saddle points, the coefficients in the fitting function are used to identify each corner location. The automation of this process greatly simplifies calibration. Our method is robust against noise and different camera orientations. Experimental analysis shows the accuracy of our method using actual images acquired at different camera locations and orientations.Keywords: Camera Calibration, Corner Detector, Saddle Points, X-Corners.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 31624521 Towards an Understanding of how Information Technology Enables Innovation – The Innovators- Perceptions
Authors: R. Nootjarat, W. Chantatub, P. Chongstitvatana
Abstract:
This research attempts to explore gaps in Information Systems (IS) and innovation literatures by developing a model of Information Technology (IT) capability in enabling innovation. The research was conducted by using semi-structured interview with six innovators in business consulting, financial, healthcare and academic organizations. The interview results suggest four elements of ITenabled innovation capability which are information (ability to capture ideas and knowledge), connectivity (ability to bridge geographical boundary and mobilize human resources), communication (ability to attain and engage relationships between human resources) and transformation (ability to change the functions and process integrations) in defining IT-enabled innovation platform. The results also suggests innovators- roles and IT capability.Keywords: Innovation Platform, IT Capability, Innovators, Innovation Delivery
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13984520 Spatial Data Mining by Decision Trees
Authors: S. Oujdi, H. Belbachir
Abstract:
Existing methods of data mining cannot be applied on spatial data because they require spatial specificity consideration, as spatial relationships. This paper focuses on the classification with decision trees, which are one of the data mining techniques. We propose an extension of the C4.5 algorithm for spatial data, based on two different approaches Join materialization and Querying on the fly the different tables. Similar works have been done on these two main approaches, the first - Join materialization - favors the processing time in spite of memory space, whereas the second - Querying on the fly different tables- promotes memory space despite of the processing time. The modified C4.5 algorithm requires three entries tables: a target table, a neighbor table, and a spatial index join that contains the possible spatial relationship among the objects in the target table and those in the neighbor table. Thus, the proposed algorithms are applied to a spatial data pattern in the accidentology domain. A comparative study of our approach with other works of classification by spatial decision trees will be detailed.
Keywords: C4.5 Algorithm, Decision trees, S-CART, Spatial data mining.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 29914519 Automatic Extraction of Arbitrarily Shaped Buildings from VHR Satellite Imagery
Authors: Evans Belly, Imdad Rizvi, M. M. Kadam
Abstract:
Satellite imagery is one of the emerging technologies which are extensively utilized in various applications such as detection/extraction of man-made structures, monitoring of sensitive areas, creating graphic maps etc. The main approach here is the automated detection of buildings from very high resolution (VHR) optical satellite images. Initially, the shadow, the building and the non-building regions (roads, vegetation etc.) are investigated wherein building extraction is mainly focused. Once all the landscape is collected a trimming process is done so as to eliminate the landscapes that may occur due to non-building objects. Finally the label method is used to extract the building regions. The label method may be altered for efficient building extraction. The images used for the analysis are the ones which are extracted from the sensors having resolution less than 1 meter (VHR). This method provides an efficient way to produce good results. The additional overhead of mid processing is eliminated without compromising the quality of the output to ease the processing steps required and time consumed.Keywords: Building detection, shadow detection, landscape generation, label, partitioning, very high resolution satellite imagery.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8454518 The Study on Evaluation System and Method of Legacy System
Authors: Chao Qi, Fuyang Peng, Bo Deng, Xiaoyan Su
Abstract:
In the upgrade process of enterprise information systems, how to deal with and utilize those legacy systems affects the efficiency of construction and development of the new system. We propose an evaluation system, which comprehensively describes the capacity of legacy information systems in five aspects. Then we propose a practical legacy systems evaluation method. Base on the evaluation result, we can determine the current state of legacy system which was evaluated.Keywords: Legacy Information Systems, Evaluation IndexSystem, Evaluation Method, Evaluation Level
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16254517 A Generalized Sparse Bayesian Learning Algorithm for Near-Field Synthetic Aperture Radar Imaging: By Exploiting Impropriety and Noncircularity
Authors: Pan Long, Bi Dongjie, Li Xifeng, Xie Yongle
Abstract:
The near-field synthetic aperture radar (SAR) imaging is an advanced nondestructive testing and evaluation (NDT&E) technique. This paper investigates the complex-valued signal processing related to the near-field SAR imaging system, where the measurement data turns out to be noncircular and improper, meaning that the complex-valued data is correlated to its complex conjugate. Furthermore, we discover that the degree of impropriety of the measurement data and that of the target image can be highly correlated in near-field SAR imaging. Based on these observations, A modified generalized sparse Bayesian learning algorithm is proposed, taking impropriety and noncircularity into account. Numerical results show that the proposed algorithm provides performance gain, with the help of noncircular assumption on the signals.Keywords: Complex-valued signal processing, synthetic aperture radar (SAR), 2-D radar imaging, compressive sensing, Sparse Bayesian learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15374516 Mining Image Features in an Automatic Two-Dimensional Shape Recognition System
Authors: R. A. Salam, M.A. Rodrigues
Abstract:
The number of features required to represent an image can be very huge. Using all available features to recognize objects can suffer from curse dimensionality. Feature selection and extraction is the pre-processing step of image mining. Main issues in analyzing images is the effective identification of features and another one is extracting them. The mining problem that has been focused is the grouping of features for different shapes. Experiments have been conducted by using shape outline as the features. Shape outline readings are put through normalization and dimensionality reduction process using an eigenvector based method to produce a new set of readings. After this pre-processing step data will be grouped through their shapes. Through statistical analysis, these readings together with peak measures a robust classification and recognition process is achieved. Tests showed that the suggested methods are able to automatically recognize objects through their shapes. Finally, experiments also demonstrate the system invariance to rotation, translation, scale, reflection and to a small degree of distortion.Keywords: Image mining, feature selection, shape recognition, peak measures.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14624515 Performance Improvements of DSP Applications on a Generic Reconfigurable Platform
Authors: Michalis D. Galanis, Gregory Dimitroulakos, Costas E. Goutis
Abstract:
Speedups from mapping four real-life DSP applications on an embedded system-on-chip that couples coarsegrained reconfigurable logic with an instruction-set processor are presented. The reconfigurable logic is realized by a 2-Dimensional Array of Processing Elements. A design flow for improving application-s performance is proposed. Critical software parts, called kernels, are accelerated on the Coarse-Grained Reconfigurable Array. The kernels are detected by profiling the source code. For mapping the detected kernels on the reconfigurable logic a prioritybased mapping algorithm has been developed. Two 4x4 array architectures, which differ in their interconnection structure among the Processing Elements, are considered. The experiments for eight different instances of a generic system show that important overall application speedups have been reported for the four applications. The performance improvements range from 1.86 to 3.67, with an average value of 2.53, compared with an all-software execution. These speedups are quite close to the maximum theoretical speedups imposed by Amdahl-s law.Keywords: Reconfigurable computing, Coarse-grained reconfigurable array, Embedded systems, DSP, Performance
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14954514 Learning Object Interface Adapted to the Learner's Learning Style
Authors: Zenaide Carvalho da Silva, Leandro Rodrigues Ferreira, Andrey Ricardo Pimentel
Abstract:
Learning styles (LS) refer to the ways and forms that the student prefers to learn in the teaching and learning process. Each student has their own way of receiving and processing information throughout the learning process. Therefore, knowing their LS is important to better understand their individual learning preferences, and also, understand why the use of some teaching methods and techniques give better results with some students, while others it does not. We believe that knowledge of these styles enables the possibility of making propositions for teaching; thus, reorganizing teaching methods and techniques in order to allow learning that is adapted to the individual needs of the student. Adapting learning would be possible through the creation of online educational resources adapted to the style of the student. In this context, this article presents the structure of a learning object interface adaptation based on the LS. The structure created should enable the creation of the adapted learning object according to the student's LS and contributes to the increase of student’s motivation in the use of a learning object as an educational resource.
Keywords: Adaptation, interface, learning object, learning style.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9894513 Realization of Autonomous Guidance Service by Integrating Information from NFC and MEMS
Authors: Dawei Cai
Abstract:
In this paper, we present an autonomous guidance service by combinating the position information from NFC and the orientation information from 6 a 6 axis acceleration and terrestrial magnetism sensor. We developed an algorithm to calculate the device orientation based on the data from acceleration and terrestrial magnetism sensor.With this function, a autonomous guidance service can be provided, according the visitors's position and orientation. This service may be convient for old people or disables or children.
Keywords: NFC, Ubiquitous Computing, Guide Sysem, MEMS.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1661