Search results for: query processing
1446 Retina Based Mouse Control (RBMC)
Authors: Arslan Qamar Malik, Jehanzeb Ahmad
Abstract:
The paper presents a novel idea to control computer mouse cursor movement with human eyes. In this paper, a working of the product has been described as to how it helps the special people share their knowledge with the world. Number of traditional techniques such as Head and Eye Movement Tracking Systems etc. exist for cursor control by making use of image processing in which light is the primary source. Electro-oculography (EOG) is a new technology to sense eye signals with which the mouse cursor can be controlled. The signals captured using sensors, are first amplified, then noise is removed and then digitized, before being transferred to PC for software interfacing.Keywords: Human Computer Interaction, Real-Time System, Electro-oculography, Signal Processing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 42471445 Image Retrieval Using Fused Features
Authors: K. Sakthivel, R. Nallusamy, C. Kavitha
Abstract:
The system is designed to show images which are related to the query image. Extracting color, texture, and shape features from an image plays a vital role in content-based image retrieval (CBIR). Initially RGB image is converted into HSV color space due to its perceptual uniformity. From the HSV image, Color features are extracted using block color histogram, texture features using Haar transform and shape feature using Fuzzy C-means Algorithm. Then, the characteristics of the global and local color histogram, texture features through co-occurrence matrix and Haar wavelet transform and shape are compared and analyzed for CBIR. Finally, the best method of each feature is fused during similarity measure to improve image retrieval effectiveness and accuracy.
Keywords: Color Histogram, Haar Wavelet Transform, Fuzzy C-means, Co-occurrence matrix; Similarity measure.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21271444 Adjusting the Furnace and Converter Temperature of the Sulfur Recovery Units
Authors: Hamid Reza Mahdipoor, Hamid Ganji, Hamed Naderi, Hajar Yousefian, Hooman Javaherizadeh
Abstract:
The modified Claus process is commonly used in oil refining and gas processing to recover sulfur and destroy contaminants formed in upstream processing. A Claus furnace feed containing a relatively low concentration of H2S may be incapable of producing a stable flame. Also, incomplete combustion of hydrocarbons in the feed can lead to deterioration of the catalyst in the reactors due to soot or carbon deposition. Therefore, special consideration is necessary to achieve the appropriate overall sulfur recovery. In this paper, some configurations available to treat lean acid gas streams are described and the most appropriate ones are studied to overcome low H2S concentration problems. As a result, overall sulfur recovery is investigated for feed preheating and hot gas configurations.Keywords: Sulfur recovery unit, Low H2S content
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 46311443 Part of Speech Tagging Using Statistical Approach for Nepali Text
Authors: Archit Yajnik
Abstract:
Part of Speech Tagging has always been a challenging task in the era of Natural Language Processing. This article presents POS tagging for Nepali text using Hidden Markov Model and Viterbi algorithm. From the Nepali text, annotated corpus training and testing data set are randomly separated. Both methods are employed on the data sets. Viterbi algorithm is found to be computationally faster and accurate as compared to HMM. The accuracy of 95.43% is achieved using Viterbi algorithm. Error analysis where the mismatches took place is elaborately discussed.Keywords: Hidden Markov model, Viterbi algorithm, POS tagging, natural language processing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17081442 A Novel Method for Behavior Modeling in Uncertain Information Systems
Authors: Ali Haroonabadi, Mohammad Teshnehlab
Abstract:
None of the processing models in the software development has explained the software systems performance evaluation and modeling; likewise, there exist uncertainty in the information systems because of the natural essence of requirements, and this may cause other challenges in the processing of software development. By definition an extended version of UML (Fuzzy- UML), the functional requirements of the software defined uncertainly would be supported. In this study, the behavioral description of uncertain information systems by the aid of fuzzy-state diagram is crucial; moreover, the introduction of behavioral diagrams role in F-UML is investigated in software performance modeling process. To get the aim, a fuzzy sub-profile is used.Keywords: Fuzzy System, Software Development Model, Software Performance Evaluation, UML
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24971441 A New Approach to Design an Efficient CIC Decimator Using Signed Digit Arithmetic
Authors: Vishal Awasthi, Krishna Raj
Abstract:
Any digital processing performed on a signal with larger nyquist interval requires more computation than signal processing performed on smaller nyquist interval. The sampling rate alteration generates the unwanted effects in the system such as spectral aliasing and spectral imaging during signal processing. Multirate-multistage implementation of digital filter can result a significant computational saving than single rate filter designed for sample rate conversion. In this paper, we presented an efficient cascaded integrator comb (CIC) decimation filter that perform fast down sampling using signed digit adder algorithm with compensated frequency droop that arises due to aliasing effect during the decimation process. This proposed compensated CIC decimation filter structure with a hybrid signed digit (HSD) fast adder provide an improved performance in terms of down sampling speed by 65.15% than ripple carry adder (RCA) and reduced area and power by 57.5% and 0.01 % than signed digit (SD) adder algorithms respectively.
Keywords: Sampling rate conversion, Multirate Filtering, Compensation Theory, Decimation filter, CIC filter, Redundant signed digit arithmetic, Fast adders.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 48921440 FPGA Hardware Implementation and Evaluation of a Micro-Network Architecture for Multi-Core Systems
Authors: Yahia Salah, Med Lassaad Kaddachi, Rached Tourki
Abstract:
This paper presents the design, implementation and evaluation of a micro-network, or Network-on-Chip (NoC), based on a generic pipeline router architecture. The router is designed to efficiently support traffic generated by multimedia applications on embedded multi-core systems. It employs a simplest routing mechanism and implements the round-robin scheduling strategy to resolve output port contentions and minimize latency. A virtual channel flow control is applied to avoid the head-of-line blocking problem and enhance performance in the NoC. The hardware design of the router architecture has been implemented at the register transfer level; its functionality is evaluated in the case of the two dimensional Mesh/Torus topology, and performance results are derived from ModelSim simulator and Xilinx ISE 9.2i synthesis tool. An example of a multi-core image processing system utilizing the NoC structure has been implemented and validated to demonstrate the capability of the proposed micro-network architecture. To reduce complexity of the image compression and decompression architecture, the system use image processing algorithm based on classical discrete cosine transform with an efficient zonal processing approach. The experimental results have confirmed that both the proposed image compression scheme and NoC architecture can achieve a reasonable image quality with lower processing time.
Keywords: Generic Pipeline Network-on-Chip Router Architecture, JPEG Image Compression, FPGA Hardware Implementation, Performance Evaluation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 30971439 Morpho-Phonological Modelling in Natural Language Processing
Authors: Eleni Galiotou, Angela Ralli
Abstract:
In this paper we propose a computational model for the representation and processing of morpho-phonological phenomena in a natural language, like Modern Greek. We aim at a unified treatment of inflection, compounding, and word-internal phonological changes, in a model that is used for both analysis and generation. After discussing certain difficulties cuase by well-known finitestate approaches, such as Koskenniemi-s two-level model [7] when applied to a computational treatment of compounding, we argue that a morphology-based model provides a more adequate account of word-internal phenomena. Contrary to the finite state approaches that cannot handle hierarchical word constituency in a satisfactory way, we propose a unification-based word grammar, as the nucleus of our strategy, which takes into consideration word representations that are based on affixation and [stem stem] or [stem word] compounds. In our formalism, feature-passing operations are formulated with the use of the unification device, and phonological rules modeling the correspondence between lexical and surface forms apply at morpheme boundaries. In the paper, examples from Modern Greek illustrate our approach. Morpheme structures, stress, and morphologically conditioned phoneme changes are analyzed and generated in a principled way.
Keywords: Morpho-Phonology, Natural Language Processing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21291438 Algorithm for Information Retrieval Optimization
Authors: Kehinde K. Agbele, Kehinde Daniel Aruleba, Eniafe F. Ayetiran
Abstract:
When using Information Retrieval Systems (IRS), users often present search queries made of ad-hoc keywords. It is then up to the IRS to obtain a precise representation of the user’s information need and the context of the information. This paper investigates optimization of IRS to individual information needs in order of relevance. The study addressed development of algorithms that optimize the ranking of documents retrieved from IRS. This study discusses and describes a Document Ranking Optimization (DROPT) algorithm for information retrieval (IR) in an Internet-based or designated databases environment. Conversely, as the volume of information available online and in designated databases is growing continuously, ranking algorithms can play a major role in the context of search results. In this paper, a DROPT technique for documents retrieved from a corpus is developed with respect to document index keywords and the query vectors. This is based on calculating the weight (Keywords: Internet ranking,
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14751437 Evolving Knowledge Extraction from Online Resources
Authors: Zhibo Xiao, Tharini Nayanika de Silva, Kezhi Mao
Abstract:
In this paper, we present an evolving knowledge extraction system named AKEOS (Automatic Knowledge Extraction from Online Sources). AKEOS consists of two modules, including a one-time learning module and an evolving learning module. The one-time learning module takes in user input query, and automatically harvests knowledge from online unstructured resources in an unsupervised way. The output of the one-time learning is a structured vector representing the harvested knowledge. The evolving learning module automatically schedules and performs repeated one-time learning to extract the newest information and track the development of an event. In addition, the evolving learning module summarizes the knowledge learned at different time points to produce a final knowledge vector about the event. With the evolving learning, we are able to visualize the key information of the event, discover the trends, and track the development of an event.Keywords: Evolving learning, knowledge extraction, knowledge graph, text mining.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9421436 Advanced Image Analysis Tools Development for the Early Stage Bronchial Cancer Detection
Authors: P. Bountris, E. Farantatos, N. Apostolou
Abstract:
Autofluorescence (AF) bronchoscopy is an established method to detect dysplasia and carcinoma in situ (CIS). For this reason the “Sotiria" Hospital uses the Karl Storz D-light system. However, in early tumor stages the visualization is not that obvious. With the help of a PC, we analyzed the color images we captured by developing certain tools in Matlab®. We used statistical methods based on texture analysis, signal processing methods based on Gabor models and conversion algorithms between devicedependent color spaces. Our belief is that we reduced the error made by the naked eye. The tools we implemented improve the quality of patients' life.Keywords: Bronchoscopy, digital image processing, lung cancer, texture analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14331435 Performance Evaluation of Compression Algorithms for Developing and Testing Industrial Imaging Systems
Authors: Daniel F. Garcia, Julio Molleda, Francisco Gonzalez, Ruben Usamentiaga
Abstract:
The development of many measurement and inspection systems of products based on real-time image processing can not be carried out totally in a laboratory due to the size or the temperature of the manufactured products. Those systems must be developed in successive phases. Firstly, the system is installed in the production line with only an operational service to acquire images of the products and other complementary signals. Next, a recording service of the image and signals must be developed and integrated in the system. Only after a large set of images of products is available, the development of the real-time image processing algorithms for measurement or inspection of the products can be accomplished under realistic conditions. Finally, the recording service is turned off or eliminated and the system operates only with the real-time services for the acquisition and processing of the images. This article presents a systematic performance evaluation of the image compression algorithms currently available to implement a real-time recording service. The results allow establishing a trade off between the reduction or compression of the image size and the CPU time required to get that compression level.Keywords: Lossless image compression, codec performanceevaluation, grayscale codec comparison, real-time image recording.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14191434 Enhancing the Performance of Wireless Sensor Networks Using Low Power Design
Authors: N. Mahendran, R. Madhuranthi
Abstract:
Wireless sensor networks (WSNs), are constantly in demand to process information more rapidly with less energy and area cost. Presently, processor based solutions have difficult to achieve high processing speed with low-power consumption. This paper presents a simple and accurate data processing scheme for low power wireless sensor node, based on reduced number of processing element (PE). The presented model provides a simple recursive structure (SRS) to process the sampled data in the wireless sensor environment and to reduce the power consumption in wireless sensor node. Based on this model, to process the incoming samples and produce a smaller amount of data sufficient to reconstruct the original signal. The ModelSim simulator used to simulate SRS structure. Functional simulation is carried out for the validation of the presented architecture. Xilinx Power Estimator (XPE) tool is used to measure the power consumption. The experimental results show the average power consumption of 91 mW; this is 42% improvement compared to the folded tree architecture.Keywords: Power consumption, energy efficiency, low power WSN node, recursive structure, sleep/wake scheduling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10141433 Fuzzy Mathematical Morphology approach in Image Processing
Authors: Yee Yee Htun, Dr. Khaing Khaing Aye
Abstract:
Morphological operators transform the original image into another image through the interaction with the other image of certain shape and size which is known as the structure element. Mathematical morphology provides a systematic approach to analyze the geometric characteristics of signals or images, and has been applied widely too many applications such as edge detection, objection segmentation, noise suppression and so on. Fuzzy Mathematical Morphology aims to extend the binary morphological operators to grey-level images. In order to define the basic morphological operations such as fuzzy erosion, dilation, opening and closing, a general method based upon fuzzy implication and inclusion grade operators is introduced. The fuzzy morphological operations extend the ordinary morphological operations by using fuzzy sets where for fuzzy sets, the union operation is replaced by a maximum operation, and the intersection operation is replaced by a minimum operation. In this work, it consists of two articles. In the first one, fuzzy set theory, fuzzy Mathematical morphology which is based on fuzzy logic and fuzzy set theory; fuzzy Mathematical operations and their properties will be studied in details. As a second part, the application of fuzziness in Mathematical morphology in practical work such as image processing will be discussed with the illustration problems.Keywords: Binary Morphological, Fuzzy sets, Grayscalemorphology, Image processing, Mathematical morphology.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 32461432 Bioceramic Scaffolds Fabrication by Rapid Prototyping Technology
Authors: F.H. Liu, S.H. Chen, R.T. Lee, W.S. Lin, Y.S. Liao
Abstract:
This paper describes a rapid prototyping (RP) technology for forming a hydroxyapatite (HA) bone scaffold model. The HA powder and a silica sol are mixed into bioceramic slurry form under a suitable viscosity. The HA particles are embedded in the solidified silica matrix to form green parts via a wide range of process parameters after processing by selective laser sintering (SLS). The results indicate that the proposed process was possible to fabricate multilayers and hollow shell structure with brittle property but sufficient integrity for handling prior to post-processing. The fabricated bone scaffold models had a surface finish of 25Keywords: bioceramic, bone scaffold, rapid prototyping, selective laser sintering
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17191431 Performance Analysis of the First-Order Characteristics of Polling Systems Based on Parallel Limited (k = 1) Services Mode
Authors: Liu Yi, Bao Liyong
Abstract:
Aiming at the problem of low efficiency of pipelined scheduling in periodic query-qualified service, this paper proposes a system service resource scheduling strategy with parallel optimized qualified service polling control. The paper constructs the polling queuing system and its mathematical model; firstly, the first-order and second-order characteristic parameter equations are obtained by partial derivation of the probability mother function of the system state variables, and the complete mathematical, analytical expressions of each system parameter are deduced after the joint solution. The simulation experimental results are consistent with the theoretical calculated values. The system performance analysis shows that the average captain and average period of the system have been greatly improved, which can better adapt to the service demand of delay-sensitive data in the dense data environment.
Keywords: Polling, parallel scheduling, mean queue length, average cycle time.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 601430 SMaTTS: Standard Malay Text to Speech System
Authors: Othman O. Khalifa, Zakiah Hanim Ahmad, Teddy Surya Gunawan
Abstract:
This paper presents a rule-based text- to- speech (TTS) Synthesis System for Standard Malay, namely SMaTTS. The proposed system using sinusoidal method and some pre- recorded wave files in generating speech for the system. The use of phone database significantly decreases the amount of computer memory space used, thus making the system very light and embeddable. The overall system was comprised of two phases the Natural Language Processing (NLP) that consisted of the high-level processing of text analysis, phonetic analysis, text normalization and morphophonemic module. The module was designed specially for SM to overcome few problems in defining the rules for SM orthography system before it can be passed to the DSP module. The second phase is the Digital Signal Processing (DSP) which operated on the low-level process of the speech waveform generation. A developed an intelligible and adequately natural sounding formant-based speech synthesis system with a light and user-friendly Graphical User Interface (GUI) is introduced. A Standard Malay Language (SM) phoneme set and an inclusive set of phone database have been constructed carefully for this phone-based speech synthesizer. By applying the generative phonology, a comprehensive letter-to-sound (LTS) rules and a pronunciation lexicon have been invented for SMaTTS. As for the evaluation tests, a set of Diagnostic Rhyme Test (DRT) word list was compiled and several experiments have been performed to evaluate the quality of the synthesized speech by analyzing the Mean Opinion Score (MOS) obtained. The overall performance of the system as well as the room for improvements was thoroughly discussed.Keywords: Natural Language Processing, Text-To-Speech (TTS), Diphone, source filter, low-/ high- level synthesis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19731429 A Model-Driven Approach of User Interface for MVP Rich Internet Application
Authors: Sarra Roubi, Mohammed Erramdani, Samir Mbarki
Abstract:
This paper presents an approach for the model-driven generating of Rich Internet Application (RIA) focusing on the graphical aspect. We used well known Model-Driven Engineering (MDE) frameworks and technologies, such as Eclipse Modeling Framework (EMF), Graphical Modeling Framework (GMF), Query View Transformation (QVTo) and Acceleo to enable the design and the code automatic generation of the RIA. During the development of the approach, we focused on the graphical aspect of the application in terms of interfaces while opting for the Model View Presenter pattern that is designed for graphics interfaces. The paper describes the process followed to define the approach, the supporting tool and presents the results from a case study.Keywords: Code generation, Design Pattern, metamodel, Model Driven Engineering, MVP, Rich Internet Application, transformation, User Interface.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17421428 A Study of Gaps in CBMIR Using Different Methods and Prospective
Authors: Pradeep Singh, Sukhwinder Singh, Gurjinder Kaur
Abstract:
In recent years, rapid advances in software and hardware in the field of information technology along with a digital imaging revolution in the medical domain facilitate the generation and storage of large collections of images by hospitals and clinics. To search these large image collections effectively and efficiently poses significant technical challenges, and it raises the necessity of constructing intelligent retrieval systems. Content-based Image Retrieval (CBIR) consists of retrieving the most visually similar images to a given query image from a database of images[5]. Medical CBIR (content-based image retrieval) applications pose unique challenges but at the same time offer many new opportunities. On one hand, while one can easily understand news or sports videos, a medical image is often completely incomprehensible to untrained eyes.
Keywords: Classification, clustering, content-based image retrieval (CBIR), relevance feedback (RF), statistical similarity matching, support vector machine (SVM).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17871427 Determination and Comparison of Fabric Pills Distribution Using Image Processing and Spatial Data Analysis Tools
Authors: Lenka Techniková, Maroš Tunák, Jiří Janáček
Abstract:
This work deals with the determination and comparison of pill patterns in 2 sets of fabric samples which differ in way of pill creation. The first set contains fabric samples with the pills created by simulation on a Martindale abrasion machine, while pills in the second set originated during normal wearing and maintenance. The goal of the study is to determine whether the pattern of the fabric pills created by simulation is the same as the pattern of naturally occurring pills. The system of determination and comparison of the pills is based on image processing and spatial data analysis tools. Firstly, 3D reconstruction of the fabric surfaces with the pills is realized with using a gradient fields method. The gradient fields method creates a 3D fabric surface from a set of 4 images. Thereafter, the pills are detected in 3D fabric surfaces using image-processing tools in the MATLAB software. Determination and comparison of the pills patterns of two sets of fabric samples is based on spatial data analysis using tools in R software.
Keywords: 3D reconstruction of the surface, image analysis tools, distribution of the pills, spatial data analysis tools.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21731426 Narrative and Expository Text Reading Comprehension by Fourth Grade Spanish-Speaking Children
Authors: Mariela V. De Mier, Veronica S. Sanchez Abchi, Ana M. Borzone
Abstract:
This work aims to explore the factors that have an incidence in reading comprehension process, with different type of texts. In a recent study with 2nd, 3rd and 4th grade children, it was observed that reading comprehension of narrative texts was better than comprehension of expository texts. Nevertheless it seems that not only the type of text but also other textual factors would account for comprehension depending on the cognitive processing demands posed by the text. In order to explore this assumption, three narrative and three expository texts were elaborated with different degree of complexity. A group of 40 fourth grade Spanish-speaking children took part in the study. Children were asked to read the texts and answer orally three literal and three inferential questions for each text. The quantitative and qualitative analysis of children responses showed that children had difficulties in both, narrative and expository texts. The problem was to answer those questions that involved establishing complex relationships among information units that were present in the text or that should be activated from children’s previous knowledge to make an inference. Considering the data analysis, it could be concluded that there is some interaction between the type of text and the cognitive processing load of a specific text.
Keywords: comprehension, textual factors, type of text, processing demands.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14081425 An Efficient Cache Replacement Strategy for the Hybrid Cache Consistency Approach
Authors: Aline Zeitunlian, Ramzi A. Haraty
Abstract:
Caching was suggested as a solution for reducing bandwidth utilization and minimizing query latency in mobile environments. Over the years, different caching approaches have been proposed, some relying on the server to broadcast reports periodically informing of the updated data while others allowed the clients to request for the data whenever needed. Until recently a hybrid cache consistency scheme Scalable Asynchronous Cache Consistency Scheme SACCS was proposed, which combined the two different approaches benefits- and is proved to be more efficient and scalable. Nevertheless, caching has its limitations too, due to the limited cache size and the limited bandwidth, which makes the implementation of cache replacement strategy an important aspect for improving the cache consistency algorithms. In this thesis, we proposed a new cache replacement strategy, the Least Unified Value strategy (LUV) to replace the Least Recently Used (LRU) that SACCS was based on. This paper studies the advantages and the drawbacks of the new proposed strategy, comparing it with different categories of cache replacement strategies.
Keywords: Cache consistency, hybrid algorithm, and mobileenvironments
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22021424 The Design of Picture Books for Children from Tales of Amphawa Fireflies
Authors: Marut Pichetvit
Abstract:
The research objective aims to search information about storytelling and fable associated with fireflies in Amphawa community, in order to design and create a story book which is appropriate for the interests of children in early childhood. This book should help building the development of learning about the natural environment, imagination, and creativity among children, which then, brings about the promotion of the development, conservation and dissemination of cultural values and uniqueness of the Amphawa community. The population used in this study were 30 students in early childhood aged between 6-8 years-old, grade 1-3 from the Demonstration School of Suan Sunandha Rajabhat University. The method used for this study was purposive sampling and the research conducted by the query and analysis of data from both the document and the narrative field tales and fable associated with the fireflies of Amphawa community. Then, using the results to synthesize and create a conceptual design in a form of 8 visual images which were later applied to 1 illustrated children’s book and presented to the experts to evaluate and test this media.
Keywords: Children’s illustrated book, Fireflies, Amphawa.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13191423 Wavelet Transform and Support Vector Machine Approach for Fault Location in Power Transmission Line
Authors: V. Malathi, N.S.Marimuthu
Abstract:
This paper presents a wavelet transform and Support Vector Machine (SVM) based algorithm for estimating fault location on transmission lines. The Discrete wavelet transform (DWT) is used for data pre-processing and this data are used for training and testing SVM. Five types of mother wavelet are used for signal processing to identify a suitable wavelet family that is more appropriate for use in estimating fault location. The results demonstrated the ability of SVM to generalize the situation from the provided patterns and to accurately estimate the location of faults with varying fault resistance.Keywords: Fault location, support vector machine, supportvector regression, transmission lines, wavelet transform.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21841422 Event Template Generation for News Articles
Authors: A. Kowcika, E. Umamaheswari, T.V. Geetha
Abstract:
In this paper we focus on event extraction from Tamil news article. This system utilizes a scoring scheme for extracting and grouping event-specific sentences. Using this scoring scheme eventspecific clustering is performed for multiple documents. Events are extracted from each document using a scoring scheme based on feature score and condition score. Similarly event specific sentences are clustered from multiple documents using this scoring scheme. The proposed system builds the Event Template based on user specified query. The templates are filled with event specific details like person, location and timeline extracted from the formed clusters. The proposed system applies these methodologies for Tamil news articles that have been enconverted into UNL graphs using a Tamil to UNL-enconverter. The main intention of this work is to generate an event based template.Keywords: Event Extraction, Score based Clustering, Segmentation, Template Generation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16991421 Early Diagnosis of Alzheimer's Disease Using a Combination of Images Processing and Brain Signals
Authors: E. Irankhah, M. Zarif, E. Mazrooei Rad, K. Ghandehari
Abstract:
Alzheimer's prevalence is on the rise, and the disease comes with problems like cessation of treatment, high cost of treatment, and the lack of early detection methods. The pathology of this disease causes the formation of protein deposits in the brain of patients called plaque amyloid. Generally, the diagnosis of this disease is done by performing tests such as a cerebrospinal fluid, CT scan, MRI, and spinal cord fluid testing, or mental testing tests and eye tracing tests. In this paper, we tried to use the Medial Temporal Atrophy (MTA) method and the Leave One Out (LOO) cycle to extract the statistical properties of the three Fz, Pz, and Cz channels of ERP signals for early diagnosis of this disease. In the process of CT scan images, the accuracy of the results is 81% for the healthy person and 88% for the severe patient. After the process of ERP signaling, the accuracy of the results for a healthy person in the delta band in the Cz channel is 81% and in the alpha band the Pz channel is 90%. In the results obtained from the signal processing, the results of the severe patient in the delta band of the Cz channel were 89% and in the alpha band Pz channel 92%.
Keywords: Alzheimer's disease, image and signal processing, medial temporal atrophy, LOO Cycle.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20491420 Cognitive SATP for Airborne Radar Based on Slow-Time Coding
Authors: Fanqiang Kong, Jindong Zhang, Daiyin Zhu
Abstract:
Space-time adaptive processing (STAP) techniques have been motivated as a key enabling technology for advanced airborne radar applications. In this paper, the notion of cognitive radar is extended to STAP technique, and cognitive STAP is discussed. The principle for improving signal-to-clutter ratio (SCNR) based on slow-time coding is given, and the corresponding optimization algorithm based on cyclic and power-like algorithms is presented. Numerical examples show the effectiveness of the proposed method.Keywords: Space-time adaptive processing (STAP), signal-to-clutter ratio, slow-time coding.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8501419 Non-destructive Watermelon Ripeness Determination Using Image Processing and Artificial Neural Network (ANN)
Authors: Shah Rizam M. S. B., Farah Yasmin A.R., Ahmad Ihsan M. Y., Shazana K.
Abstract:
Agriculture products are being more demanding in market today. To increase its productivity, automation to produce these products will be very helpful. The purpose of this work is to measure and determine the ripeness and quality of watermelon. The textures on watermelon skin will be captured using digital camera. These images will be filtered using image processing technique. All these information gathered will be trained using ANN to determine the watermelon ripeness accuracy. Initial results showed that the best model has produced percentage accuracy of 86.51%, when measured at 32 hidden units with a balanced percentage rate of training dataset.Keywords: Artificial Neural Network (ANN), Digital ImageProcessing, YCbCr Colour Space, Watermelon Ripeness.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 29521418 Comparative Analysis of Different Page Ranking Algorithms
Authors: S. Prabha, K. Duraiswamy, J. Indhumathi
Abstract:
Search engine plays an important role in internet, to retrieve the relevant documents among the huge number of web pages. However, it retrieves more number of documents, which are all relevant to your search topics. To retrieve the most meaningful documents related to search topics, ranking algorithm is used in information retrieval technique. One of the issues in data miming is ranking the retrieved document. In information retrieval the ranking is one of the practical problems. This paper includes various Page Ranking algorithms, page segmentation algorithms and compares those algorithms used for Information Retrieval. Diverse Page Rank based algorithms like Page Rank (PR), Weighted Page Rank (WPR), Weight Page Content Rank (WPCR), Hyperlink Induced Topic Selection (HITS), Distance Rank, Eigen Rumor, Distance Rank Time Rank, Tag Rank, Relational Based Page Rank and Query Dependent Ranking algorithms are discussed and compared.
Keywords: Information Retrieval, Web Page Ranking, search engine, web mining, page segmentations.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 42881417 SWARM: A Meta-Scheduler to Minimize Job Queuing Times on Computational Grids
Authors: Jean-Alain Grunchec, Jules Hernández-Sánchez, Sara Knott
Abstract:
Some meta-schedulers query the information system of individual supercomputers in order to submit jobs to the least busy supercomputer on a computational Grid. However, this information can become outdated by the time a job starts due to changes in scheduling priorities. The MSR scheme is based on Multiple Simultaneous Requests and can take advantage of opportunities resulting from these priorities changes. This paper presents the SWARM meta-scheduler, which can speed up the execution of large sets of tasks by minimizing the job queuing time through the submission of multiple requests. Performance tests have shown that this new meta-scheduler is faster than an implementation of the MSR scheme and the gLite meta-scheduler. SWARM has been used through the GridQTL project beta-testing portal during the past year. Statistics are provided for this usage and demonstrate its capacity to achieve reliably a substantial reduction of the execution time in production conditions.
Keywords: Grid computing, multiple simultaneous requests, fault tolerance, GridQTL.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1910