Search results for: embedded databases
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 757

Search results for: embedded databases

397 Real-Time Image Analysis of Capsule Endoscopy for Bleeding Discrimination in Embedded System Platform

Authors: Yong-Gyu Lee, Gilwon Yoon

Abstract:

Image processing for capsule endoscopy requires large memory and it takes hours for diagnosis since operation time is normally more than 8 hours. A real-time analysis algorithm of capsule images can be clinically very useful. It can differentiate abnormal tissue from health structure and provide with correlation information among the images. Bleeding is our interest in this regard and we propose a method of detecting frames with potential bleeding in real-time. Our detection algorithm is based on statistical analysis and the shapes of bleeding spots. We tested our algorithm with 30 cases of capsule endoscopy in the digestive track. Results were excellent where a sensitivity of 99% and a specificity of 97% were achieved in detecting the image frames with bleeding spots.

Keywords: bleeding, capsule endoscopy, image processing, real time analysis

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1825
396 Binary Phase-Only Filter Watermarking with Quantized Embedding

Authors: Hu Haibo, Liu Yi, He Ming

Abstract:

The binary phase-only filter digital watermarking embeds the phase information of the discrete Fourier transform of the image into the corresponding magnitudes for better image authentication. The paper proposed an approach of how to implement watermark embedding by quantizing the magnitude, with discussing how to regulate the quantization steps based on the frequencies of the magnitude coefficients of the embedded watermark, and how to embed the watermark at low frequency quantization. The theoretical analysis and simulation results show that algorithm flexibility, security, watermark imperceptibility and detection performance of the binary phase-only filter digital watermarking can be effectively improved with quantization based watermark embedding, and the robustness against JPEG compression will also be increased to some extent.

Keywords: binary phase-only filter, discrete Fourier transform, digital watermarking, image authentication, quantization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1503
395 Improving University Operations with Data Mining: Predicting Student Performance

Authors: Mladen Dragičević, Mirjana Pejić Bach, Vanja Šimičević

Abstract:

The purpose of this paper is to develop models that would enable predicting student success. These models could improve allocation of students among colleges and optimize the newly introduced model of government subsidies for higher education. For the purpose of collecting data, an anonymous survey was carried out in the last year of undergraduate degree student population using random sampling method. Decision trees were created of which two have been chosen that were most successful in predicting student success based on two criteria: Grade Point Average (GPA) and time that a student needs to finish the undergraduate program (time-to-degree). Decision trees have been shown as a good method of classification student success and they could be even more improved by increasing survey sample and developing specialized decision trees for each type of college. These types of methods have a big potential for use in decision support systems.

Keywords: Data mining, knowledge discovery in databases, prediction models, student success.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2458
394 Automatic Clustering of Gene Ontology by Genetic Algorithm

Authors: Razib M. Othman, Safaai Deris, Rosli M. Illias, Zalmiyah Zakaria, Saberi M. Mohamad

Abstract:

Nowadays, Gene Ontology has been used widely by many researchers for biological data mining and information retrieval, integration of biological databases, finding genes, and incorporating knowledge in the Gene Ontology for gene clustering. However, the increase in size of the Gene Ontology has caused problems in maintaining and processing them. One way to obtain their accessibility is by clustering them into fragmented groups. Clustering the Gene Ontology is a difficult combinatorial problem and can be modeled as a graph partitioning problem. Additionally, deciding the number k of clusters to use is not easily perceived and is a hard algorithmic problem. Therefore, an approach for solving the automatic clustering of the Gene Ontology is proposed by incorporating cohesion-and-coupling metric into a hybrid algorithm consisting of a genetic algorithm and a split-and-merge algorithm. Experimental results and an example of modularized Gene Ontology in RDF/XML format are given to illustrate the effectiveness of the algorithm.

Keywords: Automatic clustering, cohesion-and-coupling metric, gene ontology; genetic algorithm, split-and-merge algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1922
393 A Self Adaptive Genetic Based Algorithm for the Identification and Elimination of Bad Data

Authors: A. A. Hossam-Eldin, E. N. Abdallah, M. S. El-Nozahy

Abstract:

The identification and elimination of bad measurements is one of the basic functions of a robust state estimator as bad data have the effect of corrupting the results of state estimation according to the popular weighted least squares method. However this is a difficult problem to handle especially when dealing with multiple errors from the interactive conforming type. In this paper, a self adaptive genetic based algorithm is proposed. The algorithm utilizes the results of the classical linearized normal residuals approach to tune the genetic operators thus instead of making a randomized search throughout the whole search space it is more likely to be a directed search thus the optimum solution is obtained at very early stages(maximum of 5 generations). The algorithm utilizes the accumulating databases of already computed cases to reduce the computational burden to minimum. Tests are conducted with reference to the standard IEEE test systems. Test results are very promising.

Keywords: Bad Data, Genetic Algorithms, Linearized Normal residuals, Observability, Power System State Estimation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1313
392 A Pipelined FSBM Hardware Architecture for HTDV-H.26x

Authors: H. Loukil, A. Ben Atitallah, F. Ghozzi, M. A. Ben Ayed, N. Masmoudi

Abstract:

In MPEG and H.26x standards, to eliminate the temporal redundancy we use motion estimation. Given that the motion estimation stage is very complex in terms of computational effort, a hardware implementation on a re-configurable circuit is crucial for the requirements of different real time multimedia applications. In this paper, we present hardware architecture for motion estimation based on "Full Search Block Matching" (FSBM) algorithm. This architecture presents minimum latency, maximum throughput, full utilization of hardware resources such as embedded memory blocks, and combining both pipelining and parallel processing techniques. Our design is described in VHDL language, verified by simulation and implemented in a Stratix II EP2S130F1020C4 FPGA circuit. The experiment result show that the optimum operating clock frequency of the proposed design is 89MHz which achieves 160M pixels/sec.

Keywords: SAD, FSBM, Hardware Implementation, FPGA.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1607
391 A Tree Based Association Rule Approach for XML Data with Semantic Integration

Authors: D. Sasikala, K. Premalatha

Abstract:

The use of eXtensible Markup Language (XML) in web, business and scientific databases lead to the development of methods, techniques and systems to manage and analyze XML data. Semi-structured documents suffer due to its heterogeneity and dimensionality. XML structure and content mining represent convergence for research in semi-structured data and text mining. As the information available on the internet grows drastically, extracting knowledge from XML documents becomes a harder task. Certainly, documents are often so large that the data set returned as answer to a query may also be very big to convey the required information. To improve the query answering, a Semantic Tree Based Association Rule (STAR) mining method is proposed. This method provides intentional information by considering the structure, content and the semantics of the content. The method is applied on Reuter’s dataset and the results show that the proposed method outperforms well.

Keywords: Semi--structured Document, Tree based Association Rule (TAR), Semantic Association Rule Mining.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2301
390 High Capacity Reversible Watermarking through Interpolated Error Shifting

Authors: Hae-Yeoun Lee

Abstract:

Reversible watermarking that not only protects the copyright but also preserve the original quality of the digital content have been intensively studied. In particular, the demand for reversible watermarking has increased. In this paper, we propose a reversible watermarking scheme based on interpolation-error shifting and error pre-compensation. The intensity of a pixel is interpolated from the intensities of neighboring pixels, and the difference histogram between the interpolated and the original intensities is obtained and modified to embed the watermark message. By restoring the difference histogram, the embedded watermark is extracted and the original image is recovered by compensating for the interpolation error. The overflow and underflow are prevented by error pre-compensation. To show the performance of the method, the proposed algorithm is compared with other methods using various test images.

Keywords: Reversible watermarking, High capacity, High quality, Interpolated error shifting, Error pre-compensation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2186
389 A Lossless Watermarking Based Authentication System For Medical Images

Authors: Samia Boucherkha, Mohamed Benmohamed

Abstract:

In this paper we investigate the watermarking authentication when applied to medical imagery field. We first give an overview of watermarking technology by paying attention to fragile watermarking since it is the usual scheme for authentication.We then analyze the requirements for image authentication and integrity in medical imagery, and we show finally that invertible schemes are the best suited for this particular field. A well known authentication method is studied. This technique is then adapted here for interleaving patient information and message authentication code with medical images in a reversible manner, that is using lossless compression. The resulting scheme enables on a side the exact recovery of the original image that can be unambiguously authenticated, and on the other side, the patient information to be saved or transmitted in a confidential way. To ensure greater security the patient information is encrypted before being embedded into images.

Keywords: Medical Imaging, Invertible Watermarking, Authentication, Integrity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2616
388 Additive Manufacturing with Ceramic Filler Concerning Filament Creation and Strength

Authors: Wolfram Irsa, Lorenz Boruch

Abstract:

Innovative solutions in additive manufacturing applying material extrusion for functional parts necessitates innovative filaments with persistent quality. Uniform homogeneity and consistent dispersion of particles embedded in filaments generally require multiple cycles of extrusion or well-prepared primal matter by injection molding, kneader machines, or mixing equipment. These technologies commit to dedicated equipment that are rarely at disposal in production laboratories unfamiliar with research in polymer materials. This stands in contrast to laboratories which investigate on complex material topics and technology science to leverage on the potential of 3-D printing. Consequently, scientific studies in labs are often constrained to compositions and concentrations of fillers offered from the market. Therefore, we present a prototypal laboratory methodology scalable to tailored primal matter for extruding ceramic composite filaments with fused filament fabrication (FFF) technology. A desktop single-screw extruder serves as core device for the experiments. Custom-made filament encapsulates the ceramic fillers and serves with polylactide (PLA), which is a thermoplastic polyester, as primal matter and is processed in the melting area of the extruder preserving the defined concentration of the fillers. Validated results demonstrate that this approach enables continuously produced and uniform composite filaments with consistent homogeneity. It is 3-D printable with controllable dimensions, which is a prerequisite for any scalable application. Additionally, digital microscopy confirms steady dispersion of the ceramic particles in the composite filament. This permits a 2D reconstruction of the planar distribution of the embedded ceramic particles in the PLA matrices. The innovation of the introduced method lies in the smart simplicity of preparing the composite primal matter. It circumvents the inconvenience of numerous extrusion operations and expensive laboratory equipment. Nevertheless, it delivers consistent filaments of controlled, predictable, and reproducible filler concentration, which is the prerequisite for any industrial application. The introduced prototypal laboratory methodology seems capable for other polymer matrices and suitable to further utilitarian particle types, beyond and above of ceramic fillers. This inaugurates a roadmap for supplementary laboratory development of peculiar composite filaments, providing value for industries and societies. This low-threshold entry of sophisticated preparation of composite filaments - enabling businesses creating their own dedicated filaments - will support the mutual efforts for establishing 3D printing to new functional devices.

Keywords: Additive manufacturing, ceramic composites, complex filament, industrial application.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 316
387 Testing Database of Information System using Conceptual Modeling

Authors: Bogdan Walek, Cyril Klimes

Abstract:

This paper focuses on testing database of existing information system. At the beginning we describe the basic problems of implemented databases, such as data redundancy, poor design of database logical structure or inappropriate data types in columns of database tables. These problems are often the result of incorrect understanding of the primary requirements for a database of an information system. Then we propose an algorithm to compare the conceptual model created from vague requirements for a database with a conceptual model reconstructed from implemented database. An algorithm also suggests steps leading to optimization of implemented database. The proposed algorithm is verified by an implemented prototype. The paper also describes a fuzzy system which works with the vague requirements for a database of an information system, procedure for creating conceptual from vague requirements and an algorithm for reconstructing a conceptual model from implemented database.

Keywords: testing, database, relational database, information system, conceptual model, fuzzy, uncertain information, database testing, reconstruction, requirements, optimization

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1411
386 A Normalization-based Robust Watermarking Scheme Using Zernike Moments

Authors: Say Wei Foo, Qi Dong

Abstract:

Digital watermarking has become an important technique for copyright protection but its robustness against attacks remains a major problem. In this paper, we propose a normalizationbased robust image watermarking scheme. In the proposed scheme, original host image is first normalized to a standard form. Zernike transform is then applied to the normalized image to calculate Zernike moments. Dither modulation is adopted to quantize the magnitudes of Zernike moments according to the watermark bit stream. The watermark extracting method is a blind method. Security analysis and false alarm analysis are then performed. The quality degradation of watermarked image caused by the embedded watermark is visually transparent. Experimental results show that the proposed scheme has very high robustness against various image processing operations and geometric attacks.

Keywords: Image watermarking, Image normalization, Zernike moments, Robustness.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1720
385 Face Recognition Based On Vector Quantization Using Fuzzy Neuro Clustering

Authors: Elizabeth B. Varghese, M. Wilscy

Abstract:

A face recognition system is a computer application for automatically identifying or verifying a person from a digital image or a video frame. A lot of algorithms have been proposed for face recognition. Vector Quantization (VQ) based face recognition is a novel approach for face recognition. Here a new codebook generation for VQ based face recognition using Integrated Adaptive Fuzzy Clustering (IAFC) is proposed. IAFC is a fuzzy neural network which incorporates a fuzzy learning rule into a competitive neural network. The performance of proposed algorithm is demonstrated by using publicly available AT&T database, Yale database, Indian Face database and a small face database, DCSKU database created in our lab. In all the databases the proposed approach got a higher recognition rate than most of the existing methods. In terms of Equal Error Rate (ERR) also the proposed codebook is better than the existing methods.

Keywords: Face Recognition, Vector Quantization, Integrated Adaptive Fuzzy Clustering, Self Organization Map.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2199
384 Face Texture Reconstruction for Illumination Variant Face Recognition

Authors: Pengfei Xiong, Lei Huang, Changping Liu

Abstract:

In illumination variant face recognition, existing methods extracting face albedo as light normalized image may lead to loss of extensive facial details, with light template discarded. To improve that, a novel approach for realistic facial texture reconstruction by combining original image and albedo image is proposed. First, light subspaces of different identities are established from the given reference face images; then by projecting the original and albedo image into each light subspace respectively, texture reference images with corresponding lighting are reconstructed and two texture subspaces are formed. According to the projections in texture subspaces, facial texture with normal light can be synthesized. Due to the combination of original image, facial details can be preserved with face albedo. In addition, image partition is applied to improve the synthesization performance. Experiments on Yale B and CMUPIE databases demonstrate that this algorithm outperforms the others both in image representation and in face recognition.

Keywords: texture reconstruction, illumination, face recognition, subspaces

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1446
383 High Performance Electrocardiogram Steganography Based on Fast Discrete Cosine Transform

Authors: Liang-Ta Cheng, Ching-Yu Yang

Abstract:

Based on fast discrete cosine transform (FDCT), the authors present a high capacity and high perceived quality method for electrocardiogram (ECG) signal. By using a simple adjusting policy to the 1-dimentional (1-D) DCT coefficients, a large volume of secret message can be effectively embedded in an ECG host signal and be successfully extracted at the intended receiver. Simulations confirmed that the resulting perceived quality is good, while the hiding capability of the proposed method significantly outperforms that of existing techniques. In addition, our proposed method has a certain degree of robustness. Since the computational complexity is low, it is feasible for our method being employed in real-time applications.

Keywords: Data hiding, ECG steganography, fast discrete cosine transform, 1-D DCT bundle, real-time applications.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 758
382 Key Based Text Watermarking of E-Text Documents in an Object Based Environment Using Z-Axis for Watermark Embedding

Authors: Mussarat Abdullah, Fazal Wahab

Abstract:

Data hiding into text documents itself involves pretty complexities due to the nature of text documents. A robust text watermarking scheme targeting an object based environment is presented in this research. The heart of the proposed solution describes the concept of watermarking an object based text document where each and every text string is entertained as a separate object having its own set of properties. Taking advantage of the z-ordering of objects watermark is applied with the z-axis letting zero fidelity disturbances to the text. Watermark sequence of bits generated against user key is hashed with selected properties of given document, to determine the bit sequence to embed. Bits are embedded along z-axis and the document has no fidelity issues when printed, scanned or photocopied.

Keywords: Digital Watermarking, Object Based Environment, Watermark, z-ordering.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1661
381 Terrain Evaluation Method for Hexapod Robot

Authors: Tomas Luneckas, Dainius Udris

Abstract:

In this paper a simple terrain evaluation method for hexapod robot is introduced. This method is based on feet coordinate evaluation when all are on the ground. Depending on the feet coordinate differences the local terrain evaluation is possible. Terrain evaluation is necessary for right gait selection and/or body position correction. For terrain roughness evaluation three planes are plotted: two of them as definition points use opposite feet coordinates, third coincides with the robot body plane. The leaning angle of body plane is evaluated measuring gravity force using three-axis accelerometer. Terrain roughness evaluation method is based on angle estimation between normal vectors of these planes. Aim of this work is to present a simple method for embedded robot controller, allowing to find the best further movement settings.

Keywords: Hexapod robot, pose estimation, terrain evaluation, terrain roughness.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1761
380 A Materialized Approach to the Integration of XML Documents: the OSIX System

Authors: H. Ahmad, S. Kermanshahani, A. Simonet, M. Simonet

Abstract:

The data exchanged on the Web are of different nature from those treated by the classical database management systems; these data are called semi-structured data since they do not have a regular and static structure like data found in a relational database; their schema is dynamic and may contain missing data or types. Therefore, the needs for developing further techniques and algorithms to exploit and integrate such data, and extract relevant information for the user have been raised. In this paper we present the system OSIX (Osiris based System for Integration of XML Sources). This system has a Data Warehouse model designed for the integration of semi-structured data and more precisely for the integration of XML documents. The architecture of OSIX relies on the Osiris system, a DL-based model designed for the representation and management of databases and knowledge bases. Osiris is a viewbased data model whose indexing system supports semantic query optimization. We show that the problem of query processing on a XML source is optimized by the indexing approach proposed by Osiris.

Keywords: Data integration, semi-structured data, views, XML.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1554
379 Proximate and Mineral Composition of Chicken Giblets from Vojvodina (Northern Serbia)

Authors: M. R. Jokanović, V. M. Tomović, M. T. Jović, S. B. Škaljac, B. V. Šojić, P. M. Ikonić, T. A. Tasić

Abstract:

Proximate (moisture, protein, total fat, total ash) and mineral (K, P, Na, Mg, Ca, Zn, Fe, Cu and Mn) composition of chicken giblets (heart, liver and gizzard) were investigated. Phosphorous content, as well as proximate composition, were determined according to recommended ISO methods. The content of all elements, except phosphorus, of the giblets tissues were determined using inductively coupled plasma-optical emission spectrometry (ICP-OES), after dry ashing mineralization. Regarding proximate composition heart was the highest in total fat content, and the lowest in protein content. Liver was the highest in protein and total ash content, while gizzard was the highest in moisture and the lowest in total fat content. Regarding mineral composition liver was the highest for K, P, Ca, Mg, Fe, Zn, Cu, and Mn, while heart was the highest for Na content. The contents of almost all investigated minerals in analysed giblets tissues of chickens from Vojvodina were similar to values reported in the literature, i.e. in national food composition databases of other countries.

Keywords: Chicken giblets, proximate composition, mineral composition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2667
378 Nafion Nanofiber Composite Membrane Fabrication for Fuel Cell Applications

Authors: C. N. Okafor, M. Maaza, T. A. E. Mokrani

Abstract:

A proton exchange membrane has been developed for direct methanol fuel cell (DMFC). The nanofiber network composite membranes were prepared by interconnected network of Nafion (perfuorosulfonic acid) nanofibers that have been embedded in an uncharged and inert polymer matrix, by electro-spinning. The spinning solution of Nafion with a low concentration (1 wt% compared to Nafion) of high molecular weight poly(ethylene oxide), as a carrier polymer. The interconnected network of Nafion nanofibers with average fiber diameter in the range of 160-700nm, were used to make the membranes, with the nanofiber occupying up to 85% of the membrane volume. The matrix polymer was crosslinked with Norland Optical Adhesive 63 under UV. The resulting membranes showed proton conductivity of 0.10 S/cm at 25°C and 80% RH; and methanol permeability of 3.6 x 10-6 cm2/s.

Keywords: Composite membrane, electrospinning, fuel cell, nanofibers.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2865
377 Integrating Security Indifference Curve to Formal Decision Evaluation

Authors: Anon Yantarasri, Yachai Limpiyakorn

Abstract:

Decisions are regularly made during a project or daily life. Some decisions are critical and have a direct impact on project or human success. Formal evaluation is thus required, especially for crucial decisions, to arrive at the optimal solution among alternatives to address issues. According to microeconomic theory, all people-s decisions can be modeled as indifference curves. The proposed approach supports formal analysis and decision by constructing indifference curve model from the previous experts- decision criteria. These knowledge embedded in the system can be reused or help naïve users select alternative solution of the similar problem. Moreover, the method is flexible to cope with unlimited number of factors influencing the decision-making. The preliminary experimental results of the alternative selection are accurately matched with the expert-s decisions.

Keywords: Decision Analysis and Resolution, Indifference Curve, Multi-criteria Decision Making.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1584
376 An Enhanced Particle Swarm Optimization Algorithm for Multiobjective Problems

Authors: Houda Abadlia, Nadia Smairi, Khaled Ghedira

Abstract:

Multiobjective Particle Swarm Optimization (MOPSO) has shown an effective performance for solving test functions and real-world optimization problems. However, this method has a premature convergence problem, which may lead to lack of diversity. In order to improve its performance, this paper presents a hybrid approach which embedded the MOPSO into the island model and integrated a local search technique, Variable Neighborhood Search, to enhance the diversity into the swarm. Experiments on two series of test functions have shown the effectiveness of the proposed approach. A comparison with other evolutionary algorithms shows that the proposed approach presented a good performance in solving multiobjective optimization problems.

Keywords: Particle swarm optimization, migration, variable neighborhood search, multiobjective optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 760
375 Modeling and Simulations of Complex Low- Dimensional systems: Testing the Efficiency of Parallelization

Authors: Ryszard Matysiak, Grzegorz Kamieniarz

Abstract:

The deterministic quantum transfer-matrix (QTM) technique and its mathematical background are presented. This important tool in computational physics can be applied to a class of the real physical low-dimensional magnetic systems described by the Heisenberg hamiltonian which includes the macroscopic molecularbased spin chains, small size magnetic clusters embedded in some supramolecules and other interesting compounds. Using QTM, the spin degrees of freedom are accurately taken into account, yielding the thermodynamical functions at finite temperatures. In order to test the application for the susceptibility calculations to run in the parallel environment, the speed-up and efficiency of parallelization are analyzed on our platform SGI Origin 3800 with p = 128 processor units. Using Message Parallel Interface (MPI) system libraries we find the efficiency of the code of 94% for p = 128 that makes our application highly scalable.

Keywords: Deterministic simulations, low-dimensional magnets, modeling of complex systems, parallelization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1573
374 Numerical Simulation of Fiber Bragg Grating Spectrum for Mode-І Delamination Detection

Authors: O. Hassoon, M. Tarfoui, A. El Malk

Abstract:

Fiber Bragg optic sensor is embedded in composite material to detect and monitor the damage that occurs in composite structures. In this paper, we deal with the mode-Ι delamination to determine the material strength to crack propagation, using the coupling mode theory and T-matrix method to simulate the FBGs spectrum for both uniform and non-uniform strain distribution. The double cantilever beam test is modeled in FEM to determine the longitudinal strain. Two models are implemented, the first is the global half model, and the second is the sub-model to represent the FBGs with higher refined mesh. This method can simulate damage in composite structures and converting strain to a wavelength shifting in the FBG spectrum.

Keywords: Fiber Bragg grating, Delamination detection, DCB, FBG spectrum, Structure health monitoring.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6503
373 A Novel Implementation of Application Specific Instruction-set Processor (ASIP) using Verilog

Authors: Kamaraju.M, Lal Kishore.K, Tilak.A.V.N

Abstract:

The general purpose processors that are used in embedded systems must support constraints like execution time, power consumption, code size and so on. On the other hand an Application Specific Instruction-set Processor (ASIP) has advantages in terms of power consumption, performance and flexibility. In this paper, a 16-bit Application Specific Instruction-set processor for the sensor data transfer is proposed. The designed processor architecture consists of on-chip transmitter and receiver modules along with the processing and controlling units to enable the data transmission and reception on a single die. The data transfer is accomplished with less number of instructions as compared with the general purpose processor. The ASIP core operates at a maximum clock frequency of 1.132GHz with a delay of 0.883ns and consumes 569.63mW power at an operating voltage of 1.2V. The ASIP is implemented in Verilog HDL using the Xilinx platform on Virtex4.

Keywords: ASIP, Data transfer, Instruction set, Processor

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2035
372 Design of a Novel Inclination Sensor Utilizing Grayscale Image

Authors: Tuhin Subhra Sarkar, Subir Das

Abstract:

Several research works have been done in recent times utilizing grayscale image for the measurement of many physical phenomena. In this present paper, we have designed an embedded based inclination sensor utilizing the grayscale image with a resolution of 0.3º. The sensor module consists of a circular shaped metal disc, laminated with grayscale image and an optical transreceiver. The sensor principle is based on temporal changes in light intensity by the movement of grayscale image with the inclination of the target surface and the variation of light intensity has been detected in terms of voltage by the signal processing circuit (SPC).The output of SPC is fed to a microcontroller program to display the inclination angel digitally. The experimental results are shown a satisfactory performance of the sensor in a small inclination measuring range of -40º to + 40º with a sensitivity of 62 mV/°.

Keywords: Grayscale image, Inclination Sensor, Microcontroller Program, Signal Processing Circuit.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1784
371 Bottom Up Text Mining through Hierarchical Document Representation

Authors: Y. Djouadi., F. Souam.

Abstract:

Most of the existing text mining approaches are proposed, keeping in mind, transaction databases model. Thus, the mined dataset is structured using just one concept: the “transaction", whereas the whole dataset is modeled using the “set" abstract type. In such cases, the structure of the whole dataset and the relationships among the transactions themselves are not modeled and consequently, not considered in the mining process. We believe that taking into account structure properties of hierarchically structured information (e.g. textual document, etc ...) in the mining process, can leads to best results. For this purpose, an hierarchical associations rule mining approach for textual documents is proposed in this paper and the classical set-oriented mining approach is reconsidered profits to a Direct Acyclic Graph (DAG) oriented approach. Natural languages processing techniques are used in order to obtain the DAG structure. Based on this graph model, an hierarchical bottom up algorithm is proposed. The main idea is that each node is mined with its parent node.

Keywords: Graph based association rules mining, Hierarchical document structure, Text mining.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2016
370 Professionals’ Collaboration on Strengthening the Teaching of History

Authors: L. B. Ni, N. S. Bt Rohadi, H. Bt Alfana, A. S. Bin Ali Hassan, J. Bin Karim, C. Bt Rasin

Abstract:

This paper discusses the shared effort of teaching history in K-12 schools, community colleges, four-year colleges and universities to develop students' understanding of the history and habits of thought history. This study presents and discusses the problems of K-12 schools in colleges and universities, and the establishment of secondary school principals. This study also shows that the changing nature of practice can define new trends and affect the history professional in the classroom. There are many problems that historians and teachers of college faculty share in the history of high school teachers. History teachers can and should do better to get students in the classroom. History provides valuable insights into the information and embedded solid-state analysis models that are conflicting on the planet and are quickly changing exceptionally valuable. The survey results can reflect the history teaching in Malaysia.

Keywords: History issue, history teaching, school-university collaboration, history profession.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1050
369 Localizing Acoustic Touch Impacts using Zip-stuffing in Complex k-space Domain

Authors: R. Bremananth, Andy W. H. Khong, A. Chitra

Abstract:

Visualizing sound and noise often help us to determine an appropriate control over the source localization. Near-field acoustic holography (NAH) is a powerful tool for the ill-posed problem. However, in practice, due to the small finite aperture size, the discrete Fourier transform, FFT based NAH couldn-t predict the activeregion- of-interest (AROI) over the edges of the plane. Theoretically few approaches were proposed for solving finite aperture problem. However most of these methods are not quite compatible for the practical implementation, especially near the edge of the source. In this paper, a zip-stuffing extrapolation approach has suggested with 2D Kaiser window. It is operated on wavenumber complex space to localize the predicted sources. We numerically form a practice environment with touch impact databases to test the localization of sound source. It is observed that zip-stuffing aperture extrapolation and 2D window with evanescent components provide more accuracy especially in the small aperture and its derivatives.

Keywords: Acoustic source localization, Near-field acoustic holography (NAH), FFT, Extrapolation, k-space wavenumber errors.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1616
368 Image Retrieval Based on Multi-Feature Fusion for Heterogeneous Image Databases

Authors: N. W. U. D. Chathurani, Shlomo Geva, Vinod Chandran, Proboda Rajapaksha

Abstract:

Selecting an appropriate image representation is the most important factor in implementing an effective Content-Based Image Retrieval (CBIR) system. This paper presents a multi-feature fusion approach for efficient CBIR, based on the distance distribution of features and relative feature weights at the time of query processing. It is a simple yet effective approach, which is free from the effect of features' dimensions, ranges, internal feature normalization and the distance measure. This approach can easily be adopted in any feature combination to improve retrieval quality. The proposed approach is empirically evaluated using two benchmark datasets for image classification (a subset of the Corel dataset and Oliva and Torralba) and compared with existing approaches. The performance of the proposed approach is confirmed with the significantly improved performance in comparison with the independently evaluated baseline of the previously proposed feature fusion approaches.

Keywords: Feature fusion, image retrieval, membership function, normalization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1307