Search results for: General-Purpose computation on Graphics Processing Units
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2429

Search results for: General-Purpose computation on Graphics Processing Units

809 Feature's Extraction of Human Body Composition in Images by Segmentation Method

Authors: Mousa Mojarrad, Mashallah Abbasi Dezfouli, Amir Masoud Rahmani

Abstract:

Detection and recognition of the Human Body Composition and extraction their measures (width and length of human body) in images are a major issue in detecting objects and the important field in Image, Signal and Vision Computing in recent years. Finding people and extraction their features in Images are particularly important problem of object recognition, because people can have high variability in the appearance. This variability may be due to the configuration of a person (e.g., standing vs. sitting vs. jogging), the pose (e.g. frontal vs. lateral view), clothing, and variations in illumination. In this study, first, Human Body is being recognized in image then the measures of Human Body extract from the image.

Keywords: Analysis of image processing, canny edge detection, classification, feature extraction, human body recognition, segmentation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2765
808 Vision-based Network System for Industrial Applications

Authors: Taweepol Suesut, Arjin Numsomran, Vittaya Tipsuwanporn

Abstract:

This paper presents the communication network for machine vision system to implement to control systems and logistics applications in industrial environment. The real-time distributed over the network is very important for communication among vision node, image processing and control as well as the distributed I/O node. A robust implementation both with respect to camera packaging and data transmission has been accounted. This network consists of a gigabit Ethernet network and a switch with integrated fire-wall is used to distribute the data and provide connection to the imaging control station and IEC-61131 conform signal integration comprising the Modbus TCP protocol. The real-time and delay time properties each part on the network were considered and worked out in this paper.

Keywords: Distributed Real-Time Automation, Machine Visionand Ethernet.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1659
807 A Feature-based Invariant Watermarking Scheme Using Zernike Moments

Authors: Say Wei Foo, Qi Dong

Abstract:

In this paper, a novel feature-based image watermarking scheme is proposed. Zernike moments which have invariance properties are adopted in the scheme. In the proposed scheme, feature points are first extracted from host image and several circular patches centered on these points are generated. The patches are used as carriers of watermark information because they can be regenerated to locate watermark embedding positions even when watermarked images are severely distorted. Zernike transform is then applied to the patches to calculate local Zernike moments. Dither modulation is adopted to quantize the magnitudes of the Zernike moments followed by false alarm analysis. Experimental results show that quality degradation of watermarked image is visually transparent. The proposed scheme is very robust against image processing operations and geometric attacks.

Keywords: Image watermarking, Zernike moments, Featurepoint, Invariance, Robustness.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1844
806 New VLSI Architecture for Motion Estimation Algorithm

Authors: V. S. K. Reddy, S. Sengupta, Y. M. Latha

Abstract:

This paper presents an efficient VLSI architecture design to achieve real time video processing using Full-Search Block Matching (FSBM) algorithm. The design employs parallel bank architecture with minimum latency, maximum throughput, and full hardware utilization. We use nine parallel processors in our architecture and each controlled by a state machine. State machine control implementation makes the design very simple and cost effective. The design is implemented using VHDL and the programming techniques we incorporated makes the design completely programmable in the sense that the search ranges and the block sizes can be varied to suit any given requirements. The design can operate at frequencies up to 36 MHz and it can function in QCIF and CIF video resolution at 1.46 MHz and 5.86 MHz, respectively.

Keywords: Video Coding, Motion Estimation, Full-Search, Block-Matching, VLSI Architecture.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1804
805 Member Investment Willingness in Agricultural Cooperatives in Shaanxi (China)

Authors: Lijia Wang, Xuexi Huo

Abstract:

This study analyzes characteristics determining member’s willingness to invest in cooperatives using ordered logit model. The data were collected in a field survey among 122 cooperative members in north-central China. The descriptive analysis of survey evidence suggests that cooperatives in China generally having poor ability to deliver the processing services related to product package, grading, and storage, performing worse in profitability, inability of providing returns to capital and obtaining agricultural loan. The regression results demonstrate that members’ farm size, their satisfaction with cooperative price preferential services, attitudes toward cooperative operational scale and development potential have statistically significant impact on willingness to invest.

Keywords: Cooperatives, investment willingness, member, ordered logit.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1696
804 Digital Cinema Watermarking State of Art and Comparison

Authors: H. Kelkoul, Y. Zaz

Abstract:

Nowadays, the vigorous popularity of video processing techniques has resulted in an explosive growth of multimedia data illegal use. So, watermarking security has received much more attention. The purpose of this paper is to explore some watermarking techniques in order to observe their specificities and select the finest methods to apply in digital cinema domain against movie piracy by creating an invisible watermark that includes the date, time and the place where the hacking was done. We have studied three principal watermarking techniques in the frequency domain: Spread spectrum, Wavelet transform domain and finally the digital cinema watermarking transform domain. In this paper, a detailed technique is presented where embedding is performed using direct sequence spread spectrum technique in DWT transform domain. Experiment results shows that the algorithm provides high robustness and good imperceptibility.

Keywords: Digital cinema, watermarking, wavelet, spread spectrum, JPEG2000.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1180
803 Parametric Investigation of Diode and CO2 Laser in Direct Metal Deposition of H13 Tool Steel on Copper Substrate

Authors: M. Khalid Imran, Syed Masood, Milan Brandt, Sudip Bhattacharya, Jyotirmoy Mazumder

Abstract:

In the present investigation, H13 tool steel has been deposited on copper alloy substrate using both CO2 and diode laser. A detailed parametric analysis has been carried out in order to find out optimum processing zone for coating defect free H13 tool steel on copper alloy substrate. Followed by parametric optimization, the microstructure and microhardness of the deposited clads have been evaluated. SEM micrographs revealed dendritic microstructure in both clads. However, the microhardness of CO2 laser deposited clad was much higher compared to diode laser deposited clad.

Keywords: CO2 laser, Diode laser, Direct Metal Deposition, Microstructure, Microhardness, Porosity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1994
802 Index t-SNE: Tracking Dynamics of High-Dimensional Datasets with Coherent Embeddings

Authors: G. Candel, D. Naccache

Abstract:

t-SNE is an embedding method that the data science community has widely used. It helps two main tasks: to display results by coloring items according to the item class or feature value; and for forensic, giving a first overview of the dataset distribution. Two interesting characteristics of t-SNE are the structure preservation property and the answer to the crowding problem, where all neighbors in high dimensional space cannot be represented correctly in low dimensional space. t-SNE preserves the local neighborhood, and similar items are nicely spaced by adjusting to the local density. These two characteristics produce a meaningful representation, where the cluster area is proportional to its size in number, and relationships between clusters are materialized by closeness on the embedding. This algorithm is non-parametric. The transformation from a high to low dimensional space is described but not learned. Two initializations of the algorithm would lead to two different embedding. In a forensic approach, analysts would like to compare two or more datasets using their embedding. A naive approach would be to embed all datasets together. However, this process is costly as the complexity of t-SNE is quadratic, and would be infeasible for too many datasets. Another approach would be to learn a parametric model over an embedding built with a subset of data. While this approach is highly scalable, points could be mapped at the same exact position, making them indistinguishable. This type of model would be unable to adapt to new outliers nor concept drift. This paper presents a methodology to reuse an embedding to create a new one, where cluster positions are preserved. The optimization process minimizes two costs, one relative to the embedding shape and the second relative to the support embedding’ match. The embedding with the support process can be repeated more than once, with the newly obtained embedding. The successive embedding can be used to study the impact of one variable over the dataset distribution or monitor changes over time. This method has the same complexity as t-SNE per embedding, and memory requirements are only doubled. For a dataset of n elements sorted and split into k subsets, the total embedding complexity would be reduced from O(n2) to O(n2/k), and the memory requirement from n2 to 2(n/k)2 which enables computation on recent laptops. The method showed promising results on a real-world dataset, allowing to observe the birth, evolution and death of clusters. The proposed approach facilitates identifying significant trends and changes, which empowers the monitoring high dimensional datasets’ dynamics.

Keywords: Concept drift, data visualization, dimension reduction, embedding, monitoring, reusability, t-SNE, unsupervised learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 484
801 Solving Process Planning and Scheduling with Number of Operation Plus Processing Time Due-Date Assignment Concurrently Using a Genetic Search

Authors: Halil Ibrahim Demir, Alper Goksu, Onur Canpolat, Caner Erden, Melek Nur

Abstract:

Traditionally process planning, scheduling and due date assignment are performed sequentially and separately. High interrelation between these functions makes integration very useful. Although there are numerous works on integrated process planning and scheduling and many works on scheduling with due date assignment, there are only a few works on the integration of these three functions. Here we tested the different integration levels of these three functions and found a fully integrated version as the best. We applied genetic search and random search and genetic search was found better compared to the random search. We penalized all earliness, tardiness and due date related costs. Since all these three terms are all undesired, it is better to penalize all of them.

Keywords: Process planning, scheduling, due-date assignment, genetic algorithm, random search.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 833
800 A Pipelined FSBM Hardware Architecture for HTDV-H.26x

Authors: H. Loukil, A. Ben Atitallah, F. Ghozzi, M. A. Ben Ayed, N. Masmoudi

Abstract:

In MPEG and H.26x standards, to eliminate the temporal redundancy we use motion estimation. Given that the motion estimation stage is very complex in terms of computational effort, a hardware implementation on a re-configurable circuit is crucial for the requirements of different real time multimedia applications. In this paper, we present hardware architecture for motion estimation based on "Full Search Block Matching" (FSBM) algorithm. This architecture presents minimum latency, maximum throughput, full utilization of hardware resources such as embedded memory blocks, and combining both pipelining and parallel processing techniques. Our design is described in VHDL language, verified by simulation and implemented in a Stratix II EP2S130F1020C4 FPGA circuit. The experiment result show that the optimum operating clock frequency of the proposed design is 89MHz which achieves 160M pixels/sec.

Keywords: SAD, FSBM, Hardware Implementation, FPGA.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1637
799 A New Floating Point Implementation of Base 2 Logarithm

Authors: Ahmed M. Mansour, Ali M. El-Sawy, Ahmed T Sayed

Abstract:

Logarithms reduce products to sums and powers to products; they play an important role in signal processing, communication and information theory. They are primarily used for hardware calculations, handling multiplications, divisions, powers, and roots effectively. There are three commonly used bases for logarithms; the logarithm with base-10 is called the common logarithm, the natural logarithm with base-e and the binary logarithm with base-2. This paper demonstrates different methods of calculation for log2 showing the complexity of each and finds out the most accurate and efficient besides giving insights to their hardware design. We present a new method called Floor Shift for fast calculation of log2, and then we combine this algorithm with Taylor series to improve the accuracy of the output, we illustrate that by using two examples. We finally compare the algorithms and conclude with our remarks.

Keywords: Logarithms, log2, floor, iterative, CORDIC, Taylor series.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3817
798 The Necessity of Optimized Management on Surface Water Sources of Zayanderood Basin

Authors: A. Gandomkar, K. Fouladi

Abstract:

One of the efficient factors in comprehensive development of an area is to provide water sources and on the other hand the appropriate management of them. Population growth and nourishment security for such a population necessitate the achievement of constant development besides the reforming of traditional management in order to increase the profit of sources; In this case, the constant exploitation of sources for the next generations will be considered in this program. The achievement of this development without the consideration and possibility of water development will be too difficult. Zayanderood basin with 41500 areas in square kilometers contains 7 sub-basins and 20 units of hydrologic. In this basin area, from the entire environment descending, just a small part will enter into the river currents and the rest will be out of efficient usage by various ways. The most important surface current of this basin is Zayanderood River with 403 kilometers length which is originated from east slopes of Zagros mount and after draining of this basin area it will enter into Gaavkhooni pond. The existence of various sources and consumptions of water in Zayanderood basin, water transfer of the other basin areas into this basin, of course the contradiction between the upper and lower beneficiaries, the existence of worthwhile natural ecosystems such as Gaavkhooni swamp in this basin area and finally, the drought condition and lack of water in this area all necessitate the existence of comprehensive management of water sources in this central basin area of Iran as this method is a kind of management which considers the development and the management of water sources as an equilibrant way to increase the economical and social benefits. In this study, it is tried to survey the network of surface water sources of basin in upper and lower sections; at the most, according to the difficulties and deficiencies of an efficient management of water sources in this basin area, besides the difficulties of water draining and the destructive phenomenon of flood-water, the appropriate guidelines according to the region conditions are presented in order to prevent the deviation of water in upper sections and development of regions in lower sections of Zayanderood dam.

Keywords: Zayanderood Basin, Efficient Management, Hydrology Climate.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1606
797 Consistent Modeling of Functional Dependencies along with World Knowledge

Authors: Sven Rebhan, Nils Einecke, Julian Eggert

Abstract:

In this paper we propose a method for vision systems to consistently represent functional dependencies between different visual routines along with relational short- and long-term knowledge about the world. Here the visual routines are bound to visual properties of objects stored in the memory of the system. Furthermore, the functional dependencies between the visual routines are seen as a graph also belonging to the object-s structure. This graph is parsed in the course of acquiring a visual property of an object to automatically resolve the dependencies of the bound visual routines. Using this representation, the system is able to dynamically rearrange the processing order while keeping its functionality. Additionally, the system is able to estimate the overall computational costs of a certain action. We will also show that the system can efficiently use that structure to incorporate already acquired knowledge and thus reduce the computational demand.

Keywords: Adaptive systems, Knowledge representation, Machinevision, Systems engineering

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1691
796 Analysis and Measuring Surface Roughness of Nonwovens Using Machine Vision Method

Authors: Dariush Semnani, Javad Yekrang, Hossein Ghayoor

Abstract:

Concerning the measurement of friction properties of textiles and fabrics using Kawabata Evaluation System (KES), whose output is constrained to the surface friction factor of fabric, and no other data would be generated; this research has been conducted to gain information about surface roughness regarding its surface friction factor. To assess roughness properties of light nonwovens, a 3-dimensional model of a surface has been simulated with regular sinuous waves through it as an ideal surface. A new factor was defined, namely Surface Roughness Factor, through comparing roughness properties of simulated surface and real specimens. The relation between the proposed factor and friction factor of specimens has been analyzed by regression, and results showed a meaningful correlation between them. It can be inferred that the new presented factor can be used as an acceptable criterion for evaluating the roughness properties of light nonwoven fabrics.

Keywords: Surface roughness, Nonwoven, Machine vision, Image processing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3086
795 On a Pitch Duration Technique for Prosody Control

Authors: JongKuk Kim, HernSoo Hahn, Uei-Joong Yoo, MyungJin Bae

Abstract:

In this paper, we propose a method of alter duration in frequency domain that control prosody in real time after pitch alteration. If there has a method to alteration duration freely among prosody information, that may used in several fields such as speech impediment person's pronunciation proof reading or language study. The pitch alteration method used control prosody altered by PSOLA synthesis method which is in time domain processing method. However, the duration of pitch alteration speech is changed by the frequency domain. In this paper, we altered the duration with the method of duration alteration by Fast Fourier Transformation in frequency domain. Consequently, the intelligibility of the pitch and duration are controlled has a slight decrease than the case when only pitch is changed, but the proposed algorithm obtained the higher MOS score about naturalness.

Keywords: PSOLA, Pitch Alteration, Duration Control.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1678
794 Review of Surface Electromyogram Signals: Its Analysis and Applications

Authors: Anjana Goen, D. C. Tiwari

Abstract:

Electromyography (EMG) is the study of muscles function through analysis of electrical activity produced from muscles. This electrical activity which is displayed in the form of signal is the result of neuromuscular activation associated with muscle contraction. The most common techniques of EMG signal recording are by using surface and needle/wire electrode where the latter is usually used for interest in deep muscle. This paper will focus on surface electromyogram (SEMG) signal. During SEMG recording, several problems had to been countered such as noise, motion artifact and signal instability. Thus, various signal processing techniques had been implemented to produce a reliable signal for analysis. SEMG signal finds broad application particularly in biomedical field. It had been analyzed and studied for various interests such as neuromuscular disease, enhancement of muscular function and human-computer interface.

Keywords: Evolvable hardware (EHW), Functional Electrical Simulation (FES), Hidden Markov Model (HMM), Hjorth Time Domain (HTD).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3513
793 Particle Image Velocimetry for Measuring Water Flow Velocity

Authors: King Kuok Kuok, Po Chan Chiu

Abstract:

Floods are natural phenomena, which may turn into disasters causing widespread damage, health problems and even deaths. Nowadays, floods had become more serious and more frequent due to climatic changes. During flooding, discharge measurement still can be taken by standing on the bridge across the river using portable measurement instrument. However, it is too dangerous to get near to the river especially during high flood. Therefore, this study employs Particle Image Velocimetry (PIV) as a tool to measure the surface flow velocity. PIV is a image processing technique to track the movement of water from one point to another. The PIV codes are developed using Matlab. In this study, 18 ping pong balls were scattered over the surface of the drain and images were taken with a digital SLR camera. The images obtained were analyzed using the PIV code. Results show that PIV is able to produce the flow velocity through analyzing the series of images captured.

Keywords: Particle Image Velocimetry, flow velocity, surface flow.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2837
792 Exploring Inter-Relationships between Events to Identify Strategic Technological Competencies: A Combined Approach

Authors: Cláudio Santos, Madalena Araújo, Nuno Correia

Abstract:

The inherent complexity in nowadays- business environments is forcing organizations to be attentive to the dynamics in several fronts. Therefore, the management of technological innovation is continually faced with uncertainty about the future. These issues lead to a need for a systemic perspective, able to analyze the consequences of interactions between different factors. The field of technology foresight has proposed methods and tools to deal with this broader perspective. In an attempt to provide a method to analyze the complex interactions between events in several areas, departing from the identification of the most strategic competencies, this paper presents a methodology based on the Delphi method and Quality Function Deployment. This methodology is applied in a sheet metal processing equipment manufacturer, as a case study.

Keywords: Competencies, Delphi Method, Quality Function Deployment, Technology Foresight.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1695
791 A Normalization-based Robust Watermarking Scheme Using Zernike Moments

Authors: Say Wei Foo, Qi Dong

Abstract:

Digital watermarking has become an important technique for copyright protection but its robustness against attacks remains a major problem. In this paper, we propose a normalizationbased robust image watermarking scheme. In the proposed scheme, original host image is first normalized to a standard form. Zernike transform is then applied to the normalized image to calculate Zernike moments. Dither modulation is adopted to quantize the magnitudes of Zernike moments according to the watermark bit stream. The watermark extracting method is a blind method. Security analysis and false alarm analysis are then performed. The quality degradation of watermarked image caused by the embedded watermark is visually transparent. Experimental results show that the proposed scheme has very high robustness against various image processing operations and geometric attacks.

Keywords: Image watermarking, Image normalization, Zernike moments, Robustness.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1751
790 Road Vehicle Recognition Using Magnetic Sensing Feature Extraction and Classification

Authors: Xiao Chen, Xiaoying Kong, Min Xu

Abstract:

This paper presents a road vehicle detection approach for the intelligent transportation system. This approach mainly uses low-cost magnetic sensor and associated data collection system to collect magnetic signals. This system can measure the magnetic field changing, and it also can detect and count vehicles. We extend Mel Frequency Cepstral Coefficients to analyze vehicle magnetic signals. Vehicle type features are extracted using representation of cepstrum, frame energy, and gap cepstrum of magnetic signals. We design a 2-dimensional map algorithm using Vector Quantization to classify vehicle magnetic features to four typical types of vehicles in Australian suburbs: sedan, VAN, truck, and bus. Experiments results show that our approach achieves a high level of accuracy for vehicle detection and classification.

Keywords: Vehicle classification, signal processing, road traffic model, magnetic sensing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1397
789 Food Safety Aspects of Pesticide Residues in Spice Paprika

Authors: Sz. Klátyik, B. Darvas, M. Mörtl, M. Ottucsák, E. Takács, H. Bánáti, L. Simon, G. Gyurcsó, A. Székács

Abstract:

Environmental and health safety of condiments used for spicing food products in food processing or by culinary means receive relatively low attention, even though possible contamination of spices may affect food quality and safety. Contamination surveys mostly focus on microbial contaminants or their secondary metabolites, mycotoxins. Chemical contaminants, particularly pesticide residues, however, are clearly substantial factors in the case of given condiments in the Capsicum family including spice paprika and chilli. To assess food safety and support the quality of the Hungaricum product spice paprika, the pesticide residue status of spice paprika and chilli is assessed on the basis of reported pesticide contamination cases and non-compliances in the Rapid Alert System for Food and Feed of the European Union since 1998.

Keywords: Spice paprika, Capsicum, pesticide residues, RASFF.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2397
788 Efficient CT Image Volume Rendering for Diagnosis

Authors: HaeNa Lee, Sun K. Yoo

Abstract:

Volume rendering is widely used in medical CT image visualization. Applying 3D image visualization to diagnosis application can require accurate volume rendering with high resolution. Interpolation is important in medical image processing applications such as image compression or volume resampling. However, it can distort the original image data because of edge blurring or blocking effects when image enhancement procedures were applied. In this paper, we proposed adaptive tension control method exploiting gradient information to achieve high resolution medical image enhancement in volume visualization, where restored images are similar to original images as much as possible. The experimental results show that the proposed method can improve image quality associated with the adaptive tension control efficacy.

Keywords: Tension control, Interpolation, Ray-casting, Medical imaging analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2366
787 Automatic Classification of Initial Categories of Alzheimer's Disease from Structural MRI Phase Images: A Comparison of PSVM, KNN and ANN Methods

Authors: Ahsan Bin Tufail, Ali Abidi, Adil Masood Siddiqui, Muhammad Shahzad Younis

Abstract:

An early and accurate detection of Alzheimer's disease (AD) is an important stage in the treatment of individuals suffering from AD. We present an approach based on the use of structural magnetic resonance imaging (sMRI) phase images to distinguish between normal controls (NC), mild cognitive impairment (MCI) and AD patients with clinical dementia rating (CDR) of 1. Independent component analysis (ICA) technique is used for extracting useful features which form the inputs to the support vector machines (SVM), K nearest neighbour (kNN) and multilayer artificial neural network (ANN) classifiers to discriminate between the three classes. The obtained results are encouraging in terms of classification accuracy and effectively ascertain the usefulness of phase images for the classification of different stages of Alzheimer-s disease.

Keywords: Biomedical image processing, classification algorithms, feature extraction, statistical learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2760
786 Adhesion Problematic for Novel Non-Crimp Fabric and Surface Modification of Carbon-Fibres Using Oxy-Fluorination

Authors: Iris Käppler, Paul Matthäi, Chokri Cherif

Abstract:

In the scope of application of technical textiles, Non- Crimp Fabrics are increasingly used. In general, NCF exhibit excellent load bearing properties, but caused by the manufacturing process, there are some remaining disadvantages which have to be reduced. Regarding to this, a novel technique of processing NCF was developed substituting the binding-thread by an adhesive. This stitchfree method requires new manufacturing concept as well as new basic methods to prove adhesion of glue at fibres and textiles. To improve adhesion properties and the wettability of carbon-fibres by the adhesive, oxy-fluorination was used. The modification of carbonfibres by oxy-fluorination was investigated via scanning electron microscope, X-ray photoelectron spectroscopy and single fibre tensiometry. Special tensile tests were developed to determine the maximum force required for detachment.

Keywords: Non-Crimp Fabric, adhesive, stitch-free, high-performance fibre.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2026
785 Ramp Rate and Constriction Factor Based Dual Objective Economic Load Dispatch Using Particle Swarm Optimization

Authors: Himanshu Shekhar Maharana, S. K .Dash

Abstract:

Economic Load Dispatch (ELD) proves to be a vital optimization process in electric power system for allocating generation amongst various units to compute the cost of generation, the cost of emission involving global warming gases like sulphur dioxide, nitrous oxide and carbon monoxide etc. In this dissertation, we emphasize ramp rate constriction factor based particle swarm optimization (RRCPSO) for analyzing various performance objectives, namely cost of generation, cost of emission, and a dual objective function involving both these objectives through the experimental simulated results. A 6-unit 30 bus IEEE test case system has been utilized for simulating the results involving improved weight factor advanced ramp rate limit constraints for optimizing total cost of generation and emission. This method increases the tendency of particles to venture into the solution space to ameliorate their convergence rates. Earlier works through dispersed PSO (DPSO) and constriction factor based PSO (CPSO) give rise to comparatively higher computational time and less good optimal solution at par with current dissertation. This paper deals with ramp rate and constriction factor based well defined ramp rate PSO to compute various objectives namely cost, emission and total objective etc. and compares the result with DPSO and weight improved PSO (WIPSO) techniques illustrating lesser computational time and better optimal solution. 

Keywords: Economic load dispatch, constriction factor based particle swarm optimization, dispersed particle swarm optimization, weight improved particle swarm optimization, ramp rate and constriction factor based particle swarm optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1254
784 Semi-Automatic Artifact Rejection Procedure Based on Kurtosis, Renyi's Entropy and Independent Component Scalp Maps

Authors: Antonino Greco, Nadia Mammone, Francesco Carlo Morabito, Mario Versaci

Abstract:

Artifact rejection plays a key role in many signal processing applications. The artifacts are disturbance that can occur during the signal acquisition and that can alter the analysis of the signals themselves. Our aim is to automatically remove the artifacts, in particular from the Electroencephalographic (EEG) recordings. A technique for the automatic artifact rejection, based on the Independent Component Analysis (ICA) for the artifact extraction and on some high order statistics such as kurtosis and Shannon-s entropy, was proposed some years ago in literature. In this paper we try to enhance this technique proposing a new method based on the Renyi-s entropy. The performance of our method was tested and compared to the performance of the method in literature and the former proved to outperform the latter.

Keywords: Artifact, EEG, Renyi's entropy, kurtosis, independent component analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1850
783 Endometrial Cancer Recognition via EEG Dependent upon 14-3-3 Protein Leading to an Ontological Diagnosis

Authors: Marios Poulos, Eirini Maliagani, Minas Paschopoulos, George Bokos

Abstract:

The purpose of my research proposal is to demonstrate that there is a relationship between EEG and endometrial cancer. The above relationship is based on an Aristotelian Syllogism; since it is known that the 14-3-3 protein is related to the electrical activity of the brain via control of the flow of Na+ and K+ ions and since it is also known that many types of cancer are associated with 14-3-3 protein, it is possible that there is a relationship between EEG and cancer. This research will be carried out by well-defined diagnostic indicators, obtained via the EEG, using signal processing procedures and pattern recognition tools such as neural networks in order to recognize the endometrial cancer type. The current research shall compare the findings from EEG and hysteroscopy performed on women of a wide age range. Moreover, this practice could be expanded to other types of cancer. The implementation of this methodology will be completed with the creation of an ontology. This ontology shall define the concepts existing in this research-s domain and the relationships between them. It will represent the types of relationships between hysteroscopy and EEG findings.

Keywords: Bioinformatics, Protein 14-3-3, EEG, Endometrial cancer, Ontology.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1626
782 MJPEG Real-Time Transmission in Industrial Environments Using a CBR Channel

Authors: J. Silvestre, L. Almeida, R. Marau, P. Pedreiras

Abstract:

Currently, there are many local area industrial networks that can give guaranteed bandwidth to synchronous traffic, particularly providing CBR channels (Constant Bit Rate), which allow improved bandwidth management. Some of such networks operate over Ethernet, delivering channels with enough capacity, specially with compressors, to integrate multimedia traffic in industrial monitoring and image processing applications with many sources. In these industrial environments where a low latency is an essential requirement, JPEG is an adequate compressing technique but it generates VBR traffic (Variable Bit Rate). Transmitting VBR traffic in CBR channels is inefficient and current solutions to this problem significantly increase the latency or further degrade the quality. In this paper an R(q) model is used which allows on-line calculation of the JPEG quantification factor. We obtained increased quality, a lower requirement for the CBR channel with reduced number of discarded frames along with better use of the channel bandwidth.

Keywords: Industrial Networks, Multimedia.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1590
781 An Improved Scheduling Strategy in Cloud Using Trust Based Mechanism

Authors: D. Sumathi, P. Poongodi

Abstract:

Cloud Computing refers to applications delivered as services over the internet, and the datacenters that provide those services with hardware and systems software. These were earlier referred to as Software as a Service (SaaS). Scheduling is justified by job components (called tasks), lack of information. In fact, in a large fraction of jobs from machine learning, bio-computing, and image processing domains, it is possible to estimate the maximum time required for a task in the job. This study focuses on Trust based scheduling to improve cloud security by modifying Heterogeneous Earliest Finish Time (HEFT) algorithm. It also proposes TR-HEFT (Trust Reputation HEFT) which is then compared to Dynamic Load Scheduling.

Keywords: Software as a Service (SaaS), Trust, Heterogeneous Earliest Finish Time (HEFT) algorithm, Dynamic Load Scheduling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2190
780 An Enhanced Slicing Algorithm Using Nearest Distance Analysis for Layer Manufacturing

Authors: M. Vatani, A. R. Rahimi, F. Brazandeh, A. Sanati nezhad

Abstract:

Although the STL (stereo lithography) file format is widely used as a de facto industry standard in the rapid prototyping industry due to its simplicity and ability to tessellation of almost all surfaces, but there are always some defects and shortcoming in their usage, which many of them are difficult to correct manually. In processing the complex models, size of the file and its defects grow extremely, therefore, correcting STL files become difficult. In this paper through optimizing the exiting algorithms, size of the files and memory usage of computers to process them will be reduced. In spite of type and extent of the errors in STL files, the tail-to-head searching method and analysis of the nearest distance between tails and heads techniques were used. As a result STL models sliced rapidly, and fully closed contours produced effectively and errorless.

Keywords: Layer manufacturing, STL files, slicing algorithm, nearest distance analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4147