Search results for: images processing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2510

Search results for: images processing

770 Evaluating Spectral Relationships between Signals by Removing the Contribution of a Common, Periodic Source A Partial Coherence-based Approach

Authors: Antonio Mauricio F. L. Miranda de Sá

Abstract:

Partial coherence between two signals removing the contribution of a periodic, deterministic signal is proposed for evaluating the interrelationship in multivariate systems. The estimator expression was derived and shown to be independent of such periodic signal. Simulations were used for obtaining its critical value, which were found to be the same as those for Gaussian signals, as well as for evaluating the technique. An Illustration with eletroencephalografic (EEG) signals during photic stimulation is also provided. The application of the proposed technique in both simulation and real EEG data indicate that it seems to be very specific in removing the contribution of periodic sources. The estimate independence of the periodic signal may widen partial coherence application to signal analysis, since it could be used together with simple coherence to test for contamination in signals by a common, periodic noise source.

Keywords: Partial coherence, periodic input, spectral analysis, statistical signal processing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1464
769 Variance Based Component Analysis for Texture Segmentation

Authors: Zeinab Ghasemi, S. Amirhassan Monadjemi, Abbas Vafaei

Abstract:

This paper presents a comparative analysis of a new unsupervised PCA-based technique for steel plates texture segmentation towards defect detection. The proposed scheme called Variance Based Component Analysis or VBCA employs PCA for feature extraction, applies a feature reduction algorithm based on variance of eigenpictures and classifies the pixels as defective and normal. While the classic PCA uses a clusterer like Kmeans for pixel clustering, VBCA employs thresholding and some post processing operations to label pixels as defective and normal. The experimental results show that proposed algorithm called VBCA is 12.46% more accurate and 78.85% faster than the classic PCA.

Keywords: Principal Component Analysis; Variance Based Component Analysis; Defect Detection; Texture Segmentation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1973
768 Generic Filtering of Infinite Sets of Stochastic Signals

Authors: Anatoli Torokhti, Phil Howlett

Abstract:

A theory for optimal filtering of infinite sets of random signals is presented. There are several new distinctive features of the proposed approach. First, a single optimal filter for processing any signal from a given infinite signal set is provided. Second, the filter is presented in the special form of a sum with p terms where each term is represented as a combination of three operations. Each operation is a special stage of the filtering aimed at facilitating the associated numerical work. Third, an iterative scheme is implemented into the filter structure to provide an improvement in the filter performance at each step of the scheme. The final step of the scheme concerns signal compression and decompression. This step is based on the solution of a new rank-constrained matrix approximation problem. The solution to the matrix problem is described in this paper. A rigorous error analysis is given for the new filter.

Keywords: Optimal filtering, data compression, stochastic signals.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1321
767 Performance Analysis of MT Evaluation Measures and Test Suites

Authors: Yao Jian-Min, Lv Qiang, Zhang Jing

Abstract:

Many measures have been proposed for machine translation evaluation (MTE) while little research has been done on the performance of MTE methods. This paper is an effort for MTE performance analysis. A general frame is proposed for the description of the MTE measure and the test suite, including whether the automatic measure is consistent with human evaluation, whether different results from various measures or test suites are consistent, whether the content of the test suite is suitable for performance evaluation, the degree of difficulty of the test suite and its influence on the MTE, the relationship of MTE result significance and the size of the test suite, etc. For a better clarification of the frame, several experiment results are analyzed relating human evaluation, BLEU evaluation, and typological MTE. A visualization method is introduced for better presentation of the results. The study aims for aid in construction of test suite and method selection in MTE practice.

Keywords: Machine translation, natural language processing, visualization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1706
766 Rule-Based Expert System for Headache Diagnosis and Medication Recommendation

Authors: Noura Al-Ajmi, Mohammed A. Almulla

Abstract:

With the increased utilization of technology devices around the world, healthcare and medical diagnosis are critical issues that people worry about these days. Doctors are doing their best to avoid any medical errors while diagnosing diseases and prescribing the wrong medication. Subsequently, artificial intelligence applications that can be installed on mobile devices such as rule-based expert systems facilitate the task of assisting doctors in several ways. Due to their many advantages, the usage of expert systems has increased recently in health sciences. This work presents a backward rule-based expert system that can be used for a headache diagnosis and medication recommendation system. The structure of the system consists of three main modules, namely the input unit, the processing unit, and the output unit.

Keywords: Headache diagnosis system, treatment recommender system, rule-based expert system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 745
765 Brain Image Segmentation Using Conditional Random Field Based On Modified Artificial Bee Colony Optimization Algorithm

Authors: B. Thiagarajan, R. Bremananth

Abstract:

Tumor is an uncontrolled growth of tissues in any part of the body. Tumors are of different types and they have different characteristics and treatments. Brain tumor is inherently serious and life-threatening because of its character in the limited space of the intracranial cavity (space formed inside the skull). Locating the tumor within MR (magnetic resonance) image of brain is integral part of the treatment of brain tumor. This segmentation task requires classification of each voxel as either tumor or non-tumor, based on the description of the voxel under consideration. Many studies are going on in the medical field using Markov Random Fields (MRF) in segmentation of MR images. Even though the segmentation process is better, computing the probability and estimation of parameters is difficult. In order to overcome the aforementioned issues, Conditional Random Field (CRF) is used in this paper for segmentation, along with the modified artificial bee colony optimization and modified fuzzy possibility c-means (MFPCM) algorithm. This work is mainly focused to reduce the computational complexities, which are found in existing methods and aimed at getting higher accuracy. The efficiency of this work is evaluated using the parameters such as region non-uniformity, correlation and computation time. The experimental results are compared with the existing methods such as MRF with improved Genetic Algorithm (GA) and MRF-Artificial Bee Colony (MRF-ABC) algorithm.

Keywords: Conditional random field, Magnetic resonance, Markov random field, Modified artificial bee colony.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2948
764 Effect of Scanning Speed on Material Efficiency of Laser Metal Deposited Ti6Al4V

Authors: Esther T. Akinlabi, Rasheedat M. Mahamood, Mukul Shukla, Sisa. Pityana

Abstract:

The study of effect of laser scanning speed on material efficiency in Ti6Al4V application is very important because unspent powder is not reusable because of high temperature oxygen pick-up and contamination. This study carried out an extensive study on the effect of scanning speed on material efficiency by varying the speed between 0.01 to 0.1m/sec. The samples are wire brushed and cleaned with acetone after each deposition to remove un-melted particles from the surface of the deposit. The substrate is weighed before and after deposition. A formula was developed to calculate the material efficiency and the scanning speed was compared with the powder efficiency obtained. The results are presented and discussed. The study revealed that the optimum scanning speed exists for this study at 0.01m/sec, above and below which the powder efficiency will drop

Keywords: Additive Manufacturing, Laser Metal Deposition Process, Material efficiency, Processing Parameter, Titanium alloy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2348
763 Identification of Wideband Sources Using Higher Order Statistics in Noisy Environment

Authors: S. Bourennane, A. Bendjama

Abstract:

This paper deals with the localization of the wideband sources. We develop a new approach for estimating the wide band sources parameters. This method is based on the high order statistics of the recorded data in order to eliminate the Gaussian components from the signals received on the various hydrophones.In fact the noise of sea bottom is regarded as being Gaussian. Thanks to the coherent signal subspace algorithm based on the cumulant matrix of the received data instead of the cross-spectral matrix the wideband correlated sources are perfectly located in the very noisy environment. We demonstrate the performance of the proposed algorithm on the real data recorded during an underwater acoustics experiments.

Keywords: Higher-order statistics, high resolution array processing techniques, localization of acoustics sources, wide band sources.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1599
762 Microwave Pretreatment of Seeds to Extract High Quality Vegetable Oil

Authors: S. Azadmard-Damirchi, K. Alirezalu, B. Fathi Achachlouei

Abstract:

Microwave energy is a superior alternative to several other thermal treatments. Extraction techniques are widely employed for the isolation of bioactive compounds and vegetable oils from oil seeds. Among the different and new available techniques, microwave pretreatment of seeds is a simple and desirable method for production of high quality vegetable oils. Microwave pretreatment for oil extraction has many advantages as follow: improving oil extraction yield and quality, direct extraction capability, lower energy consumption, faster processing time and reduced solvent levels compared with conventional methods. It allows also for better retention and availability of desirable nutraceuticals, such as phytosterols and tocopherols, canolol and phenolic compounds in the extracted oil such as rapeseed oil. This can be a new step to produce nutritional vegetable oils with improved shelf life because of high antioxidant content.

Keywords: Microwave pretreatment, vegetable oil extraction, nutraceuticals, oil quality

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4907
761 1G2A IMU\GPS Integration Algorithm for Land Vehicle Navigation

Authors: O. Maklouf, Ahmed Abdulla

Abstract:

A general decline in the cost, size, and power requirements of electronics is accelerating the adoption of integrated GPS/INS technologies in consumer applications such Land Vehicle Navigation. Researchers have looking for ways to eliminate additional components from product designs. One possibility is to drop one or more of the relatively expensive gyroscopes from microelectromechanical system (MEMS) versions of inertial measurement units (IMUs). For land vehicular use, the most important gyroscope is the vertical gyro that senses the heading of the vehicle and two horizontal accelerometers for determining the velocity of the vehicle. This paper presents a simplified integration algorithm for strap down (ParIMU)\GPS combination, with data post processing for the determination of 2-D components of position (trajectory), velocity and heading. In the present approach we have neglected earth rotation and gravity variations, because of the poor gyroscope sensitivities of the low-cost IMU and because of the relatively small area of the trajectory.

Keywords: GPS, ParIMU, INS, Kalman Filter.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2866
760 Parallel Text Processing: Alignment of Indonesian to Javanese Language

Authors: Aji P. Wibawa, Andrew Nafalski, Neil Murray, Wayan F. Mahmudy

Abstract:

Parallel text alignment is proposed as a way of aligning bahasa Indonesia to words in Javanese. Since the one-to-one word translator does not have the facility to translate pragmatic aspects of Javanese, the parallel text alignment model described uses a phrase pair combination. The algorithm aligns the parallel text automatically from the beginning to the end of each sentence. Even though the results of the phrase pair combination outperform the previous algorithm, it is still inefficient. Recording all possible combinations consume more space in the database and time consuming. The original algorithm is modified by applying the edit distance coefficient to improve the data-storage efficiency. As a result, the data-storage consumption is 90% reduced as well as its learning period (42s).

Keywords: Parallel text alignment, phrase pair combination, edit distance coefficient, Javanese-Indonesian language.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2482
759 Experimental Study on Dehumidification Performance of Supersonic Nozzle

Authors: Esam Jassim

Abstract:

Supersonic nozzles are commonly used to purify natural gas in gas processing technology. As an innovated technology, it is employed to overcome the deficit of the traditional method, related to gas dynamics, thermodynamics and fluid dynamics theory. An indoor test rig is built to study the dehumidification process of moisture fluid. Humid air was chosen for the study. The working fluid was circulating in an open loop, which had provision for filtering, metering, and humidifying. A stainless steel supersonic separator is constructed together with the C-D nozzle system. The result shows that dehumidification enhances as NPR increases. This is due to the high intensity in the turbulence caused by the shock formation in the divergent section. Such disturbance strengthens the centrifugal force, pushing more particles toward the near-wall region. In return return, the pressure recovery factor, defined as the ratio of the outlet static pressure of the fluid to its inlet value, decreases with NPR.

Keywords: Supersonic nozzle, dehumidification, particle separation, geometry.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1216
758 Studies on Distortion of Dissimilar Thin Sheet Weld Joints Using Laser Beam Welding

Authors: K. Kalaiselvan, A. Elango

Abstract:

To achieve reliable welds with minimum distortion for the fabrication of components in aerospace industry laser beam welding is attempted. Laser welding can provide a significant benefit for the welding of Titanium and Aluminium thin sheet alloys of its precision and rapid processing capability. For laser welding, pulse shape, energy, duration, repetition rate and peak power are the most important parameters that influence directly the quality of welds. In this experimental work for joining 1mm thick TI6AL4V and AA2024 alloy and JK600 Nd:YAG pulsed laser units used. The distortions at different welding power and speed of titanium and aluminium thin sheet alloys are investigated. Test results reveal that increase in welding speed increases distortion in weldment

Keywords: Laser Beam Welding, Titanium, Aluminium alloy sheets and distortion.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2682
757 Causal Relation Identification Using Convolutional Neural Networks and Knowledge Based Features

Authors: Tharini N. de Silva, Xiao Zhibo, Zhao Rui, Mao Kezhi

Abstract:

Causal relation identification is a crucial task in information extraction and knowledge discovery. In this work, we present two approaches to causal relation identification. The first is a classification model trained on a set of knowledge-based features. The second is a deep learning based approach training a model using convolutional neural networks to classify causal relations. We experiment with several different convolutional neural networks (CNN) models based on previous work on relation extraction as well as our own research. Our models are able to identify both explicit and implicit causal relations as well as the direction of the causal relation. The results of our experiments show a higher accuracy than previously achieved for causal relation identification tasks.

Keywords: Causal relation identification, convolutional neural networks, natural Language Processing, Machine Learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2257
756 Video-Based Tracking of Laparoscopic Instruments Using an Orthogonal Webcams System

Authors: Fernando Pérez, Humberto Sossa, Rigoberto Martínez, Daniel Lorias, Arturo Minor

Abstract:

This paper presents a system for tracking the movement of laparoscopic instruments which is based on an orthogonal system of webcams and video image processing. The movements are captured with two webcams placed orthogonally inside of the physical trainer. On the image, the instruments were detected by using color markers placed on the distal tip of each instrument. The 3D position of the tip of the instrument within the work space was obtained by linear triangulation method. Preliminary results showed linearity and repeatability in the motion tracking with a resolution of 0.616 mm in each axis; the accuracy of the system showed a 3D instrument positioning error of 1.009 ± 0.101 mm. This tool is a portable and low-cost alternative to traditional tracking devices and a trustable method for the objective evaluation of the surgeon’s surgical skills.

Keywords: Laparoscopic Surgery, Orthogonal Vision, Tracking Instruments, Triangulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2643
755 Analytics Model in a Telehealth Center Based on Cloud Computing and Local Storage

Authors: L. Ramirez, E. Guillén, J. Sánchez

Abstract:

Some of the main goals about telecare such as monitoring, treatment, telediagnostic are deployed with the integration of applications with specific appliances. In order to achieve a coherent model to integrate software, hardware, and healthcare systems, different telehealth models with Internet of Things (IoT), cloud computing, artificial intelligence, etc. have been implemented, and their advantages are still under analysis. In this paper, we propose an integrated model based on IoT architecture and cloud computing telehealth center. Analytics module is presented as a solution to control an ideal diagnostic about some diseases. Specific features are then compared with the recently deployed conventional models in telemedicine. The main advantage of this model is the availability of controlling the security and privacy about patient information and the optimization on processing and acquiring clinical parameters according to technical characteristics.

Keywords: Analytics, telemedicine, internet of things, cloud computing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1565
754 A Robust Method for Encrypted Data Hiding Technique Based on Neighborhood Pixels Information

Authors: Ali Shariq Imran, M. Younus Javed, Naveed Sarfraz Khattak

Abstract:

This paper presents a novel method for data hiding based on neighborhood pixels information to calculate the number of bits that can be used for substitution and modified Least Significant Bits technique for data embedding. The modified solution is independent of the nature of the data to be hidden and gives correct results along with un-noticeable image degradation. The technique, to find the number of bits that can be used for data hiding, uses the green component of the image as it is less sensitive to human eye and thus it is totally impossible for human eye to predict whether the image is encrypted or not. The application further encrypts the data using a custom designed algorithm before embedding bits into image for further security. The overall process consists of three main modules namely embedding, encryption and extraction cm.

Keywords: Data hiding, image processing, information security, stagonography.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2341
753 A Proposed Hybrid Color Image Compression Based on Fractal Coding with Quadtree and Discrete Cosine Transform

Authors: Shimal Das, Dibyendu Ghoshal

Abstract:

Fractal based digital image compression is a specific technique in the field of color image. The method is best suited for irregular shape of image like snow bobs, clouds, flame of fire; tree leaves images, depending on the fact that parts of an image often resemble with other parts of the same image. This technique has drawn much attention in recent years because of very high compression ratio that can be achieved. Hybrid scheme incorporating fractal compression and speedup techniques have achieved high compression ratio compared to pure fractal compression. Fractal image compression is a lossy compression method in which selfsimilarity nature of an image is used. This technique provides high compression ratio, less encoding time and fart decoding process. In this paper, fractal compression with quad tree and DCT is proposed to compress the color image. The proposed hybrid schemes require four phases to compress the color image. First: the image is segmented and Discrete Cosine Transform is applied to each block of the segmented image. Second: the block values are scanned in a zigzag manner to prevent zero co-efficient. Third: the resulting image is partitioned as fractals by quadtree approach. Fourth: the image is compressed using Run length encoding technique.

Keywords: Fractal coding, Discrete Cosine Transform, Iterated Function System (IFS), Affine Transformation, Run length encoding.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1570
752 Optimized Facial Features-based Age Classification

Authors: Md. Zahangir Alom, Mei-Lan Piao, Md. Shariful Islam, Nam Kim, Jae-Hyeung Park

Abstract:

The evaluation and measurement of human body dimensions are achieved by physical anthropometry. This research was conducted in view of the importance of anthropometric indices of the face in forensic medicine, surgery, and medical imaging. The main goal of this research is to optimization of facial feature point by establishing a mathematical relationship among facial features and used optimize feature points for age classification. Since selected facial feature points are located to the area of mouth, nose, eyes and eyebrow on facial images, all desire facial feature points are extracted accurately. According this proposes method; sixteen Euclidean distances are calculated from the eighteen selected facial feature points vertically as well as horizontally. The mathematical relationships among horizontal and vertical distances are established. Moreover, it is also discovered that distances of the facial feature follows a constant ratio due to age progression. The distances between the specified features points increase with respect the age progression of a human from his or her childhood but the ratio of the distances does not change (d = 1 .618 ) . Finally, according to the proposed mathematical relationship four independent feature distances related to eight feature points are selected from sixteen distances and eighteen feature point-s respectively. These four feature distances are used for classification of age using Support Vector Machine (SVM)-Sequential Minimal Optimization (SMO) algorithm and shown around 96 % accuracy. Experiment result shows the proposed system is effective and accurate for age classification.

Keywords: 3D Face Model, Face Anthropometrics, Facial Features Extraction, Feature distances, SVM-SMO

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2047
751 Quantum Enhanced Correlation Matrix Memories via States Orthogonalisation

Authors: Mario Mastriani, Marcelo Naiouf

Abstract:

This paper introduces a Quantum Correlation Matrix Memory (QCMM) and Enhanced QCMM (EQCMM), which are useful to work with quantum memories. A version of classical Gram-Schmidt orthogonalisation process in Dirac notation (called Quantum Orthogonalisation Process: QOP) is presented to convert a non-orthonormal quantum basis, i.e., a set of non-orthonormal quantum vectors (called qudits) to an orthonormal quantum basis, i.e., a set of orthonormal quantum qudits. This work shows that it is possible to improve the performance of QCMM thanks QOP algorithm. Besides, the EQCMM algorithm has a lot of additional fields of applications, e.g.: Steganography, as a replacement Hopfield Networks, Bilevel image processing, etc. Finally, it is important to mention that the EQCMM is an extremely easy to implement in any firmware.

Keywords: Quantum Algebra, correlation matrix memory, Dirac notation, orthogonalisation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1719
750 Analysis of EEG Signals Using Wavelet Entropy and Approximate Entropy: A Case Study on Depression Patients

Authors: Subha D. Puthankattil, Paul K. Joseph

Abstract:

Analyzing brain signals of the patients suffering from the state of depression may lead to interesting observations in the signal parameters that is quite different from a normal control. The present study adopts two different methods: Time frequency domain and nonlinear method for the analysis of EEG signals acquired from depression patients and age and sex matched normal controls. The time frequency domain analysis is realized using wavelet entropy and approximate entropy is employed for the nonlinear method of analysis. The ability of the signal processing technique and the nonlinear method in differentiating the physiological aspects of the brain state are revealed using Wavelet entropy and Approximate entropy.

Keywords: EEG, Depression, Wavelet entropy, Approximate entropy, Relative Wavelet energy, Multiresolution decomposition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3639
749 Relevance Feedback within CBIR Systems

Authors: Mawloud Mosbah, Bachir Boucheham

Abstract:

We present here the results for a comparative study of some techniques, available in the literature, related to the relevance feedback mechanism in the case of a short-term learning. Only one method among those considered here is belonging to the data mining field which is the K-nearest neighbors algorithm (KNN) while the rest of the methods is related purely to the information retrieval field and they fall under the purview of the following three major axes: Shifting query, Feature Weighting and the optimization of the parameters of similarity metric. As a contribution, and in addition to the comparative purpose, we propose a new version of the KNN algorithm referred to as an incremental KNN which is distinct from the original version in the sense that besides the influence of the seeds, the rate of the actual target image is influenced also by the images already rated. The results presented here have been obtained after experiments conducted on the Wang database for one iteration and utilizing color moments on the RGB space. This compact descriptor, Color Moments, is adequate for the efficiency purposes needed in the case of interactive systems. The results obtained allow us to claim that the proposed algorithm proves good results; it even outperforms a wide range of techniques available in the literature.

Keywords: CBIR, Category Search, Relevance Feedback (RFB), Query Point Movement, Standard Rocchio’s Formula, Adaptive Shifting Query, Feature Weighting, Optimization of the Parameters of Similarity Metric, Original KNN, Incremental KNN.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2342
748 A Novel Implementation of Application Specific Instruction-set Processor (ASIP) using Verilog

Authors: Kamaraju.M, Lal Kishore.K, Tilak.A.V.N

Abstract:

The general purpose processors that are used in embedded systems must support constraints like execution time, power consumption, code size and so on. On the other hand an Application Specific Instruction-set Processor (ASIP) has advantages in terms of power consumption, performance and flexibility. In this paper, a 16-bit Application Specific Instruction-set processor for the sensor data transfer is proposed. The designed processor architecture consists of on-chip transmitter and receiver modules along with the processing and controlling units to enable the data transmission and reception on a single die. The data transfer is accomplished with less number of instructions as compared with the general purpose processor. The ASIP core operates at a maximum clock frequency of 1.132GHz with a delay of 0.883ns and consumes 569.63mW power at an operating voltage of 1.2V. The ASIP is implemented in Verilog HDL using the Xilinx platform on Virtex4.

Keywords: ASIP, Data transfer, Instruction set, Processor

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2068
747 Audio Watermarking Using Spectral Modifications

Authors: Jyotsna Singh, Parul Garg, Alok Nath De

Abstract:

In this paper, we present a non-blind technique of adding the watermark to the Fourier spectral components of audio signal in a way such that the modified amplitude does not exceed the maximum amplitude spread (MAS). This MAS is due to individual Discrete fourier transform (DFT) coefficients in that particular frame, which is derived from the Energy Spreading function given by Schroeder. Using this technique one can store double the information within a given frame length i.e. overriding the watermark on the host of equal length with least perceptual distortion. The watermark is uniformly floating on the DFT components of original signal. This helps in detecting any intentional manipulations done on the watermarked audio. Also, the scheme is found robust to various signal processing attacks like presence of multiple watermarks, Additive white gaussian noise (AWGN) and mp3 compression.

Keywords: Discrete Fourier Transform, Spreading Function, Watermark, Pseudo Noise Sequence, Spectral Masking Effect

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1704
746 Application of Finite Dynamic Programming to Decision Making in the Use of Industrial Residual Water Treatment Plants

Authors: Oscar Vega Camacho, Andrea Vargas Guevara, Ellery Rowina Ariza

Abstract:

This paper presents the application of finite dynamic programming, specifically the "Markov Chain" model, as part of the decision making process of a company in the cosmetics sector located in the vicinity of Bogota DC. The objective of this process was to decide whether the company should completely reconstruct its wastewater treatment plant or instead optimize the plant through the addition of equipment. The goal of both of these options was to make the required improvements in order to comply with parameters established by national legislation regarding the treatment of waste before it is released into the environment. This technique will allow the company to select the best option and implement a solution for the processing of waste to minimize environmental damage and the acquisition and implementation costs.

Keywords: Decision making, Markov chain, optimization, wastewater.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2014
745 Awareness of Value Addition of Sweet Potato (Ipomoea batatas (L.) Lam) In Osun State, Nigeria

Authors: A. M. Omoare, E. O. Fakoya, O. E. Fapojuwo, W. O. Oyediran

Abstract:

Awareness of value addition of sweet potato has received comparatively little attention in Nigeria despite its potential to reduce perishability and enhanced utilization of the crop in diverse products forms. This study assessed the awareness of value addition of sweet potato in Osun State, Nigeria. Multi-stage random sampling technique was used to select 120 respondents for the study. Data obtained were analyzed using descriptive statistics and multiple regression analysis. Findings showed that most (75.00%) of the respondents were male with mean age of 42.10 years and 96.70% of the respondents had formal education. The mean farm size was 2.30 hectares. Majority (75.00%) of the respondents had more than 10 years farming experience. Awareness of value addition of sweet potato was very low among the respondents. It was recommended that sweet potato farmers should be empowered through effective and efficient extension training on the use of modern processing techniques in order to enhance value addition of sweet potato. 

Keywords: Awareness, value addition, sweet potato, perishability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2931
744 Development of EPID-based Real time Dose Verification for Dynamic IMRT

Authors: Todsaporn Fuangrod, Daryl J. O'Connor, Boyd MC McCurdy, Peter B. Greer

Abstract:

An electronic portal image device (EPID) has become a method of patient-specific IMRT dose verification for radiotherapy. Research studies have focused on pre and post-treatment verification, however, there are currently no interventional procedures using EPID dosimetry that measure the dose in real time as a mechanism to ensure that overdoses do not occur and underdoses are detected as soon as is practically possible. As a result, an EPID-based real time dose verification system for dynamic IMRT was developed and was implemented with MATLAB/Simulink. The EPID image acquisition was set to continuous acquisition mode at 1.4 images per second. The system defined the time constraint gap, or execution gap at the image acquisition time, so that every calculation must be completed before the next image capture is completed. In addition, the <=-evaluation method was used for dose comparison, with two types of comparison processes; individual image and cumulative dose comparison monitored. The outputs of the system are the <=-map, the percent of <=<1, and mean-<= versus time, all in real time. Two strategies were used to test the system, including an error detection test and a clinical data test. The system can monitor the actual dose delivery compared with the treatment plan data or previous treatment dose delivery that means a radiation therapist is able to switch off the machine when the error is detected.

Keywords: real-time dose verification, EPID dosimetry, simulation, dynamic IMRT

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2188
743 Linguistic Devices Reflecting Violence in Border–Provinces of Southern Thailand on the Front Page of Local and National Newspapers

Authors: Chanokporn Angsuviriya

Abstract:

The objective of the study is to analyze linguistic devices reflecting the violence in the south border provinces; namely Pattani, Yala, Narathiwat and Songkla on 1,344 front pages of three local newspapers; namely ChaoTai, Focus PhakTai and Samila Time and of two national newspapers, including ThaiRath and Matichon, between 2004 and 2005, and 2011 and 2012. The study shows that there are two important linguistic devices: 1) lexical choices consisting of the use of verbs describing violence, the use of quantitative words and the use of words naming someone who committed violent acts, and 2) metaphors consisting of “A VIOLENT PROBLEM IS HEAT”, “A VICTIM IS A LEAF”, and “A TERRORIST IS A DOG”. Comparing linguistic devices between two types of newspapers, national newspapers choose to use words more violently than local newspapers do. Moreover, they create more negative images of the south of Thailand by using stative verbs. In addition, in term of metaphors “A TERRORIST IS A FOX.” is only found in national newspapers. As regards naming terrorists “southern insurgents”, this noun phrase which is collectively called by national newspapers has strongly negative meaning. Moreover, “southern insurgents” have been perceived by the Thais in the whole country while “insurgents” that are not modified have been only used by local newspapers.

Keywords: Linguistic Devices, Local Newspapers, National Newspapers, Violence.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1287
742 Sensory Evaluation of Cooked Sausages with Legumes Additive

Authors: Ilze Gramatina, Jelena Zagorska, Evita Straumite, Svetlana Sarvi

Abstract:

In the meat processing industry the substitution of meat with non-meat ingredients is considered an important strategy for reducing overall production costs. The main purpose of the current research was to evaluate differences in physical-chemical composition of cooked sausage with different legumes additions. Peas (Pisum sativum), beans (Phaseolus vulgaris) and lentil (Lens culinaris) were used in preparation of sausages. The legumes at proportion of 20% of the total weight of meat were added in sausages. The whole ingredients were mixed, filled into casing, compressed, cooked and cooled. After storage the samples were sensory evaluated. The sensory evaluation was carried out using the nine point hedonic scale and line scale. Sausages without legumes flour was used as control sample. The main conclusion of the current research the legumes flour can be successfully used for cooked sausages production.

Keywords: Legumes, cooked sausages.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2706
741 Hiding Data in Images Using PCP

Authors: Souvik Bhattacharyya, Gautam Sanyal

Abstract:

In recent years, everything is trending toward digitalization and with the rapid development of the Internet technologies, digital media needs to be transmitted conveniently over the network. Attacks, misuse or unauthorized access of information is of great concern today which makes the protection of documents through digital media a priority problem. This urges us to devise new data hiding techniques to protect and secure the data of vital significance. In this respect, steganography often comes to the fore as a tool for hiding information. Steganography is a process that involves hiding a message in an appropriate carrier like image or audio. It is of Greek origin and means "covered or hidden writing". The goal of steganography is covert communication. Here the carrier can be sent to a receiver without any one except the authenticated receiver only knows existence of the information. Considerable amount of work has been carried out by different researchers on steganography. In this work the authors propose a novel Steganographic method for hiding information within the spatial domain of the gray scale image. The proposed approach works by selecting the embedding pixels using some mathematical function and then finds the 8 neighborhood of the each selected pixel and map each bit of the secret message in each of the neighbor pixel coordinate position in a specified manner. Before embedding a checking has been done to find out whether the selected pixel or its neighbor lies at the boundary of the image or not. This solution is independent of the nature of the data to be hidden and produces a stego image with minimum degradation.

Keywords: Cover Image, LSB, Pixel Coordinate Position (PCP), Stego Image.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1821