Search results for: solvent extraction
702 Synthesis, Characterization and Performance Study of Newly Developed Amine Polymeric Membrane (APM) for Carbon Dioxide (CO2) Removal
Authors: Rizwan Nasir, Hilmi Mukhtar, Zakaria Man, Dzeti Farhah Mohshim
Abstract:
Carbon dioxide has been well associated with greenhouse effect, and due to its corrosive nature it is an undesirable compound. A variety of physical-chemical processes are available for the removal of carbon dioxide. Previous attempts in this field have established alkanolamine group has the capability to remove carbon dioxide. So, this study combined the polymeric membrane and alkanolamine solutions to fabricate the amine polymeric membrane (APM) to remove carbon dioxide (CO2). This study entails the effect of three types of amines, monoethanolamine (MEA), diethanolamine (DEA), and methyldiethanolamine (MDEA). The effect of each alkanolamine group on the morphology and performance of polyether sulfone (PES) polymeric membranes was studied. Flat sheet membranes were fabricated by solvent evaporation method by adding polymer and different alkanolamine solutions in the N-Methyl-2-pyrrolidone (NMP) solvent. The final membranes were characterized by using Field Emission Electron Microscope (FESEM), Fourier Transform Infrared (FTIR), and Thermo-Gravimetric Analysis (TGA). The membrane separation performance was studied. The PES-DEA and PES-MDEA membrane has good ability to remove carbon dioxide.
Keywords: Amine Polymeric membrane, Alkanolamine solution, CO2 Removal, Characterization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2253701 Arabic Character Recognition using Artificial Neural Networks and Statistical Analysis
Authors: Ahmad M. Sarhan, Omar I. Al Helalat
Abstract:
In this paper, an Arabic letter recognition system based on Artificial Neural Networks (ANNs) and statistical analysis for feature extraction is presented. The ANN is trained using the Least Mean Squares (LMS) algorithm. In the proposed system, each typed Arabic letter is represented by a matrix of binary numbers that are used as input to a simple feature extraction system whose output, in addition to the input matrix, are fed to an ANN. Simulation results are provided and show that the proposed system always produces a lower Mean Squared Error (MSE) and higher success rates than the current ANN solutions.Keywords: ANN, Backpropagation, Gaussian, LMS, MSE, Neuron, standard deviation, Widrow-Hoff rule.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2016700 Development of Molecular Imprinted Polymers (MIPs) for the Selective Removal of Carbamazepine from Aqueous Solution
Authors: Bianca Schweiger, Lucile Bahnweg, Barbara Palm, Ute Steinfeld
Abstract:
The occurrence and removal of trace organic contaminants in the aquatic environment has become a focus of environmental concern. For the selective removal of carbamazepine from loaded waters molecularly imprinted polymers (MIPs) were synthesized with carbamazepine as template. Parameters varied were the type of monomer, crosslinker, and porogen, the ratio of starting materials, and the synthesis temperature. Best results were obtained with a template to crosslinker ratio of 1:20, toluene as porogen, and methacrylic acid (MAA) as monomer. MIPs were then capable to recover carbamazepine by 93% from a 10-5 M landfill leachate solution containing also caffeine and salicylic acid. By comparison, carbamazepine recoveries of 75% were achieved using a nonimprinted polymer (NIP) synthesized under the same conditions, but without template. In landfill leachate containing solutions carbamazepine was adsorbed by 93-96% compared with an uptake of 73% by activated carbon. The best solvent for desorption was acetonitrile, with which the amount of solvent necessary and dilution with water was tested. Selected MIPs were tested for their reusability and showed good results for at least five cycles. Adsorption isotherms were prepared with carbamazepine solutions in the concentration range of 0.01 M to 5*10-6 M. The heterogeneity index showed a more homogenous binding site distribution.Keywords: Carbamazepine, landfill leachate, removal, reuse
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2172699 Extraction of Bran Protein Using Enzymes and Polysaccharide Precipitation
Authors: Sudarat Jiamyangyuen, Tipawan Thongsook, Riantong Singanusong, Chanida Saengtubtim
Abstract:
Rice bran is normally used as a raw material for rice bran oil production or sold as feed with a low price. Conventionally, the protein in defatted rice bran was extracted using alkaline extraction and acid precipitation, which involves in chemical usage and lowering some nutritious component. This study was conducted in order to extract of rice bran protein concentrate (RBPC) from defatted rice bran using enzymes and employing polysaccharides in a precipitating step. The properties of RBPC obtained will be compared to those of a control sample extracted using a conventional method. The results showed that extraction of protein from rice bran using enzymes exhibited the higher protein recovery compared to that extraction with alkaline. The extraction conditions using alcalase 2% (v/w) at 50 C, pH 9.5 gave the highest protein (2.44%) and yield (32.09%) in extracted solution compared to other enzymes. Rice bran protein concentrate powder prepared by a precipitation step using alginate (protein in solution: alginate 1:0.016) exhibited the highest protein (27.55%) and yield (6.84%). Precipitation using alginate was better than that of acid. RBPC extracted with alkaline (ALK) or enzyme alcalase (ALC), then precipitated with alginate (AL) (samples RBP-ALK-AL and RBP-ALC-AL) yielded the precipitation rate of 75% and 91.30%, respectively. Therefore, protein precipitation using alginate was then selected. Amino acid profile of control sample, and sample precipitated with alginate, as compared to casein and soy protein isolated, showed that control sample showed the highest content among all sample. Functional property study of RBP showed that the highest nitrogen solubility occurred in pH 8-10. There was no statically significant between emulsion capacity and emulsion stability of control and sample precipitated by alginate. However, control sample showed a higher of foaming capacity and foaming stability compared to those of sample precipitated with alginate. The finding was successful in terms of minimizing chemicals used in extraction and precipitation steps in preparation of rice bran protein concentrate. This research involves in a production of value-added product in which the double amount of protein (28%) compared to original amount (14%) contained in rice bran could be beneficial in terms of adding to food products e.g. healthy drink with high protein and fiber. In addition, the basic knowledge of functional property of rice bran protein concentrate was obtained, which can be used to appropriately select the application of this value-added product from rice bran.Keywords: Alginate, carrageenan, rice bran, rice bran protein.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2667698 Functionality and Application of Rice Bran Protein Hydrolysates in Oil in Water Emulsions: Their Stabilities to Environmental Stresses
Authors: R. Charoen, S. Tipkanon, W. Savedboworn, N. Phonsatta, A. Panya
Abstract:
Rice bran protein hydrolysates (RBPH) were prepared from defatted rice bran of two different Thai rice cultivars (Plai-Ngahm-Prachinburi; PNP and Khao Dok Mali 105; KDM105) using an enzymatic method. This research aimed to optimize enzyme-assisted protein extraction. In addition, the functional properties of RBPH and their stabilities to environmental stresses including pH (3 to 8), ionic strength (0 mM to 500 mM) and the thermal treatment (30 °C to 90 °C) were investigated. Results showed that enzymatic process for protein extraction of defatted rice bran was as follows: enzyme concentration 0.075 g/ 5 g of protein, extraction temperature 50 °C and extraction time 4 h. The obtained protein hydrolysate powders had a degree of hydrolysis (%) of 21.05% in PNP and 19.92% in KDM105. The solubility of protein hydrolysates at pH 4-6 was ranged from 27.28-38.57% and 27.60-43.00% in PNP and KDM105, respectively. In general, antioxidant activities indicated by total phenolic content, FRAP, ferrous ion-chelating (FIC), and 2,2’-azino-bis-3-ethylbenzthiazoline-6-sulphonic acid (ABTS) of KDM105 had higher than PNP. In terms of functional properties, the emulsifying activity index (EAI) was was 8.78 m²/g protein in KDM105, whereas PNP was 5.05 m²/g protein. The foaming capacity at 5 minutes (%) was 47.33 and 52.98 in PNP and KDM105, respectively. Glutamine, Alanine, Valine, and Leucine are the major amino acid in protein hydrolysates where the total amino acid of KDM105 gave higher than PNP. Furthermore, we investigated environmental stresses on the stability of 5% oil in water emulsion (5% oil, 10 mM citrate buffer) stabilized by RBPH (3.5%). The droplet diameter of emulsion stabilized by KDM105 was smaller (d < 250 nm) than produced by PNP. For environmental stresses, RBPH stabilized emulsions were stable at pH around 3 and 5-6, at high salt (< 400 mM, pH 7) and at temperatures range between 30-50°C.
Keywords: Functional properties, oil in water emulsion, protein hydrolysates, rice bran protein.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1148697 The Extraction and Stripping of Hg (II) from Produced Water via Hollow Fiber Contactor
Authors: Dolapop Sribudda, Ura Pancharoen
Abstract:
The separation of Hg (II) from produced water by hollow fiber contactors (HFC) was investigation. This system included of two hollow fiber modules in the series connecting. The first module used for the extraction reaction and the second module for stripping reaction. Aliquat336 extractant was fed from the organic reservoirs into the shell side of the first hollow fiber module and continuous to the shell side of the second module. The organic liquid was continuously feed recirculate and back to the reservoirs. The feed solution was pumped into the lumen (tube side) of the first hollow fiber module. Simultaneously, the stripping solution was pumped in the same way in tube side of the second module. The feed and stripping solution was fed which had a countercurrent flow. Samples were kept in the outlet of feed and stripping solution at 1 hour and characterized concentration of Hg (II) by Inductively Couple Plasma Atomic Emission Spectroscopy (ICP-AES). Feed solution was produced water from natural gulf of Thailand. The extractant was Aliquat336 dissolved in kerosene diluent. Stripping solution used was nitric acid (HNO3) and thiourea (NH2CSNH2). The effect of carrier concentration and type of stripping solution were investigated. Results showed that the best condition were 10 % (v/v) Aliquat336 and 1.0 M NH2CSNH2. At the optimum condition, the extraction and stripping of Hg (II) were 98% and 44.2%, respectively.Keywords: Hg (II), hollow fiber contactor, produced water, wastewater treatment.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1823696 Extraction of Craniofacial Landmarks for Preoperative to Intraoperative Registration
Authors: M. Gooroochurn, D. Kerr, K. Bouazza-Marouf, M. Vloeberghs
Abstract:
This paper presents the automated methods employed for extracting craniofacial landmarks in white light images as part of a registration framework designed to support three neurosurgical procedures. The intraoperative space is characterised by white light stereo imaging while the preoperative plan is performed on CT scans. The registration aims at aligning these two modalities to provide a calibrated environment to enable image-guided solutions. The neurosurgical procedures can then be carried out by mapping the entry and target points from CT space onto the patient-s space. The registration basis adopted consists of natural landmarks (eye corner and ear tragus). A 5mm accuracy is deemed sufficient for these three procedures and the validity of the selected registration basis in achieving this accuracy has been assessed by simulation studies. The registration protocol is briefly described, followed by a presentation of the automated techniques developed for the extraction of the craniofacial features and results obtained from tests on the AR and FERET databases. Since the three targeted neurosurgical procedures are routinely used for head injury management, the effect of bruised/swollen faces on the automated algorithms is assessed. A user-interactive method is proposed to deal with such unpredictable circumstances.Keywords: Face Processing, Craniofacial Feature Extraction, Preoperative to Intraoperative Registration, Registration Basis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1413695 Parameters Extraction for Pseudomorphic HEMTs Using Genetic Algorithms
Authors: Mazhar B. Tayel, Amr H. Yassin
Abstract:
A proposed small-signal model parameters for a pseudomorphic high electron mobility transistor (PHEMT) is presented. Both extrinsic and intrinsic circuit elements of a smallsignal model are determined using genetic algorithm (GA) as a stochastic global search and optimization tool. The parameters extraction of the small-signal model is performed on 200-μm gate width AlGaAs/InGaAs PHEMT. The equivalent circuit elements for a proposed 18 elements model are determined directly from the measured S- parameters. The GA is used to extract the parameters of the proposed small-signal model from 0.5 up to 18 GHz.
Keywords: PHEMT, Genetic Algorithms, small signal modeling, optimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2262694 Lead in The Soil-Plant System Following Aged Contamination from Ceramic Wastes
Authors: F. Pedron, M. Grifoni, G. Petruzzelli, M. Barbafieri, I. Rosellini, B. Pezzarossa
Abstract:
Lead contamination of agricultural land mainly vegetated with perennial ryegrass (Lolium perenne) has been investigated. The metal derived from the discharge of sludge from a ceramic industry in the past had used lead paints. The results showed very high values of lead concentration in many soil samples. In order to assess the lead soil contamination, a sequential extraction with H2O, KNO3, EDTA was performed, and the chemical forms of lead in the soil were evaluated. More than 70% of lead was in a potentially bioavailable form. Analysis of Lolium perenne showed elevated lead concentration. A Freundlich-like model was used to describe the transferability of the metal from the soil to the plant.
Keywords: Bioavailability, Freundlich-like equation, sequential extraction, soil lead contamination.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 964693 SFE as a Superior Technique for Extraction of Eugenol-Rich Fraction from Cinnamomum tamala Nees (Bay Leaf) - Process Analysis and Phytochemical Characterization
Authors: Sudip Ghosh, Dipanwita Roy, Dipan Chatterjee, Paramita Bhattacharjee, Satadal Das
Abstract:
Highest yield of eugenol-rich fractions from Cinnamomum tamala (bay leaf) leaves were obtained by supercritical carbon dioxide (SC-CO2), compared to hydro-distillation, organic solvents, liquid CO2 and subcritical CO2 extractions. Optimization of SC-CO2 extraction parameters was carried out to obtain an extract with maximum eugenol content. This was achieved using a sample size of 10g at 55°C, 512 bar after 60min at a flow rate of 25.0 cm3/sof gaseous CO2. This extract has the best combination of phytochemical properties such as phenolic content (1.77mg gallic acid/g dry bay leaf), reducing power (0.80mg BHT/g dry bay leaf), antioxidant activity (IC50 of 0.20mg/ml) and anti-inflammatory potency (IC50 of 1.89mg/ml). Identification of compounds in this extract was performed by GC-MS analysis and its antimicrobial potency was also evaluated. The MIC values against E. coli, P. aeruginosa and S. aureus were 0.5, 0.25 and 0.5mg/ml, respectively.
Keywords: Antimicrobial potency, Cinnamomum tamala, eugenol, supercritical carbon dioxide extraction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3630692 Learning to Order Terms: Supervised Interestingness Measures in Terminology Extraction
Authors: Jérôme Azé, Mathieu Roche, Yves Kodratoff, Michèle Sebag
Abstract:
Term Extraction, a key data preparation step in Text Mining, extracts the terms, i.e. relevant collocation of words, attached to specific concepts (e.g. genetic-algorithms and decisiontrees are terms associated to the concept “Machine Learning" ). In this paper, the task of extracting interesting collocations is achieved through a supervised learning algorithm, exploiting a few collocations manually labelled as interesting/not interesting. From these examples, the ROGER algorithm learns a numerical function, inducing some ranking on the collocations. This ranking is optimized using genetic algorithms, maximizing the trade-off between the false positive and true positive rates (Area Under the ROC curve). This approach uses a particular representation for the word collocations, namely the vector of values corresponding to the standard statistical interestingness measures attached to this collocation. As this representation is general (over corpora and natural languages), generality tests were performed by experimenting the ranking function learned from an English corpus in Biology, onto a French corpus of Curriculum Vitae, and vice versa, showing a good robustness of the approaches compared to the state-of-the-art Support Vector Machine (SVM).Keywords: Text-mining, Terminology Extraction, Evolutionary algorithm, ROC Curve.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1659691 Implementing a Database from a Requirement Specification
Abstract:
Creating a database scheme is essentially a manual process. From a requirement specification the information contained within has to be analyzed and reduced into a set of tables, attributes and relationships. This is a time consuming process that has to go through several stages before an acceptable database schema is achieved. The purpose of this paper is to implement a Natural Language Processing (NLP) based tool to produce a relational database from a requirement specification. The Stanford CoreNLP version 3.3.1 and the Java programming were used to implement the proposed model. The outcome of this study indicates that a first draft of a relational database schema can be extracted from a requirement specification by using NLP tools and techniques with minimum user intervention. Therefore this method is a step forward in finding a solution that requires little or no user intervention.
Keywords: Information Extraction, Natural Language Processing, Relation Extraction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2228690 Extractable Heavy Metal Concentrations in Bottom Ash from Incineration of Wood-Based Residues in a BFB Boiler Using Artificial Sweat and Gastric Fluids
Authors: Risto Pöykiö, Olli Dahl, Hannu Nurmesniemi
Abstract:
The highest extractable concentration in the artificial sweat fluid was observed for Ba (120mg/kg; d.w.). The highest extractable concentration in the artificial gastric fluid was observed for Al (9030mg/kg; d.w.). Furthermore, the extractable concentrations of Ba (550mg/kg; d.w.) and Zn (400mg/kg: d.w.) in the bottom ash using artificial gastric fluid were elevated. The extractable concentrations of all heavy metals in the artificial gastric fluid were higher than those in the artificial sweat fluid. These results are reasonable in the light of the fact that the pH of the artificial gastric fluid was extremely acidic both before (pH 1.54) and after (pH 1.94) extraction, whereas the pH of the artificial sweat fluid was slightly alkaline before (pH 6.50) and after extraction (pH 8.51).
Keywords: Ash, artificial fluid, heavy metals, in vitro, waste.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2929689 Hand Gesture Recognition Based on Combined Features Extraction
Authors: Mahmoud Elmezain, Ayoub Al-Hamadi, Bernd Michaelis
Abstract:
Hand gesture is an active area of research in the vision community, mainly for the purpose of sign language recognition and Human Computer Interaction. In this paper, we propose a system to recognize alphabet characters (A-Z) and numbers (0-9) in real-time from stereo color image sequences using Hidden Markov Models (HMMs). Our system is based on three main stages; automatic segmentation and preprocessing of the hand regions, feature extraction and classification. In automatic segmentation and preprocessing stage, color and 3D depth map are used to detect hands where the hand trajectory will take place in further step using Mean-shift algorithm and Kalman filter. In the feature extraction stage, 3D combined features of location, orientation and velocity with respected to Cartesian systems are used. And then, k-means clustering is employed for HMMs codeword. The final stage so-called classification, Baum- Welch algorithm is used to do a full train for HMMs parameters. The gesture of alphabets and numbers is recognized using Left-Right Banded model in conjunction with Viterbi algorithm. Experimental results demonstrate that, our system can successfully recognize hand gestures with 98.33% recognition rate.Keywords: Gesture Recognition, Computer Vision & Image Processing, Pattern Recognition.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4032688 Comparative Study of Decision Trees and Rough Sets Theory as Knowledge ExtractionTools for Design and Control of Industrial Processes
Authors: Marcin Perzyk, Artur Soroczynski
Abstract:
General requirements for knowledge representation in the form of logic rules, applicable to design and control of industrial processes, are formulated. Characteristic behavior of decision trees (DTs) and rough sets theory (RST) in rules extraction from recorded data is discussed and illustrated with simple examples. The significance of the models- drawbacks was evaluated, using simulated and industrial data sets. It is concluded that performance of DTs may be considerably poorer in several important aspects, compared to RST, particularly when not only a characterization of a problem is required, but also detailed and precise rules are needed, according to actual, specific problems to be solved.Keywords: Knowledge extraction, decision trees, rough setstheory, industrial processes.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1633687 Chilean Wines Classification based only on Aroma Information
Authors: Nicolás H. Beltrán, Manuel A. Duarte-Mermoud, Víctor A. Soto, Sebastián A. Salah, and Matías A. Bustos
Abstract:
Results of Chilean wine classification based on the information provided by an electronic nose are reported in this paper. The classification scheme consists of two parts; in the first stage, Principal Component Analysis is used as feature extraction method to reduce the dimensionality of the original information. Then, Radial Basis Functions Neural Networks is used as pattern recognition technique to perform the classification. The objective of this study is to classify different Cabernet Sauvignon, Merlot and Carménère wine samples from different years, valleys and vineyards of Chile.Keywords: Feature extraction techniques, Pattern recognitiontechniques, Principal component analysis, Radial basis functionsneural networks, Wine classification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1547686 Designing a Tool for Software Maintenance
Authors: Amir Ngah, Masita Abdul Jalil, Zailani Abdullah
Abstract:
The aim of software maintenance is to maintain the software system in accordance with advancement in software and hardware technology. One of the early works on software maintenance is to extract information at higher level of abstraction. In this paper, we present the process of how to design an information extraction tool for software maintenance. The tool can extract the basic information from old programs such as about variables, based classes, derived classes, objects of classes, and functions. The tool have two main parts; the lexical analyzer module that can read the input file character by character, and the searching module which users can get the basic information from the existing programs. We implemented this tool for a patterned sub-C++ language as an input file.
Keywords: Extraction tool, software maintenance, reverse engineering, C++.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2398685 Jamun Juice Extraction Using Commercial Enzymes and Optimization of the Treatment with the Help of Physicochemical, Nutritional and Sensory Properties
Authors: Payel Ghosh, Rama Chandra Pradhan, Sabyasachi Mishra
Abstract:
Jamun (Syzygium cuminii L.) is one of the important indigenous minor fruit with high medicinal value. The jamun cultivation is unorganized and there is huge loss of this fruit every year. The perishable nature of the fruit makes its postharvest management further difficult. Due to the strong cell wall structure of pectin-protein bonds and hard seeds, extraction of juice becomes difficult. Enzymatic treatment has been commercially used for improvement of juice quality with high yield. The objective of the study was to optimize the best treatment method for juice extraction. Enzymes (Pectinase and Tannase) from different stains had been used and for each enzyme, best result obtained by using response surface methodology. Optimization had been done on the basis of physicochemical property, nutritional property, sensory quality and cost estimation. According to quality aspect, cost analysis and sensory evaluation, the optimizing enzymatic treatment was obtained by Pectinase from Aspergillus aculeatus strain. The optimum condition for the treatment was 44 oC with 80 minute with a concentration of 0.05% (w/w). At these conditions, 75% of yield with turbidity of 32.21NTU, clarity of 74.39%T, polyphenol content of 115.31 mg GAE/g, protein content of 102.43 mg/g have been obtained with a significant difference in overall acceptability.Keywords: Jamun, enzymatic treatment, physicochemical property, sensory analysis, optimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1555684 Image Thresholding for Weld Defect Extraction in Industrial Radiographic Testing
Authors: Nafaâ Nacereddine, Latifa Hamami, Djemel Ziou
Abstract:
In non destructive testing by radiography, a perfect knowledge of the weld defect shape is an essential step to appreciate the quality of the weld and make decision on its acceptability or rejection. Because of the complex nature of the considered images, and in order that the detected defect region represents the most accurately possible the real defect, the choice of thresholding methods must be done judiciously. In this paper, performance criteria are used to conduct a comparative study of thresholding methods based on gray level histogram, 2-D histogram and locally adaptive approach for weld defect extraction in radiographic images.
Keywords: 1D and 2D histogram, locally adaptive approach, performance criteria, radiographic image, thresholding, weld defect.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2343683 Evaluation of Solid Phase Micro-extraction with Standard Testing Method for Formaldehyde Determination
Authors: Y. L. Yung, Kong Mun Lo
Abstract:
In this study, solid phase micro-extraction (SPME) was optimized to improve the sensitivity and accuracy in formaldehyde determination for plywood panels. Further work has been carried out to compare the newly developed technique with existing method which reacts formaldehyde collected in desiccators with acetyl acetone reagent (DC-AA). In SPME, formaldehyde was first derivatized with O-(2,3,4,5,6 pentafluorobenzyl)-hydroxylamine hydrochloride (PFBHA) and analysis was then performed by gas chromatography in combination with mass spectrometry (GC-MS). SPME data subjected to various wood species gave satisfactory results, with relative standard deviations (RSDs) obtained in the range of 3.1-10.3%. It was also well correlated with DC values, giving a correlation coefficient, RSQ, of 0.959. The quantitative analysis of formaldehyde by SPME was an alternative in wood industry with great potentialKeywords: Formaldehyde, GCMS, Plywood and SPME
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2537682 Qualitative and Quantitative Characterization of Generated Waste in Nouri Petrochemical Complex, Assaluyeh, Iran
Authors: L. Heidari, M. Jalili Ghazizade
Abstract:
In recent years, different petrochemical complexes have been established to produce aromatic compounds. Among them, Nouri Petrochemical Complex (NPC) is the largest producer of aromatic raw materials in the world, and is located in south of Iran. Environmental concerns have been raised in this region due to generation of different types of solid waste generated in the process of aromatics production, and subsequently, industrial waste characterization has been thoroughly considered. The aim of this study is qualitative and quantitative characterization of industrial waste generated in the aromatics production process and determination of the best method for industrial waste management. For this purpose, all generated industrial waste during the production process was determined using a checklist. Four main industrial wastes were identified as follows: spent industrial soil, spent catalyst, spent molecular sieves and spent N-formyl morpholine (NFM) solvent. The amount of heavy metals and organic compounds in these wastes were further measured in order to identify the nature and toxicity of such a dangerous compound. Then industrial wastes were classified based on lab analysis results as well as using different international lists of hazardous waste identification such as EPA, UNEP and Basel Convention. Finally, the best method of waste disposal is selected based on environmental, economic and technical aspects.
Keywords: Spent industrial soil, spent molecular sieve, spent normal ¬formyl -morpholine solvent.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 850681 Fuzzy Wavelet Packet based Feature Extraction Method for Multifunction Myoelectric Control
Authors: Rami N. Khushaba, Adel Al-Jumaily
Abstract:
The myoelectric signal (MES) is one of the Biosignals utilized in helping humans to control equipments. Recent approaches in MES classification to control prosthetic devices employing pattern recognition techniques revealed two problems, first, the classification performance of the system starts degrading when the number of motion classes to be classified increases, second, in order to solve the first problem, additional complicated methods were utilized which increase the computational cost of a multifunction myoelectric control system. In an effort to solve these problems and to achieve a feasible design for real time implementation with high overall accuracy, this paper presents a new method for feature extraction in MES recognition systems. The method works by extracting features using Wavelet Packet Transform (WPT) applied on the MES from multiple channels, and then employs Fuzzy c-means (FCM) algorithm to generate a measure that judges on features suitability for classification. Finally, Principle Component Analysis (PCA) is utilized to reduce the size of the data before computing the classification accuracy with a multilayer perceptron neural network. The proposed system produces powerful classification results (99% accuracy) by using only a small portion of the original feature set.Keywords: Biomedical Signal Processing, Data mining andInformation Extraction, Machine Learning, Rehabilitation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1737680 Combined Feature Based Hyperspectral Image Classification Technique Using Support Vector Machines
Authors: Mrs.K.Kavitha, S.Arivazhagan
Abstract:
A spatial classification technique incorporating a State of Art Feature Extraction algorithm is proposed in this paper for classifying a heterogeneous classes present in hyper spectral images. The classification accuracy can be improved if and only if both the feature extraction and classifier selection are proper. As the classes in the hyper spectral images are assumed to have different textures, textural classification is entertained. Run Length feature extraction is entailed along with the Principal Components and Independent Components. A Hyperspectral Image of Indiana Site taken by AVIRIS is inducted for the experiment. Among the original 220 bands, a subset of 120 bands is selected. Gray Level Run Length Matrix (GLRLM) is calculated for the selected forty bands. From GLRLMs the Run Length features for individual pixels are calculated. The Principle Components are calculated for other forty bands. Independent Components are calculated for next forty bands. As Principal & Independent Components have the ability to represent the textural content of pixels, they are treated as features. The summation of Run Length features, Principal Components, and Independent Components forms the Combined Features which are used for classification. SVM with Binary Hierarchical Tree is used to classify the hyper spectral image. Results are validated with ground truth and accuracies are calculated.
Keywords: Multi-class, Run Length features, PCA, ICA, classification and Support Vector Machines.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1523679 Prediction of Cardiovascular Disease by Applying Feature Extraction
Authors: Nebi Gedik
Abstract:
Heart disease threatens the lives of a great number of people every year around the world. Heart issues lead to many of all deaths; therefore, early diagnosis and treatment are critical. The diagnosis of heart disease is complicated due to several factors affecting health such as high blood pressure, raised cholesterol, an irregular pulse rhythm, and more. Artificial intelligence has the potential to assist in the early detection and treatment of diseases. Improving heart failure prediction is one of the primary goals of research on heart disease risk assessment. This study aims to determine the features that provide the most successful classification prediction in detecting cardiovascular disease. The performances of each feature are compared using the K-Nearest Neighbor machine learning method. The feature that gives the most successful performance has been identified.
Keywords: Cardiovascular disease, feature extraction, supervised learning, k-NN.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 137678 Detection of Power Quality Disturbances using Wavelet Transform
Authors: Sudipta Nath, Arindam Dey, Abhijit Chakrabarti
Abstract:
This paper presents features that characterize power quality disturbances from recorded voltage waveforms using wavelet transform. The discrete wavelet transform has been used to detect and analyze power quality disturbances. The disturbances of interest include sag, swell, outage and transient. A power system network has been simulated by Electromagnetic Transients Program. Voltage waveforms at strategic points have been obtained for analysis, which includes different power quality disturbances. Then wavelet has been chosen to perform feature extraction. The outputs of the feature extraction are the wavelet coefficients representing the power quality disturbance signal. Wavelet coefficients at different levels reveal the time localizing information about the variation of the signal.Keywords: Power quality, detection of disturbance, wavelet transform, multiresolution signal decomposition.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3425677 Application of Liquid Emulsion Membrane Technique for the Removal of Cadmium(II) from Aqueous Solutions Using Aliquat 336 as a Carrier
Authors: B. Medjahed, M. A. Didi, B. Guezzen
Abstract:
In the present work, emulsion liquid membrane (ELM) technique was applied for the extraction of cadmium(II) present in aqueous samples. Aliquat 336 (Chloride tri-N-octylmethylammonium) was used as carrier to extract cadmium(II). The main objective of this work is to investigate the influence of various parameters affected the ELM formation and its stability and testing the performance of the prepared ELM on removal of cadmium by using synthetic solution with different concentrations. Experiments were conducted to optimize pH of the feed solution and it was found that cadmium(II) can be extracted at pH 6.5. The influence of the carrier concentration and treat ratio on the extraction process was investigated. The obtained results showed that the optimal values are respectively 3% (Aliquat 336) and a ratio (feed: emulsion) equal to 1:1.Keywords: Cadmium, carrier, emulsion liquid membrane, surfactant.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1278676 Objects Extraction by Cooperating Optical Flow, Edge Detection and Region Growing Procedures
Abstract:
The image segmentation method described in this paper has been developed as a pre-processing stage to be used in methodologies and tools for video/image indexing and retrieval by content. This method solves the problem of whole objects extraction from background and it produces images of single complete objects from videos or photos. The extracted images are used for calculating the object visual features necessary for both indexing and retrieval processes. The segmentation algorithm is based on the cooperation among an optical flow evaluation method, edge detection and region growing procedures. The optical flow estimator belongs to the class of differential methods. It permits to detect motions ranging from a fraction of a pixel to a few pixels per frame, achieving good results in presence of noise without the need of a filtering pre-processing stage and includes a specialised model for moving object detection. The first task of the presented method exploits the cues from motion analysis for moving areas detection. Objects and background are then refined using respectively edge detection and seeded region growing procedures. All the tasks are iteratively performed until objects and background are completely resolved. The method has been applied to a variety of indoor and outdoor scenes where objects of different type and shape are represented on variously textured background.Keywords: Image Segmentation, Motion Detection, Object Extraction, Optical Flow
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1757675 Extraction of Forest Plantation Resources in Selected Forest of San Manuel, Pangasinan, Philippines Using LiDAR Data for Forest Status Assessment
Authors: Mark Joseph Quinto, Roan Beronilla, Guiller Damian, Eliza Camaso, Ronaldo Alberto
Abstract:
Forest inventories are essential to assess the composition, structure and distribution of forest vegetation that can be used as baseline information for management decisions. Classical forest inventory is labor intensive and time-consuming and sometimes even dangerous. The use of Light Detection and Ranging (LiDAR) in forest inventory would improve and overcome these restrictions. This study was conducted to determine the possibility of using LiDAR derived data in extracting high accuracy forest biophysical parameters and as a non-destructive method for forest status analysis of San Manual, Pangasinan. Forest resources extraction was carried out using LAS tools, GIS, Envi and .bat scripts with the available LiDAR data. The process includes the generation of derivatives such as Digital Terrain Model (DTM), Canopy Height Model (CHM) and Canopy Cover Model (CCM) in .bat scripts followed by the generation of 17 composite bands to be used in the extraction of forest classification covers using ENVI 4.8 and GIS software. The Diameter in Breast Height (DBH), Above Ground Biomass (AGB) and Carbon Stock (CS) were estimated for each classified forest cover and Tree Count Extraction was carried out using GIS. Subsequently, field validation was conducted for accuracy assessment. Results showed that the forest of San Manuel has 73% Forest Cover, which is relatively much higher as compared to the 10% canopy cover requirement. On the extracted canopy height, 80% of the tree’s height ranges from 12 m to 17 m. CS of the three forest covers based on the AGB were: 20819.59 kg/20x20 m for closed broadleaf, 8609.82 kg/20x20 m for broadleaf plantation and 15545.57 kg/20x20m for open broadleaf. Average tree counts for the tree forest plantation was 413 trees/ha. As such, the forest of San Manuel has high percent forest cover and high CS.
Keywords: Carbon stock, forest inventory, LiDAR, tree count.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1281674 Wavelet-Based ECG Signal Analysis and Classification
Authors: Madina Hamiane, May Hashim Ali
Abstract:
This paper presents the processing and analysis of ECG signals. The study is based on wavelet transform and uses exclusively the MATLAB environment. This study includes removing Baseline wander and further de-noising through wavelet transform and metrics such as signal-to noise ratio (SNR), Peak signal-to-noise ratio (PSNR) and the mean squared error (MSE) are used to assess the efficiency of the de-noising techniques. Feature extraction is subsequently performed whereby signal features such as heart rate, rise and fall levels are extracted and the QRS complex was detected which helped in classifying the ECG signal. The classification is the last step in the analysis of the ECG signals and it is shown that these are successfully classified as Normal rhythm or Abnormal rhythm. The final result proved the adequacy of using wavelet transform for the analysis of ECG signals.
Keywords: ECG Signal, QRS detection, thresholding, wavelet decomposition, feature extraction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1276673 High Resolution Images: Segmenting, Extracting Information and GIS Integration
Authors: Erick López-Ornelas
Abstract:
As the world changes more rapidly, the demand for update information for resource management, environment monitoring, planning are increasing exponentially. Integration of Remote Sensing with GIS technology will significantly promote the ability for addressing these concerns. This paper presents an alternative way of update GIS applications using image processing and high resolution images. We show a method of high-resolution image segmentation using graphs and morphological operations, where a preprocessing step (watershed operation) is required. A morphological process is then applied using the opening and closing operations. After this segmentation we can extract significant cartographic elements such as urban areas, streets or green areas. The result of this segmentation and this extraction is then used to update GIS applications. Some examples are shown using aerial photography.
Keywords: GIS, Remote Sensing, image segmentation, feature extraction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1642