Search results for: POI extraction method
19942 A Mutually Exclusive Task Generation Method Based on Data Augmentation
Authors: Haojie Wang, Xun Li, Rui Yin
Abstract:
In order to solve the memorization overfitting in the model-agnostic meta-learning MAML algorithm, a method of generating mutually exclusive tasks based on data augmentation is proposed. This method generates a mutex task by corresponding one feature of the data to multiple labels so that the generated mutex task is inconsistent with the data distribution in the initial dataset. Because generating mutex tasks for all data will produce a large number of invalid data and, in the worst case, lead to an exponential growth of computation, this paper also proposes a key data extraction method that only extract part of the data to generate the mutex task. The experiments show that the method of generating mutually exclusive tasks can effectively solve the memorization overfitting in the meta-learning MAML algorithm.Keywords: mutex task generation, data augmentation, meta-learning, text classification.
Procedia PDF Downloads 14419941 Machine Learning Strategies for Data Extraction from Unstructured Documents in Financial Services
Authors: Delphine Vendryes, Dushyanth Sekhar, Baojia Tong, Matthew Theisen, Chester Curme
Abstract:
Much of the data that inform the decisions of governments, corporations and individuals are harvested from unstructured documents. Data extraction is defined here as a process that turns non-machine-readable information into a machine-readable format that can be stored, for instance, in a database. In financial services, introducing more automation in data extraction pipelines is a major challenge. Information sought by financial data consumers is often buried within vast bodies of unstructured documents, which have historically required thorough manual extraction. Automated solutions provide faster access to non-machine-readable datasets, in a context where untimely information quickly becomes irrelevant. Data quality standards cannot be compromised, so automation requires high data integrity. This multifaceted task is broken down into smaller steps: ingestion, table parsing (detection and structure recognition), text analysis (entity detection and disambiguation), schema-based record extraction, user feedback incorporation. Selected intermediary steps are phrased as machine learning problems. Solutions leveraging cutting-edge approaches from the fields of computer vision (e.g. table detection) and natural language processing (e.g. entity detection and disambiguation) are proposed.Keywords: computer vision, entity recognition, finance, information retrieval, machine learning, natural language processing
Procedia PDF Downloads 11419940 Contribution to the Study of Some Phytochemicals and Biological Aspects of Artemisia absinthium L
Authors: Sihem Benmimoune, Abdelbaki Lemgharbi, Ahmed Ait Yahia, Abdelkrim Kameli
Abstract:
Our study is based on chemical and phytochemical characterization of Artemisia absinthium L and in vitro tests to demonstrate the biological activities of essential oil and natural extract. A qualitative and quantitative comparison of the essential oil extracted by two extraction procedures was performed by analysis of CG/SM and the yield calculation. The method of hydrodistillation has a chemical composition and provides oil content than the best training water vapor. These oils are composed mainly of thujone followed chamazulene and ρ-cymene. The antimicrobial activity of wormwood oil was tested in vitro by two methods (agar diffusion and microdilution) on four plant pathogenic fungi (Aspergillus sp, Botrytis cinerea, Fusarium culmorum and Helminthosporium sp). The study of the antifungal effect showed that this oil has an inhibitory effect counterpart the microorganisms tested in particular the strain Botrytis cinerea. Otherwise, this activity depends on the nature of the oil and the germ itself. The antioxidant activity in vitro was studied with the DPPH method. The activity test shows that the oil and extract of Artemisia absinthium have a very low antioxidant capacity compared to the antioxidants used as a reference. The extract has a potentially high antiradical power not from its oil. The quantitative determinations of phenolic compounds by the Folin-Ciocalteu revealed that absinthe is low in total polyphenols and tannins.Keywords: artemisia absinthium, biological activities, essential oil, extraction processes
Procedia PDF Downloads 34219939 An Investigation of the Use of Visible Spectrophotometric Analysis of Lead in an Herbal Tea Supplement
Authors: Salve Alessandria Alcantara, John Armand E. Aquino, Ma. Veronica Aranda, Nikki Francine Balde, Angeli Therese F. Cruz, Elise Danielle Garcia, Antonie Kyna Lim, Divina Gracia Lucero, Nikolai Thadeus Mappatao, Maylan N. Ocat, Jamille Dyanne L. Pajarillo, Jane Mierial A. Pesigan, Grace Kristin Viva, Jasmine Arielle C. Yap, Kathleen Michelle T. Yu, Joanna J. Orejola, Joanna V. Toralba
Abstract:
Lead is a neurotoxic metallic element that is slowly accumulated in bones and tissues especially if present in products taken in a regular basis such as herbal tea supplements. Although sensitive analytical instruments are already available, the USP limit test for lead is still widely used. However, because of its serious shortcomings, Lang Lang and his colleagues developed a spectrophotometric method for determination of lead in all types of samples. This method was the one adapted in this study. The actual procedure performed was divided into three parts: digestion, extraction and analysis. For digestion, HNO3 and CH3COOH were used. Afterwards, masking agents, 0.003% and 0.001% dithizone in CHCl3 were added and used for the extraction. For the analysis, standard addition method and colorimetry were performed. This was done in triplicates under two conditions. The 1st condition, using 25µg/mL of standard, resulted to very low absorbances with an r2 of 0.551. This led to the use of a higher concentration, 1mg/mL, for condition 2. Precipitation of lead cyanide was observed and the absorbance readings were relatively higher but between 0.15-0.25, resulting to a very low r2 of 0.429. LOQ and LOD were not computed due to the limitations of the Milton-Roy Spectrophotometer. The method performed has a shorter digestion time, and used less but more accessible reagents. However, the optimum ratio of dithizone-lead complex must be observed in order to obtain reliable results while exploring other concentration of standards.Keywords: herbal tea supplement, lead-dithizone complex, standard addition, visible spectroscopy
Procedia PDF Downloads 38819938 Dissolved Gas Analysis Based Regression Rules from Trained ANN for Transformer Fault Diagnosis
Authors: Deepika Bhalla, Raj Kumar Bansal, Hari Om Gupta
Abstract:
Dissolved Gas Analysis (DGA) has been widely used for fault diagnosis in a transformer. Artificial neural networks (ANN) have high accuracy but are regarded as black boxes that are difficult to interpret. For many problems it is desired to extract knowledge from trained neural networks (NN) so that the user can gain a better understanding of the solution arrived by the NN. This paper applies a pedagogical approach for rule extraction from function approximating neural networks (REFANN) with application to incipient fault diagnosis using the concentrations of the dissolved gases within the transformer oil, as the input to the NN. The input space is split into subregions and for each subregion there is a linear equation that is used to predict the type of fault developing within a transformer. The experiments on real data indicate that the approach used can extract simple and useful rules and give fault predictions that match the actual fault and are at times also better than those predicted by the IEC method.Keywords: artificial neural networks, dissolved gas analysis, rules extraction, transformer
Procedia PDF Downloads 53719937 Magnetic Nano-Composite of Self-Doped Polyaniline Nanofibers for Magnetic Dispersive Micro Solid Phase Extraction Applications
Authors: Hatem I. Mokhtar, Randa A. Abd-El-Salam, Ghada M. Hadad
Abstract:
An improved nano-composite of self-doped polyaniline nanofibers and silica-coated magnetite nanoparticles were prepared and evaluated for suitability to magnetic dispersive micro solid-phase extraction. The work focused on optimization of the composite capacity to extract four fluoroquinolones (FQs) antibiotics, ciprofloxacin, enrofloxacin, danofloxacin, and difloxacin from water and improvement of composite stability towards acid and atmospheric degradation. Self-doped polyaniline nanofibers were prepared by oxidative co-polymerization of aniline with anthranilic acid. Magnetite nanopariticles were prepared by alkaline co-precipitation and coated with silica by silicate hydrolysis on magnetite nanoparticles surface at pH 6.5. The composite was formed by self-assembly by mixing self-doped polyaniline nanofibers with silica-coated magnetite nanoparticles dispersions in ethanol. The composite structure was confirmed by transmission electron microscopy (TEM). Self-doped polyaniline nanofibers and magnetite chemical structures were confirmed by FT-IR while silica coating of the magnetite was confirmed by Energy Dispersion X-ray Spectroscopy (EDS). Improved stability of the composite magnetic component was evidenced by resistance to degrade in 2N HCl solution. The adsorption capacity of self-doped polyaniline nanofibers based composite was higher than previously reported corresponding composite prepared from polyaniline nanofibers instead of self-doped polyaniline nanofibers. Adsorption-pH profile for the studied FQs on the prepared composite revealed that the best pH for adsorption was in range of 6.5 to 7. Best extraction recovery values were obtained at pH 7 using phosphate buffer. The best solvent for FQs desorption was found to be 0.1N HCl in methanol:water (8:2; v/v) mixture. 20 mL of Spiked water sample with studied FQs were preconcentrated using 4.8 mg of composite and resulting extracts were analysed by HPLC-UV method. The prepared composite represented a suitable adsorbent phase for magnetic dispersive micro-solid phase application.Keywords: fluoroquinolones, magnetic dispersive micro extraction, nano-composite, self-doped polyaniline nanofibers
Procedia PDF Downloads 12219936 Comparison Physicochemical Properties of Hexane Extracted Aniseed Oil from Cold Press Extraction Residue and Cold Press Aniseed Oil
Authors: Derya Ören, Şeyma Akalın
Abstract:
Cold pres technique is a traditional method to obtain oil. The cold-pressing procedure, involves neither heat nor chemical treatments, so cold press technique has low oil yield and cold pressed herbal material residue still contains some oil. In this study, the oil that is remained in the cold pressed aniseed extracted with hegzan and analysed to determine physicochemical properties and quality parameters. It is found that the aniseed after cold press process contains % 10 oil. Other analysis parametres free fatty acid (FFA) is 2,1 mgKOH/g, peroxide value is 7,6 meq02/kg. Cold pressed aniseed oil values are determined for fatty acid (FFA) value as 2,1 mgKOH/g, peroxide value 4,5 meq02/kg respectively. Also fatty acid composition is analysed, it is found that both of these oil have same fatty acid composition. The main fatty acids are; oleic, linoleic, and palmitic acids.Keywords: aniseed oil, cold press, extraction, residue
Procedia PDF Downloads 40619935 Extraction of Forest Plantation Resources in Selected Forest of San Manuel, Pangasinan, Philippines Using LiDAR Data for Forest Status Assessment
Authors: Mark Joseph Quinto, Roan Beronilla, Guiller Damian, Eliza Camaso, Ronaldo Alberto
Abstract:
Forest inventories are essential to assess the composition, structure and distribution of forest vegetation that can be used as baseline information for management decisions. Classical forest inventory is labor intensive and time-consuming and sometimes even dangerous. The use of Light Detection and Ranging (LiDAR) in forest inventory would improve and overcome these restrictions. This study was conducted to determine the possibility of using LiDAR derived data in extracting high accuracy forest biophysical parameters and as a non-destructive method for forest status analysis of San Manual, Pangasinan. Forest resources extraction was carried out using LAS tools, GIS, Envi and .bat scripts with the available LiDAR data. The process includes the generation of derivatives such as Digital Terrain Model (DTM), Canopy Height Model (CHM) and Canopy Cover Model (CCM) in .bat scripts followed by the generation of 17 composite bands to be used in the extraction of forest classification covers using ENVI 4.8 and GIS software. The Diameter in Breast Height (DBH), Above Ground Biomass (AGB) and Carbon Stock (CS) were estimated for each classified forest cover and Tree Count Extraction was carried out using GIS. Subsequently, field validation was conducted for accuracy assessment. Results showed that the forest of San Manuel has 73% Forest Cover, which is relatively much higher as compared to the 10% canopy cover requirement. On the extracted canopy height, 80% of the tree’s height ranges from 12 m to 17 m. CS of the three forest covers based on the AGB were: 20819.59 kg/20x20 m for closed broadleaf, 8609.82 kg/20x20 m for broadleaf plantation and 15545.57 kg/20x20m for open broadleaf. Average tree counts for the tree forest plantation was 413 trees/ha. As such, the forest of San Manuel has high percent forest cover and high CS.Keywords: carbon stock, forest inventory, LiDAR, tree count
Procedia PDF Downloads 39119934 Selective Separation of Amino Acids by Reactive Extraction with Di-(2-Ethylhexyl) Phosphoric Acid
Authors: Alexandra C. Blaga, Dan Caşcaval, Alexandra Tucaliuc, Madalina Poştaru, Anca I. Galaction
Abstract:
Amino acids are valuable chemical products used in in human foods, in animal feed additives and in the pharmaceutical field. Recently, there has been a noticeable rise of amino acids utilization throughout the world to include their use as raw materials in the production of various industrial chemicals: oil gelating agents (amino acid-based surfactants) to recover effluent oil in seas and rivers and poly(amino acids), which are attracting attention for biodegradable plastics manufacture. The amino acids can be obtained by biosynthesis or from protein hydrolysis, but their separation from the obtained mixtures can be challenging. In the last decades there has been a continuous interest in developing processes that will improve the selectivity and yield of downstream processing steps. The liquid-liquid extraction of amino acids (dissociated at any pH-value of the aqueous solutions) is possible only by using the reactive extraction technique, mainly with extractants of organophosphoric acid derivatives, high molecular weight amines and crown-ethers. The purpose of this study was to analyse the separation of nine amino acids of acidic character (l-aspartic acid, l-glutamic acid), basic character (l-histidine, l-lysine, l-arginine) and neutral character (l-glycine, l-tryptophan, l-cysteine, l-alanine) by reactive extraction with di-(2-ethylhexyl)phosphoric acid (D2EHPA) dissolved in butyl acetate. The results showed that the separation yield is controlled by the pH value of the aqueous phase: the reactive extraction of amino acids with D2EHPA is possible only if the amino acids exist in aqueous solution in their cationic forms (pH of aqueous phase below the isoeletric point). The studies for individual amino acids indicated the possibility of selectively separate different groups of amino acids with similar acidic properties as a function of aqueous solution pH-value: the maximum yields are reached for a pH domain of 2–3, then strongly decreasing with the pH increase. Thus, for acidic and neutral amino acids, the extraction becomes impossible at the isolelectric point (pHi) and for basic amino acids at a pH value lower than pHi, as a result of the carboxylic group dissociation. From the results obtained for the separation from the mixture of the nine amino acids, at different pH, it can be observed that all amino acids are extracted with different yields, for a pH domain of 1.5–3. Over this interval, the extract contains only the amino acids with neutral and basic character. For pH 5–6, only the neutral amino acids are extracted and for pH > 6 the extraction becomes impossible. Using this technique, the total separation of the following amino acids groups has been performed: neutral amino acids at pH 5–5.5, basic amino acids and l-cysteine at pH 4–4.5, l-histidine at pH 3–3.5 and acidic amino acids at pH 2–2.5.Keywords: amino acids, di-(2-ethylhexyl) phosphoric acid, reactive extraction, selective extraction
Procedia PDF Downloads 43119933 Case Study on Innovative Aquatic-Based Bioeconomy for Chlorella sorokiniana
Authors: Iryna Atamaniuk, Hannah Boysen, Nils Wieczorek, Natalia Politaeva, Iuliia Bazarnova, Kerstin Kuchta
Abstract:
Over the last decade due to climate change and a strategy of natural resources preservation, the interest for the aquatic biomass has dramatically increased. Along with mitigation of the environmental pressure and connection of waste streams (including CO2 and heat emissions), microalgae bioeconomy can supply food, feed, as well as the pharmaceutical and power industry with number of value-added products. Furthermore, in comparison to conventional biomass, microalgae can be cultivated in wide range of conditions without compromising food and feed production, thus addressing issues associated with negative social and the environmental impacts. This paper presents the state-of-the art technology for microalgae bioeconomy from cultivation process to production of valuable components and by-streams. Microalgae Chlorella sorokiniana were cultivated in the pilot-scale innovation concept in Hamburg (Germany) using different systems such as race way pond (5000 L) and flat panel reactors (8 x 180 L). In order to achieve the optimum growth conditions along with suitable cellular composition for the further extraction of the value-added components, process parameters such as light intensity, temperature and pH are continuously being monitored. On the other hand, metabolic needs in nutrients were provided by addition of micro- and macro-nutrients into a medium to ensure autotrophic growth conditions of microalgae. The cultivation was further followed by downstream process and extraction of lipids, proteins and saccharides. Lipids extraction is conducted in repeated-batch semi-automatic mode using hot extraction method according to Randall. As solvents hexane and ethanol are used at different ratio of 9:1 and 1:9, respectively. Depending on cell disruption method along with solvents ratio, the total lipids content showed significant variations between 8.1% and 13.9 %. The highest percentage of extracted biomass was reached with a sample pretreated with microwave digestion using 90% of hexane and 10% of ethanol as solvents. Proteins content in microalgae was determined by two different methods, namely: Total Kejadahl Nitrogen (TKN), which further was converted to protein content, as well as Bradford method using Brilliant Blue G-250 dye. Obtained results, showed a good correlation between both methods with protein content being in the range of 39.8–47.1%. Characterization of neutral and acid saccharides from microalgae was conducted by phenol-sulfuric acid method at two wavelengths of 480 nm and 490 nm. The average concentration of neutral and acid saccharides under the optimal cultivation conditions was 19.5% and 26.1%, respectively. Subsequently, biomass residues are used as substrate for anaerobic digestion on the laboratory-scale. The methane concentration, which was measured on the daily bases, showed some variations for different samples after extraction steps but was in the range between 48% and 55%. CO2 which is formed during the fermentation process and after the combustion in the Combined Heat and Power unit can potentially be used within the cultivation process as a carbon source for the photoautotrophic synthesis of biomass.Keywords: bioeconomy, lipids, microalgae, proteins, saccharides
Procedia PDF Downloads 24619932 The Studies of the Sorption Capabilities of the Porous Microspheres with Lignin
Authors: M. Goliszek, M. Sobiesiak, O. Sevastyanova, B. Podkoscielna
Abstract:
Lignin is one of three main constituents of biomass together with cellulose and hemicellulose. It is a complex biopolymer, which contains a large number of functional groups, including aliphatic and aromatic hydroxyl groups, carbohylic groups and methoxy groups in its structure, that is why it shows potential capacities for process of sorption. Lignin is a highly cross-linked polymer with a three-dimentional structure which can provide large surface area and pore volumes. It can also posses better dispersion, diffusion and mass transfer behavior in a field of the removal of, e.g., heavy-metal-ions or aromatic pollutions. In this work emulsion-suspension copolymerization method, to synthesize the porous microspheres of divinylbenzene (DVB), styrene (St) and lignin was used. There are also microspheres without the addition of lignin for comparison. Before the copolymerization, modification lignin with methacryloyl chloride, to improve its reactivity with other monomers was done. The physico-chemical properties of the obtained microspheres, e.g., pore structures (adsorption-desorption measurements), thermal properties (DSC), tendencies to swell and the actual shapes were also studied. Due to well-developed porous structure and the presence of functional groups our materials may have great potential in sorption processes. To estimate the sorption capabilities of the microspheres towards phenol and its chlorinated derivatives the off-line SPE (solid-phase extraction) method is going to be applied. This method has various advantages, including low-cost, easy to use and enables the rapid measurements for a large number of chemicals. The efficiency of the materials in removing phenols from aqueous solution and in desorption processes will be evaluated.Keywords: microspheres, lignin, sorption, solid-phase extraction
Procedia PDF Downloads 18319931 Gas Chromatography Coupled to Tandem Mass Spectrometry and Liquid Chromatography Coupled to Tandem Mass Spectrometry Qualitative Determination of Pesticides Found in Tea Infusions
Authors: Mihai-Alexandru Florea, Veronica Drumea, Roxana Nita, Cerasela Gird, Laura Olariu
Abstract:
The aim of this study was to investigate the residues of pesticide found in tea water infusions. A multi-residues method to determine 147 pesticides has been developed using the QuEChERS (Quick, Easy, Cheap, Effective, Rugged, Safe) procedure and dispersive solid phase extraction (d-SPE) for the cleanup the pesticides from complex matrices such as plants and tea. Sample preparation was carefully optimized for the efficient removal of coextracted matrix components by testing more solvent systems. Determination of pesticides was performed using GC-MS/MS (100 of pesticides) and LC-MS/MS (47 of pesticides). The selected reaction monitoring (SRM) mode was chosen to achieve low detection limits and high compounds selectivity and sensitivity. Overall performance was evaluated and validated according to DG-SANTE Guidelines. To assess the pesticide residue transfer rate (qualitative) from dried tea in infusions the samples (tea) were spiked with a mixture of pesticides at the maximum residues level accepted for teas and herbal infusions. In order to investigate the release of the pesticides in tea preparations, the medicinal plants were prepared in four ways by variation of water temperature and the infusion time. The pesticides from infusions were extracted using two methods: QuEChERS versus solid-phase extraction (SPE). More that 90 % of the pesticides studied was identified in infusion.Keywords: tea, solid-phase extraction (SPE), selected reaction monitoring (SRM), QuEChERS
Procedia PDF Downloads 21419930 AS-Geo: Arbitrary-Sized Image Geolocalization with Learnable Geometric Enhancement Resizer
Authors: Huayuan Lu, Chunfang Yang, Ma Zhu, Baojun Qi, Yaqiong Qiao, Jiangqian Xu
Abstract:
Image geolocalization has great application prospects in fields such as autonomous driving and virtual/augmented reality. In practical application scenarios, the size of the image to be located is not fixed; it is impractical to train different networks for all possible sizes. When its size does not match the size of the input of the descriptor extraction model, existing image geolocalization methods usually directly scale or crop the image in some common ways. This will result in the loss of some information important to the geolocalization task, thus affecting the performance of the image geolocalization method. For example, excessive down-sampling can lead to blurred building contour, and inappropriate cropping can lead to the loss of key semantic elements, resulting in incorrect geolocation results. To address this problem, this paper designs a learnable image resizer and proposes an arbitrary-sized image geolocation method. (1) The designed learnable image resizer employs the self-attention mechanism to enhance the geometric features of the resized image. Firstly, it applies bilinear interpolation to the input image and its feature maps to obtain the initial resized image and the resized feature maps. Then, SKNet (selective kernel net) is used to approximate the best receptive field, thus keeping the geometric shapes as the original image. And SENet (squeeze and extraction net) is used to automatically select the feature maps with strong contour information, enhancing the geometric features. Finally, the enhanced geometric features are fused with the initial resized image, to obtain the final resized images. (2) The proposed image geolocalization method embeds the above image resizer as a fronting layer of the descriptor extraction network. It not only enables the network to be compatible with arbitrary-sized input images but also enhances the geometric features that are crucial to the image geolocalization task. Moreover, the triplet attention mechanism is added after the first convolutional layer of the backbone network to optimize the utilization of geometric elements extracted by the first convolutional layer. Finally, the local features extracted by the backbone network are aggregated to form image descriptors for image geolocalization. The proposed method was evaluated on several mainstream datasets, such as Pittsburgh30K, Tokyo24/7, and Places365. The results show that the proposed method has excellent size compatibility and compares favorably to recently mainstream geolocalization methods.Keywords: image geolocalization, self-attention mechanism, image resizer, geometric feature
Procedia PDF Downloads 21619929 On-Line Super Critical Fluid Extraction, Supercritical Fluid Chromatography, Mass Spectrometry, a Technique in Pharmaceutical Analysis
Authors: Narayana Murthy Akurathi, Vijaya Lakshmi Marella
Abstract:
The literature is reviewed with regard to online Super critical fluid extraction (SFE) coupled directly with supercritical fluid chromatography (SFC) -mass spectrometry that have typically more sensitive than conventional LC-MS/MS and GC-MS/MS. It is becoming increasingly interesting to use on-line techniques that combine sample preparation, separation and detection in one analytical set up. This provides less human intervention, uses small amount of sample and organic solvent and yields enhanced analyte enrichment in a shorter time. The sample extraction is performed under light shielding and anaerobic conditions, preventing the degradation of thermo labile analytes. It may be able to analyze compounds over a wide polarity range as SFC generally uses carbon dioxide which was collected as a by-product of other chemical reactions or is collected from the atmosphere as it contributes no new chemicals to the environment. The diffusion of solutes in supercritical fluids is about ten times greater than that in liquids and about three times less than in gases which results in a decrease in resistance to mass transfer in the column and allows for fast high resolution separations. The drawback of SFC when using carbon dioxide as mobile phase is that the direct introduction of water samples poses a series of problems, water must therefore be eliminated before it reaches the analytical column. Hundreds of compounds analysed simultaneously by simple enclosing in an extraction vessel. This is mainly applicable for pharmaceutical industry where it can analyse fatty acids and phospholipids that have many analogues as their UV spectrum is very similar, trace additives in polymers, cleaning validation can be conducted by putting swab sample in an extraction vessel, analysing hundreds of pesticides with good resolution.Keywords: super critical fluid extraction (SFE), super critical fluid chromatography (SFC), LCMS/MS, GCMS/MS
Procedia PDF Downloads 39119928 Laser Data Based Automatic Generation of Lane-Level Road Map for Intelligent Vehicles
Authors: Zehai Yu, Hui Zhu, Linglong Lin, Huawei Liang, Biao Yu, Weixin Huang
Abstract:
With the development of intelligent vehicle systems, a high-precision road map is increasingly needed in many aspects. The automatic lane lines extraction and modeling are the most essential steps for the generation of a precise lane-level road map. In this paper, an automatic lane-level road map generation system is proposed. To extract the road markings on the ground, the multi-region Otsu thresholding method is applied, which calculates the intensity value of laser data that maximizes the variance between background and road markings. The extracted road marking points are then projected to the raster image and clustered using a two-stage clustering algorithm. Lane lines are subsequently recognized from these clusters by the shape features of their minimum bounding rectangle. To ensure the storage efficiency of the map, the lane lines are approximated to cubic polynomial curves using a Bayesian estimation approach. The proposed lane-level road map generation system has been tested on urban and expressway conditions in Hefei, China. The experimental results on the datasets show that our method can achieve excellent extraction and clustering effect, and the fitted lines can reach a high position accuracy with an error of less than 10 cm.Keywords: curve fitting, lane-level road map, line recognition, multi-thresholding, two-stage clustering
Procedia PDF Downloads 12819927 Literature Review on Text Comparison Techniques: Analysis of Text Extraction, Main Comparison and Visual Representation Tools
Authors: Andriana Mkrtchyan, Vahe Khlghatyan
Abstract:
The choice of a profession is one of the most important decisions people make throughout their life. With the development of modern science, technologies, and all the spheres existing in the modern world, more and more professions are being arisen that complicate even more the process of choosing. Hence, there is a need for a guiding platform to help people to choose a profession and the right career path based on their interests, skills, and personality. This review aims at analyzing existing methods of comparing PDF format documents and suggests that a 3-stage approach is implemented for the comparison, that is – 1. text extraction from PDF format documents, 2. comparison of the extracted text via NLP algorithms, 3. comparison representation using special shape and color psychology methodology.Keywords: color psychology, data acquisition/extraction, data augmentation, disambiguation, natural language processing, outlier detection, semantic similarity, text-mining, user evaluation, visual search
Procedia PDF Downloads 7919926 A Supervised Learning Data Mining Approach for Object Recognition and Classification in High Resolution Satellite Data
Authors: Mais Nijim, Rama Devi Chennuboyina, Waseem Al Aqqad
Abstract:
Advances in spatial and spectral resolution of satellite images have led to tremendous growth in large image databases. The data we acquire through satellites, radars and sensors consists of important geographical information that can be used for remote sensing applications such as region planning, disaster management. Spatial data classification and object recognition are important tasks for many applications. However, classifying objects and identifying them manually from images is a difficult task. Object recognition is often considered as a classification problem, this task can be performed using machine-learning techniques. Despite of many machine-learning algorithms, the classification is done using supervised classifiers such as Support Vector Machines (SVM) as the area of interest is known. We proposed a classification method, which considers neighboring pixels in a region for feature extraction and it evaluates classifications precisely according to neighboring classes for semantic interpretation of region of interest (ROI). A dataset has been created for training and testing purpose; we generated the attributes by considering pixel intensity values and mean values of reflectance. We demonstrated the benefits of using knowledge discovery and data-mining techniques, which can be on image data for accurate information extraction and classification from high spatial resolution remote sensing imagery.Keywords: remote sensing, object recognition, classification, data mining, waterbody identification, feature extraction
Procedia PDF Downloads 34019925 Pharmacokinetic Monitoring of Glimepiride and Ilaprazole in Rat Plasma by High Performance Liquid Chromatography with Diode Array Detection
Authors: Anil P. Dewani, Alok S. Tripathi, Anil V. Chandewar
Abstract:
Present manuscript reports the development and validation of a quantitative high performance liquid chromatography method for the pharmacokinetic evaluation of Glimepiride (GLM) and Ilaprazole (ILA) in rat plasma. The plasma samples were involved with Solid phase extraction process (SPE). The analytes were resolved on a Phenomenex C18 column (4.6 mm× 250 mm; 5 µm particle size) using a isocratic elution mode comprising methanol:water (80:20 % v/v) with pH of water modified to 3 using Formic acid, the total run time was 10 min at 225 nm as common wavelength, the flow rate throughout was 1ml/min. The method was validated over the concentration range from 10 to 600 ng/mL for GLM and ILA, in rat plasma. Metformin (MET) was used as Internal Standard. Validation data demonstrated the method to be selective, sensitive, accurate and precise. The limit of detection was 1.54 and 4.08 and limit of quantification was 5.15 and 13.62 for GLM and ILA respectively, the method demonstrated excellent linearity with correlation coefficients (r2) 0.999. The intra and inter-day precision (RSD%) values were < 2.0% for both ILA and GLM. The method was successfully applied in pharmacokinetic studies followed by oral administration in rats.Keywords: pharmacokinetics, glimepiride, ilaprazole, HPLC, SPE
Procedia PDF Downloads 36919924 Additional Method for the Purification of Lanthanide-Labeled Peptide Compounds Pre-Purified by Weak Cation Exchange Cartridge
Authors: K. Eryilmaz, G. Mercanoglu
Abstract:
Aim: Purification of the final product, which is the last step in the synthesis of lanthanide-labeled peptide compounds, can be accomplished by different methods. Among these methods, the two most commonly used methods are C18 solid phase extraction (SPE) and weak cation exchanger cartridge elution. SPE C18 solid phase extraction method yields high purity final product, while elution from the weak cation exchanger cartridge is pH dependent and ineffective in removing colloidal impurities. The aim of this work is to develop an additional purification method for the lanthanide-labeled peptide compound in cases where the desired radionuclidic and radiochemical purity of the final product can not be achieved because of pH problem or colloidal impurity. Material and Methods: For colloidal impurity formation, 3 mL of water for injection (WFI) was added to 30 mCi of 177LuCl3 solution and allowed to stand for 1 day. 177Lu-DOTATATE was synthesized using EZAG ML-EAZY module (10 mCi/mL). After synthesis, the final product was mixed with the colloidal impurity solution (total volume:13 mL, total activity: 40 mCi). The resulting mixture was trapped in SPE-C18 cartridge. The cartridge was washed with 10 ml saline to remove impurities to the waste vial. The product trapped in the cartridge was eluted with 2 ml of 50% ethanol and collected to the final product vial via passing through a 0.22μm filter. The final product was diluted with 10 mL of saline. Radiochemical purity before and after purification was analysed by HPLC method. (column: ACE C18-100A. 3µm. 150 x 3.0mm, mobile phase: Water-Acetonitrile-Trifluoro acetic acid (75:25:1), flow rate: 0.6 mL/min). Results: UV and radioactivity detector results in HPLC analysis showed that colloidal impurities were completely removed from the 177Lu-DOTATATE/ colloidal impurity mixture by purification method. Conclusion: The improved purification method can be used as an additional method to remove impurities that may result from the lanthanide-peptide synthesis in which the weak cation exchange purification technique is used as the last step. The purification of the final product and the GMP compliance (the final aseptic filtration and the sterile disposable system components) are two major advantages.Keywords: lanthanide, peptide, labeling, purification, radionuclide, radiopharmaceutical, synthesis
Procedia PDF Downloads 16419923 Introduction of Artificial Intelligence for Estimating Fractal Dimension and Its Applications in the Medical Field
Authors: Zerroug Abdelhamid, Danielle Chassoux
Abstract:
Various models are given to simulate homogeneous or heterogeneous cancerous tumors and extract in each case the boundary. The fractal dimension is then estimated by least squares method and compared to some previous methods.Keywords: simulation, cancerous tumor, Markov fields, fractal dimension, extraction, recovering
Procedia PDF Downloads 36519922 Optimization of Titanium Leaching Process Using Experimental Design
Authors: Arash Rafiei, Carroll Moore
Abstract:
Leaching process as the first stage of hydrometallurgy is a multidisciplinary system including material properties, chemistry, reactor design, mechanics and fluid dynamics. Therefore, doing leaching system optimization by pure scientific methods need lots of times and expenses. In this work, a mixture of two titanium ores and one titanium slag are used for extracting titanium for leaching stage of TiO2 pigment production procedure. Optimum titanium extraction can be obtained from following strategies: i) Maximizing titanium extraction without selective digestion; and ii) Optimizing selective titanium extraction by balancing between maximum titanium extraction and minimum impurity digestion. The main difference between two strategies is due to process optimization framework. For the first strategy, the most important stage of production process is concerned as the main stage and rest of stages would be adopted with respect to the main stage. The second strategy optimizes performance of more than one stage at once. The second strategy has more technical complexity compared to the first one but it brings more economical and technical advantages for the leaching system. Obviously, each strategy has its own optimum operational zone that is not as same as the other one and the best operational zone is chosen due to complexity, economical and practical aspects of the leaching system. Experimental design has been carried out by using Taguchi method. The most important advantages of this methodology are involving different technical aspects of leaching process; minimizing the number of needed experiments as well as time and expense; and concerning the role of parameter interactions due to principles of multifactor-at-time optimization. Leaching tests have been done at batch scale on lab with appropriate control on temperature. The leaching tank geometry has been concerned as an important factor to provide comparable agitation conditions. Data analysis has been done by using reactor design and mass balancing principles. Finally, optimum zone for operational parameters are determined for each leaching strategy and discussed due to their economical and practical aspects.Keywords: titanium leaching, optimization, experimental design, performance analysis
Procedia PDF Downloads 37519921 Extraction and Analysis of Anthocyanins Contents from Different Stage Flowers of the Orchids Dendrobium Hybrid cv. Ear-Sakul
Authors: Orose Rugchati, Khumthong Mahawongwiriya
Abstract:
Dendrobium hybrid cv. Ear-Sakul has become one of the important commercial commodities in Thailand agricultural industry worldwide, either as potted plants or as cut flowers due to the attractive color produced in flower petals. Anthocyanins are the main flower pigments and responsible for the natural attractive display of petal colors. These pigments play an important role in functionality, such as to attract animal pollinators, classification, and grading of these orchids. Dendrobium hybrid cv. Ear-Sakul has been collected from local area farm in different stage flowers (F1, F2-F5, and F6). Anthocyanins pigment were extracted from the fresh flower by solvent extraction (MeOH–TFA 99.5:0.5v/v at 4ºC) and purification with ethyl acetate. The main anthocyanins components are cyanidin, pelargonidin, and delphinidin. Pure anthocyanin contents were analysis by UV-Visible spectroscopy technique at λ max 535, 520 and 546 nm respectively. The anthocyanins contents were converted in term of monomeric anthocyanins pigment (mg/L). The anthocyanins contents of all sample were compared with standard pigments cyanidin, pelargonidin and delphinidin. From this experiment is a simple extraction and analysis anthocyanins content in different stage of flowers results shown that monomeric anthocyanins pigment contents of different stage flowers (F1, F2-F5 and F6 ): cyanidin – 3 – glucoside (mg/l) are 0.85+0.08, 24.22+0.12 and 62.12+0.6; Pelargonidin 3,5-di- glucoside(mg/l) 10.37+0.12, 31.06+0.8 and 81.58+ 0.5; Delphinidin (mg/l) 6.34+0.17, 18.98+0.56 and 49.87+0.7; and the appearance of extraction pure anthocyanins in L(a, b): 2.71(1.38, -0.48), 1.06(0.39,-0.66) and 2.64(2.71,-3.61) respectively. Dendrobium Hybrid cv. Ear-Sakul could be used as a source of anthocyanins by simple solvent extraction and stage of flowers as a guideline for the prediction amount of main anthocyanins components are cyanidin, pelargonidin, and delphinidin could be application and development in quantities, and qualities with the advantage for food pharmaceutical and cosmetic industries.Keywords: analysis, anthocyanins contents, different stage flowers, Dendrobium Hybrid cv. Ear-Sakul
Procedia PDF Downloads 15119920 Clutter Suppression Based on Singular Value Decomposition and Fast Wavelet Algorithm
Authors: Ruomeng Xiao, Zhulin Zong, Longfa Yang
Abstract:
Aiming at the problem that the target signal is difficult to detect under the strong ground clutter environment, this paper proposes a clutter suppression algorithm based on the combination of singular value decomposition and the Mallat fast wavelet algorithm. The method first carries out singular value decomposition on the radar echo data matrix, realizes the initial separation of target and clutter through the threshold processing of singular value, and then carries out wavelet decomposition on the echo data to find out the target location, and adopts the discard method to select the appropriate decomposition layer to reconstruct the target signal, which ensures the minimum loss of target information while suppressing the clutter. After the verification of the measured data, the method has a significant effect on the target extraction under low SCR, and the target reconstruction can be realized without the prior position information of the target and the method also has a certain enhancement on the output SCR compared with the traditional single wavelet processing method.Keywords: clutter suppression, singular value decomposition, wavelet transform, Mallat algorithm, low SCR
Procedia PDF Downloads 12019919 Incremental Learning of Independent Topic Analysis
Authors: Takahiro Nishigaki, Katsumi Nitta, Takashi Onoda
Abstract:
In this paper, we present a method of applying Independent Topic Analysis (ITA) to increasing the number of document data. The number of document data has been increasing since the spread of the Internet. ITA was presented as one method to analyze the document data. ITA is a method for extracting the independent topics from the document data by using the Independent Component Analysis (ICA). ICA is a technique in the signal processing; however, it is difficult to apply the ITA to increasing number of document data. Because ITA must use the all document data so temporal and spatial cost is very high. Therefore, we present Incremental ITA which extracts the independent topics from increasing number of document data. Incremental ITA is a method of updating the independent topics when the document data is added after extracted the independent topics from a just previous the data. In addition, Incremental ITA updates the independent topics when the document data is added. And we show the result applied Incremental ITA to benchmark datasets.Keywords: text mining, topic extraction, independent, incremental, independent component analysis
Procedia PDF Downloads 30919918 Deasphalting of Crude Oil by Extraction Method
Authors: A. N. Kurbanova, G. K. Sugurbekova, N. K. Akhmetov
Abstract:
The asphaltenes are heavy fraction of crude oil. Asphaltenes on oilfield is known for its ability to plug wells, surface equipment and pores of the geologic formations. The present research is devoted to the deasphalting of crude oil as the initial stage refining oil. Solvent deasphalting was conducted by extraction with organic solvents (cyclohexane, carbon tetrachloride, chloroform). Analysis of availability of metals was conducted by ICP-MS and spectral feature at deasphalting was achieved by FTIR. High contents of asphaltenes in crude oil reduce the efficiency of refining processes. Moreover, high distribution heteroatoms (e.g., S, N) were also suggested in asphaltenes cause some problems: environmental pollution, corrosion and poisoning of the catalyst. The main objective of this work is to study the effect of deasphalting process crude oil to improve its properties and improving the efficiency of recycling processes. Experiments of solvent extraction are using organic solvents held in the crude oil JSC “Pavlodar Oil Chemistry Refinery. Experimental results show that deasphalting process also leads to decrease Ni, V in the composition of the oil. One solution to the problem of cleaning oils from metals, hydrogen sulfide and mercaptan is absorption with chemical reagents directly in oil residue and production due to the fact that asphalt and resinous substance degrade operational properties of oils and reduce the effectiveness of selective refining of oils. Deasphalting of crude oil is necessary to separate the light fraction from heavy metallic asphaltenes part of crude oil. For this oil is pretreated deasphalting, because asphaltenes tend to form coke or consume large quantities of hydrogen. Removing asphaltenes leads to partly demetallization, i.e. for removal of asphaltenes V/Ni and organic compounds with heteroatoms. Intramolecular complexes are relatively well researched on the example of porphyinous complex (VO2) and nickel (Ni). As a result of studies of V/Ni by ICP MS method were determined the effect of different solvents-deasphalting – on the process of extracting metals on deasphalting stage and select the best organic solvent. Thus, as the best DAO proved cyclohexane (C6H12), which as a result of ICP MS retrieves V-51.2%, Ni-66.4%? Also in this paper presents the results of a study of physical and chemical properties and spectral characteristics of oil on FTIR with a view to establishing its hydrocarbon composition. Obtained by using IR-spectroscopy method information about the specifics of the whole oil give provisional physical, chemical characteristics. They can be useful in the consideration of issues of origin and geochemical conditions of accumulation of oil, as well as some technological challenges. Systematic analysis carried out in this study; improve our understanding of the stability mechanism of asphaltenes. The role of deasphalted crude oil fractions on the stability asphaltene is described.Keywords: asphaltenes, deasphalting, extraction, vanadium, nickel, metalloporphyrins, ICP-MS, IR spectroscopy
Procedia PDF Downloads 24219917 Optimizing Sustainable Graphene Production: Extraction of Graphite from Spent Primary and Secondary Batteries for Advanced Material Synthesis
Authors: Pratima Kumari, Sukha Ranjan Samadder
Abstract:
This research aims to contribute to the sustainable production of graphene materials by exploring the extraction of graphite from spent primary and secondary batteries. The increasing demand for graphene materials, a versatile and high-performance material, necessitates environmentally friendly methods for its synthesis. The process involves a well-planned methodology, beginning with the gathering and categorization of batteries, followed by the disassembly and careful removal of graphite from anode structures. The use of environmentally friendly solvents and mechanical techniques ensures an efficient and eco-friendly extraction of graphite. Advanced approaches such as the modified Hummers' method and chemical reduction process are utilized for the synthesis of graphene materials, with a focus on optimizing parameters. Various analytical techniques such as Fourier-transform infrared spectroscopy, X-ray diffraction, scanning electron microscopy, thermogravimetric analysis, and Raman spectroscopy were employed to validate the quality and structure of the produced graphene materials. The major findings of this study reveal the successful implementation of the methodology, leading to the production of high-quality graphene materials suitable for advanced material applications. Thorough characterization using various advanced techniques validates the structural integrity and purity of the graphene. The economic viability of the process is demonstrated through a comprehensive economic analysis, highlighting the potential for large-scale production. This research contributes to the field of sustainable production of graphene materials by offering a systematic methodology that efficiently transforms spent batteries into valuable graphene resources. Furthermore, the findings not only showcase the potential for upcycling electronic waste but also address the pressing need for environmentally conscious processes in advanced material synthesis.Keywords: spent primary batteries, spent secondary batteries, graphite extraction, advanced material synthesis, circular economy approach
Procedia PDF Downloads 5419916 SFE as a Superior Technique for Extraction of Eugenol-Rich Fraction from Cinnamomum tamala Nees (Bay Leaf) - Process Analysis and Phytochemical Characterization
Authors: Sudip Ghosh, Dipanwita Roy, Dipan Chatterjee, Paramita Bhattacharjee, Satadal Das
Abstract:
Highest yield of eugenol-rich fractions from Cinnamomum tamala (bay leaf) leaves were obtained by supercritical carbon dioxide (SC-CO2), compared to hydro-distillation, organic solvents, liquid CO2 and subcritical CO2 extractions. Optimization of SC-CO2 extraction parameters was carried out to obtain an extract with maximum eugenol content. This was achieved using a sample size of 10 g at 55°C, 512 bar after 60 min at a flow rate of 25.0 cm3/sof gaseous CO2. This extract has the best combination of phytochemical properties such as phenolic content (1.77 mg gallic acid/g dry bay leaf), reducing power (0.80 mg BHT/g dry bay leaf), antioxidant activity (IC50 of 0.20 mg/ml) and anti-inflammatory potency (IC50 of 1.89 mg/ml). Identification of compounds in this extract was performed by GC-MS analysis and its antimicrobial potency was also evaluated. The MIC values against E. coli, P. aeruginosa and S. aureus were 0.5, 0.25 and 0.5 mg/ml, respectively.Keywords: antimicrobial potency, Cinnamomum tamala, eugenol, supercritical carbon dioxide extraction
Procedia PDF Downloads 34219915 Secure Image Retrieval Based on Orthogonal Decomposition under Cloud Environment
Authors: Y. Xu, L. Xiong, Z. Xu
Abstract:
In order to protect data privacy, image with sensitive or private information needs to be encrypted before being outsourced to the cloud. However, this causes difficulties in image retrieval and data management. A secure image retrieval method based on orthogonal decomposition is proposed in the paper. The image is divided into two different components, for which encryption and feature extraction are executed separately. As a result, cloud server can extract features from an encrypted image directly and compare them with the features of the queried images, so that the user can thus obtain the image. Different from other methods, the proposed method has no special requirements to encryption algorithms. Experimental results prove that the proposed method can achieve better security and better retrieval precision.Keywords: secure image retrieval, secure search, orthogonal decomposition, secure cloud computing
Procedia PDF Downloads 48719914 From Electroencephalogram to Epileptic Seizures Detection by Using Artificial Neural Networks
Authors: Gaetano Zazzaro, Angelo Martone, Roberto V. Montaquila, Luigi Pavone
Abstract:
Seizure is the main factor that affects the quality of life of epileptic patients. The diagnosis of epilepsy, and hence the identification of epileptogenic zone, is commonly made by using continuous Electroencephalogram (EEG) signal monitoring. Seizure identification on EEG signals is made manually by epileptologists and this process is usually very long and error prone. The aim of this paper is to describe an automated method able to detect seizures in EEG signals, using knowledge discovery in database process and data mining methods and algorithms, which can support physicians during the seizure detection process. Our detection method is based on Artificial Neural Network classifier, trained by applying the multilayer perceptron algorithm, and by using a software application, called Training Builder that has been developed for the massive extraction of features from EEG signals. This tool is able to cover all the data preparation steps ranging from signal processing to data analysis techniques, including the sliding window paradigm, the dimensionality reduction algorithms, information theory, and feature selection measures. The final model shows excellent performances, reaching an accuracy of over 99% during tests on data of a single patient retrieved from a publicly available EEG dataset.Keywords: artificial neural network, data mining, electroencephalogram, epilepsy, feature extraction, seizure detection, signal processing
Procedia PDF Downloads 18919913 Investigation of Deep Eutectic Solvents for Microwave Assisted Extraction and Headspace Gas Chromatographic Determination of Hexanal in Fat-Rich Food
Authors: Birute Bugelyte, Ingrida Jurkute, Vida Vickackaite
Abstract:
The most complicated step of the determination of volatile compounds in complex matrices is the separation of analytes from the matrix. Traditional analyte separation methods (liquid extraction, Soxhlet extraction) require a lot of time and labour; moreover, there is a risk to lose the volatile analytes. In recent years, headspace gas chromatography has been used to determine volatile compounds. To date, traditional extraction solvents have been used in headspace gas chromatography. As a rule, such solvents are rather volatile; therefore, a large amount of solvent vapour enters into the headspace together with the analyte. Because of that, the determination sensitivity of the analyte is reduced, a huge solvent peak in the chromatogram can overlap with the peaks of the analyts. The sensitivity is also limited by the fact that the sample can’t be heated at a higher temperature than the solvent boiling point. In 2018 it was suggested to replace traditional headspace gas chromatographic solvents with non-volatile, eco-friendly, biodegradable, inexpensive, and easy to prepare deep eutectic solvents (DESs). Generally, deep eutectic solvents have low vapour pressure, a relatively wide liquid range, much lower melting point than that of any of their individual components. Those features make DESs very attractive as matrix media for application in headspace gas chromatography. Also, DESs are polar compounds, so they can be applied for microwave assisted extraction. The aim of this work was to investigate the possibility of applying deep eutectic solvents for microwave assisted extraction and headspace gas chromatographic determination of hexanal in fat-rich food. Hexanal is considered one of the most suitable indicators of lipid oxidation degree as it is the main secondary oxidation product of linoleic acid, which is one of the principal fatty acids of many edible oils. Eight hydrophilic and hydrophobic deep eutectic solvents have been synthesized, and the influence of the temperature and microwaves on their headspace gas chromatographic behaviour has been investigated. Using the most suitable DES, microwave assisted extraction conditions and headspace gas chromatographic conditions have been optimized for the determination of hexanal in potato chips. Under optimized conditions, the quality parameters of the prepared technique have been determined. The suggested technique was applied for the determination of hexanal in potato chips and other fat-rich food.Keywords: deep eutectic solvents, headspace gas chromatography, hexanal, microwave assisted extraction
Procedia PDF Downloads 195