Search results for: soxhlet extraction method
19862 Segmentation of Arabic Handwritten Numeral Strings Based on Watershed Approach
Authors: Nidal F. Shilbayeh, Remah W. Al-Khatib, Sameer A. Nooh
Abstract:
Arabic offline handwriting recognition systems are considered as one of the most challenging topics. Arabic Handwritten Numeral Strings are used to automate systems that deal with numbers such as postal code, banking account numbers and numbers on car plates. Segmentation of connected numerals is the main bottleneck in the handwritten numeral recognition system. This is in turn can increase the speed and efficiency of the recognition system. In this paper, we proposed algorithms for automatic segmentation and feature extraction of Arabic handwritten numeral strings based on Watershed approach. The algorithms have been designed and implemented to achieve the main goal of segmenting and extracting the string of numeral digits written by hand especially in a courtesy amount of bank checks. The segmentation algorithm partitions the string into multiple regions that can be associated with the properties of one or more criteria. The numeral extraction algorithm extracts the numeral string digits into separated individual digit. Both algorithms for segmentation and feature extraction have been tested successfully and efficiently for all types of numerals.Keywords: handwritten numerals, segmentation, courtesy amount, feature extraction, numeral recognition
Procedia PDF Downloads 38219861 Contribution to the Study of Some Phytochemicals and Biological Aspects of Artemisia absinthium L
Authors: Sihem Benmimoune, Abdelbaki Lemgharbi, Ahmed Ait Yahia, Abdelkrim Kameli
Abstract:
Our study is based on chemical and phytochemical characterization of Artemisia absinthium L and in vitro tests to demonstrate the biological activities of essential oil and natural extract. A qualitative and quantitative comparison of the essential oil extracted by two extraction procedures was performed by analysis of CG/SM and the yield calculation. The method of hydrodistillation has a chemical composition and provides oil content than the best training water vapor. These oils are composed mainly of thujone followed chamazulene and ρ-cymene. The antimicrobial activity of wormwood oil was tested in vitro by two methods (agar diffusion and microdilution) on four plant pathogenic fungi (Aspergillus sp, Botrytis cinerea, Fusarium culmorum and Helminthosporium sp). The study of the antifungal effect showed that this oil has an inhibitory effect counterpart the microorganisms tested in particular the strain Botrytis cinerea. Otherwise, this activity depends on the nature of the oil and the germ itself. The antioxidant activity in vitro was studied with the DPPH method. The activity test shows that the oil and extract of Artemisia absinthium have a very low antioxidant capacity compared to the antioxidants used as a reference. The extract has a potentially high antiradical power not from its oil. The quantitative determinations of phenolic compounds by the Folin-Ciocalteu revealed that absinthe is low in total polyphenols and tannins.Keywords: artemisia absinthium, biological activities, essential oil, extraction processes
Procedia PDF Downloads 34119860 Evaluation of Features Extraction Algorithms for a Real-Time Isolated Word Recognition System
Authors: Tomyslav Sledevič, Artūras Serackis, Gintautas Tamulevičius, Dalius Navakauskas
Abstract:
This paper presents a comparative evaluation of features extraction algorithm for a real-time isolated word recognition system based on FPGA. The Mel-frequency cepstral, linear frequency cepstral, linear predictive and their cepstral coefficients were implemented in hardware/software design. The proposed system was investigated in the speaker-dependent mode for 100 different Lithuanian words. The robustness of features extraction algorithms was tested recognizing the speech records at different signals to noise rates. The experiments on clean records show highest accuracy for Mel-frequency cepstral and linear frequency cepstral coefficients. For records with 15 dB signal to noise rate the linear predictive cepstral coefficients give best result. The hard and soft part of the system is clocked on 50 MHz and 100 MHz accordingly. For the classification purpose, the pipelined dynamic time warping core was implemented. The proposed word recognition system satisfies the real-time requirements and is suitable for applications in embedded systems.Keywords: isolated word recognition, features extraction, MFCC, LFCC, LPCC, LPC, FPGA, DTW
Procedia PDF Downloads 49619859 Machine Learning Strategies for Data Extraction from Unstructured Documents in Financial Services
Authors: Delphine Vendryes, Dushyanth Sekhar, Baojia Tong, Matthew Theisen, Chester Curme
Abstract:
Much of the data that inform the decisions of governments, corporations and individuals are harvested from unstructured documents. Data extraction is defined here as a process that turns non-machine-readable information into a machine-readable format that can be stored, for instance, in a database. In financial services, introducing more automation in data extraction pipelines is a major challenge. Information sought by financial data consumers is often buried within vast bodies of unstructured documents, which have historically required thorough manual extraction. Automated solutions provide faster access to non-machine-readable datasets, in a context where untimely information quickly becomes irrelevant. Data quality standards cannot be compromised, so automation requires high data integrity. This multifaceted task is broken down into smaller steps: ingestion, table parsing (detection and structure recognition), text analysis (entity detection and disambiguation), schema-based record extraction, user feedback incorporation. Selected intermediary steps are phrased as machine learning problems. Solutions leveraging cutting-edge approaches from the fields of computer vision (e.g. table detection) and natural language processing (e.g. entity detection and disambiguation) are proposed.Keywords: computer vision, entity recognition, finance, information retrieval, machine learning, natural language processing
Procedia PDF Downloads 11319858 Dissolved Gas Analysis Based Regression Rules from Trained ANN for Transformer Fault Diagnosis
Authors: Deepika Bhalla, Raj Kumar Bansal, Hari Om Gupta
Abstract:
Dissolved Gas Analysis (DGA) has been widely used for fault diagnosis in a transformer. Artificial neural networks (ANN) have high accuracy but are regarded as black boxes that are difficult to interpret. For many problems it is desired to extract knowledge from trained neural networks (NN) so that the user can gain a better understanding of the solution arrived by the NN. This paper applies a pedagogical approach for rule extraction from function approximating neural networks (REFANN) with application to incipient fault diagnosis using the concentrations of the dissolved gases within the transformer oil, as the input to the NN. The input space is split into subregions and for each subregion there is a linear equation that is used to predict the type of fault developing within a transformer. The experiments on real data indicate that the approach used can extract simple and useful rules and give fault predictions that match the actual fault and are at times also better than those predicted by the IEC method.Keywords: artificial neural networks, dissolved gas analysis, rules extraction, transformer
Procedia PDF Downloads 53619857 Compost Bioremediation of Oil Refinery Sludge by Using Different Manures in a Laboratory Condition
Authors: O. Ubani, H. I. Atagana, M. S. Thantsha
Abstract:
This study was conducted to measure the reduction in polycyclic aromatic hydrocarbons (PAHs) content in oil sludge by co-composting the sludge with pig, cow, horse and poultry manures under laboratory conditions. Four kilograms of soil spiked with 800 g of oil sludge was co-composted differently with each manure in a ratio of 2:1 (w/w) spiked soil:manure and wood-chips in a ratio of 2:1 (w/v) spiked soil:wood-chips. Control was set up similar as the one above but without manure. Mixtures were incubated for 10 months at room temperature. Compost piles were turned weekly and moisture level was maintained at between 50% and 70%. Moisture level, pH, temperature, CO2 evolution and oxygen consumption were measured monthly and the ash content at the end of experimentation. Bacteria capable of utilizing PAHs were isolated, purified and characterized by molecular techniques using polymerase chain reaction-denaturing gradient gel electrophoresis (PCR-DGGE), amplification of the 16S rDNA gene using the specific primers (16S-P1 PCR and 16S-P2 PCR) and the amplicons were sequenced. Extent of reduction of PAHs was measured using automated soxhlet extractor with dichloromethane as the extraction solvent coupled with gas chromatography/mass spectrometry (GC/MS). Temperature did not exceed 27.5O°C in all compost heaps, pH ranged from 5.5 to 7.8 and CO2 evolution was highest in poultry manure at 18.78 µg/dwt/day. Microbial growth and activities were enhanced. Bacteria identified were Bacillus, Arthrobacter and Staphylococcus species. Results from PAH measurements showed reduction between 77 and 99%. The results from the control experiments may be because it was invaded by fungi. Co-composting of spiked soils with animal manures enhanced the reduction in PAHs. Interestingly, all bacteria isolated and identified in this study were present in all treatments, including the control.Keywords: bioremediation, co-composting, oil refinery sludge, PAHs, bacteria spp, animal manures, molecular techniques
Procedia PDF Downloads 47519856 Magnetic Nano-Composite of Self-Doped Polyaniline Nanofibers for Magnetic Dispersive Micro Solid Phase Extraction Applications
Authors: Hatem I. Mokhtar, Randa A. Abd-El-Salam, Ghada M. Hadad
Abstract:
An improved nano-composite of self-doped polyaniline nanofibers and silica-coated magnetite nanoparticles were prepared and evaluated for suitability to magnetic dispersive micro solid-phase extraction. The work focused on optimization of the composite capacity to extract four fluoroquinolones (FQs) antibiotics, ciprofloxacin, enrofloxacin, danofloxacin, and difloxacin from water and improvement of composite stability towards acid and atmospheric degradation. Self-doped polyaniline nanofibers were prepared by oxidative co-polymerization of aniline with anthranilic acid. Magnetite nanopariticles were prepared by alkaline co-precipitation and coated with silica by silicate hydrolysis on magnetite nanoparticles surface at pH 6.5. The composite was formed by self-assembly by mixing self-doped polyaniline nanofibers with silica-coated magnetite nanoparticles dispersions in ethanol. The composite structure was confirmed by transmission electron microscopy (TEM). Self-doped polyaniline nanofibers and magnetite chemical structures were confirmed by FT-IR while silica coating of the magnetite was confirmed by Energy Dispersion X-ray Spectroscopy (EDS). Improved stability of the composite magnetic component was evidenced by resistance to degrade in 2N HCl solution. The adsorption capacity of self-doped polyaniline nanofibers based composite was higher than previously reported corresponding composite prepared from polyaniline nanofibers instead of self-doped polyaniline nanofibers. Adsorption-pH profile for the studied FQs on the prepared composite revealed that the best pH for adsorption was in range of 6.5 to 7. Best extraction recovery values were obtained at pH 7 using phosphate buffer. The best solvent for FQs desorption was found to be 0.1N HCl in methanol:water (8:2; v/v) mixture. 20 mL of Spiked water sample with studied FQs were preconcentrated using 4.8 mg of composite and resulting extracts were analysed by HPLC-UV method. The prepared composite represented a suitable adsorbent phase for magnetic dispersive micro-solid phase application.Keywords: fluoroquinolones, magnetic dispersive micro extraction, nano-composite, self-doped polyaniline nanofibers
Procedia PDF Downloads 12219855 Comparison Physicochemical Properties of Hexane Extracted Aniseed Oil from Cold Press Extraction Residue and Cold Press Aniseed Oil
Authors: Derya Ören, Şeyma Akalın
Abstract:
Cold pres technique is a traditional method to obtain oil. The cold-pressing procedure, involves neither heat nor chemical treatments, so cold press technique has low oil yield and cold pressed herbal material residue still contains some oil. In this study, the oil that is remained in the cold pressed aniseed extracted with hegzan and analysed to determine physicochemical properties and quality parameters. It is found that the aniseed after cold press process contains % 10 oil. Other analysis parametres free fatty acid (FFA) is 2,1 mgKOH/g, peroxide value is 7,6 meq02/kg. Cold pressed aniseed oil values are determined for fatty acid (FFA) value as 2,1 mgKOH/g, peroxide value 4,5 meq02/kg respectively. Also fatty acid composition is analysed, it is found that both of these oil have same fatty acid composition. The main fatty acids are; oleic, linoleic, and palmitic acids.Keywords: aniseed oil, cold press, extraction, residue
Procedia PDF Downloads 40619854 Extraction of Forest Plantation Resources in Selected Forest of San Manuel, Pangasinan, Philippines Using LiDAR Data for Forest Status Assessment
Authors: Mark Joseph Quinto, Roan Beronilla, Guiller Damian, Eliza Camaso, Ronaldo Alberto
Abstract:
Forest inventories are essential to assess the composition, structure and distribution of forest vegetation that can be used as baseline information for management decisions. Classical forest inventory is labor intensive and time-consuming and sometimes even dangerous. The use of Light Detection and Ranging (LiDAR) in forest inventory would improve and overcome these restrictions. This study was conducted to determine the possibility of using LiDAR derived data in extracting high accuracy forest biophysical parameters and as a non-destructive method for forest status analysis of San Manual, Pangasinan. Forest resources extraction was carried out using LAS tools, GIS, Envi and .bat scripts with the available LiDAR data. The process includes the generation of derivatives such as Digital Terrain Model (DTM), Canopy Height Model (CHM) and Canopy Cover Model (CCM) in .bat scripts followed by the generation of 17 composite bands to be used in the extraction of forest classification covers using ENVI 4.8 and GIS software. The Diameter in Breast Height (DBH), Above Ground Biomass (AGB) and Carbon Stock (CS) were estimated for each classified forest cover and Tree Count Extraction was carried out using GIS. Subsequently, field validation was conducted for accuracy assessment. Results showed that the forest of San Manuel has 73% Forest Cover, which is relatively much higher as compared to the 10% canopy cover requirement. On the extracted canopy height, 80% of the tree’s height ranges from 12 m to 17 m. CS of the three forest covers based on the AGB were: 20819.59 kg/20x20 m for closed broadleaf, 8609.82 kg/20x20 m for broadleaf plantation and 15545.57 kg/20x20m for open broadleaf. Average tree counts for the tree forest plantation was 413 trees/ha. As such, the forest of San Manuel has high percent forest cover and high CS.Keywords: carbon stock, forest inventory, LiDAR, tree count
Procedia PDF Downloads 38919853 Case Study on Innovative Aquatic-Based Bioeconomy for Chlorella sorokiniana
Authors: Iryna Atamaniuk, Hannah Boysen, Nils Wieczorek, Natalia Politaeva, Iuliia Bazarnova, Kerstin Kuchta
Abstract:
Over the last decade due to climate change and a strategy of natural resources preservation, the interest for the aquatic biomass has dramatically increased. Along with mitigation of the environmental pressure and connection of waste streams (including CO2 and heat emissions), microalgae bioeconomy can supply food, feed, as well as the pharmaceutical and power industry with number of value-added products. Furthermore, in comparison to conventional biomass, microalgae can be cultivated in wide range of conditions without compromising food and feed production, thus addressing issues associated with negative social and the environmental impacts. This paper presents the state-of-the art technology for microalgae bioeconomy from cultivation process to production of valuable components and by-streams. Microalgae Chlorella sorokiniana were cultivated in the pilot-scale innovation concept in Hamburg (Germany) using different systems such as race way pond (5000 L) and flat panel reactors (8 x 180 L). In order to achieve the optimum growth conditions along with suitable cellular composition for the further extraction of the value-added components, process parameters such as light intensity, temperature and pH are continuously being monitored. On the other hand, metabolic needs in nutrients were provided by addition of micro- and macro-nutrients into a medium to ensure autotrophic growth conditions of microalgae. The cultivation was further followed by downstream process and extraction of lipids, proteins and saccharides. Lipids extraction is conducted in repeated-batch semi-automatic mode using hot extraction method according to Randall. As solvents hexane and ethanol are used at different ratio of 9:1 and 1:9, respectively. Depending on cell disruption method along with solvents ratio, the total lipids content showed significant variations between 8.1% and 13.9 %. The highest percentage of extracted biomass was reached with a sample pretreated with microwave digestion using 90% of hexane and 10% of ethanol as solvents. Proteins content in microalgae was determined by two different methods, namely: Total Kejadahl Nitrogen (TKN), which further was converted to protein content, as well as Bradford method using Brilliant Blue G-250 dye. Obtained results, showed a good correlation between both methods with protein content being in the range of 39.8–47.1%. Characterization of neutral and acid saccharides from microalgae was conducted by phenol-sulfuric acid method at two wavelengths of 480 nm and 490 nm. The average concentration of neutral and acid saccharides under the optimal cultivation conditions was 19.5% and 26.1%, respectively. Subsequently, biomass residues are used as substrate for anaerobic digestion on the laboratory-scale. The methane concentration, which was measured on the daily bases, showed some variations for different samples after extraction steps but was in the range between 48% and 55%. CO2 which is formed during the fermentation process and after the combustion in the Combined Heat and Power unit can potentially be used within the cultivation process as a carbon source for the photoautotrophic synthesis of biomass.Keywords: bioeconomy, lipids, microalgae, proteins, saccharides
Procedia PDF Downloads 24519852 The Studies of the Sorption Capabilities of the Porous Microspheres with Lignin
Authors: M. Goliszek, M. Sobiesiak, O. Sevastyanova, B. Podkoscielna
Abstract:
Lignin is one of three main constituents of biomass together with cellulose and hemicellulose. It is a complex biopolymer, which contains a large number of functional groups, including aliphatic and aromatic hydroxyl groups, carbohylic groups and methoxy groups in its structure, that is why it shows potential capacities for process of sorption. Lignin is a highly cross-linked polymer with a three-dimentional structure which can provide large surface area and pore volumes. It can also posses better dispersion, diffusion and mass transfer behavior in a field of the removal of, e.g., heavy-metal-ions or aromatic pollutions. In this work emulsion-suspension copolymerization method, to synthesize the porous microspheres of divinylbenzene (DVB), styrene (St) and lignin was used. There are also microspheres without the addition of lignin for comparison. Before the copolymerization, modification lignin with methacryloyl chloride, to improve its reactivity with other monomers was done. The physico-chemical properties of the obtained microspheres, e.g., pore structures (adsorption-desorption measurements), thermal properties (DSC), tendencies to swell and the actual shapes were also studied. Due to well-developed porous structure and the presence of functional groups our materials may have great potential in sorption processes. To estimate the sorption capabilities of the microspheres towards phenol and its chlorinated derivatives the off-line SPE (solid-phase extraction) method is going to be applied. This method has various advantages, including low-cost, easy to use and enables the rapid measurements for a large number of chemicals. The efficiency of the materials in removing phenols from aqueous solution and in desorption processes will be evaluated.Keywords: microspheres, lignin, sorption, solid-phase extraction
Procedia PDF Downloads 18319851 Isolation, Characterization and Quantitation of Anticancer Constituent from Chloroform Extract of N. arbortristis L. Leaves
Authors: Parul Grover, K. A. Suri, Raj Kumar, Gulshan Bansal
Abstract:
Background: Nyctanthes arbortristis Linn is traditionally used as anticancer herb in Indian system of medicine, but its introduction into modern system of medicine is still awaited due to lack of systematic scientific studies. Objective: The objective of the present study was to isolate and characterize anticancer phytoconstituents from N. arbortristis L. leaves based on bioactivity guided fractionation. Method: Different extracts of the leaves of the plant were prepared by Soxhlet extractor. Each extract was evaluated for anticancer activity against HL-60 cell lines. Chloroform and HA extract showed potent anticancer activity and hence were selected for fractionation. Fraction C1 from chloroform extract was found to be most potent amongst all when tested against three cell lines (HL-60, A-549, and HCT-116) and thus was selected for further fractionation and a pure compound CP-01 was isolated. RP-HPLC method has been developed for quantification of isolated compound by using Kinetex C-18 column with gradient elution at 0.7 mL/min using mobile phase containing potassium dihydrogen phosphate (0.01 M, pH 3.0) with acetonitrile. The wavelength of maximum absorption (λₘₐₓ) selected was 210 nm. Results: The structure of potent anticancer CP-01 was determined on the basis spectroscopic methods like IR, 1H-NMR, ¹³C-NMR and Mass Spectrometry and it was characterized as 1,1,2-tris(2’,4’-di-tert-butylbenzene)-4,4-dimethyl-pent-1-ene. The content of CP-01 was found to be 0.88 %w/w of chloroform extract and 0.08 %w/w of N.arbortristis leaves. Conclusion: The study supports the traditional use of N. arbortristis as anticancer herb & the identified compound CP-01 can serve as an excellent lead to develop potent and safe anticancer drugs.Keywords: anticancer, HL-60 cell lines, Nyctanthes arbor-tristis, RP-HPLC
Procedia PDF Downloads 14719850 AS-Geo: Arbitrary-Sized Image Geolocalization with Learnable Geometric Enhancement Resizer
Authors: Huayuan Lu, Chunfang Yang, Ma Zhu, Baojun Qi, Yaqiong Qiao, Jiangqian Xu
Abstract:
Image geolocalization has great application prospects in fields such as autonomous driving and virtual/augmented reality. In practical application scenarios, the size of the image to be located is not fixed; it is impractical to train different networks for all possible sizes. When its size does not match the size of the input of the descriptor extraction model, existing image geolocalization methods usually directly scale or crop the image in some common ways. This will result in the loss of some information important to the geolocalization task, thus affecting the performance of the image geolocalization method. For example, excessive down-sampling can lead to blurred building contour, and inappropriate cropping can lead to the loss of key semantic elements, resulting in incorrect geolocation results. To address this problem, this paper designs a learnable image resizer and proposes an arbitrary-sized image geolocation method. (1) The designed learnable image resizer employs the self-attention mechanism to enhance the geometric features of the resized image. Firstly, it applies bilinear interpolation to the input image and its feature maps to obtain the initial resized image and the resized feature maps. Then, SKNet (selective kernel net) is used to approximate the best receptive field, thus keeping the geometric shapes as the original image. And SENet (squeeze and extraction net) is used to automatically select the feature maps with strong contour information, enhancing the geometric features. Finally, the enhanced geometric features are fused with the initial resized image, to obtain the final resized images. (2) The proposed image geolocalization method embeds the above image resizer as a fronting layer of the descriptor extraction network. It not only enables the network to be compatible with arbitrary-sized input images but also enhances the geometric features that are crucial to the image geolocalization task. Moreover, the triplet attention mechanism is added after the first convolutional layer of the backbone network to optimize the utilization of geometric elements extracted by the first convolutional layer. Finally, the local features extracted by the backbone network are aggregated to form image descriptors for image geolocalization. The proposed method was evaluated on several mainstream datasets, such as Pittsburgh30K, Tokyo24/7, and Places365. The results show that the proposed method has excellent size compatibility and compares favorably to recently mainstream geolocalization methods.Keywords: image geolocalization, self-attention mechanism, image resizer, geometric feature
Procedia PDF Downloads 21419849 Selective Separation of Amino Acids by Reactive Extraction with Di-(2-Ethylhexyl) Phosphoric Acid
Authors: Alexandra C. Blaga, Dan Caşcaval, Alexandra Tucaliuc, Madalina Poştaru, Anca I. Galaction
Abstract:
Amino acids are valuable chemical products used in in human foods, in animal feed additives and in the pharmaceutical field. Recently, there has been a noticeable rise of amino acids utilization throughout the world to include their use as raw materials in the production of various industrial chemicals: oil gelating agents (amino acid-based surfactants) to recover effluent oil in seas and rivers and poly(amino acids), which are attracting attention for biodegradable plastics manufacture. The amino acids can be obtained by biosynthesis or from protein hydrolysis, but their separation from the obtained mixtures can be challenging. In the last decades there has been a continuous interest in developing processes that will improve the selectivity and yield of downstream processing steps. The liquid-liquid extraction of amino acids (dissociated at any pH-value of the aqueous solutions) is possible only by using the reactive extraction technique, mainly with extractants of organophosphoric acid derivatives, high molecular weight amines and crown-ethers. The purpose of this study was to analyse the separation of nine amino acids of acidic character (l-aspartic acid, l-glutamic acid), basic character (l-histidine, l-lysine, l-arginine) and neutral character (l-glycine, l-tryptophan, l-cysteine, l-alanine) by reactive extraction with di-(2-ethylhexyl)phosphoric acid (D2EHPA) dissolved in butyl acetate. The results showed that the separation yield is controlled by the pH value of the aqueous phase: the reactive extraction of amino acids with D2EHPA is possible only if the amino acids exist in aqueous solution in their cationic forms (pH of aqueous phase below the isoeletric point). The studies for individual amino acids indicated the possibility of selectively separate different groups of amino acids with similar acidic properties as a function of aqueous solution pH-value: the maximum yields are reached for a pH domain of 2–3, then strongly decreasing with the pH increase. Thus, for acidic and neutral amino acids, the extraction becomes impossible at the isolelectric point (pHi) and for basic amino acids at a pH value lower than pHi, as a result of the carboxylic group dissociation. From the results obtained for the separation from the mixture of the nine amino acids, at different pH, it can be observed that all amino acids are extracted with different yields, for a pH domain of 1.5–3. Over this interval, the extract contains only the amino acids with neutral and basic character. For pH 5–6, only the neutral amino acids are extracted and for pH > 6 the extraction becomes impossible. Using this technique, the total separation of the following amino acids groups has been performed: neutral amino acids at pH 5–5.5, basic amino acids and l-cysteine at pH 4–4.5, l-histidine at pH 3–3.5 and acidic amino acids at pH 2–2.5.Keywords: amino acids, di-(2-ethylhexyl) phosphoric acid, reactive extraction, selective extraction
Procedia PDF Downloads 43119848 Gas Chromatography Coupled to Tandem Mass Spectrometry and Liquid Chromatography Coupled to Tandem Mass Spectrometry Qualitative Determination of Pesticides Found in Tea Infusions
Authors: Mihai-Alexandru Florea, Veronica Drumea, Roxana Nita, Cerasela Gird, Laura Olariu
Abstract:
The aim of this study was to investigate the residues of pesticide found in tea water infusions. A multi-residues method to determine 147 pesticides has been developed using the QuEChERS (Quick, Easy, Cheap, Effective, Rugged, Safe) procedure and dispersive solid phase extraction (d-SPE) for the cleanup the pesticides from complex matrices such as plants and tea. Sample preparation was carefully optimized for the efficient removal of coextracted matrix components by testing more solvent systems. Determination of pesticides was performed using GC-MS/MS (100 of pesticides) and LC-MS/MS (47 of pesticides). The selected reaction monitoring (SRM) mode was chosen to achieve low detection limits and high compounds selectivity and sensitivity. Overall performance was evaluated and validated according to DG-SANTE Guidelines. To assess the pesticide residue transfer rate (qualitative) from dried tea in infusions the samples (tea) were spiked with a mixture of pesticides at the maximum residues level accepted for teas and herbal infusions. In order to investigate the release of the pesticides in tea preparations, the medicinal plants were prepared in four ways by variation of water temperature and the infusion time. The pesticides from infusions were extracted using two methods: QuEChERS versus solid-phase extraction (SPE). More that 90 % of the pesticides studied was identified in infusion.Keywords: tea, solid-phase extraction (SPE), selected reaction monitoring (SRM), QuEChERS
Procedia PDF Downloads 21319847 Laser Data Based Automatic Generation of Lane-Level Road Map for Intelligent Vehicles
Authors: Zehai Yu, Hui Zhu, Linglong Lin, Huawei Liang, Biao Yu, Weixin Huang
Abstract:
With the development of intelligent vehicle systems, a high-precision road map is increasingly needed in many aspects. The automatic lane lines extraction and modeling are the most essential steps for the generation of a precise lane-level road map. In this paper, an automatic lane-level road map generation system is proposed. To extract the road markings on the ground, the multi-region Otsu thresholding method is applied, which calculates the intensity value of laser data that maximizes the variance between background and road markings. The extracted road marking points are then projected to the raster image and clustered using a two-stage clustering algorithm. Lane lines are subsequently recognized from these clusters by the shape features of their minimum bounding rectangle. To ensure the storage efficiency of the map, the lane lines are approximated to cubic polynomial curves using a Bayesian estimation approach. The proposed lane-level road map generation system has been tested on urban and expressway conditions in Hefei, China. The experimental results on the datasets show that our method can achieve excellent extraction and clustering effect, and the fitted lines can reach a high position accuracy with an error of less than 10 cm.Keywords: curve fitting, lane-level road map, line recognition, multi-thresholding, two-stage clustering
Procedia PDF Downloads 12819846 Pharmacokinetic Monitoring of Glimepiride and Ilaprazole in Rat Plasma by High Performance Liquid Chromatography with Diode Array Detection
Authors: Anil P. Dewani, Alok S. Tripathi, Anil V. Chandewar
Abstract:
Present manuscript reports the development and validation of a quantitative high performance liquid chromatography method for the pharmacokinetic evaluation of Glimepiride (GLM) and Ilaprazole (ILA) in rat plasma. The plasma samples were involved with Solid phase extraction process (SPE). The analytes were resolved on a Phenomenex C18 column (4.6 mm× 250 mm; 5 µm particle size) using a isocratic elution mode comprising methanol:water (80:20 % v/v) with pH of water modified to 3 using Formic acid, the total run time was 10 min at 225 nm as common wavelength, the flow rate throughout was 1ml/min. The method was validated over the concentration range from 10 to 600 ng/mL for GLM and ILA, in rat plasma. Metformin (MET) was used as Internal Standard. Validation data demonstrated the method to be selective, sensitive, accurate and precise. The limit of detection was 1.54 and 4.08 and limit of quantification was 5.15 and 13.62 for GLM and ILA respectively, the method demonstrated excellent linearity with correlation coefficients (r2) 0.999. The intra and inter-day precision (RSD%) values were < 2.0% for both ILA and GLM. The method was successfully applied in pharmacokinetic studies followed by oral administration in rats.Keywords: pharmacokinetics, glimepiride, ilaprazole, HPLC, SPE
Procedia PDF Downloads 36919845 Phytochemical Screening and Identification of Anti-Biological Activity Properties of Pelargonium graveolens
Authors: Anupalli Roja Rani, Saraswathi Jaggali
Abstract:
Rose-scented geranium (Pelargonium graveolens L’Hér.) is an erect, much-branched shrub. It is indigenous to various parts of southern Africa, and it is often called Geranium. Pelargonium species are widely used by traditional healers in the areas of Southern Africa by Sotho, Xhosa, Khoi-San and Zulus for its curative and palliative effects in the treatment of diarrhea, dysentery, fever, respiratory tract infections, liver complaints, wounds, gastroenteritis, haemorrhage, kidney and bladder disorders. We have used Plant materials for extracting active compounds from analytical grades of solvents methanol, ethyl acetate, chloroform and water by a soxhlet apparatus. The phytochemical screening reveals that extracts of Pelargonium graveolens contains alkaloids, glycosides, steroids, tannins, saponins and phenols in ethyl acetate solvent. The antioxidant activity was determined using 1, 1-diphenyl-2-picrylhydrazyl (DPPH) bleaching method and the total phenolic content in the extracts was determined by the Folin–Ciocalteu method. Due to the presence of different phytochemical compounds in Pelargonium the anti-microbial activity against different micro-organisms like E.coli, Streptococcus, Klebsiella and Bacillus. Fractionation of plant extract was performed by column chromatography and was confirmed with HPLC analysis, NMR and FTIR spectroscopy for the compound identification in different organic solvent extracts.Keywords: Pelargonium graveolens L’Hér, DPPH, micro-organisms, HPLC analysis, NMR, FTIR spectroscopy
Procedia PDF Downloads 50019844 On-Line Super Critical Fluid Extraction, Supercritical Fluid Chromatography, Mass Spectrometry, a Technique in Pharmaceutical Analysis
Authors: Narayana Murthy Akurathi, Vijaya Lakshmi Marella
Abstract:
The literature is reviewed with regard to online Super critical fluid extraction (SFE) coupled directly with supercritical fluid chromatography (SFC) -mass spectrometry that have typically more sensitive than conventional LC-MS/MS and GC-MS/MS. It is becoming increasingly interesting to use on-line techniques that combine sample preparation, separation and detection in one analytical set up. This provides less human intervention, uses small amount of sample and organic solvent and yields enhanced analyte enrichment in a shorter time. The sample extraction is performed under light shielding and anaerobic conditions, preventing the degradation of thermo labile analytes. It may be able to analyze compounds over a wide polarity range as SFC generally uses carbon dioxide which was collected as a by-product of other chemical reactions or is collected from the atmosphere as it contributes no new chemicals to the environment. The diffusion of solutes in supercritical fluids is about ten times greater than that in liquids and about three times less than in gases which results in a decrease in resistance to mass transfer in the column and allows for fast high resolution separations. The drawback of SFC when using carbon dioxide as mobile phase is that the direct introduction of water samples poses a series of problems, water must therefore be eliminated before it reaches the analytical column. Hundreds of compounds analysed simultaneously by simple enclosing in an extraction vessel. This is mainly applicable for pharmaceutical industry where it can analyse fatty acids and phospholipids that have many analogues as their UV spectrum is very similar, trace additives in polymers, cleaning validation can be conducted by putting swab sample in an extraction vessel, analysing hundreds of pesticides with good resolution.Keywords: super critical fluid extraction (SFE), super critical fluid chromatography (SFC), LCMS/MS, GCMS/MS
Procedia PDF Downloads 39119843 A Supervised Learning Data Mining Approach for Object Recognition and Classification in High Resolution Satellite Data
Authors: Mais Nijim, Rama Devi Chennuboyina, Waseem Al Aqqad
Abstract:
Advances in spatial and spectral resolution of satellite images have led to tremendous growth in large image databases. The data we acquire through satellites, radars and sensors consists of important geographical information that can be used for remote sensing applications such as region planning, disaster management. Spatial data classification and object recognition are important tasks for many applications. However, classifying objects and identifying them manually from images is a difficult task. Object recognition is often considered as a classification problem, this task can be performed using machine-learning techniques. Despite of many machine-learning algorithms, the classification is done using supervised classifiers such as Support Vector Machines (SVM) as the area of interest is known. We proposed a classification method, which considers neighboring pixels in a region for feature extraction and it evaluates classifications precisely according to neighboring classes for semantic interpretation of region of interest (ROI). A dataset has been created for training and testing purpose; we generated the attributes by considering pixel intensity values and mean values of reflectance. We demonstrated the benefits of using knowledge discovery and data-mining techniques, which can be on image data for accurate information extraction and classification from high spatial resolution remote sensing imagery.Keywords: remote sensing, object recognition, classification, data mining, waterbody identification, feature extraction
Procedia PDF Downloads 34019842 Literature Review on Text Comparison Techniques: Analysis of Text Extraction, Main Comparison and Visual Representation Tools
Authors: Andriana Mkrtchyan, Vahe Khlghatyan
Abstract:
The choice of a profession is one of the most important decisions people make throughout their life. With the development of modern science, technologies, and all the spheres existing in the modern world, more and more professions are being arisen that complicate even more the process of choosing. Hence, there is a need for a guiding platform to help people to choose a profession and the right career path based on their interests, skills, and personality. This review aims at analyzing existing methods of comparing PDF format documents and suggests that a 3-stage approach is implemented for the comparison, that is – 1. text extraction from PDF format documents, 2. comparison of the extracted text via NLP algorithms, 3. comparison representation using special shape and color psychology methodology.Keywords: color psychology, data acquisition/extraction, data augmentation, disambiguation, natural language processing, outlier detection, semantic similarity, text-mining, user evaluation, visual search
Procedia PDF Downloads 7619841 Additional Method for the Purification of Lanthanide-Labeled Peptide Compounds Pre-Purified by Weak Cation Exchange Cartridge
Authors: K. Eryilmaz, G. Mercanoglu
Abstract:
Aim: Purification of the final product, which is the last step in the synthesis of lanthanide-labeled peptide compounds, can be accomplished by different methods. Among these methods, the two most commonly used methods are C18 solid phase extraction (SPE) and weak cation exchanger cartridge elution. SPE C18 solid phase extraction method yields high purity final product, while elution from the weak cation exchanger cartridge is pH dependent and ineffective in removing colloidal impurities. The aim of this work is to develop an additional purification method for the lanthanide-labeled peptide compound in cases where the desired radionuclidic and radiochemical purity of the final product can not be achieved because of pH problem or colloidal impurity. Material and Methods: For colloidal impurity formation, 3 mL of water for injection (WFI) was added to 30 mCi of 177LuCl3 solution and allowed to stand for 1 day. 177Lu-DOTATATE was synthesized using EZAG ML-EAZY module (10 mCi/mL). After synthesis, the final product was mixed with the colloidal impurity solution (total volume:13 mL, total activity: 40 mCi). The resulting mixture was trapped in SPE-C18 cartridge. The cartridge was washed with 10 ml saline to remove impurities to the waste vial. The product trapped in the cartridge was eluted with 2 ml of 50% ethanol and collected to the final product vial via passing through a 0.22μm filter. The final product was diluted with 10 mL of saline. Radiochemical purity before and after purification was analysed by HPLC method. (column: ACE C18-100A. 3µm. 150 x 3.0mm, mobile phase: Water-Acetonitrile-Trifluoro acetic acid (75:25:1), flow rate: 0.6 mL/min). Results: UV and radioactivity detector results in HPLC analysis showed that colloidal impurities were completely removed from the 177Lu-DOTATATE/ colloidal impurity mixture by purification method. Conclusion: The improved purification method can be used as an additional method to remove impurities that may result from the lanthanide-peptide synthesis in which the weak cation exchange purification technique is used as the last step. The purification of the final product and the GMP compliance (the final aseptic filtration and the sterile disposable system components) are two major advantages.Keywords: lanthanide, peptide, labeling, purification, radionuclide, radiopharmaceutical, synthesis
Procedia PDF Downloads 16319840 Introduction of Artificial Intelligence for Estimating Fractal Dimension and Its Applications in the Medical Field
Authors: Zerroug Abdelhamid, Danielle Chassoux
Abstract:
Various models are given to simulate homogeneous or heterogeneous cancerous tumors and extract in each case the boundary. The fractal dimension is then estimated by least squares method and compared to some previous methods.Keywords: simulation, cancerous tumor, Markov fields, fractal dimension, extraction, recovering
Procedia PDF Downloads 36519839 Clutter Suppression Based on Singular Value Decomposition and Fast Wavelet Algorithm
Authors: Ruomeng Xiao, Zhulin Zong, Longfa Yang
Abstract:
Aiming at the problem that the target signal is difficult to detect under the strong ground clutter environment, this paper proposes a clutter suppression algorithm based on the combination of singular value decomposition and the Mallat fast wavelet algorithm. The method first carries out singular value decomposition on the radar echo data matrix, realizes the initial separation of target and clutter through the threshold processing of singular value, and then carries out wavelet decomposition on the echo data to find out the target location, and adopts the discard method to select the appropriate decomposition layer to reconstruct the target signal, which ensures the minimum loss of target information while suppressing the clutter. After the verification of the measured data, the method has a significant effect on the target extraction under low SCR, and the target reconstruction can be realized without the prior position information of the target and the method also has a certain enhancement on the output SCR compared with the traditional single wavelet processing method.Keywords: clutter suppression, singular value decomposition, wavelet transform, Mallat algorithm, low SCR
Procedia PDF Downloads 11819838 Optimization of Titanium Leaching Process Using Experimental Design
Authors: Arash Rafiei, Carroll Moore
Abstract:
Leaching process as the first stage of hydrometallurgy is a multidisciplinary system including material properties, chemistry, reactor design, mechanics and fluid dynamics. Therefore, doing leaching system optimization by pure scientific methods need lots of times and expenses. In this work, a mixture of two titanium ores and one titanium slag are used for extracting titanium for leaching stage of TiO2 pigment production procedure. Optimum titanium extraction can be obtained from following strategies: i) Maximizing titanium extraction without selective digestion; and ii) Optimizing selective titanium extraction by balancing between maximum titanium extraction and minimum impurity digestion. The main difference between two strategies is due to process optimization framework. For the first strategy, the most important stage of production process is concerned as the main stage and rest of stages would be adopted with respect to the main stage. The second strategy optimizes performance of more than one stage at once. The second strategy has more technical complexity compared to the first one but it brings more economical and technical advantages for the leaching system. Obviously, each strategy has its own optimum operational zone that is not as same as the other one and the best operational zone is chosen due to complexity, economical and practical aspects of the leaching system. Experimental design has been carried out by using Taguchi method. The most important advantages of this methodology are involving different technical aspects of leaching process; minimizing the number of needed experiments as well as time and expense; and concerning the role of parameter interactions due to principles of multifactor-at-time optimization. Leaching tests have been done at batch scale on lab with appropriate control on temperature. The leaching tank geometry has been concerned as an important factor to provide comparable agitation conditions. Data analysis has been done by using reactor design and mass balancing principles. Finally, optimum zone for operational parameters are determined for each leaching strategy and discussed due to their economical and practical aspects.Keywords: titanium leaching, optimization, experimental design, performance analysis
Procedia PDF Downloads 37319837 Incremental Learning of Independent Topic Analysis
Authors: Takahiro Nishigaki, Katsumi Nitta, Takashi Onoda
Abstract:
In this paper, we present a method of applying Independent Topic Analysis (ITA) to increasing the number of document data. The number of document data has been increasing since the spread of the Internet. ITA was presented as one method to analyze the document data. ITA is a method for extracting the independent topics from the document data by using the Independent Component Analysis (ICA). ICA is a technique in the signal processing; however, it is difficult to apply the ITA to increasing number of document data. Because ITA must use the all document data so temporal and spatial cost is very high. Therefore, we present Incremental ITA which extracts the independent topics from increasing number of document data. Incremental ITA is a method of updating the independent topics when the document data is added after extracted the independent topics from a just previous the data. In addition, Incremental ITA updates the independent topics when the document data is added. And we show the result applied Incremental ITA to benchmark datasets.Keywords: text mining, topic extraction, independent, incremental, independent component analysis
Procedia PDF Downloads 30919836 Extraction and Analysis of Anthocyanins Contents from Different Stage Flowers of the Orchids Dendrobium Hybrid cv. Ear-Sakul
Authors: Orose Rugchati, Khumthong Mahawongwiriya
Abstract:
Dendrobium hybrid cv. Ear-Sakul has become one of the important commercial commodities in Thailand agricultural industry worldwide, either as potted plants or as cut flowers due to the attractive color produced in flower petals. Anthocyanins are the main flower pigments and responsible for the natural attractive display of petal colors. These pigments play an important role in functionality, such as to attract animal pollinators, classification, and grading of these orchids. Dendrobium hybrid cv. Ear-Sakul has been collected from local area farm in different stage flowers (F1, F2-F5, and F6). Anthocyanins pigment were extracted from the fresh flower by solvent extraction (MeOH–TFA 99.5:0.5v/v at 4ºC) and purification with ethyl acetate. The main anthocyanins components are cyanidin, pelargonidin, and delphinidin. Pure anthocyanin contents were analysis by UV-Visible spectroscopy technique at λ max 535, 520 and 546 nm respectively. The anthocyanins contents were converted in term of monomeric anthocyanins pigment (mg/L). The anthocyanins contents of all sample were compared with standard pigments cyanidin, pelargonidin and delphinidin. From this experiment is a simple extraction and analysis anthocyanins content in different stage of flowers results shown that monomeric anthocyanins pigment contents of different stage flowers (F1, F2-F5 and F6 ): cyanidin – 3 – glucoside (mg/l) are 0.85+0.08, 24.22+0.12 and 62.12+0.6; Pelargonidin 3,5-di- glucoside(mg/l) 10.37+0.12, 31.06+0.8 and 81.58+ 0.5; Delphinidin (mg/l) 6.34+0.17, 18.98+0.56 and 49.87+0.7; and the appearance of extraction pure anthocyanins in L(a, b): 2.71(1.38, -0.48), 1.06(0.39,-0.66) and 2.64(2.71,-3.61) respectively. Dendrobium Hybrid cv. Ear-Sakul could be used as a source of anthocyanins by simple solvent extraction and stage of flowers as a guideline for the prediction amount of main anthocyanins components are cyanidin, pelargonidin, and delphinidin could be application and development in quantities, and qualities with the advantage for food pharmaceutical and cosmetic industries.Keywords: analysis, anthocyanins contents, different stage flowers, Dendrobium Hybrid cv. Ear-Sakul
Procedia PDF Downloads 15019835 Synthesis of New Bio-Based Solid Polymer Electrolyte Polyurethane-Liclo4 via Prepolymerization Method: Effect of NCO/OH Ratio on Their Chemical, Thermal Properties and Ionic Conductivity
Authors: C. S. Wong, K. H. Badri, N. Ataollahi, K. P. Law, M. S. Su’ait, N. I. Hassan
Abstract:
Novel bio-based polymer electrolyte was synthesized with LiClO4 as the main source of charge carrier. Initially, polyurethane-LiClO4 polymer electrolytes were synthesized via polymerization method with different NCO/OH ratios and labelled as PU1, PU2, PU3, and PU4. Subsequently, the chemical, thermal properties and ionic conductivity of the films produced were determined. Fourier transform infrared (FTIR) analysis indicates the co-ordination between Li+ ion and polyurethane in PU1 due to the greatest amount of hard segment of polyurethane in PU1 as proven by soxhlet analysis. The structures of polyurethanes were confirmed by 13 nuclear magnetic resonance spectroscopy (13C NMR) and FTIR spectroscopy. Differential scanning calorimetry (DSC) analysis indicates PU 1 has the highest glass transition temperature (Tg) corresponds to the most abundant urethane group which is the hard segment in PU1. Scanning electron microscopy (SEM) of the PU-LiClO4 shows the good miscibility between lithium salt and the polymer. The study found that PU1 possessed the greatest ionic conductivity (1.19 × 10-7 S.cm-1 at 298 K and 5.01 × 10-5 S.cm-1 at 373 K) and the lowest activation energy, Ea (0.32 eV) due to the greatest amount of hard segment formed in PU 1 induces the coordination between lithium ion and oxygen atom of carbonyl group in polyurethane. All the polyurethanes exhibited linear Arrhenius variations indicating ion transport via simple lithium ion hopping in polyurethane. This research proves the NCO content in polyurethane plays an important role in affecting the ionic conductivity of this polymer electrolyte.Keywords: ionic conductivity, palm kernel oil-based monoester-OH, polyurethane, solid polymer electrolyte
Procedia PDF Downloads 42619834 Deasphalting of Crude Oil by Extraction Method
Authors: A. N. Kurbanova, G. K. Sugurbekova, N. K. Akhmetov
Abstract:
The asphaltenes are heavy fraction of crude oil. Asphaltenes on oilfield is known for its ability to plug wells, surface equipment and pores of the geologic formations. The present research is devoted to the deasphalting of crude oil as the initial stage refining oil. Solvent deasphalting was conducted by extraction with organic solvents (cyclohexane, carbon tetrachloride, chloroform). Analysis of availability of metals was conducted by ICP-MS and spectral feature at deasphalting was achieved by FTIR. High contents of asphaltenes in crude oil reduce the efficiency of refining processes. Moreover, high distribution heteroatoms (e.g., S, N) were also suggested in asphaltenes cause some problems: environmental pollution, corrosion and poisoning of the catalyst. The main objective of this work is to study the effect of deasphalting process crude oil to improve its properties and improving the efficiency of recycling processes. Experiments of solvent extraction are using organic solvents held in the crude oil JSC “Pavlodar Oil Chemistry Refinery. Experimental results show that deasphalting process also leads to decrease Ni, V in the composition of the oil. One solution to the problem of cleaning oils from metals, hydrogen sulfide and mercaptan is absorption with chemical reagents directly in oil residue and production due to the fact that asphalt and resinous substance degrade operational properties of oils and reduce the effectiveness of selective refining of oils. Deasphalting of crude oil is necessary to separate the light fraction from heavy metallic asphaltenes part of crude oil. For this oil is pretreated deasphalting, because asphaltenes tend to form coke or consume large quantities of hydrogen. Removing asphaltenes leads to partly demetallization, i.e. for removal of asphaltenes V/Ni and organic compounds with heteroatoms. Intramolecular complexes are relatively well researched on the example of porphyinous complex (VO2) and nickel (Ni). As a result of studies of V/Ni by ICP MS method were determined the effect of different solvents-deasphalting – on the process of extracting metals on deasphalting stage and select the best organic solvent. Thus, as the best DAO proved cyclohexane (C6H12), which as a result of ICP MS retrieves V-51.2%, Ni-66.4%? Also in this paper presents the results of a study of physical and chemical properties and spectral characteristics of oil on FTIR with a view to establishing its hydrocarbon composition. Obtained by using IR-spectroscopy method information about the specifics of the whole oil give provisional physical, chemical characteristics. They can be useful in the consideration of issues of origin and geochemical conditions of accumulation of oil, as well as some technological challenges. Systematic analysis carried out in this study; improve our understanding of the stability mechanism of asphaltenes. The role of deasphalted crude oil fractions on the stability asphaltene is described.Keywords: asphaltenes, deasphalting, extraction, vanadium, nickel, metalloporphyrins, ICP-MS, IR spectroscopy
Procedia PDF Downloads 24219833 Optimizing Sustainable Graphene Production: Extraction of Graphite from Spent Primary and Secondary Batteries for Advanced Material Synthesis
Authors: Pratima Kumari, Sukha Ranjan Samadder
Abstract:
This research aims to contribute to the sustainable production of graphene materials by exploring the extraction of graphite from spent primary and secondary batteries. The increasing demand for graphene materials, a versatile and high-performance material, necessitates environmentally friendly methods for its synthesis. The process involves a well-planned methodology, beginning with the gathering and categorization of batteries, followed by the disassembly and careful removal of graphite from anode structures. The use of environmentally friendly solvents and mechanical techniques ensures an efficient and eco-friendly extraction of graphite. Advanced approaches such as the modified Hummers' method and chemical reduction process are utilized for the synthesis of graphene materials, with a focus on optimizing parameters. Various analytical techniques such as Fourier-transform infrared spectroscopy, X-ray diffraction, scanning electron microscopy, thermogravimetric analysis, and Raman spectroscopy were employed to validate the quality and structure of the produced graphene materials. The major findings of this study reveal the successful implementation of the methodology, leading to the production of high-quality graphene materials suitable for advanced material applications. Thorough characterization using various advanced techniques validates the structural integrity and purity of the graphene. The economic viability of the process is demonstrated through a comprehensive economic analysis, highlighting the potential for large-scale production. This research contributes to the field of sustainable production of graphene materials by offering a systematic methodology that efficiently transforms spent batteries into valuable graphene resources. Furthermore, the findings not only showcase the potential for upcycling electronic waste but also address the pressing need for environmentally conscious processes in advanced material synthesis.Keywords: spent primary batteries, spent secondary batteries, graphite extraction, advanced material synthesis, circular economy approach
Procedia PDF Downloads 54