Search results for: POI extraction method
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 20243

Search results for: POI extraction method

19823 The Effect of Extrusion Processing on Solubility and Molecular Weight of Water-Soluble Arabinoxylan

Authors: Abdulmannan Fadel

Abstract:

Arabinoxylan is a non-starch polysaccharide (NSP), which is one of the most important polysaccharides contained within cereal grains. Wheat endosperm pentosan and rice bran contain a significant amount of arabinoxylan (7% in rice bran and 10-12% in wheat endosperm pentosan). Several methods have been used for arabinoxylan extraction with varying degrees of success e.g. enzymatic and alkaline treatment. Yet, the use of extrusion alone as a pre-treatment to increase the yield and reduce the molecular weight in wheat endosperm pentosan and rice bran has not been investigated. The samples (wheat pentosan and rice bran) were extruded using a Twin-screw extruder at a range of screw speeds (80 and 160 rpm) and barrel temperatures range (80 to 140°C) with a throughput of 30 Kg hr-1 and moisture content of 25%. Arabinoxylans were extracted with water and the extraction yield and molecular weight was determined using size exclusion high-pressure liquid chromatography system. It was found that increasing screw speed from 80 rpm to 160 rpm, did not effect the extraction yield (p < 0.05) of arabinoxylan from either the wheat endosperm pentosan or the rice bran. However, the molecular weight of the extracted arabinoxylans from pentosan was found to decrease with increasing screw speed in wheat endosperm pentosan. These low molecular weight arabinoxylans have been suggested as immunomodulators.

Keywords: arabinoxylans, extrusion, wheat endosperm pentosan, rice bran

Procedia PDF Downloads 417
19822 Medium-Scale Multi-Juice Extractor for Food Processing

Authors: Flordeliza L. Mercado, Teresito G. Aguinaldo, Helen F. Gavino, Victorino T. Taylan

Abstract:

Most fruits and vegetables are available in large quantities during peak season which are oftentimes marketed at low price and left to rot or fed to farm animals. The lack of efficient storage facilities, and the additional cost and unavailability of small machinery for food processing, results to low price and wastage. Incidentally, processed fresh fruits and vegetables are gaining importance nowadays and health conscious people are also into ‘juicing’. One way to reduce wastage and ensure an all-season availability of crop juices at reasonable costs is to develop equipment for effective extraction of juice. The study was conducted to design, fabricate and evaluate a multi-juice extractor using locally available materials, making it relatively cheaper and affordable for medium-scale enterprises. The study was also conducted to formulate juice blends using extracted juices and calamansi juice at different blending percentage, and evaluate its chemical properties and sensory attributes. Furthermore, the chemical properties of extracted meals were evaluated for future applications. The multi-juice extractor has an overall dimension of 963mm x 300mm x 995mm, a gross weight of 82kg and 5 major components namely; feeding hopper, extracting chamber, juice and meal outlet, transmission assembly, and frame. The machine performance was evaluated based on juice recovery, extraction efficiency, extraction rate, extraction recovery, and extraction loss considering type of crop as apple and carrot with three replications each and was analyzed using T-test. The formulated juice blends were subjected to sensory evaluation and data gathered were analyzed using Analysis of Variance appropriate for Complete Randomized Design. Results showed that the machine’s juice recovery (73.39%), extraction rate (16.40li/hr), and extraction efficiency (88.11%) for apple were significantly higher than for carrot while extraction recovery (99.88%) was higher for apple than for carrot. Extraction loss (0.12%) was lower for apple than for carrot, but was not significantly affected by crop. Based on adding percentage mark-up on extraction cost (Php 2.75/kg), the breakeven weight and payback period for a 35% mark-up is 4,710.69kg and 1.22 years, respectively and for a 50% mark-up, the breakeven weight is 3,492.41kg and the payback period is 0.86 year (10.32 months). Results on the sensory evaluation of juice blends showed that the type of juice significantly influenced all the sensory parameters while the blending percentage including their respective interaction, had no significant effect on all sensory parameters, making the apple-calamansi juice blend more preferred than the carrot-calamansi juice blend in terms of all the sensory parameter. The machine’s performance is higher for apple than for carrot and the cost analysis on the use of the machine revealed that it is financially viable with a payback period of 1.22 years (35% mark-up) and 0.86 year (50% mark-up) for machine cost, generating an income of Php 23,961.60 and Php 34,444.80 per year using 35% and 50% mark-up, respectively. The juice blends were of good qualities based on the values obtained in the chemical analysis and the extracted meal could also be used to produce another product based on the values obtained from proximate analysis.

Keywords: food processing, fruits and vegetables, juice extraction, multi-juice extractor

Procedia PDF Downloads 325
19821 A Method for Quantifying Arsenolipids in Sea Water by HPLC-High Resolution Mass Spectrometry

Authors: Muslim Khan, Kenneth B. Jensen, Kevin A. Francesconi

Abstract:

Trace amounts (ca 1 µg/L, 13 nM) of arsenic are present in sea water mostly as the oxyanion arsenate. In contrast, arsenic is present in marine biota (animals and algae) at very high levels (up to100,000 µg/kg) a significant portion of which is present as lipid-soluble compounds collectively termed arsenolipids. The complex nature of sea water presents an analytical challenge to detect trace compounds and monitor their environmental path. We developed a simple method using liquid-liquid extraction combined with HPLC-High Resolution Mass Spectrometer capable of detecting trace of arsenolipids (99 % of the sample matrix while recovering > 80 % of the six target arsenolipids with limit of detection of 0.003 µg/L.)

Keywords: arsenolipids, sea water, HPLC-high resolution mass spectrometry

Procedia PDF Downloads 372
19820 Exploring Selected Nigerian Fictional Work and Films as Sources of Peace Building and Conflict Resolution in the Natural Resource Extraction Regions of Nigeria: A Social Conflict Theoretical Perspective and Analysis

Authors: Joyce Onoromhenre Agofure

Abstract:

Research has shown how fictional work and films reflect the destruction of the environment due to the exploitation of oil, gas, gold, and forest products by multinational companies for profits but overlook discussions on conflict resolution and peacebuilding. However, this paper examines the manner art forms project peace and conflict resolution, thereby contributing to mediation and stability geared towards changing appalling situations in the resource extraction regions of Nigeria. This paper draws from selected Nigerian films- Blood and Oil (2019), directed by Curtis Graham, Black November (2012), directed by Jeta Amata, and a novel- Death of Eternity (2007), by Adamu Kyuka Usman. The study seeks to show that the disruptions caused in the natural resource regions of Nigeria have not only left adverse effects on the social well-being of the people but require resolutions through means of peacebuilding. By adopting the theoretical insights of Social Conflict, this paper focuses on artistic processes that enhance peacebuilding and conflict resolution in non-violent ways by using scenes, visual effects, themes, and images that can educate by shaping opinions, influencing attitudes, and changing ideas and behavioral patterns of individuals and communities. Put together; the research will open up critical perceptions brought about by the artists of study to shed light on the dire need to sustain peace and actively participate in conflict resolution in natural resource extraction spaces.

Keywords: natural resource, extraction, conflict resolution, peace building

Procedia PDF Downloads 80
19819 Data Mining Spatial: Unsupervised Classification of Geographic Data

Authors: Chahrazed Zouaoui

Abstract:

In recent years, the volume of geospatial information is increasing due to the evolution of communication technologies and information, this information is presented often by geographic information systems (GIS) and stored on of spatial databases (BDS). The classical data mining revealed a weakness in knowledge extraction at these enormous amounts of data due to the particularity of these spatial entities, which are characterized by the interdependence between them (1st law of geography). This gave rise to spatial data mining. Spatial data mining is a process of analyzing geographic data, which allows the extraction of knowledge and spatial relationships from geospatial data, including methods of this process we distinguish the monothematic and thematic, geo- Clustering is one of the main tasks of spatial data mining, which is registered in the part of the monothematic method. It includes geo-spatial entities similar in the same class and it affects more dissimilar to the different classes. In other words, maximize intra-class similarity and minimize inter similarity classes. Taking account of the particularity of geo-spatial data. Two approaches to geo-clustering exist, the dynamic processing of data involves applying algorithms designed for the direct treatment of spatial data, and the approach based on the spatial data pre-processing, which consists of applying clustering algorithms classic pre-processed data (by integration of spatial relationships). This approach (based on pre-treatment) is quite complex in different cases, so the search for approximate solutions involves the use of approximation algorithms, including the algorithms we are interested in dedicated approaches (clustering methods for partitioning and methods for density) and approaching bees (biomimetic approach), our study is proposed to design very significant to this problem, using different algorithms for automatically detecting geo-spatial neighborhood in order to implement the method of geo- clustering by pre-treatment, and the application of the bees algorithm to this problem for the first time in the field of geo-spatial.

Keywords: mining, GIS, geo-clustering, neighborhood

Procedia PDF Downloads 375
19818 Enhancement of Light Extraction of Luminescent Coating by Nanostructuring

Authors: Aubry Martin, Nehed Amara, Jeff Nyalosaso, Audrey Potdevin, FrançOis ReVeret, Michel Langlet, Genevieve Chadeyron

Abstract:

Energy-saving lighting devices based on LightEmitting Diodes (LEDs) combine a semiconductor chip emitting in the ultraviolet or blue wavelength region to one or more phosphor(s) deposited in the form of coatings. The most common ones combine a blue LED with the yellow phosphor Y₃Al₅O₁₂:Ce³⁺ (YAG:Ce) and a red phosphor. Even if these devices are characterized by satisfying photometric parameters (Color Rendering Index, Color Temperature) and good luminous efficiencies, further improvements can be carried out to enhance light extraction efficiency (increase in phosphor forward emission). One of the possible strategies is to pattern the phosphor coatings. Here, we have worked on different ways to nanostructure the coating surface. On the one hand, we used the colloidal lithography combined with the Langmuir-Blodgett technique to directly pattern the surface of YAG:Tb³⁺ sol-gel derived coatings, YAG:Tb³⁺ being used as phosphor model. On the other hand, we achieved composite architectures combining YAG:Ce coatings and ZnO nanowires. Structural, morphological and optical properties of both systems have been studied and compared to flat YAG coatings. In both cases, nanostructuring brought a significative enhancement of photoluminescence properties under UV or blue radiations. In particular, angle-resolved photoluminescence measurements have shown that nanostructuring modifies photons path within the coatings, with a better extraction of the guided modes. These two strategies have the advantage of being versatile and applicable to any phosphor synthesizable by sol-gel technique. They then appear as promising ways to enhancement luminescence efficiencies of both phosphor coatings and the optical devices into which they are incorporated, such as LED-based lighting or safety devices.

Keywords: phosphor coatings, nanostructuring, light extraction, ZnO nanowires, colloidal lithography, LED devices

Procedia PDF Downloads 176
19817 Automatic Method for Exudates and Hemorrhages Detection from Fundus Retinal Images

Authors: A. Biran, P. Sobhe Bidari, K. Raahemifar

Abstract:

Diabetic Retinopathy (DR) is an eye disease that leads to blindness. The earliest signs of DR are the appearance of red and yellow lesions on the retina called hemorrhages and exudates. Early diagnosis of DR prevents from blindness; hence, many automated algorithms have been proposed to extract hemorrhages and exudates. In this paper, an automated algorithm is presented to extract hemorrhages and exudates separately from retinal fundus images using different image processing techniques including Circular Hough Transform (CHT), Contrast Limited Adaptive Histogram Equalization (CLAHE), Gabor filter and thresholding. Since Optic Disc is the same color as the exudates, it is first localized and detected. The presented method has been tested on fundus images from Structured Analysis of the Retina (STARE) and Digital Retinal Images for Vessel Extraction (DRIVE) databases by using MATLAB codes. The results show that this method is perfectly capable of detecting hard exudates and the highly probable soft exudates. It is also capable of detecting the hemorrhages and distinguishing them from blood vessels.

Keywords: diabetic retinopathy, fundus, CHT, exudates, hemorrhages

Procedia PDF Downloads 273
19816 An Auxiliary Technique for Coronary Heart Disease Prediction by Analyzing Electrocardiogram Based on ResNet and Bi-Long Short-Term Memory

Authors: Yang Zhang, Jian He

Abstract:

Heart disease is one of the leading causes of death in the world, and coronary heart disease (CHD) is one of the major heart diseases. Electrocardiogram (ECG) is widely used in the detection of heart diseases, but the traditional manual method for CHD prediction by analyzing ECG requires lots of professional knowledge for doctors. This paper introduces sliding window and continuous wavelet transform (CWT) to transform ECG signals into images, and then ResNet and Bi-LSTM are introduced to build the ECG feature extraction network (namely ECGNet). At last, an auxiliary system for coronary heart disease prediction was developed based on modified ResNet18 and Bi-LSTM, and the public ECG dataset of CHD from MIMIC-3 was used to train and test the system. The experimental results show that the accuracy of the method is 83%, and the F1-score is 83%. Compared with the available methods for CHD prediction based on ECG, such as kNN, decision tree, VGGNet, etc., this method not only improves the prediction accuracy but also could avoid the degradation phenomenon of the deep learning network.

Keywords: Bi-LSTM, CHD, ECG, ResNet, sliding window

Procedia PDF Downloads 91
19815 Multi-Criteria Optimal Management Strategy for in-situ Bioremediation of LNAPL Contaminated Aquifer Using Particle Swarm Optimization

Authors: Deepak Kumar, Jahangeer, Brijesh Kumar Yadav, Shashi Mathur

Abstract:

In-situ remediation is a technique which can remediate either surface or groundwater at the site of contamination. In the present study, simulation optimization approach has been used to develop management strategy for remediating LNAPL (Light Non-Aqueous Phase Liquid) contaminated aquifers. Benzene, toluene, ethyl benzene and xylene are the main component of LNAPL contaminant. Collectively, these contaminants are known as BTEX. In in-situ bioremediation process, a set of injection and extraction wells are installed. Injection wells supply oxygen and other nutrient which convert BTEX into carbon dioxide and water with the help of indigenous soil bacteria. On the other hand, extraction wells check the movement of plume along downstream. In this study, optimal design of the system has been done using PSO (Particle Swarm Optimization) algorithm. A comprehensive management strategy for pumping of injection and extraction wells has been done to attain a maximum allowable concentration of 5 ppm and 4.5 ppm. The management strategy comprises determination of pumping rates, the total pumping volume and the total running cost incurred for each potential injection and extraction well. The results indicate a high pumping rate for injection wells during the initial management period since it facilitates the availability of oxygen and other nutrients necessary for biodegradation, however it is low during the third year on account of sufficient oxygen availability. This is because the contaminant is assumed to have biodegraded by the end of the third year when the concentration drops to a permissible level.

Keywords: groundwater, in-situ bioremediation, light non-aqueous phase liquid, BTEX, particle swarm optimization

Procedia PDF Downloads 445
19814 The Extraction and Stripping of Hg(II) from Produced Water via Hollow Fiber Contactor

Authors: Dolapop Sribudda, Ura Pancharoen

Abstract:

The separation of Hg(II) from produced water by hollow fiber contactors (HFC) was investigation. This system included of two hollow fiber modules in the series connecting. The first module used for the extraction reaction and the second module for stripping reaction. Aliquat336 extractant was fed from the organic reservoirs into the shell side of the first hollow fiber module and continuous to the shell side of the second module. The organic liquid was continuously feed recirculate and back to the reservoirs. The feed solution was pumped into the lumen (tube side) of the first hollow fiber module. Simultaneously, the stripping solution was pumped in the same way in tube side of the second module. The feed and stripping solution was fed which had a counter current flow. Samples were kept in the outlet of feed and stripping solution for 1 hour and characterized concentration of Hg(II) by Inductively Couple Plasma Atomic Emission Spectroscopy (ICP-AES). Feed solution was produced water from natural gulf of Thailand. The extractant was Aliquat336 dissolved in kerosene diluent. Stripping solution used was nitric acid (HNO3) and thiourea (NH2CSNH2). The effect of carrier concentration and type of stripping solution were investigated. Results showed that the best condition were 10 % (v/v) Aliquat336 and 1.0 M NH2CSNH2. At the optimum condition, the extraction and stripping of Hg(II) were 98% and 44.2%, respectively.

Keywords: Hg(II), hollow fiber contactor, produced water, wastewater treatment

Procedia PDF Downloads 404
19813 Bitplanes Gray-Level Image Encryption Approach Using Arnold Transform

Authors: Ali Abdrhman M. Ukasha

Abstract:

Data security needed in data transmission, storage, and communication to ensure the security. The single step parallel contour extraction (SSPCE) method is used to create the edge map as a key image from the different Gray level/Binary image. Performing the X-OR operation between the key image and each bit plane of the original image for image pixel values change purpose. The Arnold transform used to changes the locations of image pixels as image scrambling process. Experiments have demonstrated that proposed algorithm can fully encrypt 2D Gary level image and completely reconstructed without any distortion. Also shown that the analyzed algorithm have extremely large security against some attacks like salt & pepper and JPEG compression. Its proof that the Gray level image can be protected with a higher security level. The presented method has easy hardware implementation and suitable for multimedia protection in real time applications such as wireless networks and mobile phone services.

Keywords: SSPCE method, image compression-salt- peppers attacks, bitplanes decomposition, Arnold transform, lossless image encryption

Procedia PDF Downloads 440
19812 The Modification of Convolutional Neural Network in Fin Whale Identification

Authors: Jiahao Cui

Abstract:

In the past centuries, due to climate change and intense whaling, the global whale population has dramatically declined. Among the various whale species, the fin whale experienced the most drastic drop in number due to its popularity in whaling. Under this background, identifying fin whale calls could be immensely beneficial to the preservation of the species. This paper uses feature extraction to process the input audio signal, then a network based on AlexNet and three networks based on the ResNet model was constructed to classify fin whale calls. A mixture of the DOSITS database and the Watkins database was used during training. The results demonstrate that a modified ResNet network has the best performance considering precision and network complexity.

Keywords: convolutional neural network, ResNet, AlexNet, fin whale preservation, feature extraction

Procedia PDF Downloads 126
19811 Ontology Mapping with R-GNN for IT Infrastructure: Enhancing Ontology Construction and Knowledge Graph Expansion

Authors: Andrey Khalov

Abstract:

The rapid growth of unstructured data necessitates advanced methods for transforming raw information into structured knowledge, particularly in domain-specific contexts such as IT service management and outsourcing. This paper presents a methodology for automatically constructing domain ontologies using the DOLCE framework as the base ontology. The research focuses on expanding ITIL-based ontologies by integrating concepts from ITSMO, followed by the extraction of entities and relationships from domain-specific texts through transformers and statistical methods like formal concept analysis (FCA). In particular, this work introduces an R-GNN-based approach for ontology mapping, enabling more efficient entity extraction and ontology alignment with existing knowledge bases. Additionally, the research explores transfer learning techniques using pre-trained transformer models (e.g., DeBERTa-v3-large) fine-tuned on synthetic datasets generated via large language models such as LLaMA. The resulting ontology, termed IT Ontology (ITO), is evaluated against existing methodologies, highlighting significant improvements in precision and recall. This study advances the field of ontology engineering by automating the extraction, expansion, and refinement of ontologies tailored to the IT domain, thus bridging the gap between unstructured data and actionable knowledge.

Keywords: ontology mapping, knowledge graphs, R-GNN, ITIL, NER

Procedia PDF Downloads 20
19810 Automatic Segmentation of the Clean Speech Signal

Authors: M. A. Ben Messaoud, A. Bouzid, N. Ellouze

Abstract:

Speech Segmentation is the measure of the change point detection for partitioning an input speech signal into regions each of which accords to only one speaker. In this paper, we apply two features based on multi-scale product (MP) of the clean speech, namely the spectral centroid of MP, and the zero crossings rate of MP. We focus on multi-scale product analysis as an important tool for segmentation extraction. The multi-scale product is based on making the product of the speech wavelet transform coefficients at three successive dyadic scales. We have evaluated our method on the Keele database. Experimental results show the effectiveness of our method presenting a good performance. It shows that the two simple features can find word boundaries, and extracted the segments of the clean speech.

Keywords: multiscale product, spectral centroid, speech segmentation, zero crossings rate

Procedia PDF Downloads 501
19809 A Fast and Robust Protocol for Reconstruction and Re-Enactment of Historical Sites

Authors: Sanaa I. Abu Alasal, Madleen M. Esbeih, Eman R. Fayyad, Rami S. Gharaibeh, Mostafa Z. Ali, Ahmed A. Freewan, Monther M. Jamhawi

Abstract:

This research proposes a novel reconstruction protocol for restoring missing surfaces and low-quality edges and shapes in photos of artifacts at historical sites. The protocol starts with the extraction of a cloud of points. This extraction process is based on four subordinate algorithms, which differ in the robustness and amount of resultant. Moreover, they use different -but complementary- accuracy to some related features and to the way they build a quality mesh. The performance of our proposed protocol is compared with other state-of-the-art algorithms and toolkits. The statistical analysis shows that our algorithm significantly outperforms its rivals in the resultant quality of its object files used to reconstruct the desired model.

Keywords: meshes, point clouds, surface reconstruction protocols, 3D reconstruction

Procedia PDF Downloads 457
19808 Analytical Tools for Multi-Residue Analysis of Some Oxygenated Metabolites of PAHs (Hydroxylated, Quinones) in Sediments

Authors: I. Berger, N. Machour, F. Portet-Koltalo

Abstract:

Polycyclic aromatic hydrocarbons (PAHs) are toxic and carcinogenic pollutants produced in majority by incomplete combustion processes in industrialized and urbanized areas. After being emitted in atmosphere, these persistent contaminants are deposited to soils or sediments. Even if persistent, some can be partially degraded (photodegradation, biodegradation, chemical oxidation) and they lead to oxygenated metabolites (oxy-PAHs) which can be more toxic than their parent PAH. Oxy-PAHs are less measured than PAHs in sediments and this study aims to compare different analytical tools in order to extract and quantify a mixture of four hydroxylated PAHs (OH-PAHs) and four carbonyl PAHs (quinones) in sediments. Methodologies: Two analytical systems – HPLC with on-line UV and fluorescence detectors (HPLC-UV-FLD) and GC coupled to a mass spectrometer (GC-MS) – were compared to separate and quantify oxy-PAHs. Microwave assisted extraction (MAE) was optimized to extract oxy-PAHs from sediments. Results: First OH-PAHs and quinones were analyzed in HPLC with on-line UV and fluorimetric detectors. OH-PAHs were detected with the sensitive FLD, but the non-fluorescent quinones were detected with UV. The limits of detection (LOD)s obtained were in the range (2-3)×10-4 mg/L for OH-PAHs and (2-3)×10-3 mg/L for quinones. Second, even if GC-MS is not well adapted to the analysis of the thermodegradable OH-PAHs and quinones without any derivatization step, it was used because of the advantages of the detector in terms of identification and of GC in terms of efficiency. Without derivatization, only two of the four quinones were detected in the range 1-10 mg/L (LODs=0.3-1.2 mg/L) and LODs were neither very satisfying for the four OH-PAHs (0.18-0.6 mg/L). So two derivatization processes were optimized, comparing to literature: one for silylation of OH-PAHs, one for acetylation of quinones. Silylation using BSTFA/TCMS 99/1 was enhanced using a mixture of catalyst solvents (pyridine/ethyle acetate) and finding the appropriate reaction duration (5-60 minutes). Acetylation was optimized at different steps of the process, including the initial volume of compounds to derivatize, the added amounts of Zn (0.1-0.25 g), the nature of the derivatization product (acetic anhydride, heptafluorobutyric acid…) and the liquid/liquid extraction at the end of the process. After derivatization, LODs were decreased by a factor 3 for OH-PAHs and by a factor 4 for quinones, all the quinones being now detected. Thereafter, quinones and OH-PAHs were extracted from spiked sediments using microwave assisted extraction (MAE) followed by GC-MS analysis. Several mixtures of solvents of different volumes (10-25 mL) and using different extraction temperatures (80-120°C) were tested to obtain the best recovery yields. Satisfactory recoveries could be obtained for quinones (70-96%) and for OH-PAHs (70-104%). Temperature was a critical factor which had to be controlled to avoid oxy-PAHs degradation during the MAE extraction process. Conclusion: Even if MAE-GC-MS was satisfactory to analyze these oxy-PAHs, MAE optimization has to be carried on to obtain a most appropriate extraction solvent mixture, allowing a direct injection in the HPLC-UV-FLD system, which is more sensitive than GC-MS and does not necessitate a previous long derivatization step.

Keywords: derivatizations for GC-MS, microwave assisted extraction, on-line HPLC-UV-FLD, oxygenated PAHs, polluted sediments

Procedia PDF Downloads 287
19807 Lead in The Soil-Plant System Following Aged Contamination from Ceramic Wastes

Authors: F. Pedron, M. Grifoni, G. Petruzzelli, M. Barbafieri, I. Rosellini, B. Pezzarossa

Abstract:

Lead contamination of agricultural land mainly vegetated with perennial ryegrass (Lolium perenne) has been investigated. The metal derived from the discharge of sludge from a ceramic industry in the past had used lead paints. The results showed very high values of lead concentration in many soil samples. In order to assess the lead soil contamination, a sequential extraction with H2O, KNO3, EDTA was performed, and the chemical forms of lead in the soil were evaluated. More than 70% of lead was in a potentially bioavailable form. Analysis of Lolium perenne showed elevated lead concentration. A Freundlich-like model was used to describe the transferability of the metal from the soil to the plant.

Keywords: bioavailability, Freundlich-like equation, sequential extraction, soil lead contamination

Procedia PDF Downloads 310
19806 Age Estimation from Teeth among North Indian Population: Comparison and Reliability of Qualitative and Quantitative Methods

Authors: Jasbir Arora, Indu Talwar, Daisy Sahni, Vidya Rattan

Abstract:

Introduction: Age estimation is a crucial step to build the identity of a person, both in case of deceased and alive. In adults, age can be estimated on the basis of six regressive (Attrition, Secondary dentine, Dentine transparency, Root resorption, Cementum apposition and Periodontal Disease) changes in teeth qualitatively using scoring system and quantitatively by micrometric method. The present research was designed to establish the reliability of qualitative (method 1) and quantitative (method 2) of age estimation among North Indians and to compare the efficacy of these two methods. Method: 250 single-rooted extracted teeth (18-75 yrs.) were collected from Department of Oral Health Sciences, PGIMER, Chandigarh. Before extraction, periodontal score of each tooth was noted. Labiolingual sections were prepared and examined under light microscope for regressive changes. Each parameter was scored using Gustafson’s 0-3 point score system (qualitative), and total score was calculated. For quantitative method, each regressive change was measured quantitatively in form of 18 micrometric parameters under microscope with the help of measuring eyepiece. Age was estimated using linear and multiple regression analysis in Gustafson’s method and Kedici’s method respectively. Estimated age was compared with actual age on the basis of absolute mean error. Results: In pooled data, by Gustafson’s method, significant correlation (r= 0.8) was observed between total score and actual age. Total score generated an absolute mean error of ±7.8 years. Whereas, for Kedici’s method, a value of correlation coefficient of r=0.5 (p<0.01) was observed between all the eighteen micrometric parameters and known age. Using multiple regression equation, age was estimated, and an absolute mean error of age was found to be ±12.18 years. Conclusion: Gustafson’s (qualitative) method was found to be a better predictor for age estimation among North Indians.

Keywords: forensic odontology, age estimation, North India, teeth

Procedia PDF Downloads 242
19805 Ultrasonic Extraction of Phenolics from Leaves of Shallots and Peels of Potatoes for Biofortification of Cheese

Authors: Lila Boulekbache-Makhlouf, Brahmi Fatiha

Abstract:

This study was carried out with the aim of enriching fresh cheese with the food by-products, which are the leaves of shallots and the peels of potatoes. Firstly, the conditions for extracting the total polyphenols (TPP) using ultrasound are optimized. Then, the contents of PPT, flavonoids, and antioxidant activity were evaluated for the extracts obtained by adopting the optimal parameter. On the other hand, we have carried out some physico-chemical, microbiological, and sensory analyzes of the cheese produced. The maximum PPT value of 70.44 mg GAE/g DM of shallot leaves was reached with 40% (v/v) ethanol, an extraction time of 90 min, and a temperature of 10°C. Meanwhile, the maximum TPP content of potato peels of 45.03 ± 4.16 mg GAE/g DM was obtained using an ethanol/water mixture (40%, v/v), a time of 30 min, and a temperature of 60°C and the flavonoid contents were 13.99 and 7.52 QE/g DM, respectively. From the antioxidant tests, we deduced that the potato peels present a higher antioxidant power with IC50s of 125.42 ± 2.78 μg/mL for DPPH, of 87.21 ± 7.72 μg/mL for phosphomolybdate and 200.77 ± 13.38 μg/mL for iron chelation, compared with the results obtained for shallot leaves which were 204.29 ± 0.09, 45.85 ± 3,46 and 1004.10 ± 145.73 μg/mL, respectively. The results of the physico-chemical analyzes have shown that the formulated cheese was compliant with standards. Microbiological analyzes show that the hygienic quality of the cheese produced was satisfactory. According to the sensory analyzes, the experts liked the cheese enriched with the powder and pieces of the leaves of the shallots.

Keywords: shallots leaves, potato peels, ultrasound extraction, phenolic, cheese

Procedia PDF Downloads 187
19804 A Computer-Aided System for Tooth Shade Matching

Authors: Zuhal Kurt, Meral Kurt, Bilge T. Bal, Kemal Ozkan

Abstract:

Shade matching and reproduction is the most important element of success in prosthetic dentistry. Until recently, shade matching procedure was implemented by dentists visual perception with the help of shade guides. Since many factors influence visual perception; tooth shade matching using visual devices (shade guides) is highly subjective and inconsistent. Subjective nature of this process has lead to the development of instrumental devices. Nowadays, colorimeters, spectrophotometers, spectroradiometers and digital image analysing systems are used for instrumental shade selection. Instrumental devices have advantages that readings are quantifiable, can obtain more rapidly and simply, objectively and precisely. However, these devices have noticeable drawbacks. For example, translucent structure and irregular surfaces of teeth lead to defects on measurement with these devices. Also between the results acquired by devices with different measurement principles may make inconsistencies. So, its obligatory to search for new methods for dental shade matching process. A computer-aided system device; digital camera has developed rapidly upon today. Currently, advances in image processing and computing have resulted in the extensive use of digital cameras for color imaging. This procedure has a much cheaper process than the use of traditional contact-type color measurement devices. Digital cameras can be taken by the place of contact-type instruments for shade selection and overcome their disadvantages. Images taken from teeth show morphology and color texture of teeth. In last decades, a new method was recommended to compare the color of shade tabs taken by a digital camera using color features. This method showed that visual and computer-aided shade matching systems should be used as concatenated. Recently using methods of feature extraction techniques are based on shape description and not used color information. However, color is mostly experienced as an essential property in depicting and extracting features from objects in the world around us. When local feature descriptors with color information are extended by concatenating color descriptor with the shape descriptor, that descriptor will be effective on visual object recognition and classification task. Therefore, the color descriptor is to be used in combination with a shape descriptor it does not need to contain any spatial information, which leads us to use local histograms. This local color histogram method is remain reliable under variation of photometric changes, geometrical changes and variation of image quality. So, coloring local feature extraction methods are used to extract features, and also the Scale Invariant Feature Transform (SIFT) descriptor used to for shape description in the proposed method. After the combination of these descriptors, the state-of-art descriptor named by Color-SIFT will be used in this study. Finally, the image feature vectors obtained from quantization algorithm are fed to classifiers such as Nearest Neighbor (KNN), Naive Bayes or Support Vector Machines (SVM) to determine label(s) of the visual object category or matching. In this study, SVM are used as classifiers for color determination and shade matching. Finally, experimental results of this method will be compared with other recent studies. It is concluded from the study that the proposed method is remarkable development on computer aided tooth shade determination system.

Keywords: classifiers, color determination, computer-aided system, tooth shade matching, feature extraction

Procedia PDF Downloads 448
19803 Electrical Decomposition of Time Series of Power Consumption

Authors: Noura Al Akkari, Aurélie Foucquier, Sylvain Lespinats

Abstract:

Load monitoring is a management process for energy consumption towards energy savings and energy efficiency. Non Intrusive Load Monitoring (NILM) is one method of load monitoring used for disaggregation purposes. NILM is a technique for identifying individual appliances based on the analysis of the whole residence data retrieved from the main power meter of the house. Our NILM framework starts with data acquisition, followed by data preprocessing, then event detection, feature extraction, then general appliance modeling and identification at the final stage. The event detection stage is a core component of NILM process since event detection techniques lead to the extraction of appliance features. Appliance features are required for the accurate identification of the household devices. In this research work, we aim at developing a new event detection methodology with accurate load disaggregation to extract appliance features. Time-domain features extracted are used for tuning general appliance models for appliance identification and classification steps. We use unsupervised algorithms such as Dynamic Time Warping (DTW). The proposed method relies on detecting areas of operation of each residential appliance based on the power demand. Then, detecting the time at which each selected appliance changes its states. In order to fit with practical existing smart meters capabilities, we work on low sampling data with a frequency of (1/60) Hz. The data is simulated on Load Profile Generator software (LPG), which was not previously taken into consideration for NILM purposes in the literature. LPG is a numerical software that uses behaviour simulation of people inside the house to generate residential energy consumption data. The proposed event detection method targets low consumption loads that are difficult to detect. Also, it facilitates the extraction of specific features used for general appliance modeling. In addition to this, the identification process includes unsupervised techniques such as DTW. To our best knowledge, there exist few unsupervised techniques employed with low sampling data in comparison to the many supervised techniques used for such cases. We extract a power interval at which falls the operation of the selected appliance along with a time vector for the values delimiting the state transitions of the appliance. After this, appliance signatures are formed from extracted power, geometrical and statistical features. Afterwards, those formed signatures are used to tune general model types for appliances identification using unsupervised algorithms. This method is evaluated using both simulated data on LPG and real-time Reference Energy Disaggregation Dataset (REDD). For that, we compute performance metrics using confusion matrix based metrics, considering accuracy, precision, recall and error-rate. The performance analysis of our methodology is then compared with other detection techniques previously used in the literature review, such as detection techniques based on statistical variations and abrupt changes (Variance Sliding Window and Cumulative Sum).

Keywords: electrical disaggregation, DTW, general appliance modeling, event detection

Procedia PDF Downloads 78
19802 Extraction and Characterization of Kernel Oil of Acrocomia Totai

Authors: Gredson Keif Souza, Nehemias Curvelo Pereira

Abstract:

Kernel oil from Macaúba is an important source of essential fatty acids. Thus, a new knowledge of the oil of this species could be used in new applications, such as pharmaceutical drugs based in the manufacture of cosmetics, and in various industrial processes. The aim of this study was to characterize the kernel oil of macaúba (Acrocomia Totai) at different times of their maturation. The physico-chemical characteristics were determined in accordance with the official analytical methods of oils and fats. It was determined the content of water and lipids in kernel, saponification value, acid value, water content in the oil, viscosity, density, composition in fatty acids by gas chromatography and molar mass. The results submitted to Tukey test for significant value to 5%. Found for the unripe fruits values superior to unsaturated fatty acids.

Keywords: extraction, characterization, kernel oil, acrocomia totai

Procedia PDF Downloads 358
19801 Electromagnetically-Vibrated Solid-Phase Microextraction for Organic Compounds

Authors: Soo Hyung Park, Seong Beom Kim, Wontae Lee, Jin Chul Joo, Jungmin Lee, Jongsoo Choi

Abstract:

A newly-developed electromagnetically vibrated solid-phase microextraction (SPME) device for extracting nonpolar organic compounds from aqueous matrices was evaluated in terms of sorption equilibrium time, precision, and detection level relative to three other more conventional extraction techniques involving SPME, viz., static, magnetic stirring, and fiber insertion/retraction. Electromagnetic vibration at 300~420 cycles/s was found to be the most efficient extraction technique in terms of reducing sorption equilibrium time and enhancing both precision and linearity. The increased efficiency for electromagnetic vibration was attributed to a greater reduction in the thickness of the stagnant-water layer that facilitated more rapid mass transport from the aqueous matrix to the SPME fiber. Electromagnetic vibration less than 500 cycles/s also did not detrimentally impact the sustainability of the extracting performance of the SPME fiber. Therefore, electromagnetically vibrated SPME may be a more powerful tool for rapid sampling and solvent-free sample preparation relative to other more conventional extraction techniques used with SPME.

Keywords: electromagnetic vibration, organic compounds, precision, solid-phase microextraction (SPME), sorption equilibrium time

Procedia PDF Downloads 255
19800 Information Extraction for Short-Answer Question for the University of the Cordilleras

Authors: Thelma Palaoag, Melanie Basa, Jezreel Mark Panilo

Abstract:

Checking short-answer questions and essays, whether it may be paper or electronic in form, is a tiring and tedious task for teachers. Evaluating a student’s output require wide array of domains. Scoring the work is often a critical task. Several attempts in the past few years to create an automated writing assessment software but only have received negative results from teachers and students alike due to unreliability in scoring, does not provide feedback and others. The study aims to create an application that will be able to check short-answer questions which incorporate information extraction. Information extraction is a subfield of Natural Language Processing (NLP) where a chunk of text (technically known as unstructured text) is being broken down to gather necessary bits of data and/or keywords (structured text) to be further analyzed or rather be utilized by query tools. The proposed system shall be able to extract keywords or phrases from the individual’s answers to match it into a corpora of words (as defined by the instructor), which shall be the basis of evaluation of the individual’s answer. The proposed system shall also enable the teacher to provide feedback and re-evaluate the output of the student for some writing elements in which the computer cannot fully evaluate such as creativity and logic. Teachers can formulate, design, and check short answer questions efficiently by defining keywords or phrases as parameters by assigning weights for checking answers. With the proposed system, teacher’s time in checking and evaluating students output shall be lessened, thus, making the teacher more productive and easier.

Keywords: information extraction, short-answer question, natural language processing, application

Procedia PDF Downloads 428
19799 Detection and Classification of Myocardial Infarction Using New Extracted Features from Standard 12-Lead ECG Signals

Authors: Naser Safdarian, Nader Jafarnia Dabanloo

Abstract:

In this paper we used four features i.e. Q-wave integral, QRS complex integral, T-wave integral and total integral as extracted feature from normal and patient ECG signals to detection and localization of myocardial infarction (MI) in left ventricle of heart. In our research we focused on detection and localization of MI in standard ECG. We use the Q-wave integral and T-wave integral because this feature is important impression in detection of MI. We used some pattern recognition method such as Artificial Neural Network (ANN) to detect and localize the MI. Because these methods have good accuracy for classification of normal and abnormal signals. We used one type of Radial Basis Function (RBF) that called Probabilistic Neural Network (PNN) because of its nonlinearity property, and used other classifier such as k-Nearest Neighbors (KNN), Multilayer Perceptron (MLP) and Naive Bayes Classification. We used PhysioNet database as our training and test data. We reached over 80% for accuracy in test data for localization and over 95% for detection of MI. Main advantages of our method are simplicity and its good accuracy. Also we can improve accuracy of classification by adding more features in this method. A simple method based on using only four features which extracted from standard ECG is presented which has good accuracy in MI localization.

Keywords: ECG signal processing, myocardial infarction, features extraction, pattern recognition

Procedia PDF Downloads 456
19798 Automatic Detection of Suicidal Behaviors Using an RGB-D Camera: Azure Kinect

Authors: Maha Jazouli

Abstract:

Suicide is one of the most important causes of death in the prison environment, both in Canada and internationally. Rates of attempts of suicide and self-harm have been on the rise in recent years, with hangings being the most frequent method resorted to. The objective of this article is to propose a method to automatically detect in real time suicidal behaviors. We present a gesture recognition system that consists of three modules: model-based movement tracking, feature extraction, and gesture recognition using machine learning algorithms (MLA). Our proposed system gives us satisfactory results. This smart video surveillance system can help assist staff responsible for the safety and health of inmates by alerting them when suicidal behavior is detected, which helps reduce mortality rates and save lives.

Keywords: suicide detection, Kinect azure, RGB-D camera, SVM, machine learning, gesture recognition

Procedia PDF Downloads 190
19797 An Experiential Learning of Ontology-Based Multi-document Summarization by Removal Summarization Techniques

Authors: Pranjali Avinash Yadav-Deshmukh

Abstract:

Remarkable development of the Internet along with the new technological innovation, such as high-speed systems and affordable large storage space have led to a tremendous increase in the amount and accessibility to digital records. For any person, studying of all these data is tremendously time intensive, so there is a great need to access effective multi-document summarization (MDS) systems, which can successfully reduce details found in several records into a short, understandable summary or conclusion. For semantic representation of textual details in ontology area, as a theoretical design, our system provides a significant structure. The stability of using the ontology in fixing multi-document summarization problems in the sector of catastrophe control is finding its recommended design. Saliency ranking is usually allocated to each phrase and phrases are rated according to the ranking, then the top rated phrases are chosen as the conclusion. With regards to the conclusion quality, wide tests on a selection of media announcements are appropriate for “Jammu Kashmir Overflow in 2014” records. Ontology centered multi-document summarization methods using “NLP centered extraction” outshine other baselines. Our participation in recommended component is to implement the details removal methods (NLP) to enhance the results.

Keywords: disaster management, extraction technique, k-means, multi-document summarization, NLP, ontology, sentence extraction

Procedia PDF Downloads 388
19796 Investigation of the Physicochemistry in Leaching of Blackmass for the Recovery of Metals from Spent Lithium-Ion Battery

Authors: Alexandre Chagnes

Abstract:

Lithium-ion battery is the technology of choice in the development of electric vehicles. This technology is now mature, although there are still many challenges to increase their energy density while ensuring an irreproachable safety of use. For this goal, it is necessary to develop new cathodic materials that can be cycled at higher voltages and electrolytes compatible with these materials. But the challenge does not only concern the production of efficient batteries for the electrochemical storage of energy since lithium-ion battery technology relies on the use of critical and/or strategic value resources. It is, therefore, crucial to include Lithium-ion batteries development in a circular economy approach very early. In particular, optimized recycling and reuse of battery components must both minimize their impact on the environment and limit geopolitical issues related to tensions on the mineral resources necessary for lithium-ion battery production. Although recycling will never replace mining, it reduces resource dependence by ensuring the presence of exploitable resources in the territory, which is particularly important for countries like France, where exploited or exploitable resources are limited. This conference addresses the development of a new hydrometallurgical process combining leaching of cathodic material from spent lithium-ion battery in acidic chloride media and solvent extraction process. Most of recycling processes reported in the literature rely on the sulphate route, and a few studies investigate the potentialities of the chloride route despite many advantages and the possibility to develop new chemistry, which could get easier the metal separation. The leaching mechanisms and the solvent extraction equilibria will be presented in this conference. Based on the comprehension of the physicochemistry of leaching and solvent extraction, the present study will introduce a new hydrometallurgical process for the production of cobalt, nickel, manganese and lithium from spent cathodic materials.

Keywords: lithium-ion battery, recycling, hydrometallurgy, leaching, solvent extraction

Procedia PDF Downloads 80
19795 Empirical Decomposition of Time Series of Power Consumption

Authors: Noura Al Akkari, Aurélie Foucquier, Sylvain Lespinats

Abstract:

Load monitoring is a management process for energy consumption towards energy savings and energy efficiency. Non Intrusive Load Monitoring (NILM) is one method of load monitoring used for disaggregation purposes. NILM is a technique for identifying individual appliances based on the analysis of the whole residence data retrieved from the main power meter of the house. Our NILM framework starts with data acquisition, followed by data preprocessing, then event detection, feature extraction, then general appliance modeling and identification at the final stage. The event detection stage is a core component of NILM process since event detection techniques lead to the extraction of appliance features. Appliance features are required for the accurate identification of the household devices. In this research work, we aim at developing a new event detection methodology with accurate load disaggregation to extract appliance features. Time-domain features extracted are used for tuning general appliance models for appliance identification and classification steps. We use unsupervised algorithms such as Dynamic Time Warping (DTW). The proposed method relies on detecting areas of operation of each residential appliance based on the power demand. Then, detecting the time at which each selected appliance changes its states. In order to fit with practical existing smart meters capabilities, we work on low sampling data with a frequency of (1/60) Hz. The data is simulated on Load Profile Generator software (LPG), which was not previously taken into consideration for NILM purposes in the literature. LPG is a numerical software that uses behaviour simulation of people inside the house to generate residential energy consumption data. The proposed event detection method targets low consumption loads that are difficult to detect. Also, it facilitates the extraction of specific features used for general appliance modeling. In addition to this, the identification process includes unsupervised techniques such as DTW. To our best knowledge, there exist few unsupervised techniques employed with low sampling data in comparison to the many supervised techniques used for such cases. We extract a power interval at which falls the operation of the selected appliance along with a time vector for the values delimiting the state transitions of the appliance. After this, appliance signatures are formed from extracted power, geometrical and statistical features. Afterwards, those formed signatures are used to tune general model types for appliances identification using unsupervised algorithms. This method is evaluated using both simulated data on LPG and real-time Reference Energy Disaggregation Dataset (REDD). For that, we compute performance metrics using confusion matrix based metrics, considering accuracy, precision, recall and error-rate. The performance analysis of our methodology is then compared with other detection techniques previously used in the literature review, such as detection techniques based on statistical variations and abrupt changes (Variance Sliding Window and Cumulative Sum).

Keywords: general appliance model, non intrusive load monitoring, events detection, unsupervised techniques;

Procedia PDF Downloads 82
19794 Tumor Boundary Extraction Using Intensity and Texture-Based on Gradient Vector

Authors: Namita Mittal, Himakshi Shekhawat, Ankit Vidyarthi

Abstract:

In medical research study, doctors and radiologists face lot of complexities in analysing the brain tumors in Magnetic Resonance (MR) images. Brain tumor detection is difficult due to amorphous tumor shape and overlapping of similar tissues in nearby region. So, radiologists require one such clinically viable solution which helps in automatic segmentation of tumor inside brain MR image. Initially, segmentation methods were used to detect tumor, by dividing the image into segments but causes loss of information. In this paper, a hybrid method is proposed which detect Region of Interest (ROI) on the basis of difference in intensity values and texture values of tumor region using nearby tissues with Gradient Vector Flow (GVF) technique in the identification of ROI. Proposed approach uses both intensity and texture values for identification of abnormal section of the brain MR images. Experimental results show that proposed method outperforms GVF method without any loss of information.

Keywords: brain tumor, GVF, intensity, MR images, segmentation, texture

Procedia PDF Downloads 433