Search results for: sequential extraction process
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 16959

Search results for: sequential extraction process

16419 Ontology Mapping with R-GNN for IT Infrastructure: Enhancing Ontology Construction and Knowledge Graph Expansion

Authors: Andrey Khalov

Abstract:

The rapid growth of unstructured data necessitates advanced methods for transforming raw information into structured knowledge, particularly in domain-specific contexts such as IT service management and outsourcing. This paper presents a methodology for automatically constructing domain ontologies using the DOLCE framework as the base ontology. The research focuses on expanding ITIL-based ontologies by integrating concepts from ITSMO, followed by the extraction of entities and relationships from domain-specific texts through transformers and statistical methods like formal concept analysis (FCA). In particular, this work introduces an R-GNN-based approach for ontology mapping, enabling more efficient entity extraction and ontology alignment with existing knowledge bases. Additionally, the research explores transfer learning techniques using pre-trained transformer models (e.g., DeBERTa-v3-large) fine-tuned on synthetic datasets generated via large language models such as LLaMA. The resulting ontology, termed IT Ontology (ITO), is evaluated against existing methodologies, highlighting significant improvements in precision and recall. This study advances the field of ontology engineering by automating the extraction, expansion, and refinement of ontologies tailored to the IT domain, thus bridging the gap between unstructured data and actionable knowledge.

Keywords: ontology mapping, knowledge graphs, R-GNN, ITIL, NER

Procedia PDF Downloads 15
16418 Resume Ranking Using Custom Word2vec and Rule-Based Natural Language Processing Techniques

Authors: Subodh Chandra Shakya, Rajendra Sapkota, Aakash Tamang, Shushant Pudasaini, Sujan Adhikari, Sajjan Adhikari

Abstract:

Lots of efforts have been made in order to measure the semantic similarity between the text corpora in the documents. Techniques have been evolved to measure the similarity of two documents. One such state-of-art technique in the field of Natural Language Processing (NLP) is word to vector models, which converts the words into their word-embedding and measures the similarity between the vectors. We found this to be quite useful for the task of resume ranking. So, this research paper is the implementation of the word2vec model along with other Natural Language Processing techniques in order to rank the resumes for the particular job description so as to automate the process of hiring. The research paper proposes the system and the findings that were made during the process of building the system.

Keywords: chunking, document similarity, information extraction, natural language processing, word2vec, word embedding

Procedia PDF Downloads 158
16417 Electrical Decomposition of Time Series of Power Consumption

Authors: Noura Al Akkari, Aurélie Foucquier, Sylvain Lespinats

Abstract:

Load monitoring is a management process for energy consumption towards energy savings and energy efficiency. Non Intrusive Load Monitoring (NILM) is one method of load monitoring used for disaggregation purposes. NILM is a technique for identifying individual appliances based on the analysis of the whole residence data retrieved from the main power meter of the house. Our NILM framework starts with data acquisition, followed by data preprocessing, then event detection, feature extraction, then general appliance modeling and identification at the final stage. The event detection stage is a core component of NILM process since event detection techniques lead to the extraction of appliance features. Appliance features are required for the accurate identification of the household devices. In this research work, we aim at developing a new event detection methodology with accurate load disaggregation to extract appliance features. Time-domain features extracted are used for tuning general appliance models for appliance identification and classification steps. We use unsupervised algorithms such as Dynamic Time Warping (DTW). The proposed method relies on detecting areas of operation of each residential appliance based on the power demand. Then, detecting the time at which each selected appliance changes its states. In order to fit with practical existing smart meters capabilities, we work on low sampling data with a frequency of (1/60) Hz. The data is simulated on Load Profile Generator software (LPG), which was not previously taken into consideration for NILM purposes in the literature. LPG is a numerical software that uses behaviour simulation of people inside the house to generate residential energy consumption data. The proposed event detection method targets low consumption loads that are difficult to detect. Also, it facilitates the extraction of specific features used for general appliance modeling. In addition to this, the identification process includes unsupervised techniques such as DTW. To our best knowledge, there exist few unsupervised techniques employed with low sampling data in comparison to the many supervised techniques used for such cases. We extract a power interval at which falls the operation of the selected appliance along with a time vector for the values delimiting the state transitions of the appliance. After this, appliance signatures are formed from extracted power, geometrical and statistical features. Afterwards, those formed signatures are used to tune general model types for appliances identification using unsupervised algorithms. This method is evaluated using both simulated data on LPG and real-time Reference Energy Disaggregation Dataset (REDD). For that, we compute performance metrics using confusion matrix based metrics, considering accuracy, precision, recall and error-rate. The performance analysis of our methodology is then compared with other detection techniques previously used in the literature review, such as detection techniques based on statistical variations and abrupt changes (Variance Sliding Window and Cumulative Sum).

Keywords: electrical disaggregation, DTW, general appliance modeling, event detection

Procedia PDF Downloads 78
16416 Ultrasonic Extraction of Phenolics from Leaves of Shallots and Peels of Potatoes for Biofortification of Cheese

Authors: Lila Boulekbache-Makhlouf, Brahmi Fatiha

Abstract:

This study was carried out with the aim of enriching fresh cheese with the food by-products, which are the leaves of shallots and the peels of potatoes. Firstly, the conditions for extracting the total polyphenols (TPP) using ultrasound are optimized. Then, the contents of PPT, flavonoids, and antioxidant activity were evaluated for the extracts obtained by adopting the optimal parameter. On the other hand, we have carried out some physico-chemical, microbiological, and sensory analyzes of the cheese produced. The maximum PPT value of 70.44 mg GAE/g DM of shallot leaves was reached with 40% (v/v) ethanol, an extraction time of 90 min, and a temperature of 10°C. Meanwhile, the maximum TPP content of potato peels of 45.03 ± 4.16 mg GAE/g DM was obtained using an ethanol/water mixture (40%, v/v), a time of 30 min, and a temperature of 60°C and the flavonoid contents were 13.99 and 7.52 QE/g DM, respectively. From the antioxidant tests, we deduced that the potato peels present a higher antioxidant power with IC50s of 125.42 ± 2.78 μg/mL for DPPH, of 87.21 ± 7.72 μg/mL for phosphomolybdate and 200.77 ± 13.38 μg/mL for iron chelation, compared with the results obtained for shallot leaves which were 204.29 ± 0.09, 45.85 ± 3,46 and 1004.10 ± 145.73 μg/mL, respectively. The results of the physico-chemical analyzes have shown that the formulated cheese was compliant with standards. Microbiological analyzes show that the hygienic quality of the cheese produced was satisfactory. According to the sensory analyzes, the experts liked the cheese enriched with the powder and pieces of the leaves of the shallots.

Keywords: shallots leaves, potato peels, ultrasound extraction, phenolic, cheese

Procedia PDF Downloads 184
16415 Flashsonar or Echolocation Education: Expanding the Function of Hearing and Changing the Meaning of Blindness

Authors: Thomas, Daniel Tajo, Kish

Abstract:

Sight is primarily associated with the function of gathering and processing near and extended spatial information which is largely used to support self-determined interaction with the environment through self-directed movement and navigation. By contrast, hearing is primarily associated with the function of gathering and processing sequential information which may typically be used to support self-determined communication through the self-directed use of music and language. Blindness or the lack of vision is traditionally characterized by a lack of capacity to access spatial information which, in turn, is presumed to result in a lack of capacity for self-determined interaction with the environment due to limitations in self-directed movement and navigation. However, through a specific protocol of FlashSonar education developed by World Access for the Blind, the function of hearing can be expanded in blind people to carry out some of the functions normally associated with sight, that is to access and process near and extended spatial information to construct three-dimensional acoustic images of the environment. This perceptual education protocol results in a significant restoration in blind people of self-determined environmental interaction, movement, and navigational capacities normally attributed to vision - a new way to see. Thus, by expanding the function of hearing to process spatial information to restore self-determined movement, we are not only changing the meaning of blindness, and what it means to be blind, but we are also recasting the meaning of vision and what it is to see.

Keywords: echolocation, changing, sensory, function

Procedia PDF Downloads 154
16414 Extraction and Characterization of Kernel Oil of Acrocomia Totai

Authors: Gredson Keif Souza, Nehemias Curvelo Pereira

Abstract:

Kernel oil from Macaúba is an important source of essential fatty acids. Thus, a new knowledge of the oil of this species could be used in new applications, such as pharmaceutical drugs based in the manufacture of cosmetics, and in various industrial processes. The aim of this study was to characterize the kernel oil of macaúba (Acrocomia Totai) at different times of their maturation. The physico-chemical characteristics were determined in accordance with the official analytical methods of oils and fats. It was determined the content of water and lipids in kernel, saponification value, acid value, water content in the oil, viscosity, density, composition in fatty acids by gas chromatography and molar mass. The results submitted to Tukey test for significant value to 5%. Found for the unripe fruits values superior to unsaturated fatty acids.

Keywords: extraction, characterization, kernel oil, acrocomia totai

Procedia PDF Downloads 356
16413 Electromagnetically-Vibrated Solid-Phase Microextraction for Organic Compounds

Authors: Soo Hyung Park, Seong Beom Kim, Wontae Lee, Jin Chul Joo, Jungmin Lee, Jongsoo Choi

Abstract:

A newly-developed electromagnetically vibrated solid-phase microextraction (SPME) device for extracting nonpolar organic compounds from aqueous matrices was evaluated in terms of sorption equilibrium time, precision, and detection level relative to three other more conventional extraction techniques involving SPME, viz., static, magnetic stirring, and fiber insertion/retraction. Electromagnetic vibration at 300~420 cycles/s was found to be the most efficient extraction technique in terms of reducing sorption equilibrium time and enhancing both precision and linearity. The increased efficiency for electromagnetic vibration was attributed to a greater reduction in the thickness of the stagnant-water layer that facilitated more rapid mass transport from the aqueous matrix to the SPME fiber. Electromagnetic vibration less than 500 cycles/s also did not detrimentally impact the sustainability of the extracting performance of the SPME fiber. Therefore, electromagnetically vibrated SPME may be a more powerful tool for rapid sampling and solvent-free sample preparation relative to other more conventional extraction techniques used with SPME.

Keywords: electromagnetic vibration, organic compounds, precision, solid-phase microextraction (SPME), sorption equilibrium time

Procedia PDF Downloads 254
16412 Numerical Investigation of Phase Change Materials (PCM) Solidification in a Finned Rectangular Heat Exchanger

Authors: Mounir Baccar, Imen Jmal

Abstract:

Because of the rise in energy costs, thermal storage systems designed for the heating and cooling of buildings are becoming increasingly important. Energy storage can not only reduce the time or rate mismatch between energy supply and demand but also plays an important role in energy conservation. One of the most preferable storage techniques is the Latent Heat Thermal Energy Storage (LHTES) by Phase Change Materials (PCM) due to its important energy storage density and isothermal storage process. This paper presents a numerical study of the solidification of a PCM (paraffin RT27) in a rectangular thermal storage exchanger for air conditioning systems taking into account the presence of natural convection. Resolution of continuity, momentum and thermal energy equations are treated by the finite volume method. The main objective of this numerical approach is to study the effect of natural convection on the PCM solidification time and the impact of fins number on heat transfer enhancement. It also aims at investigating the temporal evolution of PCM solidification, as well as the longitudinal profiles of the HTF circling in the duct. The present research undertakes the study of two cases: the first one treats the solidification of PCM in a PCM-air heat exchanger without fins, while the second focuses on the solidification of PCM in a heat exchanger of the same type with the addition of fins (3 fins, 5 fins, and 9 fins). Without fins, the stratification of the PCM from colder to hotter during the heat transfer process has been noted. This behavior prevents the formation of thermo-convective cells in PCM area and then makes transferring almost conductive. In the presence of fins, energy extraction from PCM to airflow occurs at a faster rate, which contributes to the reduction of the discharging time and the increase of the outlet air temperature (HTF). However, for a great number of fins (9 fins), the enhancement of the solidification process is not significant because of the effect of confinement of PCM liquid spaces for the development of thermo-convective flow. Hence, it can be concluded that the effect of natural convection is not very significant for a high number of fins. In the optimum case, using 3 fins, the increasing temperature of the HTF exceeds approximately 10°C during the first 30 minutes. When solidification progresses from the surfaces of the PCM-container and propagates to the central liquid phase, an insulating layer will be created in the vicinity of the container surfaces and the fins, causing a low heat exchange rate between PCM and air. As the solid PCM layer gets thicker, a progressive regression of the field of movements is induced in the liquid phase, thus leading to the inhibition of heat extraction process. After about 2 hours, 68% of the PCM became solid, and heat transfer was almost dominated by conduction mechanism.

Keywords: heat transfer enhancement, front solidification, PCM, natural convection

Procedia PDF Downloads 187
16411 Empirical Decomposition of Time Series of Power Consumption

Authors: Noura Al Akkari, Aurélie Foucquier, Sylvain Lespinats

Abstract:

Load monitoring is a management process for energy consumption towards energy savings and energy efficiency. Non Intrusive Load Monitoring (NILM) is one method of load monitoring used for disaggregation purposes. NILM is a technique for identifying individual appliances based on the analysis of the whole residence data retrieved from the main power meter of the house. Our NILM framework starts with data acquisition, followed by data preprocessing, then event detection, feature extraction, then general appliance modeling and identification at the final stage. The event detection stage is a core component of NILM process since event detection techniques lead to the extraction of appliance features. Appliance features are required for the accurate identification of the household devices. In this research work, we aim at developing a new event detection methodology with accurate load disaggregation to extract appliance features. Time-domain features extracted are used for tuning general appliance models for appliance identification and classification steps. We use unsupervised algorithms such as Dynamic Time Warping (DTW). The proposed method relies on detecting areas of operation of each residential appliance based on the power demand. Then, detecting the time at which each selected appliance changes its states. In order to fit with practical existing smart meters capabilities, we work on low sampling data with a frequency of (1/60) Hz. The data is simulated on Load Profile Generator software (LPG), which was not previously taken into consideration for NILM purposes in the literature. LPG is a numerical software that uses behaviour simulation of people inside the house to generate residential energy consumption data. The proposed event detection method targets low consumption loads that are difficult to detect. Also, it facilitates the extraction of specific features used for general appliance modeling. In addition to this, the identification process includes unsupervised techniques such as DTW. To our best knowledge, there exist few unsupervised techniques employed with low sampling data in comparison to the many supervised techniques used for such cases. We extract a power interval at which falls the operation of the selected appliance along with a time vector for the values delimiting the state transitions of the appliance. After this, appliance signatures are formed from extracted power, geometrical and statistical features. Afterwards, those formed signatures are used to tune general model types for appliances identification using unsupervised algorithms. This method is evaluated using both simulated data on LPG and real-time Reference Energy Disaggregation Dataset (REDD). For that, we compute performance metrics using confusion matrix based metrics, considering accuracy, precision, recall and error-rate. The performance analysis of our methodology is then compared with other detection techniques previously used in the literature review, such as detection techniques based on statistical variations and abrupt changes (Variance Sliding Window and Cumulative Sum).

Keywords: general appliance model, non intrusive load monitoring, events detection, unsupervised techniques;

Procedia PDF Downloads 82
16410 Information Extraction for Short-Answer Question for the University of the Cordilleras

Authors: Thelma Palaoag, Melanie Basa, Jezreel Mark Panilo

Abstract:

Checking short-answer questions and essays, whether it may be paper or electronic in form, is a tiring and tedious task for teachers. Evaluating a student’s output require wide array of domains. Scoring the work is often a critical task. Several attempts in the past few years to create an automated writing assessment software but only have received negative results from teachers and students alike due to unreliability in scoring, does not provide feedback and others. The study aims to create an application that will be able to check short-answer questions which incorporate information extraction. Information extraction is a subfield of Natural Language Processing (NLP) where a chunk of text (technically known as unstructured text) is being broken down to gather necessary bits of data and/or keywords (structured text) to be further analyzed or rather be utilized by query tools. The proposed system shall be able to extract keywords or phrases from the individual’s answers to match it into a corpora of words (as defined by the instructor), which shall be the basis of evaluation of the individual’s answer. The proposed system shall also enable the teacher to provide feedback and re-evaluate the output of the student for some writing elements in which the computer cannot fully evaluate such as creativity and logic. Teachers can formulate, design, and check short answer questions efficiently by defining keywords or phrases as parameters by assigning weights for checking answers. With the proposed system, teacher’s time in checking and evaluating students output shall be lessened, thus, making the teacher more productive and easier.

Keywords: information extraction, short-answer question, natural language processing, application

Procedia PDF Downloads 428
16409 Characterization the Internal Corrosion Behavior by Using Natural Inhibitor in Crude Oil of Low Carbon Steel Pipeline

Authors: Iman Adnan Annon, Kadhim F. Alsultan

Abstract:

This study investigate the internal corrosion of low carbon steel pipelines in the crude oil, as well as prepare and use natural and locally available plant as a natural corrosion inhibiter, the nature extraction achieved by two types of solvents in order to show the solvent effect on inhibition process, the first being distilled water and the second is diethyl ether. FT-IR spectra and using a chemical reagents achieved to detection the presence of many active groups and the presence of tannins, phenols, and alkaloids in the natural extraction. Some experiments were achieved to estimate the performance of a new inhibitor, one of these tests include corrosion measurement by simple immersion in crude oil within and without inhibitors which added in different amounts 30,40,50and 60 ppm at tow temperature 300 and 323k, where the best inhibition efficiencies which get when added the inhibitors in a critical amounts or closest to it, since for the aqueous extract (EB-A) the inhibition efficiency reached (94.4) and (86.71)% at 300 and 323k respectively, and for diethyl ether extract (EB-D) reached (82.87) and (84.6)% at 300 and 323k respectively. Optical microscopy examination have been conducted to evaluate the corrosion nature where it show a clear difference in the topography of the immersed samples surface after add the inhibitors at two temperatures. The results show that the new corrosion inhibitor is not only equivalent to a chemical inhibitor but has greatly improvement properties such as: high efficiency, low cost, non-toxic, easily to produce, and nonpolluting as compared with chemical inhibitor.

Keywords: corrosion in pipeline, inhibitors, crude oil, carbon steel, types of solvent

Procedia PDF Downloads 140
16408 Chemical Modification of Biosorbent for Prconcentation of Cadmium in Water Sample

Authors: Homayon Ahmad Panahi, Niusha Mohseni Darabi, Elham Moniri

Abstract:

A new biosorbent is prepared by coupling a cibacron blue to yeast cells. The modified yeast cells with cibacron blue has been characterized by Fourier transform infrared spectroscopy (FT-IR) and elemental analysis and applied for the preconcentration and solid phase extraction of trace cadmium ion from water samples. The optimum pH value for sorption of the cadmium ions by yeast cells- cibacron blue was 5.5. The sorption capacity of modified biosorbent was 45 mg. g−1. A recovery of 98.2% was obtained for Cd(II) when eluted with 0.5 M nitric acid. The method was applied for Cd(II) preconcentration and determination in sea water sample.

Keywords: solid phase extraction, yeast cells, Nickl, isotherm study

Procedia PDF Downloads 264
16407 An Experiential Learning of Ontology-Based Multi-document Summarization by Removal Summarization Techniques

Authors: Pranjali Avinash Yadav-Deshmukh

Abstract:

Remarkable development of the Internet along with the new technological innovation, such as high-speed systems and affordable large storage space have led to a tremendous increase in the amount and accessibility to digital records. For any person, studying of all these data is tremendously time intensive, so there is a great need to access effective multi-document summarization (MDS) systems, which can successfully reduce details found in several records into a short, understandable summary or conclusion. For semantic representation of textual details in ontology area, as a theoretical design, our system provides a significant structure. The stability of using the ontology in fixing multi-document summarization problems in the sector of catastrophe control is finding its recommended design. Saliency ranking is usually allocated to each phrase and phrases are rated according to the ranking, then the top rated phrases are chosen as the conclusion. With regards to the conclusion quality, wide tests on a selection of media announcements are appropriate for “Jammu Kashmir Overflow in 2014” records. Ontology centered multi-document summarization methods using “NLP centered extraction” outshine other baselines. Our participation in recommended component is to implement the details removal methods (NLP) to enhance the results.

Keywords: disaster management, extraction technique, k-means, multi-document summarization, NLP, ontology, sentence extraction

Procedia PDF Downloads 386
16406 Creativity in Industrial Design as an Instrument for the Achievement of the Proper and Necessary Balance between Intuition and Reason, Design and Science

Authors: Juan Carlos Quiñones

Abstract:

Time has passed since the industrial design has put murder on a mass-production basis. The industrial design applies methods from different disciplines with a strategic approach, to place humans at the centers of the design process and to deliver solutions that are meaningful and desirable for users and for the market. This analysis summarizes some of the discussions that occurred in the 6th International Forum of Design as a Process, June 2016, Valencia. The aims of this conference were finding new linkages between systems and design interactions in order to define the social consequences. Through knowledge management we are able to transform the intangible aspect by using design as a transforming function capable of converting intangible knowledge into tangible solutions (i.e. products and services demanded by society). Industrial designers use knowledge consciously as a starting point for the ideation of the product. The handling of the intangible becomes more and more relevant over time as different methods emerge for knowledge extraction and subsequent organization. The different methodologies applied to the industrial design discipline and the evolution of the same discipline methods underpin the cultural and scientific background knowledge as a starting point of thought as a response to the needs; the whole thing coming through the instrument of creativity for the achievement of the proper and necessary balance between intuition and reason, design and science.

Keywords: creative process, creativity, industrial design, intangible

Procedia PDF Downloads 287
16405 Economic Analysis of a Carbon Abatement Technology

Authors: Hameed Rukayat Opeyemi, Pericles Pilidis Pagone Emmanuele, Agbadede Roupa, Allison Isaiah

Abstract:

Climate change represents one of the single most challenging problems facing the world today. According to the National Oceanic and Administrative Association, Atmospheric temperature rose almost 25% since 1958, Artic sea ice has shrunk 40% since 1959 and global sea levels have risen more than 5.5cm since 1990. Power plants are the major culprits of GHG emission to the atmosphere. Several technologies have been proposed to reduce the amount of GHG emitted to the atmosphere from power plant, one of which is the less researched Advanced zero-emission power plant. The advanced zero emission power plants make use of mixed conductive membrane (MCM) reactor also known as oxygen transfer membrane (OTM) for oxygen transfer. The MCM employs membrane separation process. The membrane separation process was first introduced in 1899 when Walter Hermann Nernst investigated electric current between metals and solutions. He found that when a dense ceramic is heated, the current of oxygen molecules move through it. In the bid to curb the amount of GHG emitted to the atmosphere, the membrane separation process was applied to the field of power engineering in the low carbon cycle known as the Advanced zero emission power plant (AZEP cycle). The AZEP cycle was originally invented by Norsk Hydro, Norway and ABB Alstom power (now known as Demag Delaval Industrial turbomachinery AB), Sweden. The AZEP drew a lot of attention because its ability to capture ~100% CO2 and also boasts of about 30-50% cost reduction compared to other carbon abatement technologies, the penalty in efficiency is also not as much as its counterparts and crowns it with almost zero NOx emissions due to very low nitrogen concentrations in the working fluid. The advanced zero emission power plants differ from a conventional gas turbine in the sense that its combustor is substituted with the mixed conductive membrane (MCM-reactor). The MCM-reactor is made up of the combustor, low-temperature heat exchanger LTHX (referred to by some authors as air preheater the mixed conductive membrane responsible for oxygen transfer and the high-temperature heat exchanger and in some layouts, the bleed gas heat exchanger. Air is taken in by the compressor and compressed to a temperature of about 723 Kelvin and pressure of 2 Mega-Pascals. The membrane area needed for oxygen transfer is reduced by increasing the temperature of 90% of the air using the LTHX; the temperature is also increased to facilitate oxygen transfer through the membrane. The air stream enters the LTHX through the transition duct leading to inlet of the LTHX. The temperature of the air stream is then increased to about 1150 K depending on the design point specification of the plant and the efficiency of the heat exchanging system. The amount of oxygen transported through the membrane is directly proportional to the temperature of air going through the membrane. The AZEP cycle was developed using the Fortran software and economic analysis was conducted using excel and Matlab followed by optimization case study. The Simple bleed gas heat exchange layout (100 % CO2 capture), Bleed gas heat exchanger layout with flue gas turbine (100 % CO2 capture), Pre-expansion reheating layout (Sequential burning layout)–AZEP 85% (85% CO2 capture) and Pre-expansion reheating layout (Sequential burning layout) with flue gas turbine–AZEP 85% (85% CO2 capture). This paper discusses monte carlo risk analysis of four possible layouts of the AZEP cycle.

Keywords: gas turbine, global warming, green house gas, fossil fuel power plants

Procedia PDF Downloads 397
16404 Jamun Juice Extraction Using Commercial Enzymes and Optimization of the Treatment with the Help of Physicochemical, Nutritional and Sensory Properties

Authors: Payel Ghosh, Rama Chandra Pradhan, Sabyasachi Mishra

Abstract:

Jamun (Syzygium cuminii L.) is one of the important indigenous minor fruit with high medicinal value. The jamun cultivation is unorganized and there is huge loss of this fruit every year. The perishable nature of the fruit makes its postharvest management further difficult. Due to the strong cell wall structure of pectin-protein bonds and hard seeds, extraction of juice becomes difficult. Enzymatic treatment has been commercially used for improvement of juice quality with high yield. The objective of the study was to optimize the best treatment method for juice extraction. Enzymes (Pectinase and Tannase) from different stains had been used and for each enzyme, best result obtained by using response surface methodology. Optimization had been done on the basis of physicochemical property, nutritional property, sensory quality and cost estimation. According to quality aspect, cost analysis and sensory evaluation, the optimizing enzymatic treatment was obtained by Pectinase from Aspergillus aculeatus strain. The optimum condition for the treatment was 44 oC with 80 minute with a concentration of 0.05% (w/w). At these conditions, 75% of yield with turbidity of 32.21NTU, clarity of 74.39%T, polyphenol content of 115.31 mg GAE/g, protein content of 102.43 mg/g have been obtained with a significant difference in overall acceptability.

Keywords: enzymatic treatment, Jamun, optimization, physicochemical property, sensory analysis

Procedia PDF Downloads 296
16403 Sequential Release of Dual Drugs Using Thermo-Sensitive Hydrogel for Tumor Vascular Inhibition and to Enhance the Efficacy of Chemotherapy

Authors: Haile F. Darge, Hsieh C. Tsai

Abstract:

The tumor microenvironment affects the therapeutic outcomes of cancer disease. In a malignant tumor, overexpression of vascular endothelial growth factor (VEGF) provokes the production of pathologic vascular networks. This results in a hostile tumor environment that hinders anti-cancer drug activities and profoundly fuels tumor progression. In this study, we develop a strategy of sequential sustain release of the anti-angiogenic drug: Bevacizumab(BVZ), and anti-cancer drug: Doxorubicin(DOX) which had a synergistic effect on cancer treatment. Poly (D, L-Lactide)- Poly (ethylene glycol) –Poly (D, L-Lactide) (PDLLA-PEG-PDLLA) thermo-sensitive hydrogel was used as a vehicle for local delivery of drugs in a single platform. The in vitro release profiles of the drugs were investigated and confirmed a relatively rapid release of BVZ (73.56 ± 1.39%) followed by Dox (61.21 ± 0.62%) for a prolonged period. The cytotoxicity test revealed that the copolymer exhibited negligible cytotoxicity up to 2.5 mg ml-1 concentration on HaCaT and HeLa cells. The in vivo study on Hela xenograft nude mice verified that hydrogel co-loaded with BVZ and DOX displayed the highest tumor suppression efficacy for up to 36 days with pronounce anti-angiogenic effect of BVZ and with no noticeable damage on vital organs. Therefore, localized co-delivery of anti-angiogenic drug and anti-cancer drugs by the hydrogel system may be a promising approach for enhanced chemotherapeutic efficacy in cancer treatment.

Keywords: anti-angiogenesis, chemotherapy, controlled release, thermo-sensitive hydrogel

Procedia PDF Downloads 134
16402 Credit Card Fraud Detection with Ensemble Model: A Meta-Heuristic Approach

Authors: Gong Zhilin, Jing Yang, Jian Yin

Abstract:

The purpose of this paper is to develop a novel system for credit card fraud detection based on sequential modeling of data using hybrid deep learning models. The projected model encapsulates five major phases are pre-processing, imbalance-data handling, feature extraction, optimal feature selection, and fraud detection with an ensemble classifier. The collected raw data (input) is pre-processed to enhance the quality of the data through alleviation of the missing data, noisy data as well as null values. The pre-processed data are class imbalanced in nature, and therefore they are handled effectively with the K-means clustering-based SMOTE model. From the balanced class data, the most relevant features like improved Principal Component Analysis (PCA), statistical features (mean, median, standard deviation) and higher-order statistical features (skewness and kurtosis). Among the extracted features, the most optimal features are selected with the Self-improved Arithmetic Optimization Algorithm (SI-AOA). This SI-AOA model is the conceptual improvement of the standard Arithmetic Optimization Algorithm. The deep learning models like Long Short-Term Memory (LSTM), Convolutional Neural Network (CNN), and optimized Quantum Deep Neural Network (QDNN). The LSTM and CNN are trained with the extracted optimal features. The outcomes from LSTM and CNN will enter as input to optimized QDNN that provides the final detection outcome. Since the QDNN is the ultimate detector, its weight function is fine-tuned with the Self-improved Arithmetic Optimization Algorithm (SI-AOA).

Keywords: credit card, data mining, fraud detection, money transactions

Procedia PDF Downloads 131
16401 Modeling and Simulation of Fluid Catalytic Cracking Process

Authors: Sungho Kim, Dae Shik Kim, Jong Min Lee

Abstract:

Fluid catalytic cracking (FCC) process is one of the most important process in modern refinery industry. This paper focuses on the fluid catalytic cracking (FCC) process. As the FCC process is difficult to model well, due to its non linearities and various interactions between its process variables, rigorous process modeling of whole FCC plant is demanded for control and plant-wide optimization of the plant. In this study, a process design for the FCC plant includes riser reactor, main fractionator, and gas processing unit was developed. A reactor model was described based on four-lumped kinetic scheme. Main fractionator, gas processing unit and other process units are designed to simulate real plant data, using a process flow sheet simulator, Aspen PLUS. The custom reactor model was integrated with the process flow sheet simulator to develop an integrated process model.

Keywords: fluid catalytic cracking, simulation, plant data, process design

Procedia PDF Downloads 529
16400 Speech Detection Model Based on Deep Neural Networks Classifier for Speech Emotions Recognition

Authors: A. Shoiynbek, K. Kozhakhmet, P. Menezes, D. Kuanyshbay, D. Bayazitov

Abstract:

Speech emotion recognition has received increasing research interest all through current years. There was used emotional speech that was collected under controlled conditions in most research work. Actors imitating and artificially producing emotions in front of a microphone noted those records. There are four issues related to that approach, namely, (1) emotions are not natural, and it means that machines are learning to recognize fake emotions. (2) Emotions are very limited by quantity and poor in their variety of speaking. (3) There is language dependency on SER. (4) Consequently, each time when researchers want to start work with SER, they need to find a good emotional database on their language. In this paper, we propose the approach to create an automatic tool for speech emotion extraction based on facial emotion recognition and describe the sequence of actions of the proposed approach. One of the first objectives of the sequence of actions is a speech detection issue. The paper gives a detailed description of the speech detection model based on a fully connected deep neural network for Kazakh and Russian languages. Despite the high results in speech detection for Kazakh and Russian, the described process is suitable for any language. To illustrate the working capacity of the developed model, we have performed an analysis of speech detection and extraction from real tasks.

Keywords: deep neural networks, speech detection, speech emotion recognition, Mel-frequency cepstrum coefficients, collecting speech emotion corpus, collecting speech emotion dataset, Kazakh speech dataset

Procedia PDF Downloads 101
16399 Potentials of Henna Leaves as Dye and Its Fastness Properties on Fabric

Authors: Nkem Angela Udeani

Abstract:

Despite the widespread use of synthetic dyes, natural dyes are still exploited and used to enhance its inherent aesthetic qualities as a major material for the beautification of the body. Centuries before the discovery of synthetic dye, natural dyes were the only source of dye open to mankind. Dyes are extracted from plant - leaves, roots, and barks, insect secretions, and minerals. However, research findings have made it clear that of all, plant- leaves, roots, barks or flowers are the most explored and exploited. Henna (Lawsonia innermis) is one of those plants. The experiment has also shown that henna is used in body painting in conjunction with an alkaline (Ammonium Sulphate) as a fixing agent. This of course gives a clue that if colour derived from henna is properly investigated, it may not only be used as body decoration but possibly, may have affinity to fibre substrate. This paper investigates the dyeing potentials - dyeing ability and fastness qualities of henna dye extract on cotton and linen fibres using mordants like ammonium sulphate and other alkalies (hydrosulphate and caustic soda, potash, common salt and alum). Hot and cold water and ethanol solvent were used in the extraction of the dye to investigate the most effective method of extraction, dyeing ability and fastness qualities of these extracts under room temperature. The results of the experiment show that cotton have a high rate of dye intake than linen fibre. On a similar note, the colours obtained depend most on the solvent and or the mordant used. In conclusion, hot water extraction appear more effective. While the colours obtained from ethanol and both cold and hot method of extraction range from light to dark yellow, light green to army green, there are to some extent shades of brown hues.

Keywords: dye, fabrics, henna leaves, potential

Procedia PDF Downloads 472
16398 Comparison of Polyphonic Profile of a Berry from Two Different Sources, Using an Optimized Extraction Method

Authors: G. Torabian, A. Fathi, P. Valtchev, F. Dehghani

Abstract:

The superior polyphenol content of Sambucus nigra berries has high health potentials for the production of nutraceutical products. Numerous factors influence the polyphenol content of the final products including the berries’ source and the subsequent processing production steps. The aim of this study is to compare the polyphenol content of berries from two different sources and also to optimise the polyphenol extraction process from elderberries. Berries from source B obtained more acceptable physical properties than source A; a single berry from source B was double in size and weight (both wet and dry weight) compared with a source A berry. Despite the appropriate physical characteristics of source B berries, their polyphenolic profile was inferior; as source A berries had 2.3 fold higher total anthocyanin content, and nearly two times greater total phenolic content and total flavonoid content compared to source B. Moreover, the result of this study showed that almost 50 percent of the phenolic content of berries are entrapped within their skin and pulp that potentially cannot be extracted by press juicing. To address this challenge and to increase the total polyphenol yield of the extract, we used cold-shock blade grinding method to break the cell walls. The result of this study showed that using cultivars with higher phenolic content as well as using the whole fruit including juice, skin and pulp can increase polyphenol yield significantly; and thus, may boost the potential of using elderberries as therapeutic products.

Keywords: different sources, elderberry, grinding, juicing, polyphenols

Procedia PDF Downloads 294
16397 Moderate Electric Field and Ultrasound as Alternative Technologies to Raspberry Juice Pasteurization Process

Authors: Cibele F. Oliveira, Debora P. Jaeschke, Rodrigo R. Laurino, Amanda R. Andrade, Ligia D. F. Marczak

Abstract:

Raspberry is well-known as a good source of phenolic compounds, mainly anthocyanin. Some studies pointed out the importance of these bioactive compounds consumption, which is related to the decrease of the risk of cancer and cardiovascular diseases. The most consumed raspberry products are juices, yogurts, ice creams and jellies and, to ensure the safety of these products, raspberry is commonly pasteurized, for enzyme and microorganisms inactivation. Despite being efficient, the pasteurization process can lead to degradation reactions of the bioactive compounds, decreasing the products healthy benefits. Therefore, the aim of the present work was to evaluate moderate electric field (MEF) and ultrasound (US) technologies application on the pasteurization process of raspberry juice and compare the results with conventional pasteurization process. For this, phenolic compounds, anthocyanin content and physical-chemical parameters (pH, color changes, titratable acidity) of the juice were evaluated before and after the treatments. Moreover, microbiological analyses of aerobic mesophiles microorganisms, molds and yeast were performed in the samples before and after the treatments, to verify the potential of these technologies to inactivate microorganisms. All the pasteurization processes were performed in triplicate for 10 min, using a cylindrical Pyrex® vessel with a water jacket. The conventional pasteurization was performed at 90 °C using a hot water bath connected to the extraction cell. The US assisted pasteurization was performed using 423 and 508 W cm-2 (75 and 90 % of ultrasound intensity). It is important to mention that during US application the temperature was kept below 35 °C; for this, the water jacket of the extraction cell was connected to a water bath with cold water. MEF assisted pasteurization experiments were performed similarly to US experiments, using 25 and 50 V. Control experiments were performed at the maximum temperature of US and MEF experiments (35 °C) to evaluate only the effect of the aforementioned technologies on the pasteurization. The results showed that phenolic compounds concentration in the juice was not affected by US and MEF application. However, it was observed that the US assisted pasteurization, performed at the highest intensity, decreased anthocyanin content in 33 % (compared to in natura juice). This result was possibly due to the cavitation phenomena, which can lead to free radicals formation and accumulation on the medium; these radicals can react with anthocyanin decreasing the content of these antioxidant compounds in the juice. Physical-chemical parameters did not present statistical differences for samples before and after the treatments. Microbiological analyses results showed that all the pasteurization treatments decreased the microorganism content in two logarithmic cycles. However, as values were lower than 1000 CFU mL-1 it was not possible to verify the efficacy of each treatment. Thus, MEF and US were considered as potential alternative technologies for pasteurization process, once in the right conditions the application of the technologies decreased microorganism content in the juice and did not affected phenolic and anthocyanin content, as well as physical-chemical parameters. However, more studies are needed regarding the influence of MEF and US processes on microorganisms’ inactivation.

Keywords: MEF, microorganism inactivation, anthocyanin, phenolic compounds

Procedia PDF Downloads 242
16396 Estimation of the Exergy-Aggregated Value Generated by a Manufacturing Process Using the Theory of the Exergetic Cost

Authors: German Osma, Gabriel Ordonez

Abstract:

The production of metal-rubber spares for vehicles is a sequential process that consists in the transformation of raw material through cutting activities and chemical and thermal treatments, which demand electricity and fossil fuels. The energy efficiency analysis for these cases is mostly focused on studying of each machine or production step, but is not common to study of the quality of the production process achieves from aggregated value viewpoint, which can be used as a quality measurement for determining of impact on the environment. In this paper, the theory of exergetic cost is used for determining of aggregated exergy to three metal-rubber spares, from an exergy analysis and thermoeconomic analysis. The manufacturing processing of these spares is based into batch production technique, and therefore is proposed the use of this theory for discontinuous flows from of single models of workstations; subsequently, the complete exergy model of each product is built using flowcharts. These models are a representation of exergy flows between components into the machines according to electrical, mechanical and/or thermal expressions; they determine the demanded exergy to produce the effective transformation in raw materials (aggregated exergy value), the exergy losses caused by equipment and irreversibilities. The energy resources of manufacturing process are electricity and natural gas. The workstations considered are lathes, punching presses, cutters, zinc machine, chemical treatment tanks, hydraulic vulcanizing presses and rubber mixer. The thermoeconomic analysis was done by workstation and by spare; first of them describes the operation of the components of each machine and where the exergy losses are; while the second of them estimates the exergy-aggregated value for finished product and wasted feedstock. Results indicate that exergy efficiency of a mechanical workstation is between 10% and 60% while this value in the thermal workstations is less than 5%; also that each effective exergy-aggregated value is one-thirtieth of total exergy required for operation of manufacturing process, which amounts approximately to 2 MJ. These troubles are caused mainly by technical limitations of machines, oversizing of metal feedstock that demands more mechanical transformation work, and low thermal insulation of chemical treatment tanks and hydraulic vulcanizing presses. From established information, in this case, it is possible to appreciate the usefulness of theory of exergetic cost for analyzing of aggregated value in manufacturing processes.

Keywords: exergy-aggregated value, exergy efficiency, thermoeconomics, exergy modeling

Procedia PDF Downloads 170
16395 Study on Meristem Culture of Purwoceng (Pimpinella pruatjan Molk.) and Its Stigmasterol Detected by Thin Layer Chromatography

Authors: Totik Sri Mariani, Sukrasno Isna, Tet Fatt Chia

Abstract:

Purwoceng (Pimpinella pruatjan Molk) is a legend plant used for increasing stamina by Kings in Java Island, Indonesia. Purpose of this study was to perform meristem culture and detected its stigmasterol by thin layer chromatography (TLC). Our result show that meristem culture could be propagated and grew into plantlet. After extracting intact acclimatized plant derived from meristem culture by hexane, we could detected stigmasterol by TLC. For suggestion, our extraction and TLC method could be used for detecting stigmasterol in others plant.

Keywords: purwoceng (pimpinella pruatjan), meristem culture, extraction, thin layer chromatography

Procedia PDF Downloads 430
16394 Biorefinery as Extension to Sugar Mills: Sustainability and Social Upliftment in the Green Economy

Authors: Asfaw Gezae Daful, Mohsen Alimandagari, Kathleen Haigh, Somayeh Farzad, Eugene Van Rensburg, Johann F. Görgens

Abstract:

The sugar industry has to 're-invent' itself to ensure long-term economic survival and opportunities for job creation and enhanced community-level impacts, given increasing pressure from fluctuating and low global sugar prices, increasing energy prices and sustainability demands. We propose biorefineries for re-vitalisation of the sugar industry using low value lignocellulosic biomass (sugarcane bagasse, leaves, and tops) annexed to existing sugar mills, producing a spectrum of high value platform chemicals along with biofuel, bioenergy, and electricity. Opportunity is presented for greener products, to mitigate climate change and overcome economic challenges. Xylose from labile hemicellulose remains largely underutilized and the conversion to value-add products a major challenge. Insight is required on pretreatment and/or extraction to optimize production of cellulosic ethanol together with lactic acid, furfural or biopolymers from sugarcane bagasse, leaves, and tops. Experimental conditions for alkaline and pressurized hot water extraction dilute acid and steam explosion pretreatment of sugarcane bagasse and harvest residues were investigated to serve as a basis for developing various process scenarios under a sugarcane biorefinery scheme. Dilute acid and steam explosion pretreatment were optimized for maximum hemicellulose recovery, combined sugar yield and solids digestibility. An optimal range of conditions for alkaline and liquid hot water extraction of hemicellulosic biopolymers, as well as conditions for acceptable enzymatic digestibility of the solid residue, after such extraction was established. Using data from the above, a series of energy efficient biorefinery scenarios are under development and modeled using Aspen Plus® software, to simulate potential factories to better understand the biorefinery processes and estimate the CAPEX and OPEX, environmental impacts, and overall viability. Rigorous and detailed sustainability assessment methodology was formulated to address all pillars of sustainability. This work is ongoing and to date, models have been developed for some of the processes which can ultimately be combined into biorefinery scenarios. This will allow systematic comparison of a series of biorefinery scenarios to assess the potential to reduce negative impacts on and maximize the benefits of social, economic, and environmental factors on a lifecycle basis.

Keywords: biomass, biorefinery, green economy, sustainability

Procedia PDF Downloads 514
16393 Optimal Management of Forest Stands under Wind Risk in Czech Republic

Authors: Zohreh Mohammadi, Jan Kaspar, Peter Lohmander, Robert Marusak, Harald Vacik, Ljusk Ola Eriksson

Abstract:

Storms are important damaging agents in European forest ecosystems. In the latest decades, significant economic losses in European forestry occurred due to storms. This study investigates the problem of optimal harvest planning when forest stands risk to be felled by storms. One of the most applicable mathematical methods which are being used to optimize forest management is stochastic dynamic programming (SDP). This method belongs to the adaptive optimization class. Sequential decisions, such as harvest decisions, can be optimized based on sequential information about events that cannot be perfectly predicted, such as the future storms and the future states of wind protection from other forest stands. In this paper, stochastic dynamic programming is used to maximize the expected present value of the profits from an area consisting of several forest stands. The region of analysis is the Czech Republic. The harvest decisions, in a particular time period, should be simultaneously taken in all neighbor stands. The reason is that different stands protect each other from possible winds. The optimal harvest age of a particular stand is a function of wind speed and different wind protection effects. The optimal harvest age often decreases with wind speed, but it cannot be determined for one stand at a time. When we consider a particular stand, this stand also protects other stands. Furthermore, the particular stand is protected by neighbor stands. In some forest stands, it may even be rational to increase the harvest age under the influence of stronger winds, in order to protect more valuable stands in the neighborhood. It is important to integrate wind risk in forestry decision-making.

Keywords: Czech republic, forest stands, stochastic dynamic programming, wind risk

Procedia PDF Downloads 147
16392 Monte Carlo Risk Analysis of a Carbon Abatement Technology

Authors: Hameed Rukayat Opeyemi, Pericles Pilidis, Pagone Emanuele

Abstract:

Climate change represents one of the single most challenging problems facing the world today. According to the National Oceanic and Administrative Association, Atmospheric temperature rose almost 25% since 1958, Artic sea ice has shrunk 40% since 1959 and global sea levels have risen more than 5.5 cm since 1990. Power plants are the major culprits of GHG emission to the atmosphere. Several technologies have been proposed to reduce the amount of GHG emitted to the atmosphere from power plant, one of which is the less researched Advanced zero emission power plant. The advanced zero emission power plants make use of mixed conductive membrane (MCM) reactor also known as oxygen transfer membrane (OTM) for oxygen transfer. The MCM employs membrane separation process. The membrane separation process was first introduced in 1899 when Walter Hermann Nernst investigated electric current between metals and solutions. He found that when a dense ceramic is heated, current of oxygen molecules move through it. In the bid to curb the amount of GHG emitted to the atmosphere, the membrane separation process was applied to the field of power engineering in the low carbon cycle known as the Advanced zero emission power plant (AZEP cycle). The AZEP cycle was originally invented by Norsk Hydro, Norway and ABB Alstom power (now known as Demag Delaval Industrial turbo machinery AB), Sweden. The AZEP drew a lot of attention because its ability to capture ~100% CO2 and also boasts of about 30-50 % cost reduction compared to other carbon abatement technologies, the penalty in efficiency is also not as much as its counterparts and crowns it with almost zero NOx emissions due to very low nitrogen concentrations in the working fluid. The advanced zero emission power plants differ from a conventional gas turbine in the sense that its combustor is substituted with the mixed conductive membrane (MCM-reactor). The MCM-reactor is made up of the combustor, low temperature heat exchanger LTHX (referred to by some authors as air pre-heater the mixed conductive membrane responsible for oxygen transfer and the high temperature heat exchanger and in some layouts, the bleed gas heat exchanger. Air is taken in by the compressor and compressed to a temperature of about 723 Kelvin and pressure of 2 Mega-Pascals. The membrane area needed for oxygen transfer is reduced by increasing the temperature of 90% of the air using the LTHX; the temperature is also increased to facilitate oxygen transfer through the membrane. The air stream enters the LTHX through the transition duct leading to inlet of the LTHX. The temperature of the air stream is then increased to about 1150 K depending on the design point specification of the plant and the efficiency of the heat exchanging system. The amount of oxygen transported through the membrane is directly proportional to the temperature of air going through the membrane. The AZEP cycle was developed using the Fortran software and economic analysis was conducted using excel and Matlab followed by optimization case study. This paper discusses techno-economic analysis of four possible layouts of the AZEP cycle. The Simple bleed gas heat exchange layout (100 % CO2 capture), Bleed gas heat exchanger layout with flue gas turbine (100 % CO2 capture), Pre-expansion reheating layout (Sequential burning layout) – AZEP 85 % (85 % CO2 capture) and Pre-expansion reheating layout (Sequential burning layout) with flue gas turbine– AZEP 85 % (85 % CO2 capture). This paper discusses Montecarlo risk analysis of four possible layouts of the AZEP cycle.

Keywords: gas turbine, global warming, green house gases, power plants

Procedia PDF Downloads 471
16391 MapReduce Algorithm for Geometric and Topological Information Extraction from 3D CAD Models

Authors: Ahmed Fradi

Abstract:

In a digital world in perpetual evolution and acceleration, data more and more voluminous, rich and varied, the new software solutions emerged with the Big Data phenomenon offer new opportunities to the company enabling it not only to optimize its business and to evolve its production model, but also to reorganize itself to increase competitiveness and to identify new strategic axes. Design and manufacturing industrial companies, like the others, face these challenges, data represent a major asset, provided that they know how to capture, refine, combine and analyze them. The objective of our paper is to propose a solution allowing geometric and topological information extraction from 3D CAD model (precisely STEP files) databases, with specific algorithm based on the programming paradigm MapReduce. Our proposal is the first step of our future approach to 3D CAD object retrieval.

Keywords: Big Data, MapReduce, 3D object retrieval, CAD, STEP format

Procedia PDF Downloads 540
16390 Quantification of Polychlorinated Biphenyls (PCBs) in Soil Samples of Electrical Power Substations from Different Cities in Nigeria

Authors: Omasan Urhie Urhie, Adenipekun C. O, Eke W., Ogwu K., Erinle K. O

Abstract:

Polychlorinated Biphenyls (PCBs) are Persistent organic pollutants (POPs) that are very toxic; they possess ability to accumulate in soil and in human tissues hence resulting in health issues like birth defect, reproductive disorder and cancer. The air is polluted by PCBs through volatilization and dispersion; they also contaminate soil and sediments and are not easily degraded. Soil samples were collected from a depth of 0-15 cm from three substations (Warri, Ughelli and Ibadan) of Power Holding Company of Nigeria (PHCN) where old transformers were dumped in Nigeria. Extraction and cleanup of soil samples were conducted using Accelerated Solvent Extraction (ASE) with Pressurized Liquid extraction (PLE). The concentration of PCBs was determined using gsas chromatography/mass spectrometry (GC/MS). Mean total PCB concentrations in the soil samples increased in the order Ughelli ˂ Ibadan˂ Warri, 2.457757ppm Ughelli substation 4.198926ppm, for Ibadan substation and 14.05065ppm at Warri substation. In the Warri samples, PCB-167 was the most abundant at about 30% (4.28086ppm) followed by PCB-157 at about 20% (2.77871), of the total PCB concentrations (14.05065ppm). Of the total PCBs in the Ughelli and Ibadan samples, PCB-156 was the most abundant at about 44% and 40%, respectively. This study provides a baseline report on the presence of PCBs in the vicinity of abandoned electrical power facilities in different cities in Nigeria.

Keywords: polychlorintated biphenyls, persistent organic pollutants, soil, transformer

Procedia PDF Downloads 139