Search results for: liquid-liquid extraction
1384 Developing an Automated Protocol for the Wristband Extraction Process Using Opentrons
Authors: Tei Kim, Brooklynn McNeil, Kathryn Dunn, Douglas I. Walker
Abstract:
To better characterize the relationship between complex chemical exposures and disease, our laboratory uses an approach that combines low-cost, polydimethylsiloxane (silicone) wristband samplers that absorb many of the chemicals we are exposed to with untargeted high-resolution mass spectrometry (HRMS) to characterize 1000’s of chemicals at a time. In studies with human populations, these wristbands can provide an important measure of our environment: however, there is a need to use this approach in large cohorts to study exposures associated with the disease. To facilitate the use of silicone samplers in large scale population studies, the goal of this research project was to establish automated sample preparation methods that improve throughput, robustness, and scalability of analytical methods for silicone wristbands. Using the Opentron OT2 automated liquid platform, which provides a low-cost and opensource framework for automated pipetting, we created two separate workflows that translate the manual wristband preparation method to a fully automated protocol that requires minor intervention by the operator. These protocols include a sequence generation step, which defines the location of all plates and labware according to user-specified settings, and a transfer protocol that includes all necessary instrument parameters and instructions for automated solvent extraction of wristband samplers. These protocols were written in Python and uploaded to GitHub for use by others in the research community. Results from this project show it is possible to establish automated and open source methods for the preparation of silicone wristband samplers to support profiling of many environmental exposures. Ongoing studies include deployment in longitudinal cohort studies to investigate the relationship between personal chemical exposure and disease.Keywords: bioinformatics, automation, opentrons, research
Procedia PDF Downloads 1161383 Analysis of Real Time Seismic Signal Dataset Using Machine Learning
Authors: Sujata Kulkarni, Udhav Bhosle, Vijaykumar T.
Abstract:
Due to the closeness between seismic signals and non-seismic signals, it is vital to detect earthquakes using conventional methods. In order to distinguish between seismic events and non-seismic events depending on their amplitude, our study processes the data that come from seismic sensors. The authors suggest a robust noise suppression technique that makes use of a bandpass filter, an IIR Wiener filter, recursive short-term average/long-term average (STA/LTA), and Carl short-term average (STA)/long-term average for event identification (LTA). The trigger ratio used in the proposed study to differentiate between seismic and non-seismic activity is determined. The proposed work focuses on significant feature extraction for machine learning-based seismic event detection. This serves as motivation for compiling a dataset of all features for the identification and forecasting of seismic signals. We place a focus on feature vector dimension reduction techniques due to the temporal complexity. The proposed notable features were experimentally tested using a machine learning model, and the results on unseen data are optimal. Finally, a presentation using a hybrid dataset (captured by different sensors) demonstrates how this model may also be employed in a real-time setting while lowering false alarm rates. The planned study is based on the examination of seismic signals obtained from both individual sensors and sensor networks (SN). A wideband seismic signal from BSVK and CUKG station sensors, respectively located near Basavakalyan, Karnataka, and the Central University of Karnataka, makes up the experimental dataset.Keywords: Carl STA/LTA, features extraction, real time, dataset, machine learning, seismic detection
Procedia PDF Downloads 1261382 Recent Advancement in Fetal Electrocardiogram Extraction
Authors: Savita, Anurag Sharma, Harsukhpreet Singh
Abstract:
Fetal Electrocardiogram (fECG) is a widely used technique to assess the fetal well-being and identify any changes that might be with problems during pregnancy and to evaluate the health and conditions of the fetus. Various techniques or methods have been employed to diagnose the fECG from abdominal signal. This paper describes the facile approach for the estimation of the fECG known as Adaptive Comb. Filter (ACF). The ACF can adjust according to the temporal variations in fundamental frequency by itself that used for the estimation of the quasi periodic signal of ECG signal.Keywords: aECG, ACF, fECG, mECG
Procedia PDF Downloads 4081381 Beneficiation of Low Grade Chromite Ore and Its Characterization for the Formation of Magnesia-Chromite Refractory by Economically Viable Process
Authors: Amit Kumar Bhandary, Prithviraj Gupta, Siddhartha Mukherjee, Mahua Ghosh Chaudhuri, Rajib Dey
Abstract:
Chromite ores are primarily used for extraction of chromium, which is an expensive metal. For low grade chromite ores (containing less than 40% Cr2O3), the chromium extraction is not usually economically viable. India possesses huge quantities of low grade chromite reserves. This deposit can be utilized after proper physical beneficiation. Magnetic separation techniques may be useful after reduction for the beneficiation of low grade chromite ore. The sample collected from the sukinda mines is characterized by XRD which shows predominant phases like maghemite, chromite, silica, magnesia and alumina. The raw ore is crushed and ground to below 75 micrometer size. The microstructure of the ore shows that the chromite grains surrounded by a silicate matrix and porosity observed the exposed side of the chromite ore. However, this ore may be utilized in refractory applications. Chromite ores contain Cr2O3, FeO, Al2O3 and other oxides like Fe-Cr, Mg-Cr have a high tendency to form spinel compounds, which usually show high refractoriness. Initially, the low grade chromite ore (containing 34.8% Cr2O3) was reduced at 1200 0C for 80 minutes with 30% coke fines by weight, before being subjected to magnetic separation. The reduction by coke leads to conversion of higher state of iron oxides converted to lower state of iron oxides. The pre-reduced samples are then characterized by XRD. The magnetically inert mass was then reacted with 20% MgO by weight at 1450 0C for 2 hours. The resultant product was then tested for various refractoriness parameters like apparent porosity, slag resistance etc. The results were satisfactory, indicating that the resultant spinel compounds are suitable for refractory applications for elevated temperature processes.Keywords: apparent porosity, beneficiation, low-grade chromite, refractory, spinel compounds, slag resistance
Procedia PDF Downloads 3871380 An Analysis of Eco-efficiency and GHG Emission of Olive Oil Production in Northeast of Portugal
Authors: M. Feliciano, F. Maia, A. Gonçalves
Abstract:
Olive oil production sector plays an important role in Portuguese economy. It had a major growth over the last decade, increasing its weight in the overall national exports. International market penetration for Mediterranean traditional products is increasingly more demanding, especially in the Northern European markets, where consumers are looking for more sustainable products. Trying to support this growing demand this study addresses olive oil production under the environmental and eco-efficiency perspectives. The analysis considers two consecutive product life cycle stages: olive trees farming; and olive oil extraction in mills. Addressing olive farming, data collection covered two different organizations: a middle-size farm (~12ha) (F1) and a large-size farm (~100ha) (F2). Results from both farms show that olive collection activities are responsible for the largest amounts of Green House Gases (GHG) emissions. In this activities, estimate for the Carbon Footprint per olive was higher in F2 (188g CO2e/kgolive) than in F1 (148g CO2e/kgolive). Considering olive oil extraction, two different mills were considered: one using a two-phase system (2P) and other with a three-phase system (3P). Results from the study of two mills show that there is a much higher use of water in 3P. Energy intensity (EI) is similar in both mills. When evaluating the GHG generated, two conditions are evaluated: a biomass neutral condition resulting on a carbon footprint higher in 3P (184g CO2e/Lolive oil) than in 2P (92g CO2e/Lolive oil); and a non-neutral biomass condition in which 2P increase its carbon footprint to 273g CO2e/Lolive oil. When addressing the carbon footprint of possible combinations among studied subsystems, results suggest that olive harvesting is the major source for GHG.Keywords: carbon footprint, environmental indicators, farming subsystem, industrial subsystem, olive oil
Procedia PDF Downloads 2871379 Power Quality Modeling Using Recognition Learning Methods for Waveform Disturbances
Authors: Sang-Keun Moon, Hong-Rok Lim, Jin-O Kim
Abstract:
This paper presents a Power Quality (PQ) modeling and filtering processes for the distribution system disturbances using recognition learning methods. Typical PQ waveforms with mathematical applications and gathered field data are applied to the proposed models. The objective of this paper is analyzing PQ data with respect to monitoring, discriminating, and evaluating the waveform of power disturbances to ensure the system preventative system failure protections and complex system problem estimations. Examined signal filtering techniques are used for the field waveform noises and feature extractions. Using extraction and learning classification techniques, the efficiency was verified for the recognition of the PQ disturbances with focusing on interactive modeling methods in this paper. The waveform of selected 8 disturbances is modeled with randomized parameters of IEEE 1159 PQ ranges. The range, parameters, and weights are updated regarding field waveform obtained. Along with voltages, currents have same process to obtain the waveform features as the voltage apart from some of ratings and filters. Changing loads are causing the distortion in the voltage waveform due to the drawing of the different patterns of current variation. In the conclusion, PQ disturbances in the voltage and current waveforms indicate different types of patterns of variations and disturbance, and a modified technique based on the symmetrical components in time domain was proposed in this paper for the PQ disturbances detection and then classification. Our method is based on the fact that obtained waveforms from suggested trigger conditions contain potential information for abnormality detections. The extracted features are sequentially applied to estimation and recognition learning modules for further studies.Keywords: power quality recognition, PQ modeling, waveform feature extraction, disturbance trigger condition, PQ signal filtering
Procedia PDF Downloads 1881378 Data Mining Spatial: Unsupervised Classification of Geographic Data
Authors: Chahrazed Zouaoui
Abstract:
In recent years, the volume of geospatial information is increasing due to the evolution of communication technologies and information, this information is presented often by geographic information systems (GIS) and stored on of spatial databases (BDS). The classical data mining revealed a weakness in knowledge extraction at these enormous amounts of data due to the particularity of these spatial entities, which are characterized by the interdependence between them (1st law of geography). This gave rise to spatial data mining. Spatial data mining is a process of analyzing geographic data, which allows the extraction of knowledge and spatial relationships from geospatial data, including methods of this process we distinguish the monothematic and thematic, geo- Clustering is one of the main tasks of spatial data mining, which is registered in the part of the monothematic method. It includes geo-spatial entities similar in the same class and it affects more dissimilar to the different classes. In other words, maximize intra-class similarity and minimize inter similarity classes. Taking account of the particularity of geo-spatial data. Two approaches to geo-clustering exist, the dynamic processing of data involves applying algorithms designed for the direct treatment of spatial data, and the approach based on the spatial data pre-processing, which consists of applying clustering algorithms classic pre-processed data (by integration of spatial relationships). This approach (based on pre-treatment) is quite complex in different cases, so the search for approximate solutions involves the use of approximation algorithms, including the algorithms we are interested in dedicated approaches (clustering methods for partitioning and methods for density) and approaching bees (biomimetic approach), our study is proposed to design very significant to this problem, using different algorithms for automatically detecting geo-spatial neighborhood in order to implement the method of geo- clustering by pre-treatment, and the application of the bees algorithm to this problem for the first time in the field of geo-spatial.Keywords: mining, GIS, geo-clustering, neighborhood
Procedia PDF Downloads 3751377 Solid Phase Micro-Extraction/Gas Chromatography-Mass Spectrometry Study of Volatile Compounds from Strawberry Tree and Autumn Heather Honeys
Authors: Marinos Xagoraris, Elisavet Lazarou, Eleftherios Alissandrakis, Christos S. Pappas, Petros A. Tarantilis
Abstract:
Strawberry tree (Arbutus unedo L.) and autumn heather (Erica manipuliflora Salisb.) are important beekeeping plants of Greece. Six monofloral honeys (four strawberry tree, two autumn heather) were analyzed by means of Solid Phase Micro-Extraction (SPME, 60 min, 60 oC) followed by Gas Chromatography coupled to Mass Spectrometry (GC-MS) for the purpose of assessing the botanical origin. A Divinylbenzene/Carboxen/Polydimethylsiloxane (DVB/CAR/PDMS) fiber was employed, and benzophenone was used as internal standard. The volatile compounds with higher concentration (μg/ g of honey expressed as benzophenone) from strawberry tree honey samples, were α-isophorone (2.50-8.12); 3,4,5-trimethyl-phenol (0.20-4.62); 2-hydroxy-isophorone (0.06-0.53); 4-oxoisophorone (0.38-0.46); and β-isophorone (0.02-0.43). Regarding heather honey samples, the most abundant compounds were 1-methoxy-4-propyl-benzene (1.22-1.40); p-anisaldehyde (0.97-1.28); p-anisic acid (0.35-0.58); 2-furaldehyde (0.52-0.57); and benzaldehyde (0.41-0.56). Norisoprenoids are potent floral markers for strawberry-tree honey. β-isophorone is found exclusively in the volatile fraction of this type of honey, while also α-isophorone, 4-oxoisophorone and 2-hydroxy-isophorone could be considered as additional marker compounds. The analysis of autumn heather honey revealed that phenolic compounds are the most abundant and p-anisaldehyde; 1-methoxy-4-propyl-benzene; and p-anisic acid could serve as potent marker compounds. In conclusion, marker compounds for the determination of the botanical origin for these honeys could be identified as several norisoprenoids and phenolic components were found exclusively or in higher concentrations compared to common Greek honey varieties.Keywords: SPME/GC-MS, volatile compounds, heather honey, strawberry tree honey
Procedia PDF Downloads 2001376 A Q-Methodology Approach for the Evaluation of Land Administration Mergers
Authors: Tsitsi Nyukurayi Muparari, Walter Timo De Vries, Jaap Zevenbergen
Abstract:
The nature of Land administration accommodates diversity in terms of both spatial data handling activities and the expertise involved, which supposedly aims to satisfy the unpredictable demands of land data and the diverse demands of the customers arising from the land. However, it is known that strategic decisions of restructuring are in most cases repelled in favour of complex structures that strive to accommodate professional diversity and diverse roles in the field of Land administration. Yet despite of this widely accepted knowledge, there is scanty theoretical knowledge concerning the psychological methodologies that can extract the deeper perceptions from the diverse spatial expertise in order to explain the invisible control arm of the polarised reception of the ideas of change. This paper evaluates Q methodology in the context of a cadastre and land registry merger (under one agency) using the Swedish cadastral system as a case study. Precisely, the aim of this paper is to evaluate the effectiveness of Q methodology towards modelling the diverse psychological perceptions of spatial professionals who are in a widely contested decision of merging the cadastre and land registry components of Land administration using the Swedish cadastral system as a case study. An empirical approach that is prescribed by Q methodology starts with the concourse development, followed by the design of statements and q sort instrument, selection of the participants, the q-sorting exercise, factor extraction by PQMethod and finally narrative development by logic of abduction. The paper uses 36 statements developed from a dominant competing value theory that stands out on its reliability and validity, purposively selects 19 participants to do the Qsorting exercise, proceeds with factor extraction from the diversity using varimax rotation and judgemental rotation provided by PQMethod and effect the narrative construction using the logic abduction. The findings from the diverse perceptions from cadastral professionals in the merger decision of land registry and cadastre components in Sweden’s mapping agency (Lantmäteriet) shows that focus is rather inclined on the perfection of the relationship between the legal expertise and technical spatial expertise. There is much emphasis on tradition, loyalty and communication attributes which concern the organisation’s internal environment rather than innovation and market attributes that reveals customer behavior and needs arising from the changing humankind-land needs. It can be concluded that Q methodology offers effective tools that pursues a psychological approach for the evaluation and gradations of the decisions of strategic change through extracting the local perceptions of spatial expertise.Keywords: cadastre, factor extraction, land administration merger, land registry, q-methodology, rotation
Procedia PDF Downloads 1951375 Assessing Acute Toxicity and Endocrine Disruption Potential of Selected Packages Internal Layers Extracts
Authors: N. Szczepanska, B. Kudlak, G. Yotova, S. Tsakovski, J. Namiesnik
Abstract:
In the scientific literature related to the widely understood issue of packaging materials designed to have contact with food (food contact materials), there is much information on raw materials used for their production, as well as their physiochemical properties, types, and parameters. However, not much attention is given to the issues concerning migration of toxic substances from packaging and its actual influence on the health of the final consumer, even though health protection and food safety are the priority tasks. The goal of this study was to estimate the impact of particular foodstuff packaging type, food production, and storage conditions on the degree of leaching of potentially toxic compounds and endocrine disruptors to foodstuffs using the acute toxicity test Microtox and XenoScreen YES YAS assay. The selected foodstuff packaging materials were metal cans used for fish storage and tetrapak. Five stimulants respectful to specific kinds of food were chosen in order to assess global migration: distilled water for aqueous foods with a pH above 4.5; acetic acid at 3% in distilled water for acidic aqueous food with pH below 4.5; ethanol at 5% for any food that may contain alcohol; dimethyl sulfoxide (DMSO) and artificial saliva were used in regard to the possibility of using it as an simulation medium. For each packaging three independent variables (temperature and contact time) factorial design simulant was performed. Xenobiotics migration from epoxy resins was studied at three different temperatures (25°C, 65°C, and 121°C) and extraction time of 12h, 48h and 2 weeks. Such experimental design leads to 9 experiments for each food simulant as conditions for each experiment are obtained by combination of temperature and contact time levels. Each experiment was run in triplicate for acute toxicity and in duplicate for estrogen disruption potential determination. Multi-factor analysis of variation (MANOVA) was used to evaluate the effects of the three main factors solvent, temperature (temperature regime for cup), contact time and their interactions on the respected dependent variable (acute toxicity or estrogen disruption potential). From all stimulants studied the most toxic were can and tetrapak lining acetic acid extracts that are indication for significant migration of toxic compounds. This migration increased with increase of contact time and temperature and justified the hypothesis that food products with low pH values cause significant damage internal resin filling. Can lining extracts of all simulation medias excluding distilled water and artificial saliva proved to contain androgen agonists even at 25°C and extraction time of 12h. For tetrapak extracts significant endocrine potential for acetic acid, DMSO and saliva were detected.Keywords: food packaging, extraction, migration, toxicity, biotest
Procedia PDF Downloads 1811374 Lung Cancer Detection and Multi Level Classification Using Discrete Wavelet Transform Approach
Authors: V. Veeraprathap, G. S. Harish, G. Narendra Kumar
Abstract:
Uncontrolled growth of abnormal cells in the lung in the form of tumor can be either benign (non-cancerous) or malignant (cancerous). Patients with Lung Cancer (LC) have an average of five years life span expectancy provided diagnosis, detection and prediction, which reduces many treatment options to risk of invasive surgery increasing survival rate. Computed Tomography (CT), Positron Emission Tomography (PET), and Magnetic Resonance Imaging (MRI) for earlier detection of cancer are common. Gaussian filter along with median filter used for smoothing and noise removal, Histogram Equalization (HE) for image enhancement gives the best results without inviting further opinions. Lung cavities are extracted and the background portion other than two lung cavities is completely removed with right and left lungs segmented separately. Region properties measurements area, perimeter, diameter, centroid and eccentricity measured for the tumor segmented image, while texture is characterized by Gray-Level Co-occurrence Matrix (GLCM) functions, feature extraction provides Region of Interest (ROI) given as input to classifier. Two levels of classifications, K-Nearest Neighbor (KNN) is used for determining patient condition as normal or abnormal, while Artificial Neural Networks (ANN) is used for identifying the cancer stage is employed. Discrete Wavelet Transform (DWT) algorithm is used for the main feature extraction leading to best efficiency. The developed technology finds encouraging results for real time information and on line detection for future research.Keywords: artificial neural networks, ANN, discrete wavelet transform, DWT, gray-level co-occurrence matrix, GLCM, k-nearest neighbor, KNN, region of interest, ROI
Procedia PDF Downloads 1551373 Web Data Scraping Technology Using Term Frequency Inverse Document Frequency to Enhance the Big Data Quality on Sentiment Analysis
Authors: Sangita Pokhrel, Nalinda Somasiri, Rebecca Jeyavadhanam, Swathi Ganesan
Abstract:
Tourism is a booming industry with huge future potential for global wealth and employment. There are countless data generated over social media sites every day, creating numerous opportunities to bring more insights to decision-makers. The integration of Big Data Technology into the tourism industry will allow companies to conclude where their customers have been and what they like. This information can then be used by businesses, such as those in charge of managing visitor centers or hotels, etc., and the tourist can get a clear idea of places before visiting. The technical perspective of natural language is processed by analysing the sentiment features of online reviews from tourists, and we then supply an enhanced long short-term memory (LSTM) framework for sentiment feature extraction of travel reviews. We have constructed a web review database using a crawler and web scraping technique for experimental validation to evaluate the effectiveness of our methodology. The text form of sentences was first classified through Vader and Roberta model to get the polarity of the reviews. In this paper, we have conducted study methods for feature extraction, such as Count Vectorization and TFIDF Vectorization, and implemented Convolutional Neural Network (CNN) classifier algorithm for the sentiment analysis to decide the tourist’s attitude towards the destinations is positive, negative, or simply neutral based on the review text that they posted online. The results demonstrated that from the CNN algorithm, after pre-processing and cleaning the dataset, we received an accuracy of 96.12% for the positive and negative sentiment analysis.Keywords: counter vectorization, convolutional neural network, crawler, data technology, long short-term memory, web scraping, sentiment analysis
Procedia PDF Downloads 881372 Multi-source Question Answering Framework Using Transformers for Attribute Extraction
Authors: Prashanth Pillai, Purnaprajna Mangsuli
Abstract:
Oil exploration and production companies invest considerable time and efforts to extract essential well attributes (like well status, surface, and target coordinates, wellbore depths, event timelines, etc.) from unstructured data sources like technical reports, which are often non-standardized, multimodal, and highly domain-specific by nature. It is also important to consider the context when extracting attribute values from reports that contain information on multiple wells/wellbores. Moreover, semantically similar information may often be depicted in different data syntax representations across multiple pages and document sources. We propose a hierarchical multi-source fact extraction workflow based on a deep learning framework to extract essential well attributes at scale. An information retrieval module based on the transformer architecture was used to rank relevant pages in a document source utilizing the page image embeddings and semantic text embeddings. A question answering framework utilizingLayoutLM transformer was used to extract attribute-value pairs incorporating the text semantics and layout information from top relevant pages in a document. To better handle context while dealing with multi-well reports, we incorporate a dynamic query generation module to resolve ambiguities. The extracted attribute information from various pages and documents are standardized to a common representation using a parser module to facilitate information comparison and aggregation. Finally, we use a probabilistic approach to fuse information extracted from multiple sources into a coherent well record. The applicability of the proposed approach and related performance was studied on several real-life well technical reports.Keywords: natural language processing, deep learning, transformers, information retrieval
Procedia PDF Downloads 1931371 Efficient Video Compression Technique Using Convolutional Neural Networks and Generative Adversarial Network
Authors: P. Karthick, K. Mahesh
Abstract:
Video has become an increasingly significant component of our digital everyday contact. With the advancement of greater contents and shows of the resolution, its significant volume poses serious obstacles to the objective of receiving, distributing, compressing, and revealing video content of high quality. In this paper, we propose the primary beginning to complete a deep video compression model that jointly upgrades all video compression components. The video compression method involves splitting the video into frames, comparing the images using convolutional neural networks (CNN) to remove duplicates, repeating the single image instead of the duplicate images by recognizing and detecting minute changes using generative adversarial network (GAN) and recorded with long short-term memory (LSTM). Instead of the complete image, the small changes generated using GAN are substituted, which helps in frame level compression. Pixel wise comparison is performed using K-nearest neighbours (KNN) over the frame, clustered with K-means, and singular value decomposition (SVD) is applied for each and every frame in the video for all three color channels [Red, Green, Blue] to decrease the dimension of the utility matrix [R, G, B] by extracting its latent factors. Video frames are packed with parameters with the aid of a codec and converted to video format, and the results are compared with the original video. Repeated experiments on several videos with different sizes, duration, frames per second (FPS), and quality results demonstrate a significant resampling rate. On average, the result produced had approximately a 10% deviation in quality and more than 50% in size when compared with the original video.Keywords: video compression, K-means clustering, convolutional neural network, generative adversarial network, singular value decomposition, pixel visualization, stochastic gradient descent, frame per second extraction, RGB channel extraction, self-detection and deciding system
Procedia PDF Downloads 1881370 ARABEX: Automated Dotted Arabic Expiration Date Extraction using Optimized Convolutional Autoencoder and Custom Convolutional Recurrent Neural Network
Authors: Hozaifa Zaki, Ghada Soliman
Abstract:
In this paper, we introduced an approach for Automated Dotted Arabic Expiration Date Extraction using Optimized Convolutional Autoencoder (ARABEX) with bidirectional LSTM. This approach is used for translating the Arabic dot-matrix expiration dates into their corresponding filled-in dates. A custom lightweight Convolutional Recurrent Neural Network (CRNN) model is then employed to extract the expiration dates. Due to the lack of available dataset images for the Arabic dot-matrix expiration date, we generated synthetic images by creating an Arabic dot-matrix True Type Font (TTF) matrix to address this limitation. Our model was trained on a realistic synthetic dataset of 3287 images, covering the period from 2019 to 2027, represented in the format of yyyy/mm/dd. We then trained our custom CRNN model using the generated synthetic images to assess the performance of our model (ARABEX) by extracting expiration dates from the translated images. Our proposed approach achieved an accuracy of 99.4% on the test dataset of 658 images, while also achieving a Structural Similarity Index (SSIM) of 0.46 for image translation on our dataset. The ARABEX approach demonstrates its ability to be applied to various downstream learning tasks, including image translation and reconstruction. Moreover, this pipeline (ARABEX+CRNN) can be seamlessly integrated into automated sorting systems to extract expiry dates and sort products accordingly during the manufacturing stage. By eliminating the need for manual entry of expiration dates, which can be time-consuming and inefficient for merchants, our approach offers significant results in terms of efficiency and accuracy for Arabic dot-matrix expiration date recognition.Keywords: computer vision, deep learning, image processing, character recognition
Procedia PDF Downloads 821369 Improving Fingerprinting-Based Localization System Using Generative AI
Authors: Getaneh Berie Tarekegn
Abstract:
A precise localization system is crucial for many artificial intelligence Internet of Things (AI-IoT) applications in the era of smart cities. Their applications include traffic monitoring, emergency alarming, environmental monitoring, location-based advertising, intelligent transportation, and smart health care. The most common method for providing continuous positioning services in outdoor environments is by using a global navigation satellite system (GNSS). Due to nonline-of-sight, multipath, and weather conditions, GNSS systems do not perform well in dense urban, urban, and suburban areas.This paper proposes a generative AI-based positioning scheme for large-scale wireless settings using fingerprinting techniques. In this article, we presented a semi-supervised deep convolutional generative adversarial network (S-DCGAN)-based radio map construction method for real-time device localization. It also employed a reliable signal fingerprint feature extraction method with t-distributed stochastic neighbor embedding (t-SNE), which extracts dominant features while eliminating noise from hybrid WLAN and long-term evolution (LTE) fingerprints. The proposed scheme reduced the workload of site surveying required to build the fingerprint database by up to 78.5% and significantly improved positioning accuracy. The results show that the average positioning error of GAILoc is less than 0.39 m, and more than 90% of the errors are less than 0.82 m. According to numerical results, SRCLoc improves positioning performance and reduces radio map construction costs significantly compared to traditional methods.Keywords: location-aware services, feature extraction technique, generative adversarial network, long short-term memory, support vector machine
Procedia PDF Downloads 601368 Quality Assessment of New Zealand Mānuka Honeys Using Hyperspectral Imaging Combined with Deep 1D-Convolutional Neural Networks
Authors: Hien Thi Dieu Truong, Mahmoud Al-Sarayreh, Pullanagari Reddy, Marlon M. Reis, Richard Archer
Abstract:
New Zealand mānuka honey is a honeybee product derived mainly from Leptospermum scoparium nectar. The potent antibacterial activity of mānuka honey derives principally from methylglyoxal (MGO), in addition to the hydrogen peroxide and other lesser activities present in all honey. MGO is formed from dihydroxyacetone (DHA) unique to L. scoparium nectar. Mānuka honey also has an idiosyncratic phenolic profile that is useful as a chemical maker. Authentic mānuka honey is highly valuable, but almost all honey is formed from natural mixtures of nectars harvested by a hive over a time period. Once diluted by other nectars, mānuka honey irrevocably loses value. We aimed to apply hyperspectral imaging to honey frames before bulk extraction to minimise the dilution of genuine mānuka by other honey and ensure authenticity at the source. This technology is non-destructive and suitable for an industrial setting. Chemometrics using linear Partial Least Squares (PLS) and Support Vector Machine (SVM) showed limited efficacy in interpreting chemical footprints due to large non-linear relationships between predictor and predictand in a large sample set, likely due to honey quality variability across geographic regions. Therefore, an advanced modelling approach, one-dimensional convolutional neural networks (1D-CNN), was investigated for analysing hyperspectral data for extraction of biochemical information from honey. The 1D-CNN model showed superior prediction of honey quality (R² = 0.73, RMSE = 2.346, RPD= 2.56) to PLS (R² = 0.66, RMSE = 2.607, RPD= 1.91) and SVM (R² = 0.67, RMSE = 2.559, RPD=1.98). Classification of mono-floral manuka honey from multi-floral and non-manuka honey exceeded 90% accuracy for all models tried. Overall, this study reveals the potential of HSI and deep learning modelling for automating the evaluation of honey quality in frames.Keywords: mānuka honey, quality, purity, potency, deep learning, 1D-CNN, chemometrics
Procedia PDF Downloads 1401367 Purification of Zr from Zr-Hf Resources Using Crystallization in HF-HCl Solvent Mixture
Authors: Kenichi Hirota, Jifeng Wang, Sadao Araki, Koji Endo, Hideki Yamamoto
Abstract:
Zirconium (Zr) has been used as a fuel cladding tube for nuclear reactors, because of the excellent corrosion resistance and the low adsorptive material for neutron. Generally speaking, the natural resource of Zr is often containing Hf that has similar properties. The content of Hf in the Zr resources is about 2~4 wt%. In the industrial use, the content of Hf in Zr resources should be lower than the 100 ppm. However, the separation of Zr and Hf is not so easy, because of similar chemical and physical properties such as melting point, boiling point and things. Solvent extraction method has been applied for the separation of Zr and Hf from Zr natural resources. This method can separate Hf with high efficiency (Hf < 100ppm), however, it needs much amount of organic solvents for solvent extraction and the cost of its disposal treatment is high. Therefore, we attached attention for the fractional crystallization. This separation method depends on the solubility difference of Zr and Hf in the solvent. In this work, hexafluorozirconate (hafnate) (K2Zr(Hf)F6) was used as model compound. Solubility of K2ZrF6 in water showed lower than that of K2HfF6. By repeating of this treatment, it is possible to purify Zr, practically. In this case, 16-18 times of recrystallization stages were needed for its high purification. The improvement of the crystallization process was carried out in this work. Water, hydrofluoric acid (HF) and hydrofluoric acid (HF) +hydrochloric acid (HCl) mixture were chosen as solvent for dissolution of Zr and Hf. In the experiment, 10g of K2ZrF6 was added to each solvent of 100mL. Each solution was heated for 1 hour at 353K. After 1h of this operation, they were cooled down till 293K, and were held for 5 hours at 273K. Concentration of Zr or Hf was measured using ICP analysis. It was found that Hf was separated from Zr-Hf mixed compound with high efficiency, when HF-HCl solution was used for solvent of crystallization. From the comparison of the particle size of each crystal by SEM, it was confirmed that the particle diameter of the crystal showed smaller size with decreasing of Hf content. This paper concerned with purification of Zr from Zr-Hf mixture using crystallization method.Keywords: crystallization, zirconium, hafnium, separation
Procedia PDF Downloads 4381366 Post-Operative Pain Management in Ehlers-Danlos Hypermobile-Type Syndrome Following Wisdom Teeth Extraction: A Case Report and Literature Review
Authors: Aikaterini Amanatidou
Abstract:
We describe the case of a 20-year-old female patient diagnosed with Ehlers-Danlos Syndrome (EDS) who was scheduled to undergo a wisdom teeth extraction in outpatient surgery. EDS is a hereditary connective tissue disorder characterized by joint hypermobility, skin hyper-extensibility, and vascular and soft tissue fragility. There are six subtypes of Ehlers-Danlos, and in our case, the patient had EDS hyper-mobility (HT) type disorder. One important clinical feature of this syndrome is chronic pain, which is often poorly understood and treated. Our patient had a long history of articular and lumbar pain when she was diagnosed. She was prescribed analgesic treatment for acute and neuropathic pain and had multiple sessions of psychotherapy and physiotherapy to ease the pain. Unfortunately, her extensive medical history was underrated by our anesthetic team, and no further measures were taken for the operation. Despite an uneventful intra-operative phase, the patient experienced several episodes of hyperalgesia during the immediate post-operative care. Management of pain was challenging for the anesthetic team: initial opioid treatment had only a temporary effect and a paradoxical reaction after a while. Final pain relief was eventually obtained with psycho-physiologic treatment, high doses of ketamine, and patient-controlled analgesia infusion of morphine-ketamine-dehydrobenzperidol. We suspected an episode of Opioid-Induced hyperalgesia. This case report supports the hypothesis that anti-hyperalgesics such as ketamine as well as lidocaine, and dexmedetomidine should be considered intra-operatively to avoid opioid-induced hyperalgesia and may be an alternative solution to manage complex chronic pain like others in neuropathic pain syndromes.Keywords: Ehlers-Danlos, post-operative management, hyperalgesia, opioid-induced hyperalgesia, rare disease
Procedia PDF Downloads 951365 Optimal Design of Wind Turbine Blades Equipped with Flaps
Authors: I. Kade Wiratama
Abstract:
As a result of the significant growth of wind turbines in size, blade load control has become the main challenge for large wind turbines. Many advanced techniques have been investigated aiming at developing control devices to ease blade loading. Amongst them, trailing edge flaps have been proven as effective devices for load alleviation. The present study aims at investigating the potential benefits of flaps in enhancing the energy capture capabilities rather than blade load alleviation. A software tool is especially developed for the aerodynamic simulation of wind turbines utilising blades equipped with flaps. As part of the aerodynamic simulation of these wind turbines, the control system must be also simulated. The simulation of the control system is carried out via solving an optimisation problem which gives the best value for the controlling parameter at each wind turbine run condition. Developing a genetic algorithm optimisation tool which is especially designed for wind turbine blades and integrating it with the aerodynamic performance evaluator, a design optimisation tool for blades equipped with flaps is constructed. The design optimisation tool is employed to carry out design case studies. The results of design case studies on wind turbine AWT 27 reveal that, as expected, the location of flap is a key parameter influencing the amount of improvement in the power extraction. The best location for placing a flap is at about 70% of the blade span from the root of the blade. The size of the flap has also significant effect on the amount of enhancement in the average power. This effect, however, reduces dramatically as the size increases. For constant speed rotors, adding flaps without re-designing the topology of the blade can improve the power extraction capability as high as of about 5%. However, with re-designing the blade pretwist the overall improvement can be reached as high as 12%.Keywords: flaps, design blade, optimisation, simulation, genetic algorithm, WTAero
Procedia PDF Downloads 3371364 Mentha piperita Formulations in Natural Deep Eutectic Solvents: Phenolic Profile and Biological Activity
Authors: Tatjana Jurić, Bojana Blagojević, Denis Uka, Ružica Ždero Pavlović, Boris M. Popović
Abstract:
Natural deep eutectic solvents (NADES) represent a class of modern systems that have been developed as a green alternative to toxic organic solvents, which are commonly used as extraction media. It has been considered that hydrogen bonding is the main interaction leading to the formation of NADES. The aim of this study was phytochemical characterization and determination of the antioxidant and antibacterial activity of Mentha piperita leaf extracts obtained by six choline chloride-based NADES. NADES were prepared by mixing choline chloride with different hydrogen bond donors in 1:1 molar ratio following the addition of 30% (w/w) water. The mixtures were then heated (60 °C) and stirred (650 rpm) until the clear homogenous liquids were obtained. The Mentha piperita extracts were prepared by mixing 75 mg of peppermint leaves with 1 mL of NADES following by the heating and stirring (60 °C, 650 rpm) within 30 min. The content of six phenolics in extracts was determined using HPLC-PDA. The dominant compounds presented in peppermint leaves - rosmarinic acid and luteolin 7-O-glucoside, were extracted by NADES at a similar level as 70% ethanol. The microdilution method was applied to test the antibacterial activity of extracts. Compared with 70% ethanol, all NADES systems showed higher antibacterial activity towards Pseudomonas aeruginosa (Gram -), Staphylococcus aureus (Gram +), Escherichia coli (Gram -), and Salmonella enterica (Gram -), especially NADES containing organic acids. The majority of NADES extracts showed a better ability to neutralize DPPH radical than conventional solvent and similar ability to reduce Fe3+ to Fe2+ ions in FRAP assay. The obtained results introduce NADES systems as the novel, sustainable, and low-cost solvents with a variety of applications.Keywords: antibacterial activity, antioxidant activity, green extraction, natural deep eutectic solvents, polyphenols
Procedia PDF Downloads 1861363 Improving Fingerprinting-Based Localization System Using Generative Artificial Intelligence
Authors: Getaneh Berie Tarekegn
Abstract:
A precise localization system is crucial for many artificial intelligence Internet of Things (AI-IoT) applications in the era of smart cities. Their applications include traffic monitoring, emergency alarming, environmental monitoring, location-based advertising, intelligent transportation, and smart health care. The most common method for providing continuous positioning services in outdoor environments is by using a global navigation satellite system (GNSS). Due to nonline-of-sight, multipath, and weather conditions, GNSS systems do not perform well in dense urban, urban, and suburban areas.This paper proposes a generative AI-based positioning scheme for large-scale wireless settings using fingerprinting techniques. In this article, we presented a novel semi-supervised deep convolutional generative adversarial network (S-DCGAN)-based radio map construction method for real-time device localization. We also employed a reliable signal fingerprint feature extraction method with t-distributed stochastic neighbor embedding (t-SNE), which extracts dominant features while eliminating noise from hybrid WLAN and long-term evolution (LTE) fingerprints. The proposed scheme reduced the workload of site surveying required to build the fingerprint database by up to 78.5% and significantly improved positioning accuracy. The results show that the average positioning error of GAILoc is less than 39 cm, and more than 90% of the errors are less than 82 cm. That is, numerical results proved that, in comparison to traditional methods, the proposed SRCLoc method can significantly improve positioning performance and reduce radio map construction costs.Keywords: location-aware services, feature extraction technique, generative adversarial network, long short-term memory, support vector machine
Procedia PDF Downloads 711362 Linkage between a Plant-based Diet and Visual Impairment: A Systematic Review and Meta-Analysis
Authors: Cristina Cirone, Katrina Cirone, Monali S. Malvankar-Mehta
Abstract:
Purpose: An increased risk of visual impairment has been observed in individuals lacking a balanced diet. The purpose of this paper is to characterize the relationship between plant-based diets and specific ocular outcomes among adults. Design: Systematic review and meta-analysis. Methods: This systematic review and meta-analysis were conducted according to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses statement guidelines. The databases MEDLINE, EMBASE, Cochrane, and PubMed, were systematically searched up until May 27, 2021. Of the 503 articles independently screened by two reviewers, 21 were included in this review. Quality assessment and data extraction were performed by both reviewers. Meta-analysis was conducted using STATA 15.0. Fixed-effect and random-effect models were computed based on heterogeneity. Results: A total of 503 studies were identified which then underwent duplicate removal and a title and abstract screen. The remaining 61 studies underwent a full-text screen, 21 progressed to data extraction and fifteen were included in the quantitative analysis. Meta-analysis indicated that regular consumption of fish (OR = 0.70; CI: [0.62-0.79]) and skim milk, poultry, and non-meat animal products (OR = 0.70; CI: [0.61-0.79]) is positively correlated with a reduced risk of visual impairment (age-related macular degeneration, age-related maculopathy, cataract development, and central geographic atrophy) among adults. Consumption of red meat [OR = 1.41; CI: [1.07-1.86]) is associated with an increased risk of visual impairment. Conclusion: Overall, a pescatarian diet is associated with the most favorable visual outcomes among adults, while the consumption of red meat appears to negatively impact vision. Results suggest a need for more local and government-led interventions promoting a healthy and balanced diet.Keywords: plant-based diet, pescatarian diet, visual impairment, systematic review, meta-analysis
Procedia PDF Downloads 1851361 Phytotechnologies for Use and Reconstitution of Contaminated Sites
Authors: Olga Shuvaeva, Tamara Romanova, Sergey Volynkin, Valentina Podolinnaya
Abstract:
Green chemistry concept is focused on the prevention of environmental pollution caused by human activity. However, there are a lot of contaminated areas in the world which pose a serious threat to ecosystems in terms of their conservation. Therefore in accordance with the principles of green chemistry, it should not be forgotten about the need to clean these areas. Furthermore, the waste material often contains the valuable components, the extraction of which by traditional wet chemical technologies is inefficient both from the economic and environmental protection standpoint. Wherein, the plants may be successfully used to ‘scavenge’ a range of metals from polluted land sites in an approach allowing to carry out both of these processes – phytoremediation and phytomining in conjunction. The goal of the present work was to study bioaccumulation ability of floating macrophytes such as water hyacinth and pondweed toward Hg, Ba, Cd, Mo and Pb as pollutants in aquatic medium and terrestrial plants (birch, reed, and cane) towards gold and silver as valuable components. The peculiarity of ongoing research was that the plants grew under extreme conditions (pH of drainage and pore waters was about 2.5). The study was conducted at the territory of Ursk tailings (Southwestern Siberia, Russia) formed as a result of primary polymetallic ores cyanidation. The waste material is mainly presented (~80%) by pyrite (FeS₂) and barite (BaSO₄), the raw minerals included FeAsS, HgS, PbS, Ag₂S as minor ones. It has been shown that water hyacinth demonstrates high ability to accumulate different metals, and what is especially important – to remove mercury from polluted waters with BCF value more than 1000. As for the gold, its concentrations in reed and cane growing near the waste material were estimated as 500 and 900 μg∙kg⁻¹ respectively. It was also found that the plants can survive under extreme conditions of acidic environment and hence we can assume that there is a principal opportunity to use them for the valuable substances extraction from an area of the mining waste dumps burial.Keywords: bioaccumulation, gold, heavy metals, mine tailing
Procedia PDF Downloads 1731360 HCl-Based Hydrometallurgical Recycling Route for Metal Recovery from Li-Ion Battery Wastes
Authors: Claudia Schier, Arvid Biallas, Bernd Friedrich
Abstract:
The demand for Li-ion-batteries owing to their benefits, such as; fast charging time, high energy density, low weight, large temperature range, and a long service life performance is increasing compared to other battery systems. These characteristics are substantial not only for battery-operated portable devices but also in the growing field of electromobility where high-performance energy storage systems in the form of batteries are highly requested. Due to the sharp rising production, there is a tremendous interest to recycle spent Li-Ion batteries in a closed-loop manner owed to the high content of valuable metals such as cobalt, manganese, and lithium as well as regarding the increasing demand for those scarce applied metals. Currently, there are just a few industrial processes using hydrometallurgical methods to recover valuable metals from Li-ion-battery waste. In this study, the extraction of valuable metals from spent Li-ion-batteries is investigated by pretreated and subsequently leached battery wastes using different precipitation methods in a comparative manner. For the extraction of lithium, cobalt, and other valuable metals, pelletized battery wastes with an initial Li content of 2.24 wt. % and cobalt of 22 wt. % is used. Hydrochloric acid with 4 mol/L is applied with 1:50 solid to liquid (s/l) ratio to generate pregnant leach solution for subsequent precipitation steps. In order to obtain pure precipitates, two different pathways (pathway 1 and pathway 2) are investigated, which differ from each other with regard to the precipitation steps carried out. While lithium carbonate recovery is the final process step in pathway 1, pathway 2 requires a preliminary removal of lithium from the process. The aim is to evaluate both processes in terms of purity and yield of the products obtained. ICP-OES is used to determine the chemical content of leach liquor as well as of the solid residue.Keywords: hydrochloric acid, hydrometallurgy, Li-ion-batteries, metal recovery
Procedia PDF Downloads 1721359 Processing and Economic Analysis of Rain Tree (Samanea saman) Pods for Village Level Hydrous Bioethanol Production
Authors: Dharell B. Siano, Wendy C. Mateo, Victorino T. Taylan, Francisco D. Cuaresma
Abstract:
Biofuel is one of the renewable energy sources adapted by the Philippine government in order to lessen the dependency on foreign fuel and to reduce carbon dioxide emissions. Rain tree pods were seen to be a promising source of bioethanol since it contains significant amount of fermentable sugars. The study was conducted to establish the complete procedure in processing rain tree pods for village level hydrous bioethanol production. Production processes were done for village level hydrous bioethanol production from collection, drying, storage, shredding, dilution, extraction, fermentation, and distillation. The feedstock was sundried, and moisture content was determined at a range of 20% to 26% prior to storage. Dilution ratio was 1:1.25 (1 kg of pods = 1.25 L of water) and after extraction process yielded a sugar concentration of 22 0Bx to 24 0Bx. The dilution period was three hours. After three hours of diluting the samples, the juice was extracted using extractor with a capacity of 64.10 L/hour. 150 L of rain tree pods juice was extracted and subjected to fermentation process using a village level anaerobic bioreactor. Fermentation with yeast (Saccharomyces cerevisiae) can fasten up the process, thus producing more ethanol at a shorter period of time; however, without yeast fermentation, it also produces ethanol at lower volume with slower fermentation process. Distillation of 150 L of fermented broth was done for six hours at 85 °C to 95 °C temperature (feedstock) and 74 °C to 95 °C temperature of the column head (vapor state of ethanol). The highest volume of ethanol recovered was established at with yeast fermentation at five-day duration with a value of 14.89 L and lowest actual ethanol content was found at without yeast fermentation at three-day duration having a value of 11.63 L. In general, the results suggested that rain tree pods had a very good potential as feedstock for bioethanol production. Fermentation of rain tree pods juice can be done with yeast and without yeast.Keywords: fermentation, hydrous bioethanol, fermentation, rain tree pods, village level
Procedia PDF Downloads 2951358 Lexical Semantic Analysis to Support Ontology Modeling of Maintenance Activities– Case Study of Offshore Riser Integrity
Authors: Vahid Ebrahimipour
Abstract:
Word representation and context meaning of text-based documents play an essential role in knowledge modeling. Business procedures written in natural language are meant to store technical and engineering information, management decision and operation experience during the production system life cycle. Context meaning representation is highly dependent upon word sense, lexical relativity, and sematic features of the argument. This paper proposes a method for lexical semantic analysis and context meaning representation of maintenance activity in a mass production system. Our approach constructs a straightforward lexical semantic approach to analyze facilitates semantic and syntactic features of context structure of maintenance report to facilitate translation, interpretation, and conversion of human-readable interpretation into computer-readable representation and understandable with less heterogeneity and ambiguity. The methodology will enable users to obtain a representation format that maximizes shareability and accessibility for multi-purpose usage. It provides a contextualized structure to obtain a generic context model that can be utilized during the system life cycle. At first, it employs a co-occurrence-based clustering framework to recognize a group of highly frequent contextual features that correspond to a maintenance report text. Then the keywords are identified for syntactic and semantic extraction analysis. The analysis exercises causality-driven logic of keywords’ senses to divulge the structural and meaning dependency relationships between the words in a context. The output is a word contextualized representation of maintenance activity accommodating computer-based representation and inference using OWL/RDF.Keywords: lexical semantic analysis, metadata modeling, contextual meaning extraction, ontology modeling, knowledge representation
Procedia PDF Downloads 1051357 A Scientific Method of Drug Development Based on Ayurvedic Bhaishajya Knowledge
Authors: Rajesh S. Mony, Vaidyaratnam Oushadhasala
Abstract:
An attempt is made in this study to evolve a drug development modality based on classical Ayurvedic knowledge base as well as on modern scientific methodology. The present study involves (a) identification of a specific ailment condition, (b) the selection of a polyherbal formulation, (c) deciding suitable extraction procedure, (d) confirming the efficacy of the combination by in-vitro trials and (e) fixing up the recommended dose. The ailment segment selected is arthritic condition. The selected herbal combination is Kunturushka, Vibhitaki, Guggulu, Haridra, Maricha and Nirgundi. They were selected as per Classical Ayurvedic references, Authentified as per API (Ayurvedic Pharmacopeia of India), Extraction of each drug was done by different ratios of Hydroalcoholic menstrums, Invitro assessment of each extract after removing residual solvent for anti-Inflammatory, anti-arthritic activities (by UV-Vis. Spectrophotometer with positive control), Invitro assessment of each extract for COX enzyme inhibition (by UV-Vis. Spectrophotometer with positive control), Selection of the extracts was made having good in-vitro activity, Performed the QC testing of each selected extract including HPTLC, that is the in process QC specifications, h. Decision of the single dose with mixtures of selected extracts was made as per the level of in-vitro activity and available toxicology data, Quantification of major groups like Phenolics, Flavonoids, Alkaloids and Bitters was done with both standard Spectrophotometric and Gravimetric methods, Method for Marker assay was developed and validated by HPTLC and a good resolved HPTLC finger print was developed for the single dosage API (Active Pharmaceutical Ingredient mixture of extracts), Three batches was prepared to fix the in process and API (Active Pharmaceutical Ingredient) QC specifications.Keywords: drug development, antiinflammatory, quality stardardisation, planar chromatography
Procedia PDF Downloads 1001356 Methodology for the Determination of Triterpenic Compounds in Apple Extracts
Authors: Mindaugas Liaudanskas, Darius Kviklys, Kristina Zymonė, Raimondas Raudonis, Jonas Viškelis, Norbertas Uselis, Pranas Viškelis, Valdimaras Janulis
Abstract:
Apples are among the most commonly consumed fruits in the world. Based on data from the year 2014, approximately 84.63 million tons of apples are grown per annum. Apples are widely used in food industry to produce various products and drinks (juice, wine, and cider); they are also used unprocessed. Apples in human diet are an important source of different groups of biological active compounds that can positively contribute to the prevention of various diseases. They are a source of various biologically active substances – especially vitamins, organic acids, micro- and macro-elements, pectins, and phenolic, triterpenic, and other compounds. Triterpenic compounds, which are characterized by versatile biological activity, are the biologically active compounds found in apples that are among the most promising and most significant for human health. A specific analytical procedure including sample preparation and High Performance Liquid Chromatography (HPLC) analysis was developed, optimized, and validated for the detection of triterpenic compounds in the samples of different apples, their peels, and flesh from widespread apple cultivars 'Aldas', 'Auksis', 'Connel Red', 'Ligol', 'Lodel', and 'Rajka' grown in Lithuanian climatic conditions. The conditions for triterpenic compound extraction were optimized: the solvent of the extraction was 100% (v/v) acetone, and the extraction was performed in an ultrasound bath for 10 min. Isocratic elution (the eluents ratio being 88% (solvent A) and 12% (solvent B)) for a rapid separation of triterpenic compounds was performed. The validation of the methodology was performed on the basis of the ICH recommendations. The following characteristics of validation were evaluated: the selectivity of the method (specificity), precision, the detection and quantitation limits of the analytes, and linearity. The obtained parameters values confirm suitability of methodology to perform analysis of triterpenic compounds. Using the optimised and validated HPLC technique, four triterpenic compounds were separated and identified, and their specificity was confirmed. These compounds were corosolic acid, betulinic acid, oleanolic acid, and ursolic acid. Ursolic acid was the dominant compound in all the tested apple samples. The detected amount of betulinic acid was the lowest of all the identified triterpenic compounds. The greatest amounts of triterpenic compounds were detected in whole apple and apple peel samples of the 'Lodel' cultivar, and thus apples and apple extracts of this cultivar are potentially valuable for use in medical practice, for the prevention of various diseases, for adjunct therapy, for the isolation of individual compounds with a specific biological effect, and for the development and production of dietary supplements and functional food enriched in biologically active compounds. Acknowledgements. This work was supported by a grant from the Research Council of Lithuania, project No. MIP-17-8.Keywords: apples, HPLC, triterpenic compounds, validation
Procedia PDF Downloads 1731355 Thermodynamic Evaluation of Coupling APR-1400 with a Thermal Desalination Plant
Authors: M. Gomaa Abdoelatef, Robert M. Field, Lee, Yong-Kwan
Abstract:
Growing human populations have placed increased demands on water supplies and a heightened interest in desalination infrastructure. Key elements of the economics of desalination projects are thermal and electrical inputs. With growing concerns over the use of fossil fuels to (indirectly) supply these inputs, coupling of desalination with nuclear power production represents a significant opportunity. Individually, nuclear and desalination technologies have a long history and are relatively mature. For desalination, Reverse Osmosis (RO) has the lowest energy inputs. However, the economically driven output quality of the water produced using RO, which uses only electrical inputs, is lower than the output water quality from thermal desalination plants. Therefore, modern desalination projects consider that RO should be coupled with thermal desalination technologies (MSF, MED, or MED-TVC) with attendant steam inputs to permit blending to produce various qualities of water. A large nuclear facility is well positioned to dispatch large quantities of both electrical and thermal power. This paper considers the supply of thermal energy to a large desalination facility to examine heat balance impact on the nuclear steam cycle. The APR1400 nuclear plant is selected as prototypical from both a capacity and turbine cycle heat balance perspective to examine steam supply and the impact on electrical output. Extraction points and quantities of steam are considered parametrically along with various types of thermal desalination technologies to form the basis for further evaluations of economically optimal approaches to the interface of nuclear power production with desalination projects. In our study, the thermodynamic evaluation will be executed by DE-TOP which is the IAEA desalination program, it is approved to be capable of analyzing power generation systems coupled to desalination systems through various steam extraction positions, taking into consideration the isolation loop between the APR-1400 and the thermal desalination plant for safety concern.Keywords: APR-1400, desalination, DE-TOP, IAEA, MSF, MED, MED-TVC, RO
Procedia PDF Downloads 532