Search results for: keyframe extraction
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1923

Search results for: keyframe extraction

1533 A Quality Index Optimization Method for Non-Invasive Fetal ECG Extraction

Authors: Lucia Billeci, Gennaro Tartarisco, Maurizio Varanini

Abstract:

Fetal cardiac monitoring by fetal electrocardiogram (fECG) can provide significant clinical information about the healthy condition of the fetus. Despite this potentiality till now the use of fECG in clinical practice has been quite limited due to the difficulties in its measuring. The recovery of fECG from the signals acquired non-invasively by using electrodes placed on the maternal abdomen is a challenging task because abdominal signals are a mixture of several components and the fetal one is very weak. This paper presents an approach for fECG extraction from abdominal maternal recordings, which exploits the characteristics of pseudo-periodicity of fetal ECG. It consists of devising a quality index (fQI) for fECG and of finding the linear combinations of preprocessed abdominal signals, which maximize these fQI (quality index optimization - QIO). It aims at improving the performances of the most commonly adopted methods for fECG extraction, usually based on maternal ECG (mECG) estimating and canceling. The procedure for the fECG extraction and fetal QRS (fQRS) detection is completely unsupervised and based on the following steps: signal pre-processing; maternal ECG (mECG) extraction and maternal QRS detection; mECG component approximation and canceling by weighted principal component analysis; fECG extraction by fQI maximization and fetal QRS detection. The proposed method was compared with our previously developed procedure, which obtained the highest at the Physionet/Computing in Cardiology Challenge 2013. That procedure was based on removing the mECG from abdominal signals estimated by a principal component analysis (PCA) and applying the Independent component Analysis (ICA) on the residual signals. Both methods were developed and tuned using 69, 1 min long, abdominal measurements with fetal QRS annotation of the dataset A provided by PhysioNet/Computing in Cardiology Challenge 2013. The QIO-based and the ICA-based methods were compared in analyzing two databases of abdominal maternal ECG available on the Physionet site. The first is the Abdominal and Direct Fetal Electrocardiogram Database (ADdb) which contains the fetal QRS annotations thus allowing a quantitative performance comparison, the second is the Non-Invasive Fetal Electrocardiogram Database (NIdb), which does not contain the fetal QRS annotations so that the comparison between the two methods can be only qualitative. In particular, the comparison on NIdb was performed defining an index of quality for the fetal RR series. On the annotated database ADdb the QIO method, provided the performance indexes Sens=0.9988, PPA=0.9991, F1=0.9989 overcoming the ICA-based one, which provided Sens=0.9966, PPA=0.9972, F1=0.9969. The comparison on NIdb was performed defining an index of quality for the fetal RR series. The index of quality resulted higher for the QIO-based method compared to the ICA-based one in 35 records out 55 cases of the NIdb. The QIO-based method gave very high performances with both the databases. The results of this study foresees the application of the algorithm in a fully unsupervised way for the implementation in wearable devices for self-monitoring of fetal health.

Keywords: fetal electrocardiography, fetal QRS detection, independent component analysis (ICA), optimization, wearable

Procedia PDF Downloads 257
1532 Electrochemical Recovery of Lithium from Geothermal Brines

Authors: Sanaz Mosadeghsedghi, Mathew Hudder, Mohammad Ali Baghbanzadeh, Charbel Atallah, Seyedeh Laleh Dashtban Kenari, Konstantin Volchek

Abstract:

Lithium has recently been extensively used in lithium-ion batteries (LIBs) for electric vehicles and portable electronic devices. The conventional evaporative approach to recover and concentrate lithium is extremely slow and may take 10-24 months to concentrate lithium from dilute sources, such as geothermal brines. To response to the increasing industrial lithium demand, alternative extraction and concentration technologies should be developed to recover lithium from brines with low concentrations. In this study, a combination of electrocoagulation (EC) and electrodialysis (ED) was evaluated for the recovery of lithium from geothermal brines. The brine samples in this study, collected in Western Canada, had lithium concentrations of 50-75 mg/L on a background of much higher (over 10,000 times) concentrations of sodium. This very high sodium-to-lithium ratio poses challenges to the conventional direct-lithium extraction processes which employ lithium-selective adsorbents. EC was used to co-precipitate lithium using a sacrificial aluminium electrode. The precipitate was then dissolved, and the leachate was treated using ED to separate and concentrate lithium from other ions. The focus of this paper is on the study of ED, including a two-step ED process that included a mono-valent selective stage to separate lithium from multi-valent cations followed by a bipolar ED stage to convert lithium chloride (LiCl) to LiOH product. Eventually, the ED cell was reconfigured using mono-valent cation exchange with the bipolar membranes to combine the two ED steps in one. Using this process at optimum conditions, over 95% of the co-existing cations were removed and the purity of lithium increased to over 90% in the final product.

Keywords: electrochemical separation, electrocoagulation, electrodialysis, lithium extraction

Procedia PDF Downloads 63
1531 Luminescent Properties of Plastic Scintillator with Large Area Photonic Crystal Prepared by a Combination of Nanoimprint Lithography and Atomic Layer Deposition

Authors: Jinlu Ruan, Liang Chen, Bo Liu, Xiaoping Ouyang, Zhichao Zhu, Zhongbing Zhang, Shiyi He, Mengxuan Xu

Abstract:

Plastic scintillators play an important role in the measurement of a mixed neutron/gamma pulsed radiation, neutron radiography and pulse shape discrimination technology. In some research, these luminescent properties are necessary that photons produced by the interactions between a plastic scintillator and radiations can be detected as much as possible by the photoelectric detectors and more photons can be emitted from the scintillators along a specific direction where detectors are located. Unfortunately, a majority of these photons produced are trapped in the plastic scintillators due to the total internal reflection (TIR), because there is a significant light-trapping effect when the incident angle of internal scintillation light is larger than the critical angle. Some of these photons trapped in the scintillator may be absorbed by the scintillator itself and the others are emitted from the edges of the scintillator. This makes the light extraction of plastic scintillators very low. Moreover, only a small portion of the photons emitted from the scintillator easily can be detected by detectors effectively, because the distribution of the emission directions of this portion of photons exhibits approximate Lambertian angular profile following a cosine emission law. Therefore, enhancing the light extraction efficiency and adjusting the emission angular profile become the keys for improving the number of photons detected by the detectors. In recent years, photonic crystal structures have been covered on inorganic scintillators to enhance the light extraction efficiency and adjust the angular profile of scintillation light successfully. However, that, preparation methods of photonic crystals will deteriorate performance of plastic scintillators and even destroy the plastic scintillators, makes the investigation on preparation methods of photonic crystals for plastic scintillators and luminescent properties of plastic scintillators with photonic crystal structures inadequate. Although we have successfully made photonic crystal structures covered on the surface of plastic scintillators by a modified self-assembly technique and achieved a great enhance of light extraction efficiency without evident angular-dependence for the angular profile of scintillation light, the preparation of photonic crystal structures with large area (the diameter is larger than 6cm) and perfect periodic structure is still difficult. In this paper, large area photonic crystals on the surface of scintillators were prepared by nanoimprint lithography firstly, and then a conformal layer with material of high refractive index on the surface of photonic crystal by atomic layer deposition technique in order to enhance the stability of photonic crystal structures and increase the number of leaky modes for improving the light extraction efficiency. The luminescent properties of the plastic scintillator with photonic crystals prepared by the mentioned method are compared with those of the plastic scintillator without photonic crystal. The results indicate that the number of photons detected by detectors is increased by the enhanced light extraction efficiency and the angular profile of scintillation light exhibits evident angular-dependence for the scintillator with photonic crystals. The mentioned preparation of photonic crystals is beneficial to scintillation detection applications and lays an important technique foundation for the plastic scintillators to meet special requirements under different application backgrounds.

Keywords: angular profile, atomic layer deposition, light extraction efficiency, plastic scintillator, photonic crystal

Procedia PDF Downloads 172
1530 12 Real Forensic Caseworks Solved by the DNA STR-Typing of Skeletal Remains Exposed to Extremely Environment Conditions without the Conventional Bone Pulverization Step

Authors: Chiara Della Rocca, Gavino Piras, Andrea Berti, Alessandro Mameli

Abstract:

DNA identification of human skeletal remains plays a valuable role in the forensic field, especially in missing persons and mass disaster investigations. Hard tissues, such as bones and teeth, represent a very common kind of samples analyzed in forensic laboratories because they are often the only biological materials remaining. However, the major limitation of using these compact samples relies on the extremely time–consuming and labor–intensive treatment of grinding them into powder before proceeding with the conventional DNA purification and extraction step. In this context, a DNA extraction assay called the TBone Ex kit (DNA Chip Research Inc.) was developed to digest bone chips without powdering. Here, we simultaneously analyzed bone and tooth samples that arrived at our police laboratory and belonged to 15 different forensic casework that occurred in Sardinia (Italy). A total of 27 samples were recovered from different scenarios and were exposed to extreme environmental factors, including sunlight, seawater, soil, fauna, vegetation, and high temperature and humidity. The TBone Ex kit was used prior to the EZ2 DNA extraction kit on the EZ2 Connect Fx instrument (Qiagen), and high-quality autosomal and Y-chromosome STRs profiles were obtained for the 80% of the caseworks in an extremely short time frame. This study provides additional support for the use of the TBone Ex kit for digesting bone fragments/whole teeth as an effective alternative to pulverization protocols. We empirically demonstrated the effectiveness of the kit in processing multiple bone samples simultaneously, largely simplifying the DNA extraction procedure and the good yield of recovered DNA for downstream genetic typing in highly compromised forensic real specimens. In conclusion, this study turns out to be extremely useful for forensic laboratories, to which the various actors of the criminal justice system – such as potential jury members, judges, defense attorneys, and prosecutors – required immediate feedback.

Keywords: DNA, skeletal remains, bones, tbone ex kit, extreme conditions

Procedia PDF Downloads 12
1529 Using the Smith-Waterman Algorithm to Extract Features in the Classification of Obesity Status

Authors: Rosa Figueroa, Christopher Flores

Abstract:

Text categorization is the problem of assigning a new document to a set of predetermined categories, on the basis of a training set of free-text data that contains documents whose category membership is known. To train a classification model, it is necessary to extract characteristics in the form of tokens that facilitate the learning and classification process. In text categorization, the feature extraction process involves the use of word sequences also known as N-grams. In general, it is expected that documents belonging to the same category share similar features. The Smith-Waterman (SW) algorithm is a dynamic programming algorithm that performs a local sequence alignment in order to determine similar regions between two strings or protein sequences. This work explores the use of SW algorithm as an alternative to feature extraction in text categorization. The dataset used for this purpose, contains 2,610 annotated documents with the classes Obese/Non-Obese. This dataset was represented in a matrix form using the Bag of Word approach. The score selected to represent the occurrence of the tokens in each document was the term frequency-inverse document frequency (TF-IDF). In order to extract features for classification, four experiments were conducted: the first experiment used SW to extract features, the second one used unigrams (single word), the third one used bigrams (two word sequence) and the last experiment used a combination of unigrams and bigrams to extract features for classification. To test the effectiveness of the extracted feature set for the four experiments, a Support Vector Machine (SVM) classifier was tuned using 20% of the dataset. The remaining 80% of the dataset together with 5-Fold Cross Validation were used to evaluate and compare the performance of the four experiments of feature extraction. Results from the tuning process suggest that SW performs better than the N-gram based feature extraction. These results were confirmed by using the remaining 80% of the dataset, where SW performed the best (accuracy = 97.10%, weighted average F-measure = 97.07%). The second best was obtained by the combination of unigrams-bigrams (accuracy = 96.04, weighted average F-measure = 95.97) closely followed by the bigrams (accuracy = 94.56%, weighted average F-measure = 94.46%) and finally unigrams (accuracy = 92.96%, weighted average F-measure = 92.90%).

Keywords: comorbidities, machine learning, obesity, Smith-Waterman algorithm

Procedia PDF Downloads 272
1528 Generalized Additive Model Approach for the Chilean Hake Population in a Bio-Economic Context

Authors: Selin Guney, Andres Riquelme

Abstract:

The traditional bio-economic method for fisheries modeling uses some estimate of the growth parameters and the system carrying capacity from a biological model for the population dynamics (usually a logistic population growth model) which is then analyzed as a traditional production function. The stock dynamic is transformed into a revenue function and then compared with the extraction costs to estimate the maximum economic yield. In this paper, the logistic population growth model for the population is combined with a forecast of the abundance and location of the stock by using a generalized additive model approach. The paper focuses on the Chilean hake population. This method allows for the incorporation of climatic variables and the interaction with other marine species, which in turn will increase the reliability of the estimates and generate better extraction paths for different conservation objectives, such as the maximum biological yield or the maximum economic yield.

Keywords: bio-economic, fisheries, GAM, production

Procedia PDF Downloads 229
1527 Best Timing for Capturing Satellite Thermal Images, Asphalt, and Concrete Objects

Authors: Toufic Abd El-Latif Sadek

Abstract:

The asphalt object represents the asphalted areas like roads, and the concrete object represents the concrete areas like concrete buildings. The efficient extraction of asphalt and concrete objects from one satellite thermal image occurred at a specific time, by preventing the gaps in times which give the close and same brightness values between asphalt and concrete, and among other objects. So that to achieve efficient extraction and then better analysis. Seven sample objects were used un this study, asphalt, concrete, metal, rock, dry soil, vegetation, and water. It has been found that, the best timing for capturing satellite thermal images to extract the two objects asphalt and concrete from one satellite thermal image, saving time and money, occurred at a specific time in different months. A table is deduced shows the optimal timing for capturing satellite thermal images to extract effectively these two objects.

Keywords: asphalt, concrete, satellite thermal images, timing

Procedia PDF Downloads 295
1526 COVID-19 Detection from Computed Tomography Images Using UNet Segmentation, Region Extraction, and Classification Pipeline

Authors: Kenan Morani, Esra Kaya Ayana

Abstract:

This study aimed to develop a novel pipeline for COVID-19 detection using a large and rigorously annotated database of computed tomography (CT) images. The pipeline consists of UNet-based segmentation, lung extraction, and a classification part, with the addition of optional slice removal techniques following the segmentation part. In this work, a batch normalization was added to the original UNet model to produce lighter and better localization, which is then utilized to build a full pipeline for COVID-19 diagnosis. To evaluate the effectiveness of the proposed pipeline, various segmentation methods were compared in terms of their performance and complexity. The proposed segmentation method with batch normalization outperformed traditional methods and other alternatives, resulting in a higher dice score on a publicly available dataset. Moreover, at the slice level, the proposed pipeline demonstrated high validation accuracy, indicating the efficiency of predicting 2D slices. At the patient level, the full approach exhibited higher validation accuracy and macro F1 score compared to other alternatives, surpassing the baseline. The classification component of the proposed pipeline utilizes a convolutional neural network (CNN) to make final diagnosis decisions. The COV19-CT-DB dataset, which contains a large number of CT scans with various types of slices and rigorously annotated for COVID-19 detection, was utilized for classification. The proposed pipeline outperformed many other alternatives on the dataset.

Keywords: classification, computed tomography, lung extraction, macro F1 score, UNet segmentation

Procedia PDF Downloads 106
1525 Statistical Modeling for Permeabilization of a Novel Yeast Isolate for β-Galactosidase Activity Using Organic Solvents

Authors: Shweta Kumari, Parmjit S. Panesar, Manab B. Bera

Abstract:

The hydrolysis of lactose using β-galactosidase is one of the most promising biotechnological applications, which has wide range of potential applications in food processing industries. However, due to intracellular location of the yeast enzyme, and expensive extraction methods, the industrial applications of enzymatic hydrolysis processes are being hampered. The use of permeabilization technique can help to overcome the problems associated with enzyme extraction and purification of yeast cells and to develop the economically viable process for the utilization of whole cell biocatalysts in food industries. In the present investigation, standardization of permeabilization process of novel yeast isolate was carried out using a statistical model approach known as Response Surface Methodology (RSM) to achieve maximal b-galactosidase activity. The optimum operating conditions for permeabilization process for optimal β-galactosidase activity obtained by RSM were 1:1 ratio of toluene (25%, v/v) and ethanol (50%, v/v), 25.0 oC temperature and treatment time of 12 min, which displayed enzyme activity of 1.71 IU /mg DW.

Keywords: β-galactosidase, optimization, permeabilization, response surface methodology, yeast

Procedia PDF Downloads 229
1524 Benefits of PRP in Third Molar Surgery - A Review of the Literature

Authors: Nitesh Kumar, Adel Elrasheed, Antonio Gagliardilugo

Abstract:

Introduction and aims: PRP has been increasing in popularity over the past decade. It is used in many facets of medicine and dentistry such as osteoarthritis, hair loss, skin rejunavation, healing of tendons after injury. Due to the increasing popularity of PRP in third molar surgery in dentistry, this study aims to identify the role of platelet rich plasma and its function in third molar surgery. Methodology: Three databases were chosen to source the articles for review: pubmed, science direct, and Cochrane. The keywords “platelet rich plasma”, “third molar extraction” and “wisdom tooth extraction” and literature review were used to search for relevant articles. Articles that were not in English were omitted and only systematic reviews relevant to the study were collected. All systematic reviews abstracts pertinent to the study were read by two reviewers to avoid bias. Results/statistics: 20 review articles were obtained of which 13 fulfilled the criteria. The Amstar tool validified the strength of these review articles. There is strong evidence in the literature that PRP in third molar surgery decreases post op pain, swelling and recovery time. 20 review articles were obtained of which 13 fulfilled the criteria. The Amstar tool validified the strength of these review articles. There is strong evidence in the literature that PRP in third molar surgery decreases post op pain, swelling and recovery time. Conclusions/clinical relevance: Platelet rich plasma plays a crucial role in patient recovery following the extraction of third molars and should be considered and offered as a routine part of third molar therapy.

Keywords: PRP, third molar, extractions, wisdom teeth

Procedia PDF Downloads 36
1523 A Facile Nanocomposite of Graphene Oxide Reinforced Chitosan/Poly-Nitroaniline Polymer as a Highly Efficient Adsorbent for Extracting Polycyclic Aromatic Hydrocarbons from Tea Samples

Authors: Adel M. Al-Shutairi, Ahmed H. Al-Zahrani

Abstract:

Tea is a popular beverage drunk by millions of people throughout the globe. Tea has considerable health advantages, in-cluding antioxidant, antibacterial, antiviral, chemopreventive, and anticarcinogenic properties. As a result of environmental pollution (atmospheric deposition) and the production process, tealeaves may also include a variety of dangerous substances, such as polycyclic aromatic hydrocarbons (PAHs). In this study, graphene oxide reinforced chitosan/poly-nitroaniline polymer was prepared to develop a sensitive and reliable solid phase extraction method (SPE) for extraction of PAH7 in tea samples, followed by high-performance liquid chromatography- fluorescence detection. The prepared adsorbent was validated in terms of linearity, the limit of detection, the limit of quantification, recovery (%), accuracy (%), and precision (%) for the determination of the PAH7 (benzo[a]pyrene, benzo[a]anthracene, benzo[b]fluoranthene, chrysene, benzo[b]fluoranthene, Dibenzo[a,h]anthracene and Benzo[g,h,i]perylene) in tea samples. The concentration was determined in two types of tea commercially available in Saudi Arabia, including black tea and green tea. The maximum mean of Σ7PAHs in black tea samples was 68.23 ± 0.02 ug kg-1 and 26.68 ± 0.01 ug kg-1 in green tea samples. The minimum mean of Σ7PAHs in black tea samples was 37.93 ± 0.01 ug kg-1 and 15.26 ± 0.01 ug kg-1 in green tea samples. The mean value of benzo[a]pyrene in black tea samples ranged from 6.85 to 12.17 ug kg-1, where two samples exceeded the standard level (10 ug kg-1) established by the European Union (UE), while in green tea ranged from 1.78 to 2.81 ug kg-1. Low levels of Σ7PAHs in green tea samples were detected in comparison with black tea samples.

Keywords: polycyclic aromatic hydrocarbons, CS, PNA and GO, black/green tea, solid phase extraction, Saudi Arabia

Procedia PDF Downloads 69
1522 Chemical Study of Volatile Organic Compounds (VOCS) from Xylopia aromatica (LAM.) Mart (Annonaceae)

Authors: Vanessa G. P. Severino, JOÃO Gabriel M. Junqueira, Michelle N. G. do Nascimento, Francisco W. B. Aquino, João B. Fernandes, Ana P. Terezan

Abstract:

The scientific interest in analyzing VOCs represents a significant modern research field as a result of importance in most branches of the present life and industry. Therefore it is extremely important to investigate, identify and isolate volatile substances, since they can be used in different areas, such as food, medicine, cosmetics, perfumery, aromatherapy, pesticides, repellents and other household products through methods for extracting volatile constituents, such as solid phase microextraction (SPME), hydrodistillation (HD), solvent extraction (SE), Soxhlet extraction, supercritical fluid extraction (SFE), stream distillation (SD) and vacuum distillation (VD). The Chemometrics is an area of chemistry that uses statistical and mathematical tools for the planning and optimization of the experimental conditions, and to extract relevant chemical information multivariate chemical data. In this context, the focus of this work was the study of the chemical VOCs by SPME of the specie X. aromatica, in search of constituents that can be used in the industrial sector as well as in food, cosmetics and perfumery, since these areas industrial has a considerable role. In addition, by chemometric analysis, we sought to maximize the answers of this research, in order to search for the largest number of compounds. The investigation of flowers from X. aromatica in vitro and in alive mode proved consistent, but certain factors supposed influence the composition of metabolites, and the chemometric analysis strengthened the analysis. Thus, the study of the chemical composition of X. aromatica contributed to the VOCs knowledge of the species and a possible application.

Keywords: chemometrics, flowers, HS-SPME, Xylopia aromatica

Procedia PDF Downloads 333
1521 Evaluation of Arsenic Removal in Soils Contaminated by the Phytoremediation Technique

Authors: V. Ibujes, A. Guevara, P. Barreto

Abstract:

Concentration of arsenic represents a serious threat to human health. It is a bioaccumulable toxic element and is transferred through the food chain. In Ecuador, values of 0.0423 mg/kg As are registered in potatoes of the skirts of the Tungurahua volcano. The increase of arsenic contamination in Ecuador is mainly due to mining activity, since the process of gold extraction generates toxic tailings with mercury. In the Province of Azuay, due to the mining activity, the soil reaches concentrations of 2,500 to 6,420 mg/kg As whereas in the province of Tungurahua it can be found arsenic concentrations of 6.9 to 198.7 mg/kg due to volcanic eruptions. Since the contamination by arsenic, the present investigation is directed to the remediation of the soils in the provinces of Azuay and Tungurahua by phytoremediation technique and the definition of a methodology of extraction by means of analysis of arsenic in the system soil-plant. The methodology consists in selection of two types of plants that have the best arsenic removal capacity in synthetic solutions 60 μM As, a lower percentage of mortality and hydroponics resistance. The arsenic concentrations in each plant were obtained from taking 10 ml aliquots and the subsequent analysis of the ICP-OES (inductively coupled plasma-optical emission spectrometry) equipment. Soils were contaminated with synthetic solutions of arsenic with the capillarity method to achieve arsenic concentration of 13 and 15 mg/kg. Subsequently, two types of plants were evaluated to reduce the concentration of arsenic in soils for 7 weeks. The global variance for soil types was obtained with the InfoStat program. To measure the changes in arsenic concentration in the soil-plant system, the Rhizo and Wenzel arsenic extraction methodology was used and subsequently analyzed with the ICP-OES (optima 8000 Pekin Elmer). As a result, the selected plants were bluegrass and llanten, due to the high percentages of arsenic removal of 55% and 67% and low mortality rates of 9% and 8% respectively. In conclusion, Azuay soil with an initial concentration of 13 mg/kg As reached the concentrations of 11.49 and 11.04 mg/kg As for bluegrass and llanten respectively, and for the initial concentration of 15 mg/kg As reached 11.79 and 11.10 mg/kg As for blue grass and llanten after 7 weeks. For the Tungurahua soil with an initial concentration of 13 mg/kg As it reached the concentrations of 11.56 and 12.16 mg/kg As for the bluegrass and llanten respectively, and for the initial concentration of 15 mg/kg As reached 11.97 and 12.27 mg/kg Ace for bluegrass and llanten after 7 weeks. The best arsenic extraction methodology of soil-plant system is Wenzel.

Keywords: blue grass, llanten, phytoremediation, soil of Azuay, soil of Tungurahua, synthetic arsenic solution

Procedia PDF Downloads 82
1520 Optimal and Best Timing for Capturing Satellite Thermal Images of Concrete Object

Authors: Toufic Abd El-Latif Sadek

Abstract:

The concrete object represents the concrete areas, like buildings. The best, easy, and efficient extraction of the concrete object from satellite thermal images occurred at specific times during the days of the year, by preventing the gaps in times which give the close and same brightness from different objects. Thus, to achieve the best original data which is the aim of the study and then better extraction of the concrete object and then better analysis. The study was done using seven sample objects, asphalt, concrete, metal, rock, dry soil, vegetation, and water, located at one place carefully investigated in a way that all the objects achieve the homogeneous in acquired data at the same time and same weather conditions. The samples of the objects were on the roof of building at position taking by global positioning system (GPS) which its geographical coordinates is: Latitude= 33 degrees 37 minutes, Longitude= 35 degrees 28 minutes, Height= 600 m. It has been found that the first choice and the best time in February is at 2:00 pm, in March at 4 pm, in April and may at 12 pm, in August at 5:00 pm, in October at 11:00 am. The best time in June and November is at 2:00 pm.

Keywords: best timing, concrete areas, optimal, satellite thermal images

Procedia PDF Downloads 330
1519 Influence of the Low Frequency Ultrasound on the Cadmium (II) Biosorption by an Ecofriendly Biocomposite (Extraction Solid Waste of Ammi visnaga / Calcium Alginate): Kinetic Modeling

Authors: L. Nouri Taiba, Y. Bouhamidi, F. Kaouah, Z. Bendjama, M. Trari

Abstract:

In the present study, an ecofriendly biocomposite namely calcium alginate immobilized Ammi Visnaga (Khella) extraction waste (SWAV/CA) was prepared by electrostatic extrusion method and used on the cadmium biosorption from aqueous phase with and without the assistance of ultrasound in batch conditions. The influence of low frequency ultrasound (37 and 80 KHz) on the cadmium biosorption kinetics was studied. The obtained results show that the ultrasonic irradiation significantly enhances and improves the efficiency of the cadmium removal. The Pseudo first order, Pseudo-second-order, Intraparticle diffusion, and Elovich models were evaluated using the non-linear curve fitting analysis method. Modeling of kinetic results shows that biosorption process is best described by the pseudo-second order and Elovich, in both the absence and presence of ultrasound.

Keywords: biocomposite, biosorption, cadmium, non-linear analysis, ultrasound

Procedia PDF Downloads 254
1518 Simulation of Multistage Extraction Process of Co-Ni Separation Using Ionic Liquids

Authors: Hongyan Chen, Megan Jobson, Andrew J. Masters, Maria Gonzalez-Miquel, Simon Halstead, Mayri Diaz de Rienzo

Abstract:

Ionic liquids offer excellent advantages over conventional solvents for industrial extraction of metals from aqueous solutions, where such extraction processes bring opportunities for recovery, reuse, and recycling of valuable resources and more sustainable production pathways. Recent research on the use of ionic liquids for extraction confirms their high selectivity and low volatility, but there is relatively little focus on how their properties can be best exploited in practice. This work addresses gaps in research on process modelling and simulation, to support development, design, and optimisation of these processes, focusing on the separation of the highly similar transition metals, cobalt, and nickel. The study exploits published experimental results, as well as new experimental results, relating to the separation of Co and Ni using trihexyl (tetradecyl) phosphonium chloride. This extraction agent is attractive because it is cheaper, more stable and less toxic than fluorinated hydrophobic ionic liquids. This process modelling work concerns selection and/or development of suitable models for the physical properties, distribution coefficients, for mass transfer phenomena, of the extractor unit and of the multi-stage extraction flowsheet. The distribution coefficient model for cobalt and HCl represents an anion exchange mechanism, supported by the literature and COSMO-RS calculations. Parameters of the distribution coefficient models are estimated by fitting the model to published experimental extraction equilibrium results. The mass transfer model applies Newman’s hard sphere model. Diffusion coefficients in the aqueous phase are obtained from the literature, while diffusion coefficients in the ionic liquid phase are fitted to dynamic experimental results. The mass transfer area is calculated from the surface to mean diameter of liquid droplets of the dispersed phase, estimated from the Weber number inside the extractor. New experiments measure the interfacial tension between the aqueous and ionic phases. The empirical models for predicting the density and viscosity of solutions under different metal loadings are also fitted to new experimental data. The extractor is modelled as a continuous stirred tank reactor with mass transfer between the two phases and perfect phase separation of the outlet flows. A multistage separation flowsheet simulation is set up to replicate a published experiment and compare model predictions with the experimental results. This simulation model is implemented in gPROMS software for dynamic process simulation. The results of single stage and multi-stage flowsheet simulations are shown to be in good agreement with the published experimental results. The estimated diffusion coefficient of cobalt in the ionic liquid phase is in reasonable agreement with published data for the diffusion coefficients of various metals in this ionic liquid. A sensitivity study with this simulation model demonstrates the usefulness of the models for process design. The simulation approach has potential to be extended to account for other metals, acids, and solvents for process development, design, and optimisation of extraction processes applying ionic liquids for metals separations, although a lack of experimental data is currently limiting the accuracy of models within the whole framework. Future work will focus on process development more generally and on extractive separation of rare earths using ionic liquids.

Keywords: distribution coefficient, mass transfer, COSMO-RS, flowsheet simulation, phosphonium

Procedia PDF Downloads 160
1517 Volatile Profile of Monofloral Honeys Produced by Stingless Bees from the Brazilian Semiarid Region

Authors: Ana Caroliny Vieira da Costa, Marta Suely Madruga

Abstract:

In Brazil, there is a diverse fauna of social bees, known by Meliponinae or native stingless bees. These bees are important for providing a differentiated product, especially regarding unique sweetness, flavor, and aroma. However, information about the volatile fraction in honey produced by stingless native bees is still lacking. The aim of this work was to characterize the volatile compound profile of monofloral honey produced by jandaíra bees (Melipona subnitida Ducke) which used chanana (Turnera ulmifolia L.), malícia (Mimosa quadrivalvis) and algaroba (Prosopis juliflora (Sw.) DC) as their floral sources; and by uruçu bees (Melipona scutellaris Latrelle), which used chanana (Turnera ulmifolia L.), malícia (Mimosa quadrivalvis) and angico (Anadenanthera colubrina) as their floral sources. The volatiles were extracted using HS-SPME-GC-MS technique. The condition for the extraction was: equilibration time of 15 minutes, extraction time of 45 min and extraction temperature of 45°C. Through the results obtained, it was observed that the floral source had a strong influence on the aroma profile of the honey under evaluation, since the chemical profiles were marked primarily by the classes of terpenes, norisoprenoids, and benzene derivatives. Furthermore, the results obtained suggest the existence of differentiator compounds and potential markers for the botanical sources evaluated, such as linalool, D-sylvestrene, rose oxide and benzenethanol. These reports represent a valuable contribution to certifying the authenticity of those honey and provides for the first time, information intended for the construction of chemical knowledge of the aroma and flavor that characterize these honey produced in Brazil.

Keywords: aroma, honey, semiarid, stingless, volatiles

Procedia PDF Downloads 228
1516 Modeling of a Pilot Installation for the Recovery of Residual Sludge from Olive Oil Extraction

Authors: Riad Benelmir, Muhammad Shoaib Ahmed Khan

Abstract:

The socio-economic importance of the olive oil production is significant in the Mediterranean region, both in terms of wealth and tradition. However, the extraction of olive oil generates huge quantities of wastes that may have a great impact on land and water environment because of their high phytotoxicity. Especially olive mill wastewater (OMWW) is one of the major environmental pollutants in olive oil industry. This work projects to design a smart and sustainable integrated thermochemical catalytic processes of residues from olive mills by hydrothermal carbonization (HTC) of olive mill wastewater (OMWW) and fast pyrolysis of olive mill wastewater sludge (OMWS). The byproducts resulting from OMWW-HTC treatment are a solid phase enriched in carbon, called biochar and a liquid phase (residual water with less dissolved organic and phenolic compounds). HTC biochar can be tested as a fuel in combustion systems and will also be utilized in high-value applications, such as soil bio-fertilizer and as catalyst or/and catalyst support. The HTC residual water is characterized, treated and used in soil irrigation since the organic and the toxic compounds will be reduced under the permitted limits. This project’s concept includes also the conversion of OMWS to a green diesel through a catalytic pyrolysis process. The green diesel is then used as biofuel in an internal combustion engine (IC-Engine) for automotive application to be used for clean transportation. In this work, a theoretical study is considered for the use of heat from the pyrolysis non-condensable gases in a sorption-refrigeration machine for pyrolysis gases cooling and condensation of bio-oil vapors.

Keywords: biomass, olive oil extraction, adsorption cooling, pyrolisis

Procedia PDF Downloads 57
1515 Comparative Life Cycle Assessment of High Barrier Polymer Packaging for Selecting Resource Efficient and Environmentally Low-Impact Materials

Authors: D. Kliaugaitė, J. K, Staniškis

Abstract:

In this study tree types of multilayer gas barrier plastic packaging films were compared using life cycle assessment as a tool for resource efficient and environmentally low-impact materials selection. The first type of multilayer packaging film (PET-AlOx/LDPE) consists of polyethylene terephthalate with barrier layer AlOx (PET-AlOx) and low density polyethylene (LDPE). The second type of polymer film (PET/PE-EVOH-PE) is made of polyethylene terephthalate (PET) and co-extrusion film PE-EVOH-PE as barrier layer. And the third one type of multilayer packaging film (PET-PVOH/LDPE) is formed from polyethylene terephthalate with barrier layer PVOH (PET-PVOH) and low density polyethylene (LDPE). All of analyzed packaging has significant impact to resource depletion, because of raw materials extraction and energy use and production of different kind of plastics. Nevertheless the impact generated during life cycle of functional unit of II type of packaging (PET/PE-EVOH-PE) was about 25% lower than impact generated by I type (PET-AlOx/LDPE) and III type (PET-PVOH/LDPE) of packaging. Result revealed that the contribution of different gas barrier type to the overall environmental problem of packaging is not significant. The impact are mostly generated by using energy and materials during raw material extraction and production of different plastic materials as plastic polymers material as PE, LDPE and PET, but not gas barrier materials as AlOx, PVOH and EVOH. The LCA results could be useful in different decision-making processes, for selecting resource efficient and environmentally low-impact materials.

Keywords: life cycle assessment, polymer packaging, resource efficiency, materials extraction, polyethylene terephthalate

Procedia PDF Downloads 337
1514 Measuring Text-Based Semantics Relatedness Using WordNet

Authors: Madiha Khan, Sidrah Ramzan, Seemab Khan, Shahzad Hassan, Kamran Saeed

Abstract:

Measuring semantic similarity between texts is calculating semantic relatedness between texts using various techniques. Our web application (Measuring Relatedness of Concepts-MRC) allows user to input two text corpuses and get semantic similarity percentage between both using WordNet. Our application goes through five stages for the computation of semantic relatedness. Those stages are: Preprocessing (extracts keywords from content), Feature Extraction (classification of words into Parts-of-Speech), Synonyms Extraction (retrieves synonyms against each keyword), Measuring Similarity (using keywords and synonyms, similarity is measured) and Visualization (graphical representation of similarity measure). Hence the user can measure similarity on basis of features as well. The end result is a percentage score and the word(s) which form the basis of similarity between both texts with use of different tools on same platform. In future work we look forward for a Web as a live corpus application that provides a simpler and user friendly tool to compare documents and extract useful information.

Keywords: Graphviz representation, semantic relatedness, similarity measurement, WordNet similarity

Procedia PDF Downloads 211
1513 Parameters of Validation Method of Determining Polycyclic Aromatic Hydrocarbons in Drinking Water by High Performance Liquid Chromatography

Authors: Jonida Canaj

Abstract:

A simple method of extraction and determination of fifteen priority polycyclic aromatic hydrocarbons (PAHs) from drinking water using high performance liquid chromatography (HPLC) has been validated with limits of detection (LOD) and limits of quantification (LOQ), method recovery and reproducibility, and other factors. HPLC parameters, such as mobile phase composition and flow standardized for determination of PAHs using fluorescent detector (FLD). PAH was carried out by liquid-liquid extraction using dichloromethane. Linearity of calibration curves was good for all PAH (R², 0.9954-1.0000) in the concentration range 0.1-100 ppb. Analysis of standard spiked water samples resulted in good recoveries between 78.5-150%(0.1ppb) and 93.04-137.47% (10ppb). The estimated LOD and LOQ ranged between 0.0018-0.98 ppb. The method described has been used for determination of the fifteen PAHs contents in drinking water samples.

Keywords: high performance liquid chromatography, HPLC, method validation, polycyclic aromatic hydrocarbons, PAHs, water

Procedia PDF Downloads 80
1512 A Combined Feature Extraction and Thresholding Technique for Silence Removal in Percussive Sounds

Authors: B. Kishore Kumar, Pogula Rakesh, T. Kishore Kumar

Abstract:

The music analysis is a part of the audio content analysis used to analyze the music by using the different features of audio signal. In music analysis, the first step is to divide the music signal to different sections based on the feature profiles of the music signal. In this paper, we present a music segmentation technique that will effectively segmentize the signal and thresholding technique to remove silence from the percussive sounds produced by percussive instruments, which uses two features of music, namely signal energy and spectral centroid. The proposed method impose thresholds on both the features which will vary depends on the music signal. Depends on the threshold, silence part is removed and the segmentation is done. The effectiveness of the proposed method is analyzed using MATLAB.

Keywords: percussive sounds, spectral centroid, spectral energy, silence removal, feature extraction

Procedia PDF Downloads 564
1511 Phylogenetic Differential Separation of Environmental Samples

Authors: Amber C. W. Vandepoele, Michael A. Marciano

Abstract:

Biological analyses frequently focus on single organisms, however many times, the biological sample consists of more than the target organism; for example, human microbiome research targets bacterial DNA, yet most samples consist largely of human DNA. Therefore, there would be an advantage to removing these contaminating organisms. Conversely, some analyses focus on a single organism but would greatly benefit from the additional information regarding the other organismal components of the sample. Forensic analysis is one such example, wherein most forensic casework, human DNA is targeted; however, it typically exists in complex non-pristine sample substrates such as soil or unclean surfaces. These complex samples are commonly comprised of not just human tissue but also microbial and plant life, where these organisms may help gain more forensically relevant information about a specific location or interaction. This project aims to optimize a ‘phylogenetic’ differential extraction method that will separate mammalian, bacterial and plant cells in a mixed sample. This is accomplished through the use of size exclusion separation, whereby the different cell types are separated through multiple filtrations using 5 μm filters. The components are then lysed via differential enzymatic sensitivities among the cells and extracted with minimal contribution from the preceding component. This extraction method will then allow complex DNA samples to be more easily interpreted through non-targeting sequencing since the data will not be skewed toward the smaller and usually more numerous bacterial DNAs. This research project has demonstrated that this ‘phylogenetic’ differential extraction method successfully separated the epithelial and bacterial cells from each other with minimal cell loss. We will take this one step further, showing that when adding the plant cells into the mixture, they will be separated and extracted from the sample. Research is ongoing, and results are pending.

Keywords: DNA isolation, geolocation, non-human, phylogenetic separation

Procedia PDF Downloads 89
1510 Speciation Analysis by Solid-Phase Microextraction and Application to Atrazine

Authors: K. Benhabib, X. Pierens, V-D Nguyen, G. Mimanne

Abstract:

The main hypothesis of the dynamics of solid phase microextraction (SPME) is that steady-state mass transfer is respected throughout the SPME extraction process. It considers steady-state diffusion is established in the two phases and fast exchange of the analyte at the solid phase film/water interface. An improved model is proposed in this paper to handle with the situation when the analyte (atrazine) is in contact with colloid suspensions (carboxylate latex in aqueous solution). A mathematical solution is obtained by substituting the diffusion coefficient by the mean of diffusion coefficient between analyte and carboxylate latex, and also thickness layer by the mean thickness in aqueous solution. This solution provides an equation relating the extracted amount of the analyte to the extraction a little more complicated than previous models. It also gives a better description of experimental observations. Moreover, the rate constant of analyte obtained is in satisfactory agreement with that obtained from the initial curve fitting.

Keywords: pesticide, solid-phase microextraction (SPME) methods, steady state, analytical model

Procedia PDF Downloads 464
1509 Recovery of Au and Other Metals from Old Electronic Components by Leaching and Liquid Extraction Process

Authors: Tomasz Smolinski, Irena Herdzik-Koniecko, Marta Pyszynska, M. Rogowski

Abstract:

Old electronic components can be easily found nowadays. Significant quantities of valuable metals such as gold, silver or copper are used for the production of advanced electronic devices. Old useless electronic device slowly became a new source of precious metals, very often more efficient than natural. For example, it is possible to recover more gold from 1-ton personal computers than seventeen tons of gold ore. It makes urban mining industry very profitable and necessary for sustainable development. For the recovery of metals from waste of electronic equipment, various treatment options based on conventional physical, hydrometallurgical and pyrometallurgical processes are available. In this group hydrometallurgy processes with their relatively low capital cost, low environmental impact, potential for high metal recoveries and suitability for small scale applications, are very promising options. Institute of Nuclear Chemistry and Technology has great experience in hydrometallurgy processes especially focused on recovery metals from industrial and agricultural wastes. At the moment, urban mining project is carried out. The method of effective recovery of valuable metals from central processing units (CPU) components has been developed. The principal processes such as acidic leaching and solvent extraction were used for precious metals recovery from old processors and graphic cards. Electronic components were treated by acidic solution at various conditions. Optimal acid concentration, time of the process and temperature were selected. Precious metals have been extracted to the aqueous phase. At the next step, metals were selectively extracted by organic solvents such as oximes or tributyl phosphate (TBP) etc. Multistage mixer-settler equipment was used. The process was optimized.

Keywords: electronic waste, leaching, hydrometallurgy, metal recovery, solvent extraction

Procedia PDF Downloads 116
1508 Organic Matter Distribution in Bazhenov Source Rock: Insights from Sequential Extraction and Molecular Geochemistry

Authors: Margarita S. Tikhonova, Alireza Baniasad, Anton G. Kalmykov, Georgy A. Kalmykov, Ralf Littke

Abstract:

There is a high complexity in the pore structure of organic-rich rocks caused by the combination of inter-particle porosity from inorganic mineral matter and ultrafine intra-particle porosity from both organic matter and clay minerals. Fluids are retained in that pore space, but there are major uncertainties in how and where the fluids are stored and to what extent they are accessible or trapped in 'closed' pores. A large degree of tortuosity may lead to fractionation of organic matter so that the lighter and flexible compounds would diffuse to the reservoir whereas more complicated compounds may be locked in place. Additionally, parts of hydrocarbons could be bound to solid organic matter –kerogen– and mineral matrix during expulsion and migration. Larger compounds can occupy thin channels so that clogging or oil and gas entrapment will occur. Sequential extraction of applying different solvents is a powerful tool to provide more information about the characteristics of trapped organic matter distribution. The Upper Jurassic – Lower Cretaceous Bazhenov shale is one of the most petroliferous source rock extended in West Siberia, Russia. Concerning the variable mineral composition, pore space distribution and thermal maturation, there are high uncertainties in distribution and composition of organic matter in this formation. In order to address this issue geological and geochemical properties of 30 samples including mineral composition (XRD and XRF), structure and texture (thin-section microscopy), organic matter contents, type and thermal maturity (Rock-Eval) as well as molecular composition (GC-FID and GC-MS) of different extracted materials during sequential extraction were considered. Sequential extraction was performed by a Soxhlet apparatus using different solvents, i.e., n-hexane, chloroform and ethanol-benzene (1:1 v:v) first on core plugs and later on pulverized materials. The results indicate that the studied samples are mainly composed of type II kerogen with TOC contents varied from 5 to 25%. The thermal maturity ranged from immature to late oil window. Whereas clay contents decreased with increasing maturity, the amount of silica increased in the studied samples. According to molecular geochemistry, stored hydrocarbons in open and closed pore space reveal different geochemical fingerprints. The results improve our understanding of hydrocarbon expulsion and migration in the organic-rich Bazhenov shale and therefore better estimation of hydrocarbon potential for this formation.

Keywords: Bazhenov formation, bitumen, molecular geochemistry, sequential extraction

Procedia PDF Downloads 147
1507 Exploiting the Potential of Fabric Phase Sorptive Extraction for Forensic Food Safety: Analysis of Food Samples in Cases of Drug Facilitated Crimes

Authors: Bharti Jain, Rajeev Jain, Abuzar Kabir, Torki Zughaibi, Shweta Sharma

Abstract:

Drug-facilitated crimes (DFCs) entail the use of a single drug or a mixture of drugs to render a victim unable. Traditionally, biological samples have been gathered from victims and conducted analysis to establish evidence of drug administration. Nevertheless, the rapid metabolism of various drugs and delays in analysis can impede the identification of such substances. For this, the present article describes a rapid, sustainable, highly efficient and miniaturized protocol for the identification and quantification of three sedative-hypnotic drugs, namely diazepam, chlordiazepoxide and ketamine in alcoholic beverages and complex food samples (cream of biscuit, flavored milk, juice, cake, tea, sweets and chocolate). The methodology involves utilizing fabric phase sorptive extraction (FPSE) to extract diazepam (DZ), chlordiazepoxide (CDP), and ketamine (KET). Subsequently, the extracted samples are subjected to analysis using gas chromatography-mass spectrometry (GC-MS). Several parameters, including the type of membrane, pH, agitation time and speed, ionic strength, sample volume, elution volume and time, and type of elution solvent, were screened and thoroughly optimized. Sol-gel Carbowax 20M (CW-20M) has demonstrated the most effective extraction efficiency for the target analytes among all evaluated membranes. Under optimal conditions, the method displayed linearity within the range of 0.3–10 µg mL–¹ (or µg g–¹), exhibiting a coefficient of determination (R2) ranging from 0.996–0.999. The limits of detection (LODs) and limits of quantification (LOQs) for liquid samples range between 0.020-0.069 µg mL-¹ and 0.066-0.22 µg mL-¹, respectively. Correspondingly, the LODs for solid samples ranged from 0.056-0.090 µg g-¹, while the LOQs ranged from 0.18-0.29 µg g-¹. Notably, the method showcased better precision, with repeatability and reproducibility both below 5% and 10%, respectively. Furthermore, the FPSE-GC-MS method proved effective in determining diazepam (DZ) in forensic food samples connected to drug-facilitated crimes (DFCs). Additionally, the proposed method underwent evaluation for its whiteness using the RGB12 algorithm.

Keywords: drug facilitated crime, fabric phase sorptive extraction, food forensics, white analytical chemistry

Procedia PDF Downloads 43
1506 Multidirectional Product Support System for Decision Making in Textile Industry Using Collaborative Filtering Methods

Authors: A. Senthil Kumar, V. Murali Bhaskaran

Abstract:

In the information technology ground, people are using various tools and software for their official use and personal reasons. Nowadays, people are worrying to choose data accessing and extraction tools at the time of buying and selling their products. In addition, worry about various quality factors such as price, durability, color, size, and availability of the product. The main purpose of the research study is to find solutions to these unsolved existing problems. The proposed algorithm is a Multidirectional Rank Prediction (MDRP) decision making algorithm in order to take an effective strategic decision at all the levels of data extraction, uses a real time textile dataset and analyzes the results. Finally, the results are obtained and compared with the existing measurement methods such as PCC, SLCF, and VSS. The result accuracy is higher than the existing rank prediction methods.

Keywords: Knowledge Discovery in Database (KDD), Multidirectional Rank Prediction (MDRP), Pearson’s Correlation Coefficient (PCC), VSS (Vector Space Similarity)

Procedia PDF Downloads 258
1505 Structuring of Multilayer Aluminum Nickel by Lift-off Process Using Cheap Negative Resist

Authors: Muhammad Talal Asghar

Abstract:

The lift-off technique of the photoresist for metal patterning in integrated circuit (IC) packaging has been widely utilized in the field of microelectromechanical systems and semiconductor component manufacturing. The main advantage lies in cost-saving, reduction in complexity, and maturity of the process. The selection of photoresist depends upon many factors such as cost, the thickness of the resist, comfortable and valuable parameters extraction. In the present study, an extremely cheap dry film photoresist E8015 of thickness 38-micrometer is processed for the first time for edge profiling, according to the author's best knowledge. Successful extraction of the helpful parameter range for resist processing is performed. An undercut angle of 66 to 73 degrees is realized by parameter variation like exposure energy and development time. Finally, 10-micrometer thick metallic multilayer aluminum nickel is lifted off on the plain silicon wafer. Possible applications lie in controlled self-propagating reactions within structured metallic multilayer that may be utilized for IC packaging in the future.

Keywords: lift-off, IC packaging, photoresist, multilayer

Procedia PDF Downloads 189
1504 Capturing the Stress States in Video Conferences by Photoplethysmographic Pulse Detection

Authors: Jarek Krajewski, David Daxberger

Abstract:

We propose a stress detection method based on an RGB camera using heart rate detection, also known as Photoplethysmography Imaging (PPGI). This technique focuses on the measurement of the small changes in skin colour caused by blood perfusion. A stationary lab setting with simulated video conferences is chosen using constant light conditions and a sampling rate of 30 fps. The ground truth measurement of heart rate is conducted with a common PPG system. The proposed approach for pulse peak detection is based on a machine learning-based approach, applying brute force feature extraction for the prediction of heart rate pulses. The statistical analysis showed good agreement (correlation r = .79, p<0.05) between the reference heart rate system and the proposed method. Based on these findings, the proposed method could provide a reliable, low-cost, and contactless way of measuring HR parameters in daily-life environments.

Keywords: heart rate, PPGI, machine learning, brute force feature extraction

Procedia PDF Downloads 106