Search results for: comprehensive feature extraction
5741 Roughness Discrimination Using Bioinspired Tactile Sensors
Authors: Zhengkun Yi
Abstract:
Surface texture discrimination using artificial tactile sensors has attracted increasing attentions in the past decade as it can endow technical and robot systems with a key missing ability. However, as a major component of texture, roughness has rarely been explored. This paper presents an approach for tactile surface roughness discrimination, which includes two parts: (1) design and fabrication of a bioinspired artificial fingertip, and (2) tactile signal processing for tactile surface roughness discrimination. The bioinspired fingertip is comprised of two polydimethylsiloxane (PDMS) layers, a polymethyl methacrylate (PMMA) bar, and two perpendicular polyvinylidene difluoride (PVDF) film sensors. This artificial fingertip mimics human fingertips in three aspects: (1) Elastic properties of epidermis and dermis in human skin are replicated by the two PDMS layers with different stiffness, (2) The PMMA bar serves the role analogous to that of a bone, and (3) PVDF film sensors emulate Meissner’s corpuscles in terms of both location and response to the vibratory stimuli. Various extracted features and classification algorithms including support vector machines (SVM) and k-nearest neighbors (kNN) are examined for tactile surface roughness discrimination. Eight standard rough surfaces with roughness values (Ra) of 50 μm, 25 μm, 12.5 μm, 6.3 μm 3.2 μm, 1.6 μm, 0.8 μm, and 0.4 μm are explored. The highest classification accuracy of (82.6 ± 10.8) % can be achieved using solely one PVDF film sensor with kNN (k = 9) classifier and the standard deviation feature.Keywords: bioinspired fingertip, classifier, feature extraction, roughness discrimination
Procedia PDF Downloads 3115740 Detecting Potential Biomarkers for Ulcerative Colitis Using Hybrid Feature Selection
Authors: Mustafa Alshawaqfeh, Bilal Wajidy, Echin Serpedin, Jan Suchodolski
Abstract:
Inflammatory Bowel disease (IBD) is a disease of the colon with characteristic inflammation. Clinically IBD is detected using laboratory tests (blood and stool), radiology tests (imaging using CT, MRI), capsule endoscopy and endoscopy. There are two variants of IBD referred to as Ulcerative Colitis (UC) and Crohn’s disease. This study employs a hybrid feature selection method that combines a correlation-based variable ranking approach with exhaustive search wrapper methods in order to find potential biomarkers for UC. The proposed biomarkers presented accurate discriminatory power thereby identifying themselves to be possible ingredients to UC therapeutics.Keywords: ulcerative colitis, biomarker detection, feature selection, inflammatory bowel disease (IBD)
Procedia PDF Downloads 4025739 Design of an Automated Deep Learning Recurrent Neural Networks System Integrated with IoT for Anomaly Detection in Residential Electric Vehicle Charging in Smart Cities
Authors: Wanchalerm Patanacharoenwong, Panaya Sudta, Prachya Bumrungkun
Abstract:
The paper focuses on the development of a system that combines Internet of Things (IoT) technologies and deep learning algorithms for anomaly detection in residential Electric Vehicle (EV) charging in smart cities. With the increasing number of EVs, ensuring efficient and reliable charging systems has become crucial. The aim of this research is to develop an integrated IoT and deep learning system for detecting anomalies in residential EV charging and enhancing EV load profiling and event detection in smart cities. This approach utilizes IoT devices equipped with infrared cameras to collect thermal images and household EV charging profiles from the database of Thailand utility, subsequently transmitting this data to a cloud database for comprehensive analysis. The methodology includes the use of advanced deep learning techniques such as Recurrent Neural Networks (RNN) and Long Short-Term Memory (LSTM) algorithms. IoT devices equipped with infrared cameras are used to collect thermal images and EV charging profiles. The data is transmitted to a cloud database for comprehensive analysis. The researchers also utilize feature-based Gaussian mixture models for EV load profiling and event detection. Moreover, the research findings demonstrate the effectiveness of the developed system in detecting anomalies and critical profiles in EV charging behavior. The system provides timely alarms to users regarding potential issues and categorizes the severity of detected problems based on a health index for each charging device. The system also outperforms existing models in event detection accuracy. This research contributes to the field by showcasing the potential of integrating IoT and deep learning techniques in managing residential EV charging in smart cities. The system ensures operational safety and efficiency while also promoting sustainable energy management. The data is collected using IoT devices equipped with infrared cameras and is stored in a cloud database for analysis. The collected data is then analyzed using RNN, LSTM, and feature-based Gaussian mixture models. The approach includes both EV load profiling and event detection, utilizing a feature-based Gaussian mixture model. This comprehensive method aids in identifying unique power consumption patterns among EV owners and outperforms existing models in event detection accuracy. In summary, the research concludes that integrating IoT and deep learning techniques can effectively detect anomalies in residential EV charging and enhance EV load profiling and event detection accuracy. The developed system ensures operational safety and efficiency, contributing to sustainable energy management in smart cities.Keywords: cloud computing framework, recurrent neural networks, long short-term memory, Iot, EV charging, smart grids
Procedia PDF Downloads 645738 Phase Diagrams and Liquid-Liquid Extraction in Aqueous Biphasic Systems Formed by Polyethylene Glycol and Potassium Sodium Tartrate at 303.15 K
Authors: Amanda Cristina de Oliveira, Elias de Souza Monteiro Filho, Roberta Ceriani
Abstract:
Liquid-liquid extraction in aqueous two-phase systems (ATPSs) constitutes a powerful tool for purifying bio-materials, such as cells, organelles, proteins, among others. In this work, the extraction of the bovine serum albumin (BSA) has been studied in systems formed by polyethylene glycol (PEG) (1500, 4000, and 6000 g.mol⁻¹) + potassium sodium tartrate + water at 303.15°K. Phase diagrams were obtained by turbidimetry and Merchuk’s method (1998). The experimental tie-lines were described using the Othmer-Tobias and Bancroft correlations. ATPSs were correlated with the nonrandom two-liquid (NRTL) model. The results were considered excellent according to global root-mean-square deviations found which were between 0,72 and 1,13%. The concentrations of the proteins in each phase were determined by spectrophotometry at 280 nm, finding partition efficiencies greater than 71%.Keywords: aqueous two phases systems, bovine serum albumin , liquid-liquid extraction, polyethylene glycol
Procedia PDF Downloads 1575737 Evolving Convolutional Filter Using Genetic Algorithm for Image Classification
Authors: Rujia Chen, Ajit Narayanan
Abstract:
Convolutional neural networks (CNN), as typically applied in deep learning, use layer-wise backpropagation (BP) to construct filters and kernels for feature extraction. Such filters are 2D or 3D groups of weights for constructing feature maps at subsequent layers of the CNN and are shared across the entire input. BP as a gradient descent algorithm has well-known problems of getting stuck at local optima. The use of genetic algorithms (GAs) for evolving weights between layers of standard artificial neural networks (ANNs) is a well-established area of neuroevolution. In particular, the use of crossover techniques when optimizing weights can help to overcome problems of local optima. However, the application of GAs for evolving the weights of filters and kernels in CNNs is not yet an established area of neuroevolution. In this paper, a GA-based filter development algorithm is proposed. The results of the proof-of-concept experiments described in this paper show the proposed GA algorithm can find filter weights through evolutionary techniques rather than BP learning. For some simple classification tasks like geometric shape recognition, the proposed algorithm can achieve 100% accuracy. The results for MNIST classification, while not as good as possible through standard filter learning through BP, show that filter and kernel evolution warrants further investigation as a new subarea of neuroevolution for deep architectures.Keywords: neuroevolution, convolutional neural network, genetic algorithm, filters, kernels
Procedia PDF Downloads 1865736 Bamboo Fibre Extraction and Its Reinforced Polymer Composite Material
Authors: P. Zakikhani, R. Zahari, M. T. H. Sultan, D. L. Majid
Abstract:
Natural plant fibres reinforced polymeric composite materials have been used in many fields of our lives to save the environment. Especially, bamboo fibres due to its environmental sustainability, mechanical properties, and recyclability have been utilized as reinforced polymer matrix composite in construction industries. In this review study bamboo structure and three different methods such as mechanical, chemical and combination of mechanical and chemical to extract fibres from bamboo are summarized. Each extraction method has been done base on the application of bamboo. In addition Bamboo fibre is compared with glass fibre from various aspects and in some parts it has advantages over the glass fibre.Keywords: bamboo fibres, natural fibres, bio composite, mechanical extraction, glass fibres
Procedia PDF Downloads 4905735 Extractive Desulfurization of Atmospheric Gasoil with N,N-Dimethylformamide
Authors: Kahina Bedda, Boudjema Hamada
Abstract:
Environmental regulations have been introduced in many countries around the world to reduce the sulfur content of diesel fuel to ultra low levels with the intention of lowering diesel engine’s harmful exhaust emissions and improving air quality. Removal of sulfur containing compounds from diesel feedstocks to produce ultra low sulfur diesel fuel by extraction with selective solvents has received increasing attention in recent years. This is because the sulfur extraction technologies compared to the hydrotreating processes could reduce the cost of desulfurization substantially since they do not demand hydrogen, and are carried out at atmospheric pressure. In this work, the desulfurization of distillate gasoil by liquid-liquid extraction with N, N-dimethylformamide was investigated. This fraction was recovered from a mixture of Hassi Messaoud crude oils and Hassi R'Mel gas-condensate in Algiers refinery. The sulfur content of this cut is 281 ppm. Experiments were performed in six-stage with a ratio of solvent:feed equal to 3:1. The effect of the extraction temperature was investigated in the interval 30 ÷ 110°C. At 110°C the yield of refined gas oil was 82% and its sulfur content was 69 ppm.Keywords: desulfurization, gasoil, N, N-dimethylformamide, sulfur content
Procedia PDF Downloads 3865734 Liquid-Liquid Extraction of Uranium(vi) from Aqueous Solution Using 1-Hydroxyalkylidene-1,1-Diphosphonic Acids
Authors: M. Bouhoun Ali, A. Y. Badjah Hadj Ahmed, M. Attou, A. Elias, M. A. Didi
Abstract:
The extraction of uranium(VI) from aqueous solutions has been investigated using 1-hydroxyhexadecylidene-1,1-diphosphonic acid (HHDPA) and 1-hydroxydodecylidene-1,1-diphosphonic acid (HDDPA), which were synthesized and characterized by elemental analysis and by FT-IR, 1H NMR, 31P NMR spectroscopy. In this paper, we propose a tentative assignment for the shifts of those two ligands and their specific complexes with uranium(VI). We carried out the extraction of uranium(VI) by HHDPA and HDDPA from [carbon tetrachloride + 2-octanol (v/v: 90%/10%)] solutions. Various factors such as contact time, pH, organic/aqueous phase ratio and extractant concentration were considered. The optimum conditions obtained were: contact time= 20 min, organic/aqueous phase ratio = 1, pH value = 3.0 and extractant concentration = 0.3M. The extraction yields are more significant in the case of the HHDPA which is equipped with a hydrocarbon chain, longer than that of the HDDPA. Logarithmic plots of the uranium(VI) distribution ratio vs. pHeq and the extractant concentration showed that the ratio of extractant to extracted uranium(VI) (ligand/metal) is 2:1. The formula of the complex of uranium(VI) with the HHDPA and the DHDPA is UO2(H3L)2 (HHDPA and DHDPA are denoted as H4L). A spectroscopic analysis has showed that coordination of uranium(VI) takes place via oxygen atoms.Keywords: liquid-liquid extraction, uranium(vi), 1-hydroxyalkylidene-1, 1-diphosphonic acids, hhdpa, hddpa, aqueous solution
Procedia PDF Downloads 2685733 Distribution of Phospholipids, Cholesterol and Carotenoids in Two-Solvent System during Egg Yolk Oil Solvent Extraction
Authors: Aleksandrs Kovalcuks, Mara Duma
Abstract:
Egg yolk oil is a concentrated source of egg bioactive compounds, such as fat-soluble vitamins, phospholipids, cholesterol, carotenoids and others. To extract lipids and other fat-soluble nutrients from liquid egg yolk, a two-step extraction process involving polar (ethanol) and non-polar (hexane) solvents were used. This extraction technique was based on egg yolk bioactive compounds polarities, where non-polar compound was extracted into non-polar hexane, but polar in to polar alcohol/water phase. But many egg yolk bioactive compounds are not strongly polar or non-polar. Egg yolk phospholipids, cholesterol and pigments are amphipatic (have both polar and non-polar regions) and their behavior in ethanol/hexane solvent system is not clear. The aim of this study was to clarify the behavior of phospholipids, cholesterol and carotenoids during extraction of egg yolk oil with ethanol and hexane and determine the loss of these compounds in egg yolk oil. Egg yolks and egg yolk oil were analyzed for phospholipids (phosphatidylcholine (PC) and phosphatidylethanolamine (PE)), cholesterol and carotenoids (lutein, zeaxanthin, canthaxanthin and β-carotene) content using GC-FID and HPLC methods. PC and PE are polar lipids and were extracted into polar ethanol phase. Concentration of PC in ethanol was 97.89% and PE 99.81% from total egg yolk phospholipids. Due to cholesterol’s partial extraction into ethanol, cholesterol content in egg yolk oil was reduced in comparison to its total content presented in egg yolk lipids. The highest amount of lutein and zeaxanthin was concentrated in ethanol extract. The opposite situation was observed with canthaxanthin and β-carotene, which became the main pigments of egg yolk oil.Keywords: cholesterol, egg yolk oil, lutein, phospholipids, solvent extraction
Procedia PDF Downloads 5095732 Simple Modified Method for DNA Isolation from Lyophilised Cassava Storage Roots (Manihot esculenta Crantz.)
Authors: P. K. Telengech, K. Monjero, J. Maling’a, A. Nyende, S. Gichuki
Abstract:
There is need to identify an efficient protocol for use in extraction of high quality DNA for purposes of molecular work. Cassava roots are known for their high starch content, polyphenols and other secondary metabolites which interfere with the quality of the DNA. These factors have negative interference on the various methodologies for DNA extraction. There is need to develop a simple, fast and inexpensive protocol that yields high quality DNA. In this improved Dellaporta method, the storage roots are lyophilized to reduce the water content; the extraction buffer is modified to eliminate the high polyphenols, starch and wax. This simple protocol was compared to other protocols intended for plants with similar secondary metabolites. The method gave high yield (300-950ng) and pure DNA for use in PCR analysis. This improved Dellaporta protocol allows isolation of pure DNA from starchy cassava storage roots.Keywords: cassava storage roots, dellaporta, DNA extraction, lyophilisation, polyphenols secondary metabolites
Procedia PDF Downloads 3635731 Performance Study of Neodymium Extraction by Carbon Nanotubes Assisted Emulsion Liquid Membrane Using Response Surface Methodology
Authors: Payman Davoodi-Nasab, Ahmad Rahbar-Kelishami, Jaber Safdari, Hossein Abolghasemi
Abstract:
The high purity rare earth elements (REEs) have been vastly used in the field of chemical engineering, metallurgy, nuclear energy, optical, magnetic, luminescence and laser materials, superconductors, ceramics, alloys, catalysts, and etc. Neodymium is one of the most abundant rare earths. By development of a neodymium–iron–boron (Nd–Fe–B) permanent magnet, the importance of neodymium has dramatically increased. Solvent extraction processes have many operational limitations such as large inventory of extractants, loss of solvent due to the organic solubility in aqueous solutions, volatilization of diluents, etc. One of the promising methods of liquid membrane processes is emulsion liquid membrane (ELM) which offers an alternative method to the solvent extraction processes. In this work, a study on Nd extraction through multi-walled carbon nanotubes (MWCNTs) assisted ELM using response surface methodology (RSM) has been performed. The ELM composed of diisooctylphosphinic acid (CYANEX 272) as carrier, MWCNTs as nanoparticles, Span-85 (sorbitan triooleate) as surfactant, kerosene as organic diluent and nitric acid as internal phase. The effects of important operating variables namely, surfactant concentration, MWCNTs concentration, and treatment ratio were investigated. Results were optimized using a central composite design (CCD) and a regression model for extraction percentage was developed. The 3D response surfaces of Nd(III) extraction efficiency were achieved and significance of three important variables and their interactions on the Nd extraction efficiency were found out. Results indicated that introducing the MWCNTs to the ELM process led to increasing the Nd extraction due to higher stability of membrane and mass transfer enhancement. MWCNTs concentration of 407 ppm, Span-85 concentration of 2.1 (%v/v) and treatment ratio of 10 were achieved as the optimum conditions. At the optimum condition, the extraction of Nd(III) reached the maximum of 99.03%.Keywords: emulsion liquid membrane, extraction of neodymium, multi-walled carbon nanotubes, response surface method
Procedia PDF Downloads 2555730 Simultaneous Extraction and Estimation of Steroidal Glycosides and Aglycone of Solanum
Authors: Karishma Chester, Sarvesh Paliwal, Sayeed Ahmad
Abstract:
Solanumnigrum L. (Family: Solanaceae), is an important Indian medicinal plant and have been used in various traditional formulations for hepato-protection. It has been reported to contain significant amount of steroidal glycosides such as solamargine and solasonine as well as their aglycone part solasodine. Being important pharmacologically active metabolites of several members of Solanaceae these markers have been attempted various times for their extraction and quantification but separately for glycoside and aglycone part because of their opposite polarity. Here, we propose for the first time simultaneous extraction and quantification of aglycone (solasodine)and glycosides (solamargine and solasonine) inleaves and berries of S.nigrumusing solvent extraction followed by HPTLC analysis. Simultaneous extraction was carried out by sonication in mixture of chloroform and methanol as solvent. The quantification was done using silica gel 60F254HPTLC plates as stationary phase and chloroform: methanol: acetone: 0.5 % ammonia (7: 2.5: 1: 0.4 v/v/v/v) as mobile phaseat 400 nm, after derivatization with an isaldehydesul furic acid reagent. The method was validated as per ICH guideline for calibration, linearity, precision, recovery, robustness, specificity, LOD, and LOQ. The statistical data obtained for validation showed that method can be used routinely for quality control of various solanaceous drugs reported for these markers as well as traditional formulations containing those plants as an ingredient.Keywords: solanumnigrum, solasodine, solamargine, solasonine, quantification
Procedia PDF Downloads 3295729 The Mechanism Study of Degradative Solvent Extraction of Biomass by Liquid Membrane-Fourier Transform Infrared Spectroscopy
Authors: W. Ketren, J. Wannapeera, Z. Heishun, A. Ryuichi, K. Toshiteru, M. Kouichi, O. Hideaki
Abstract:
Degradative solvent extraction is the method developed for biomass upgrading by dewatering and fractionation of biomass under the mild condition. However, the conversion mechanism of the degradative solvent extraction method has not been fully understood so far. The rice straw was treated in 1-methylnaphthalene (1-MN) at a different solvent-treatment temperature varied from 250 to 350 oC with the residence time for 60 min. The liquid membrane-Fourier Transform Infrared Spectroscopy (FTIR) technique is applied to study the processing mechanism in-depth without separation of the solvent. It has been found that the strength of the oxygen-hydrogen stretching (3600-3100 cm-1) decreased slightly with increasing temperature in the range of 300-350 oC. The decrease of the hydroxyl group in the solvent soluble suggested dehydration reaction taking place between 300 and 350 oC. FTIR spectra in the carbonyl stretching region (1800-1600 cm-1) revealed the presence of esters groups, carboxylic acid and ketonic groups in the solvent-soluble of biomass. The carboxylic acid increased in the range of 200 to 250 oC and then decreased. The prevailing of aromatic groups showed that the aromatization took place during extraction at above 250 oC. From 300 to 350 oC, the carbonyl functional groups in the solvent-soluble noticeably decreased. The removal of the carboxylic acid and the decrease of esters into the form of carbon dioxide indicated that the decarboxylation reaction occurred during the extraction process.Keywords: biomass waste, degradative solvent extraction, mechanism, upgrading
Procedia PDF Downloads 2855728 Synthetic Cannabinoids: Extraction, Identification and Purification
Authors: Niki K. Burns, James R. Pearson, Paul G. Stevenson, Xavier A. Conlan
Abstract:
In Australian state Victoria, synthetic cannabinoids have recently been made illegal under an amendment to the drugs, poisons and controlled substances act 1981. Identification of synthetic cannabinoids in popular brands of ‘incense’ and ‘potpourri’ has been a difficult and challenging task due to the sample complexity and changes observed in the chemical composition of the cannabinoids of interest. This study has developed analytical methodology for the targeted extraction and determination of synthetic cannabinoids available pre-ban. A simple solvent extraction and solid phase extraction methodology was developed that selectively extracted the cannabinoid of interest. High performance liquid chromatography coupled with UV‐visible and chemiluminescence detection (acidic potassium permanganate and tris (2,2‐bipyridine) ruthenium(III)) were used to interrogate the synthetic cannabinoid products. Mass spectrometry and nuclear magnetic resonance spectroscopy were used for structural elucidation of the synthetic cannabinoids. The tris(2,2‐bipyridine)ruthenium(III) detection was found to offer better sensitivity than the permanganate based reagents. In twelve different brands of herbal incense, cannabinoids were extracted and identified including UR‐144, XLR 11, AM2201, 5‐F‐AKB48 and A796‐260.Keywords: electrospray mass spectrometry, high performance liquid chromatography, solid phase extraction, synthetic cannabinoids
Procedia PDF Downloads 4675727 Graph-Based Semantical Extractive Text Analysis
Authors: Mina Samizadeh
Abstract:
In the past few decades, there has been an explosion in the amount of available data produced from various sources with different topics. The availability of this enormous data necessitates us to adopt effective computational tools to explore the data. This leads to an intense growing interest in the research community to develop computational methods focused on processing this text data. A line of study focused on condensing the text so that we are able to get a higher level of understanding in a shorter time. The two important tasks to do this are keyword extraction and text summarization. In keyword extraction, we are interested in finding the key important words from a text. This makes us familiar with the general topic of a text. In text summarization, we are interested in producing a short-length text which includes important information about the document. The TextRank algorithm, an unsupervised learning method that is an extension of the PageRank (algorithm which is the base algorithm of Google search engine for searching pages and ranking them), has shown its efficacy in large-scale text mining, especially for text summarization and keyword extraction. This algorithm can automatically extract the important parts of a text (keywords or sentences) and declare them as a result. However, this algorithm neglects the semantic similarity between the different parts. In this work, we improved the results of the TextRank algorithm by incorporating the semantic similarity between parts of the text. Aside from keyword extraction and text summarization, we develop a topic clustering algorithm based on our framework, which can be used individually or as a part of generating the summary to overcome coverage problems.Keywords: keyword extraction, n-gram extraction, text summarization, topic clustering, semantic analysis
Procedia PDF Downloads 705726 A Method for Solid-Liquid Separation of Cs+ from Radioactive Waste by Using Ionic Liquids and Extractants
Authors: J. W. Choi, S. Y. Cho, H. J. Lee, W. Z. Oh, S. J. Choi
Abstract:
Ionic liquids (ILs), which is alternative to conventional organic solvent, were used for extraction of Cs ions. ILs, as useful environment friendly green solvents, have been recently applied as replacement for traditional volatile organic compounds (VOCs) in liquid/liquid extraction of heavy metal ions as well as organic and inorganic species and pollutants. Thus, Ionic liquids were used for extraction of Cs ions from the liquid radioactive waste. In most cases, Cs ions present in radioactive wastes in very low concentration, approximately less than 1ppm. Therefore, unlike established extraction system the required amount of ILs as extractant is comparatively very small. This extraction method involves cation exchange mechanism in which Cs ion transfers to the organic phase and binds to one crown ether by chelation in exchange of single ILs cation, IL_cation+, transfer to the aqueous phase. In this extraction system showed solid-liquid separation in which the Ionic liquid 1-ethyl-3-methylimidazolium bis(trifluoromethylsulfonly)imide (C2mimTf2N) and the crown ether Dicyclohexano-18-crown-6 (DCH18C6) both were used here in very little amount as solvent and as extractant, respectively. 30 mM of CsNO3 was used as simulated waste solution cesium ions. Generally, in liquid-liquid extraction, the molar ratio of CE:Cs+:ILs was 1:5~10:>100, while our applied molar ratio of CE:Cs+:ILs was 1:2:1~10. The quantity of CE and Cs ions were fixed to 0.6 and 1.2 mmol, respectively. The phenomenon of precipitation showed two kinds of separation: solid-liquid separation in the ratio of 1:2:1 and 1:2:2; solid-liquid-liquid separation (3 phase) in the ratio of 1:2:5 and 1:2:10. In the last system, 3 phases were precipitate-ionic liquids-aqueous. The precipitate was verified to consist of Cs+, DCH18C6, Tf2N- based on the cation exchange mechanism. We analyzed precipitate using scanning electron microscopy with X-ray microanalysis (SEM-EDS), an elemental analyser, Fourier transform infrared spectroscopy (FT-IR) and differential scanning calorimetry (DSC). The experimental results showed an easy extraction method and confirmed the composition of solid precipitate. We also obtained information that complex formation ratio of Cs+ to DCH18C6 is 0.88:1 regardless of C2mimTf2N quantities.Keywords: extraction, precipitation, solid-liquid seperation, ionic liquid, precipitate
Procedia PDF Downloads 4205725 Automatic Extraction of Arbitrarily Shaped Buildings from VHR Satellite Imagery
Authors: Evans Belly, Imdad Rizvi, M. M. Kadam
Abstract:
Satellite imagery is one of the emerging technologies which are extensively utilized in various applications such as detection/extraction of man-made structures, monitoring of sensitive areas, creating graphic maps etc. The main approach here is the automated detection of buildings from very high resolution (VHR) optical satellite images. Initially, the shadow, the building and the non-building regions (roads, vegetation etc.) are investigated wherein building extraction is mainly focused. Once all the landscape is collected a trimming process is done so as to eliminate the landscapes that may occur due to non-building objects. Finally the label method is used to extract the building regions. The label method may be altered for efficient building extraction. The images used for the analysis are the ones which are extracted from the sensors having resolution less than 1 meter (VHR). This method provides an efficient way to produce good results. The additional overhead of mid processing is eliminated without compromising the quality of the output to ease the processing steps required and time consumed.Keywords: building detection, shadow detection, landscape generation, label, partitioning, very high resolution (VHR) satellite imagery
Procedia PDF Downloads 3145724 Comprehensive Risk Assessment Model in Agile Construction Environment
Authors: Jolanta Tamošaitienė
Abstract:
The article focuses on a developed comprehensive model to be used in an agile environment for the risk assessment and selection based on multi-attribute methods. The model is based on a multi-attribute evaluation of risk in construction, and the determination of their optimality criterion values are calculated using complex Multiple Criteria Decision-Making methods. The model may be further applied to risk assessment in an agile construction environment. The attributes of risk in a construction project are selected by applying the risk assessment condition to the construction sector, and the construction process efficiency in the construction industry accounts for the agile environment. The paper presents the comprehensive risk assessment model in an agile construction environment. It provides a background and a description of the proposed model and the developed analysis of the comprehensive risk assessment model in an agile construction environment with the criteria.Keywords: assessment, environment, agile, model, risk
Procedia PDF Downloads 2555723 Evaluating Models Through Feature Selection Methods Using Data Driven Approach
Authors: Shital Patil, Surendra Bhosale
Abstract:
Cardiac diseases are the leading causes of mortality and morbidity in the world, from recent few decades accounting for a large number of deaths have emerged as the most life-threatening disorder globally. Machine learning and Artificial intelligence have been playing key role in predicting the heart diseases. A relevant set of feature can be very helpful in predicting the disease accurately. In this study, we proposed a comparative analysis of 4 different features selection methods and evaluated their performance with both raw (Unbalanced dataset) and sampled (Balanced) dataset. The publicly available Z-Alizadeh Sani dataset have been used for this study. Four feature selection methods: Data Analysis, minimum Redundancy maximum Relevance (mRMR), Recursive Feature Elimination (RFE), Chi-squared are used in this study. These methods are tested with 8 different classification models to get the best accuracy possible. Using balanced and unbalanced dataset, the study shows promising results in terms of various performance metrics in accurately predicting heart disease. Experimental results obtained by the proposed method with the raw data obtains maximum AUC of 100%, maximum F1 score of 94%, maximum Recall of 98%, maximum Precision of 93%. While with the balanced dataset obtained results are, maximum AUC of 100%, F1-score 95%, maximum Recall of 95%, maximum Precision of 97%.Keywords: cardio vascular diseases, machine learning, feature selection, SMOTE
Procedia PDF Downloads 1185722 Application of KL Divergence for Estimation of Each Metabolic Pathway Genes
Authors: Shohei Maruyama, Yasuo Matsuyama, Sachiyo Aburatani
Abstract:
The development of the method to annotate unknown gene functions is an important task in bioinformatics. One of the approaches for the annotation is The identification of the metabolic pathway that genes are involved in. Gene expression data have been utilized for the identification, since gene expression data reflect various intracellular phenomena. However, it has been difficult to estimate the gene function with high accuracy. It is considered that the low accuracy of the estimation is caused by the difficulty of accurately measuring a gene expression. Even though they are measured under the same condition, the gene expressions will vary usually. In this study, we proposed a feature extraction method focusing on the variability of gene expressions to estimate the genes' metabolic pathway accurately. First, we estimated the distribution of each gene expression from replicate data. Next, we calculated the similarity between all gene pairs by KL divergence, which is a method for calculating the similarity between distributions. Finally, we utilized the similarity vectors as feature vectors and trained the multiclass SVM for identifying the genes' metabolic pathway. To evaluate our developed method, we applied the method to budding yeast and trained the multiclass SVM for identifying the seven metabolic pathways. As a result, the accuracy that calculated by our developed method was higher than the one that calculated from the raw gene expression data. Thus, our developed method combined with KL divergence is useful for identifying the genes' metabolic pathway.Keywords: metabolic pathways, gene expression data, microarray, Kullback–Leibler divergence, KL divergence, support vector machines, SVM, machine learning
Procedia PDF Downloads 4035721 Comparison of Microwave-Assisted and Conventional Leaching for Extraction of Copper from Chalcopyrite Concentrate
Authors: Ayfer Kilicarslan, Kubra Onol, Sercan Basit, Muhlis Nezihi Saridede
Abstract:
Chalcopyrite (CuFeS2) is the most common primary mineral used for the commercial production of copper. The low dissolution efficiency of chalcopyrite in sulfate media has prevented an efficient industrial leaching of this mineral in sulfate media. Ferric ions, bacteria, oxygen and other oxidants have been used as oxidizing agents in the leaching of chalcopyrite in sulfate and chloride media under atmospheric or pressure leaching conditions. Two leaching methods were studied to evaluate chalcopyrite (CuFeS2) dissolution in acid media. First, the conventional oxidative acid leaching method was carried out using sulfuric acid (H2SO4) and potassium dichromate (K2Cr2O7) as oxidant at atmospheric pressure. Second, microwave-assisted acid leaching was performed using the microwave accelerated reaction system (MARS) for same reaction media. Parameters affecting the copper extraction such as leaching time, leaching temperature, concentration of H2SO4 and concentration of K2Cr2O7 were investigated. The results of conventional acid leaching experiments were compared to the microwave leaching method. It was found that the copper extraction obtained under high temperature and high concentrations of oxidant with microwave leaching is higher than those obtained conventionally. 81% copper extraction was obtained by the conventional oxidative acid leaching method in 180 min, with the concentration of 0.3 mol/L K2Cr2O7 in 0.5M H2SO4 at 50 ºC, while 93.5% copper extraction was obtained in 60 min with microwave leaching method under same conditions.Keywords: extraction, copper, microwave-assisted leaching, chalcopyrite, potassium dichromate
Procedia PDF Downloads 3705720 Investigation of the EEG Signal Parameters during Epileptic Seizure Phases in Consequence to the Application of External Healing Therapy on Subjects
Authors: Karan Sharma, Ajay Kumar
Abstract:
Epileptic seizure is a type of disease due to which electrical charge in the brain flows abruptly resulting in abnormal activity by the subject. One percent of total world population gets epileptic seizure attacks.Due to abrupt flow of charge, EEG (Electroencephalogram) waveforms change. On the display appear a lot of spikes and sharp waves in the EEG signals. Detection of epileptic seizure by using conventional methods is time-consuming. Many methods have been evolved that detect it automatically. The initial part of this paper provides the review of techniques used to detect epileptic seizure automatically. The automatic detection is based on the feature extraction and classification patterns. For better accuracy decomposition of the signal is required before feature extraction. A number of parameters are calculated by the researchers using different techniques e.g. approximate entropy, sample entropy, Fuzzy approximate entropy, intrinsic mode function, cross-correlation etc. to discriminate between a normal signal & an epileptic seizure signal.The main objective of this review paper is to present the variations in the EEG signals at both stages (i) Interictal (recording between the epileptic seizure attacks). (ii) Ictal (recording during the epileptic seizure), using most appropriate methods of analysis to provide better healthcare diagnosis. This research paper then investigates the effects of a noninvasive healing therapy on the subjects by studying the EEG signals using latest signal processing techniques. The study has been conducted with Reiki as a healing technique, beneficial for restoring balance in cases of body mind alterations associated with an epileptic seizure. Reiki is practiced around the world and is recommended for different health services as a treatment approach. Reiki is an energy medicine, specifically a biofield therapy developed in Japan in the early 20th century. It is a system involving the laying on of hands, to stimulate the body’s natural energetic system. Earlier studies have shown an apparent connection between Reiki and the autonomous nervous system. The Reiki sessions are applied by an experienced therapist. EEG signals are measured at baseline, during session and post intervention to bring about effective epileptic seizure control or its elimination altogether.Keywords: EEG signal, Reiki, time consuming, epileptic seizure
Procedia PDF Downloads 4065719 ANOVA-Based Feature Selection and Machine Learning System for IoT Anomaly Detection
Authors: Muhammad Ali
Abstract:
Cyber-attacks and anomaly detection on the Internet of Things (IoT) infrastructure is emerging concern in the domain of data-driven intrusion. Rapidly increasing IoT risk is now making headlines around the world. denial of service, malicious control, data type probing, malicious operation, DDos, scan, spying, and wrong setup are attacks and anomalies that can affect an IoT system failure. Everyone talks about cyber security, connectivity, smart devices, and real-time data extraction. IoT devices expose a wide variety of new cyber security attack vectors in network traffic. For further than IoT development, and mainly for smart and IoT applications, there is a necessity for intelligent processing and analysis of data. So, our approach is too secure. We train several machine learning models that have been compared to accurately predicting attacks and anomalies on IoT systems, considering IoT applications, with ANOVA-based feature selection with fewer prediction models to evaluate network traffic to help prevent IoT devices. The machine learning (ML) algorithms that have been used here are KNN, SVM, NB, D.T., and R.F., with the most satisfactory test accuracy with fast detection. The evaluation of ML metrics includes precision, recall, F1 score, FPR, NPV, G.M., MCC, and AUC & ROC. The Random Forest algorithm achieved the best results with less prediction time, with an accuracy of 99.98%.Keywords: machine learning, analysis of variance, Internet of Thing, network security, intrusion detection
Procedia PDF Downloads 1255718 Mood Recognition Using Indian Music
Authors: Vishwa Joshi
Abstract:
The study of mood recognition in the field of music has gained a lot of momentum in the recent years with machine learning and data mining techniques and many audio features contributing considerably to analyze and identify the relation of mood plus music. In this paper we consider the same idea forward and come up with making an effort to build a system for automatic recognition of mood underlying the audio song’s clips by mining their audio features and have evaluated several data classification algorithms in order to learn, train and test the model describing the moods of these audio songs and developed an open source framework. Before classification, Preprocessing and Feature Extraction phase is necessary for removing noise and gathering features respectively.Keywords: music, mood, features, classification
Procedia PDF Downloads 4955717 Extraction of the Volatile Oils of Dictyopteris Membranacea by Focused Microwave Assisted Hydrodistillation and Supercritical Carbon Dioxide: Chemical Composition and Kinetic Data
Authors: Mohamed El Hattab
Abstract:
The Supercritical carbon dioxide (SFE) and the focused microwave-assisted hydrodistillation (FMAHD) were employed to isolate the volatile fraction of the brown alga Dictyopteris membranacea from the crude extract. The volatiles fractions obtained were analyzed by GC/MS. The major compounds in this case: dictyopterene A, 6-butylcyclohepta-1,4-diene, Undec-1-en-3-one, Undeca-1,4-dien-3-one, (3-oxoundec-4-enyl) sulphur, tetradecanoic acid, hexadecanoic acid, 3-hexyl-4,5-dithia-cycloheptanone and albicanol (this later is present only in the FMAHD oil) are identified by comparing their mass spectra with those reported on the commercial MS data base and also on our previously work. A kinetic study realized on both extraction processes and followed by an external standard quantification has allowed the study of the mass percent evolution of the major compounds in the two oils, an empirical mathematical modelling was used to describe their kinetic extraction.Keywords: dictyopteris membranacea, extraction techniques, mathematical modeling, volatile oils
Procedia PDF Downloads 4285716 An Architectural Approach for the Dynamic Adaptation of Services-Based Software
Authors: Mohhamed Yassine Baroudi, Abdelkrim Benammar, Fethi Tarik Bendimerad
Abstract:
This paper proposes software architecture for dynamical service adaptation. The services are constituted by reusable software components. The adaptation’s goal is to optimize the service function of their execution context. For a first step, the context will take into account just the user needs but other elements will be added. A particular feature in our proposition is the profiles that are used not only to describe the context’s elements but also the components itself. An adapter analyzes the compatibility between all these profiles and detects the points where the profiles are not compatibles. The same Adapter search and apply the possible adaptation solutions: component customization, insertion, extraction or replacement.Keywords: adaptative service, software component, service, dynamic adaptation
Procedia PDF Downloads 2985715 Traffic Prediction with Raw Data Utilization and Context Building
Authors: Zhou Yang, Heli Sun, Jianbin Huang, Jizhong Zhao, Shaojie Qiao
Abstract:
Traffic prediction is essential in a multitude of ways in modern urban life. The researchers of earlier work in this domain carry out the investigation chiefly with two major focuses: (1) the accurate forecast of future values in multiple time series and (2) knowledge extraction from spatial-temporal correlations. However, two key considerations for traffic prediction are often missed: the completeness of raw data and the full context of the prediction timestamp. Concentrating on the two drawbacks of earlier work, we devise an approach that can address these issues in a two-phase framework. First, we utilize the raw trajectories to a greater extent through building a VLA table and data compression. We obtain the intra-trajectory features with graph-based encoding and the intertrajectory ones with a grid-based model and the technique of back projection that restore their surrounding high-resolution spatial-temporal environment. To the best of our knowledge, we are the first to study direct feature extraction from raw trajectories for traffic prediction and attempt the use of raw data with the least degree of reduction. In the prediction phase, we provide a broader context for the prediction timestamp by taking into account the information that are around it in the training dataset. Extensive experiments on several well-known datasets have verified the effectiveness of our solution that combines the strength of raw trajectory data and prediction context. In terms of performance, our approach surpasses several state-of-the-art methods for traffic prediction.Keywords: traffic prediction, raw data utilization, context building, data reduction
Procedia PDF Downloads 1275714 Extraction of Phycocyanin from Spirulina platensis by Isoelectric Point Precipitation and Salting Out for Scale Up Processes
Authors: Velasco-Rendón María Del Carmen, Cuéllar-Bermúdez Sara Paulina, Parra-Saldívar Roberto
Abstract:
Phycocyanin is a blue pigment protein with fluorescent activity produced by cyanobacteria. It has been recently studied to determine its anticancer, antioxidant and antiinflamatory potential. Since 2014 it was approved as a Generally Recognized As Safe (GRAS) proteic pigment for the food industry. Therefore, phycocyanin shows potential for the food, nutraceutical, pharmaceutical and diagnostics industry. Conventional phycocyanin extraction includes buffer solutions and ammonium sulphate followed by chromatography or ATPS for protein separation. Therefore, further purification steps are time-requiring, energy intensive and not suitable for scale-up processing. This work presents an alternative to conventional methods that also allows large scale application with commercially available equipment. The extraction was performed by exposing the dry biomass to mechanical cavitation and salting out with NaCl to use an edible reagent. Also, isoelectric point precipitation was used by addition of HCl and neutralization with NaOH. The results were measured and compared in phycocyanin concentration, purity and extraction yield. Results showed that the best extraction condition was the extraction by salting out with 0.20 M NaCl after 30 minutes cavitation, with a concentration in the supernatant of 2.22 mg/ml, a purity of 3.28 and recovery from crude extract of 81.27%. Mechanical cavitation presumably increased the solvent-biomass contact, making the crude extract visibly dark blue after centrifugation. Compared to other systems, our process has less purification steps, similar concentrations in the phycocyanin-rich fraction and higher purity. The contaminants present in our process edible NaCl or low pHs that can be neutralized. It also can be adapted to a semi-continuous process with commercially available equipment. This characteristics make this process an appealing alternative for phycocyanin extraction as a pigment for the food industry.Keywords: extraction, phycocyanin, precipitation, scale-up
Procedia PDF Downloads 4385713 An EEG-Based Scale for Comatose Patients' Vigilance State
Authors: Bechir Hbibi, Lamine Mili
Abstract:
Understanding the condition of comatose patients can be difficult, but it is crucial to their optimal treatment. Consequently, numerous scoring systems have been developed around the world to categorize patient states based on physiological assessments. Although validated and widely adopted by medical communities, these scores still present numerous limitations and obstacles. Even with the addition of additional tests and extensions, these scoring systems have not been able to overcome certain limitations, and it appears unlikely that they will be able to do so in the future. On the other hand, physiological tests are not the only way to extract ideas about comatose patients. EEG signal analysis has helped extensively to understand the human brain and human consciousness and has been used by researchers in the classification of different levels of disease. The use of EEG in the ICU has become an urgent matter in several cases and has been recommended by medical organizations. In this field, the EEG is used to investigate epilepsy, dementia, brain injuries, and many other neurological disorders. It has recently also been used to detect pain activity in some regions of the brain, for the detection of stress levels, and to evaluate sleep quality. In our recent findings, our aim was to use multifractal analysis, a very successful method of handling multifractal signals and feature extraction, to establish a state of awareness scale for comatose patients based on their electrical brain activity. The results show that this score could be instantaneous and could overcome many limitations with which the physiological scales stock. On the contrary, multifractal analysis stands out as a highly effective tool for characterizing non-stationary and self-similar signals. It demonstrates strong performance in extracting the properties of fractal and multifractal data, including signals and images. As such, we leverage this method, along with other features derived from EEG signal recordings from comatose patients, to develop a scale. This scale aims to accurately depict the vigilance state of patients in intensive care units and to address many of the limitations inherent in physiological scales such as the Glasgow Coma Scale (GCS) and the FOUR score. The results of applying version V0 of this approach to 30 patients with known GCS showed that the EEG-based score similarly describes the states of vigilance but distinguishes between the states of 8 sedated patients where the GCS could not be applied. Therefore, our approach could show promising results with patients with disabilities, injected with painkillers, and other categories where physiological scores could not be applied.Keywords: coma, vigilance state, EEG, multifractal analysis, feature extraction
Procedia PDF Downloads 675712 Gas Phase Extraction: An Environmentally Sustainable and Effective Method for The Extraction and Recovery of Metal from Ores
Authors: Kolela J Nyembwe, Darlington C. Ashiegbu, Herman J. Potgieter
Abstract:
Over the past few decades, the demand for metals has increased significantly. This has led to a decrease and decline of high-grade ore over time and an increase in mineral complexity and matrix heterogeneity. In addition to that, there are rising concerns about greener processes and a sustainable environment. Due to these challenges, the mining and metal industry has been forced to develop new technologies that are able to economically process and recover metallic values from low-grade ores, materials having a metal content locked up in industrially processed residues (tailings and slag), and complex matrix mineral deposits. Several methods to address these issues have been developed, among which are ionic liquids (IL), heap leaching, and bioleaching. Recently, the gas phase extraction technique has been gaining interest because it eliminates many of the problems encountered in conventional mineral processing methods. The technique relies on the formation of volatile metal complexes, which can be removed from the residual solids by a carrier gas. The complexes can then be reduced using the appropriate method to obtain the metal and regenerate-recover the organic extractant. Laboratory work on the gas phase have been conducted for the extraction and recovery of aluminium (Al), iron (Fe), copper (Cu), chrome (Cr), nickel (Ni), lead (Pb), and vanadium V. In all cases the extraction revealed to depend of temperature and mineral surface area. The process technology appears very promising, offers the feasibility of recirculation, organic reagent regeneration, and has the potential to deliver on all promises of a “greener” process.Keywords: gas-phase extraction, hydrometallurgy, low-grade ore, sustainable environment
Procedia PDF Downloads 132