Search results for: time-frequency feature extraction
2595 Feature Based Unsupervised Intrusion Detection
Authors: Deeman Yousif Mahmood, Mohammed Abdullah Hussein
Abstract:
The goal of a network-based intrusion detection system is to classify activities of network traffics into two major categories: normal and attack (intrusive) activities. Nowadays, data mining and machine learning plays an important role in many sciences; including intrusion detection system (IDS) using both supervised and unsupervised techniques. However, one of the essential steps of data mining is feature selection that helps in improving the efficiency, performance and prediction rate of proposed approach. This paper applies unsupervised K-means clustering algorithm with information gain (IG) for feature selection and reduction to build a network intrusion detection system. For our experimental analysis, we have used the new NSL-KDD dataset, which is a modified dataset for KDDCup 1999 intrusion detection benchmark dataset. With a split of 60.0% for the training set and the remainder for the testing set, a 2 class classifications have been implemented (Normal, Attack). Weka framework which is a java based open source software consists of a collection of machine learning algorithms for data mining tasks has been used in the testing process. The experimental results show that the proposed approach is very accurate with low false positive rate and high true positive rate and it takes less learning time in comparison with using the full features of the dataset with the same algorithm.Keywords: information gain (IG), intrusion detection system (IDS), k-means clustering, Weka
Procedia PDF Downloads 2962594 Liquid-Liquid Equilibrium Study in Solvent Extraction of o-Cresol from Coal Tar
Authors: Dewi Selvia Fardhyanti, Astrilia Damayanti
Abstract:
Coal tar is a liquid by-product of the process of coal gasification and carbonation, also in some industries such as steel, power plant, cement, and others. This liquid oil mixture contains various kinds of useful compounds such as aromatic compounds and phenolic compounds. These compounds are widely used as raw material for insecticides, dyes, medicines, perfumes, coloring matters, and many others. This research investigates thermodynamic modelling of liquid-liquid equilibria (LLE) in solvent extraction of o-Cresol from the coal tar. The equilibria are modeled by ternary components of Wohl, Van Laar, and Three-Suffix Margules models. The values of the parameters involved are obtained by curve-fitting to the experimental data. Based on the comparison between calculated and experimental data, it turns out that among the three models studied, the Three-Suffix Margules seems to be the best to predict the LLE of o-Cresol for those system.Keywords: coal tar, o-Cresol, Wohl, Van Laar, three-suffix margules
Procedia PDF Downloads 2782593 MhAGCN: Multi-Head Attention Graph Convolutional Network for Web Services Classification
Authors: Bing Li, Zhi Li, Yilong Yang
Abstract:
Web classification can promote the quality of service discovery and management in the service repository. It is widely used to locate developers desired services. Although traditional classification methods based on supervised learning models can achieve classification tasks, developers need to manually mark web services, and the quality of these tags may not be enough to establish an accurate classifier for service classification. With the doubling of the number of web services, the manual tagging method has become unrealistic. In recent years, the attention mechanism has made remarkable progress in the field of deep learning, and its huge potential has been fully demonstrated in various fields. This paper designs a multi-head attention graph convolutional network (MHAGCN) service classification method, which can assign different weights to the neighborhood nodes without complicated matrix operations or relying on understanding the entire graph structure. The framework combines the advantages of the attention mechanism and graph convolutional neural network. It can classify web services through automatic feature extraction. The comprehensive experimental results on a real dataset not only show the superior performance of the proposed model over the existing models but also demonstrate its potentially good interpretability for graph analysis.Keywords: attention mechanism, graph convolutional network, interpretability, service classification, service discovery
Procedia PDF Downloads 1372592 Effect of Impurities in the Chlorination Process of TiO2
Authors: Seok Hong Min, Tae Kwon Ha
Abstract:
With the increasing interest on Ti alloys, the extraction process of Ti from its typical ore, TiO2, has long been and will be important issue. As an intermediate product for the production of pigment or titanium metal sponge, tetrachloride (TiCl4) is produced by fluidized bed using high TiO2 feedstock. The purity of TiCl4 after chlorination is subjected to the quality of the titanium feedstock. Since the impurities in the TiCl4 product are reported to final products, the purification process of the crude TiCl4 is required. The purification process includes fractional distillation and chemical treatment, which depends on the nature of the impurities present and the required quality of the final product. In this study, thermodynamic analysis on the impurity effect in the chlorination process, which is the first step of extraction of Ti from TiO2, has been conducted. All thermodynamic calculations were performed using the FactSage thermodynamical software.Keywords: rutile, titanium, chlorination process, impurities, thermodynamic calculation, FactSage
Procedia PDF Downloads 3092591 Robust Recognition of Locomotion Patterns via Data-Driven Machine Learning in the Cloud Environment
Authors: Shinoy Vengaramkode Bhaskaran, Kaushik Sathupadi, Sandesh Achar
Abstract:
Human locomotion recognition is important in a variety of sectors, such as robotics, security, healthcare, fitness tracking and cloud computing. With the increasing pervasiveness of peripheral devices, particularly Inertial Measurement Units (IMUs) sensors, researchers have attempted to exploit these advancements in order to precisely and efficiently identify and categorize human activities. This research paper introduces a state-of-the-art methodology for the recognition of human locomotion patterns in a cloud environment. The methodology is based on a publicly available benchmark dataset. The investigation implements a denoising and windowing strategy to deal with the unprocessed data. Next, feature extraction is adopted to abstract the main cues from the data. The SelectKBest strategy is used to abstract optimal features from the data. Furthermore, state-of-the-art ML classifiers are used to evaluate the performance of the system, including logistic regression, random forest, gradient boosting and SVM have been investigated to accomplish precise locomotion classification. Finally, a detailed comparative analysis of results is presented to reveal the performance of recognition models.Keywords: artificial intelligence, cloud computing, IoT, human locomotion, gradient boosting, random forest, neural networks, body-worn sensors
Procedia PDF Downloads 132590 Demetallization of Crude Oil: Comparative Analysis of Deasphalting and Electrochemical Removal Methods of Ni and V
Authors: Nurlan Akhmetov, Abilmansur Yeshmuratov, Aliya Kurbanova, Gulnar Sugurbekova, Murat Baisariyev
Abstract:
Extraction of the vanadium and nickel compounds is complex due to the high stability of porphyrin, nickel is catalytic poison which deactivates catalysis during the catalytic cracking of the oil, while vanadyl is abrasive and valuable metal. Thus, high concentration of the Ni and V in the crude oil makes their removal relevant. Two methods of the demetallization of crude oil were tested, therefore, the present research is conducted for comparative analysis of the deasphalting with organic solvents (cyclohexane, carbon tetrachloride, chloroform) and electrochemical method. Percentage of Ni extraction reached maximum of approximately 55% by using the electrochemical method in electrolysis cell, which was developed for this research and consists of three sections: oil and protonating agent (EtOH) solution between two conducting membranes which divides it from two capsules of 10% sulfuric acid and two graphite electrodes which cover all three parts in electrical circuit. Ions of metals pass through membranes and remain in acid solutions. The best result was obtained in 60 minutes with ethanol to oil ratio 25% to 75% respectively, current fits in to the range from 0.3A to 0.4A, voltage changed from 12.8V to 17.3V. Maximum efficiency of deasphalting, with cyclohexane as the solvent, in Soxhlet extractor was 66.4% for Ni and 51.2% for V. Thus, applying the voltammetry, ICP MS (Inductively coupled plasma mass spectrometry) and AAS (atomic absorption spectroscopy), these mentioned types of metal extraction methods were compared in this paper.Keywords: electrochemistry, deasphalting of crude oil, demetallization of crude oil, petrolium engineering
Procedia PDF Downloads 2352589 An Adaptive Dimensionality Reduction Approach for Hyperspectral Imagery Semantic Interpretation
Authors: Akrem Sellami, Imed Riadh Farah, Basel Solaiman
Abstract:
With the development of HyperSpectral Imagery (HSI) technology, the spectral resolution of HSI became denser, which resulted in large number of spectral bands, high correlation between neighboring, and high data redundancy. However, the semantic interpretation is a challenging task for HSI analysis due to the high dimensionality and the high correlation of the different spectral bands. In fact, this work presents a dimensionality reduction approach that allows to overcome the different issues improving the semantic interpretation of HSI. Therefore, in order to preserve the spatial information, the Tensor Locality Preserving Projection (TLPP) has been applied to transform the original HSI. In the second step, knowledge has been extracted based on the adjacency graph to describe the different pixels. Based on the transformation matrix using TLPP, a weighted matrix has been constructed to rank the different spectral bands based on their contribution score. Thus, the relevant bands have been adaptively selected based on the weighted matrix. The performance of the presented approach has been validated by implementing several experiments, and the obtained results demonstrate the efficiency of this approach compared to various existing dimensionality reduction techniques. Also, according to the experimental results, we can conclude that this approach can adaptively select the relevant spectral improving the semantic interpretation of HSI.Keywords: band selection, dimensionality reduction, feature extraction, hyperspectral imagery, semantic interpretation
Procedia PDF Downloads 3542588 Optimization of Titanium Leaching Process Using Experimental Design
Authors: Arash Rafiei, Carroll Moore
Abstract:
Leaching process as the first stage of hydrometallurgy is a multidisciplinary system including material properties, chemistry, reactor design, mechanics and fluid dynamics. Therefore, doing leaching system optimization by pure scientific methods need lots of times and expenses. In this work, a mixture of two titanium ores and one titanium slag are used for extracting titanium for leaching stage of TiO2 pigment production procedure. Optimum titanium extraction can be obtained from following strategies: i) Maximizing titanium extraction without selective digestion; and ii) Optimizing selective titanium extraction by balancing between maximum titanium extraction and minimum impurity digestion. The main difference between two strategies is due to process optimization framework. For the first strategy, the most important stage of production process is concerned as the main stage and rest of stages would be adopted with respect to the main stage. The second strategy optimizes performance of more than one stage at once. The second strategy has more technical complexity compared to the first one but it brings more economical and technical advantages for the leaching system. Obviously, each strategy has its own optimum operational zone that is not as same as the other one and the best operational zone is chosen due to complexity, economical and practical aspects of the leaching system. Experimental design has been carried out by using Taguchi method. The most important advantages of this methodology are involving different technical aspects of leaching process; minimizing the number of needed experiments as well as time and expense; and concerning the role of parameter interactions due to principles of multifactor-at-time optimization. Leaching tests have been done at batch scale on lab with appropriate control on temperature. The leaching tank geometry has been concerned as an important factor to provide comparable agitation conditions. Data analysis has been done by using reactor design and mass balancing principles. Finally, optimum zone for operational parameters are determined for each leaching strategy and discussed due to their economical and practical aspects.Keywords: titanium leaching, optimization, experimental design, performance analysis
Procedia PDF Downloads 3752587 A Single Feature Probability-Object Based Image Analysis for Assessing Urban Landcover Change: A Case Study of Muscat Governorate in Oman
Authors: Salim H. Al Salmani, Kevin Tansey, Mohammed S. Ozigis
Abstract:
The study of the growth of built-up areas and settlement expansion is a major exercise that city managers seek to undertake to establish previous and current developmental trends. This is to ensure that there is an equal match of settlement expansion needs to the appropriate levels of services and infrastructure required. This research aims at demonstrating the potential of satellite image processing technique, harnessing the utility of single feature probability-object based image analysis technique in assessing the urban growth dynamics of the Muscat Governorate in Oman for the period 1990, 2002 and 2013. This need is fueled by the continuous expansion of the Muscat Governorate beyond predicted levels of infrastructural provision. Landsat Images of the years 1990, 2002 and 2013 were downloaded and preprocessed to forestall appropriate radiometric and geometric standards. A novel approach of probability filtering of the target feature segment was implemented to derive the spatial extent of the final Built-Up Area of the Muscat governorate for the three years period. This however proved to be a useful technique as high accuracy assessment results of 55%, 70%, and 71% were recorded for the Urban Landcover of 1990, 2002 and 2013 respectively. Furthermore, the Normalized Differential Built – Up Index for the various images were derived and used to consolidate the results of the SFP-OBIA through a linear regression model and visual comparison. The result obtained showed various hotspots where urbanization have sporadically taken place. Specifically, settlement in the districts (Wilayat) of AL-Amarat, Muscat, and Qurayyat experienced tremendous change between 1990 and 2002, while the districts (Wilayat) of AL-Seeb, Bawshar, and Muttrah experienced more sporadic changes between 2002 and 2013.Keywords: urban growth, single feature probability, object based image analysis, landcover change
Procedia PDF Downloads 2752586 New Approaches for the Handwritten Digit Image Features Extraction for Recognition
Authors: U. Ravi Babu, Mohd Mastan
Abstract:
The present paper proposes a novel approach for handwritten digit recognition system. The present paper extract digit image features based on distance measure and derives an algorithm to classify the digit images. The distance measure can be performing on the thinned image. Thinning is the one of the preprocessing technique in image processing. The present paper mainly concentrated on an extraction of features from digit image for effective recognition of the numeral. To find the effectiveness of the proposed method tested on MNIST database, CENPARMI, CEDAR, and newly collected data. The proposed method is implemented on more than one lakh digit images and it gets good comparative recognition results. The percentage of the recognition is achieved about 97.32%.Keywords: handwritten digit recognition, distance measure, MNIST database, image features
Procedia PDF Downloads 4622585 On the Interactive Search with Web Documents
Authors: Mario Kubek, Herwig Unger
Abstract:
Due to the large amount of information in the World Wide Web (WWW, web) and the lengthy and usually linearly ordered result lists of web search engines that do not indicate semantic relationships between their entries, the search for topically similar and related documents can become a tedious task. Especially, the process of formulating queries with proper terms representing specific information needs requires much effort from the user. This problem gets even bigger when the user's knowledge on a subject and its technical terms is not sufficient enough to do so. This article presents the new and interactive search application DocAnalyser that addresses this problem by enabling users to find similar and related web documents based on automatic query formulation and state-of-the-art search word extraction. Additionally, this tool can be used to track topics across semantically connected web documentsKeywords: DocAnalyser, interactive web search, search word extraction, query formulation, source topic detection, topic tracking
Procedia PDF Downloads 3942584 The Use of a Rabbit Model to Evaluate the Influence of Age on Excision Wound Healing
Authors: S. Bilal, S. A. Bhat, I. Hussain, J. D. Parrah, S. P. Ahmad, M. R. Mir
Abstract:
Background: The wound healing involves a highly coordinated cascade of cellular and immunological response over a period including coagulation, inflammation, granulation tissue formation, epithelialization, collagen synthesis and tissue remodeling. Wounds in aged heal more slowly than those in younger, mainly because of comorbidities that occur as one age. The present study is about the influence of age on wound healing. 1x1cm^2 (100 mm) wounds were created on the back of the animal. The animals were divided into two groups; one group had animals in the age group of 3-9 months while another group had animals in the age group of 15-21 months. Materials and Methods: 24 clinically healthy rabbits in the age group of 3-21 months were used as experimental animals and divided into two groups viz A and B. All experimental parameters, i.e., Excision wound model, Measurement of wound area, Protein extraction and estimation, Protein extraction and estimation and DNA extraction and estimation were done by standard methods. Results: The parameters studied were wound contraction, hydroxyproline, glucosamine, protein, and DNA. A significant increase (p<0.005) in the hydroxyproline, glucosamine, protein and DNA and a significant decrease in wound area (p<0.005) was observed in the age group of 3-9 months when compared to animals of an age group of 15-21 months. Wound contraction together with hydroxyproline, glucosamine, protein and DNA estimations suggest that advanced age results in retarded wound healing. Conclusion: The decrease wound contraction and accumulation of hydroxyproline, glucosamine, protein and DNA in group B animals may be associated with the reduction or delay in growth factors because of the advancing age.Keywords: age, wound healing, excision wound, hydroxyproline, glucosamine
Procedia PDF Downloads 6602583 Automatic Landmark Selection Based on Feature Clustering for Visual Autonomous Unmanned Aerial Vehicle Navigation
Authors: Paulo Fernando Silva Filho, Elcio Hideiti Shiguemori
Abstract:
The selection of specific landmarks for an Unmanned Aerial Vehicles’ Visual Navigation systems based on Automatic Landmark Recognition has significant influence on the precision of the system’s estimated position. At the same time, manual selection of the landmarks does not guarantee a high recognition rate, which would also result on a poor precision. This work aims to develop an automatic landmark selection that will take the image of the flight area and identify the best landmarks to be recognized by the Visual Navigation Landmark Recognition System. The criterion to select a landmark is based on features detected by ORB or AKAZE and edges information on each possible landmark. Results have shown that disposition of possible landmarks is quite different from the human perception.Keywords: clustering, edges, feature points, landmark selection, X-means
Procedia PDF Downloads 2822582 Effect of Microwave Radiations on Natural Dyes’ Application on Cotton
Authors: Rafia Asghar, Abdul Hafeez
Abstract:
The current research was related with natural dyes’ extraction from the powder of Neem (Azadirachta indica) bark and studied characterization of this dye under microwave radiation’s influence. Both cotton fabric and dyeing powder were exposed to microwave rays for different time intervals (2minutes, 4 minutes, 6 minutes, 8 minutes and 10 minutes) using conventional oven. Aqueous, 60% Methanol and Ethyl Acetate solubilized extracts obtained from Neem (Azadirachta indica) bark were also exposed to different time intervals (2minutes, 4 minutes, 6 minutes, 8 minutes and 10 minutes) of microwave rays exposure. Pre, meta and post mordanting with Alum (2%, 4%, 6%, 8%, and 10%) was done to improve color strength of the extracted dye. Exposure of Neem (Azadirachta indica) bark extract and cotton to microwave rays enhanced the extraction process and dyeing process by reducing extraction time, dyeing time and dyeing temperature. Microwave rays treatment had a very strong influence on color fastness and color strength properties of cotton that was dyes using Neem (Azadirachta indica) bark for 30 minutes and dyeing cotton with that Neem bark extract for 75 minutes at 30°C. Among pre, meta and post mordanting, results indicated that 5% concentration of Alum in meta mordanting exhibited maximum color strength.Keywords: dyes, natural dyeing, ecofriendly dyes, microwave treatment
Procedia PDF Downloads 6922581 Measuring How Brightness Mediates Auditory Salience
Authors: Baptiste Bouvier
Abstract:
While we are constantly flooded with stimuli in daily life, attention allows us to select the ones we specifically process and ignore the others. Some salient stimuli may sometimes pass this filter independently of our will, in a "bottom-up" way. The role of the acoustic properties of the timbre of a sound on its salience, i.e., its ability to capture the attention of a listener, is still not well understood. We implemented a paradigm called the "additional singleton paradigm", in which participants have to discriminate targets according to their duration. This task is perturbed (higher error rates and longer response times) by the presence of an irrelevant additional sound, of which we can manipulate a feature of our choice at equal loudness. This allows us to highlight the influence of the timbre features of a sound stimulus on its salience at equal loudness. We have shown that a stimulus that is brighter than the others but not louder leads to an attentional capture phenomenon in this framework. This work opens the door to the study of the influence of any timbre feature on salience.Keywords: attention, audition, bottom-up attention, psychoacoustics, salience, timbre
Procedia PDF Downloads 1712580 Optimizing Sustainable Graphene Production: Extraction of Graphite from Spent Primary and Secondary Batteries for Advanced Material Synthesis
Authors: Pratima Kumari, Sukha Ranjan Samadder
Abstract:
This research aims to contribute to the sustainable production of graphene materials by exploring the extraction of graphite from spent primary and secondary batteries. The increasing demand for graphene materials, a versatile and high-performance material, necessitates environmentally friendly methods for its synthesis. The process involves a well-planned methodology, beginning with the gathering and categorization of batteries, followed by the disassembly and careful removal of graphite from anode structures. The use of environmentally friendly solvents and mechanical techniques ensures an efficient and eco-friendly extraction of graphite. Advanced approaches such as the modified Hummers' method and chemical reduction process are utilized for the synthesis of graphene materials, with a focus on optimizing parameters. Various analytical techniques such as Fourier-transform infrared spectroscopy, X-ray diffraction, scanning electron microscopy, thermogravimetric analysis, and Raman spectroscopy were employed to validate the quality and structure of the produced graphene materials. The major findings of this study reveal the successful implementation of the methodology, leading to the production of high-quality graphene materials suitable for advanced material applications. Thorough characterization using various advanced techniques validates the structural integrity and purity of the graphene. The economic viability of the process is demonstrated through a comprehensive economic analysis, highlighting the potential for large-scale production. This research contributes to the field of sustainable production of graphene materials by offering a systematic methodology that efficiently transforms spent batteries into valuable graphene resources. Furthermore, the findings not only showcase the potential for upcycling electronic waste but also address the pressing need for environmentally conscious processes in advanced material synthesis.Keywords: spent primary batteries, spent secondary batteries, graphite extraction, advanced material synthesis, circular economy approach
Procedia PDF Downloads 542579 Investigation and Optimization of DNA Isolation Efficiency Using Ferrite-Based Magnetic Nanoparticles
Authors: Tímea Gerzsenyi, Ágnes M. Ilosvai, László Vanyorek, Emma Szőri-Dorogházi
Abstract:
DNA isolation is a crucial step in many molecular biological applications for diagnostic and research purposes. However, traditional extraction requires toxic reagents, and commercially available kits are expensive, this leading to the recently wide-spread method, the magnetic nanoparticle (MNP)-based DNA isolation. Different ferrite containing MNPs were examined and compared in their plasmid DNA isolation efficiency. Among the tested MNPs, one has never been used for the extraction of plasmid molecules, marking a distinct application. pDNA isolation process was optimized for each type of nanoparticle and the best protocol was selected based on different criteria: DNA quantity, quality and integrity. With the best-performing magnetic nanoparticle, which excelled in all aspects, further tests were performed to recover genomic DNA from bacterial cells and a protocol was developed.Keywords: DNA isolation, nanobiotechnology, magnetic nanoparticles, protocol optimization, pDNA, gDNA
Procedia PDF Downloads 162578 Fused Structure and Texture (FST) Features for Improved Pedestrian Detection
Authors: Hussin K. Ragb, Vijayan K. Asari
Abstract:
In this paper, we present a pedestrian detection descriptor called Fused Structure and Texture (FST) features based on the combination of the local phase information with the texture features. Since the phase of the signal conveys more structural information than the magnitude, the phase congruency concept is used to capture the structural features. On the other hand, the Center-Symmetric Local Binary Pattern (CSLBP) approach is used to capture the texture information of the image. The dimension less quantity of the phase congruency and the robustness of the CSLBP operator on the flat images, as well as the blur and illumination changes, lead the proposed descriptor to be more robust and less sensitive to the light variations. The proposed descriptor can be formed by extracting the phase congruency and the CSLBP values of each pixel of the image with respect to its neighborhood. The histogram of the oriented phase and the histogram of the CSLBP values for the local regions in the image are computed and concatenated to construct the FST descriptor. Several experiments were conducted on INRIA and the low resolution DaimlerChrysler datasets to evaluate the detection performance of the pedestrian detection system that is based on the FST descriptor. A linear Support Vector Machine (SVM) is used to train the pedestrian classifier. These experiments showed that the proposed FST descriptor has better detection performance over a set of state of the art feature extraction methodologies.Keywords: pedestrian detection, phase congruency, local phase, LBP features, CSLBP features, FST descriptor
Procedia PDF Downloads 4902577 A New Approach to Image Stitching of Radiographic Images
Authors: Somaya Adwan, Rasha Majed, Lamya'a Majed, Hamzah Arof
Abstract:
In order to produce images with whole body parts, X-ray of different portions of the body parts is assembled using image stitching methods. A new method for image stitching that exploits mutually feature based method and direct based method to identify and merge pairs of X-ray medical images is presented in this paper. The performance of the proposed method based on this hybrid approach is investigated in this paper. The ability of the proposed method to stitch and merge the overlapping pairs of images is demonstrated. Our proposed method display comparable if not superior performance to other feature based methods that are mentioned in the literature on the standard databases. These results are promising and demonstrate the potential of the proposed method for further development to tackle more advanced stitching problems.Keywords: image stitching, direct based method, panoramic image, X-ray
Procedia PDF Downloads 5432576 Optoelectronic Hardware Architecture for Recurrent Learning Algorithm in Image Processing
Authors: Abdullah Bal, Sevdenur Bal
Abstract:
This paper purposes a new type of hardware application for training of cellular neural networks (CNN) using optical joint transform correlation (JTC) architecture for image feature extraction. CNNs require much more computation during the training stage compare to test process. Since optoelectronic hardware applications offer possibility of parallel high speed processing capability for 2D data processing applications, CNN training algorithm can be realized using Fourier optics technique. JTC employs lens and CCD cameras with laser beam that realize 2D matrix multiplication and summation in the light speed. Therefore, in the each iteration of training, JTC carries more computation burden inherently and the rest of mathematical computation realized digitally. The bipolar data is encoded by phase and summation of correlation operations is realized using multi-object input joint images. Overlapping properties of JTC are then utilized for summation of two cross-correlations which provide less computation possibility for training stage. Phase-only JTC does not require data rearrangement, electronic pre-calculation and strict system alignment. The proposed system can be incorporated simultaneously with various optical image processing or optical pattern recognition techniques just in the same optical system.Keywords: CNN training, image processing, joint transform correlation, optoelectronic hardware
Procedia PDF Downloads 5072575 Antibacterial and Antioxidant Properties of Total Phenolics from Waste Orange Peels
Authors: Kanika Kalra, Harmeet Kaur, Dinesh Goyal
Abstract:
Total phenolics were extracted from waste orange peels by solvent extraction and alkali hydrolysis method. The most efficient solvents for extracting phenolic compounds from waste biomass were methanol (60%) > dimethyl sulfoxide > ethanol (60%) > distilled water. The extraction yields were significantly impacted by solvents (ethanol, methanol, and dimethyl sulfoxide) due to varying polarity and concentrations. Extraction of phenolics using 60% methanol yielded the highest phenolics (in terms of gallic acid equivalent (GAE) per gram of biomass) in orange peels. Alkali hydrolyzed extract from orange peels contained 7.58±0.33 mg GAE g⁻¹. By using the solvent extraction technique, it was observed that 60% methanol is comparatively the best-suited solvent for extracting polyphenolic compounds and gave the maximum yield of 4.68 ± 0.47 mg GAE g⁻¹ in orange peel extracts. DPPH radical scavenging activity and reducing the power of orange peel extract were checked, where 60% methanolic extract showed the highest antioxidant activity, 85.50±0.009% for DPPH, and dimethyl sulfoxide (DMSO) extract gave the highest yield of 1.75±0.01% for reducing power ability of the orange peels extract. Characterization of the polyphenolic compounds was done by using Fourier transformation infrared (FTIR) spectroscopy. Solvent and alkali hydrolysed extracts were evaluated for antibacterial activity using the agar well diffusion method against Gram-positive Bacillus subtilis MTCC441 and Gram-negative Escherichia coli MTCC729. Methanolic extract at 300µl concentration showed an inhibition zone of around 16.33±0.47 mm against Bacillus subtilis, whereas, for Escherichia coli, it was comparatively less. Broth-based turbidimetric assay revealed the antibacterial effect of different volumes of orange peel extracts against both organisms.Keywords: orange peels, total phenolic content, antioxidant, antibacterial
Procedia PDF Downloads 742574 Assessment of Forest Resource Exploitation in the Rural Communities of District Jhelum
Authors: Rubab Zafar Kahlon, Ibtisam Butt
Abstract:
Forest resources are deteriorating and experiencing decline around the globe due to unsustainable use and over exploitation. The present study was an attempt to determine the relationship between human activities, forest resource utilization, extraction methods and practices of forest resource exploitation in the district Jhelum of Pakistan. For this purpose, primary sources of data were used which were collected from 8 villages through structured questionnaire and tabulated in Microsoft Excel 365 and SPSS 22 was used for multiple linear regression analysis. The results revealed that farming, wood cutting, animal husbandry and agro-forestry were the major occupations in the study area. Most commonly used resources included timber 26%, fuelwood 25% and fodder 19%. Methods used for resource extraction included gathering 49%, plucking 34% trapping 11% and cutting 6%. Population growth, increased demand of fuelwood and land conversion were the main reasons behind forest degradation. Results for multiple linear regression revealed that Forest based activities, sources of energy production, methods used for wood harvesting and resource extraction and use of fuelwood for energy production contributed significantly towards extensive forest resource exploitation with p value <0.5 within the study area. The study suggests that effective measures should be taken by forest department to control the unsustainable use of forest resources by stringent management interventions and awareness campaigns in Jhelum district.Keywords: forest resource, biodiversity, expliotation, human activities
Procedia PDF Downloads 952573 Effect of Hemicellulase on Extraction of Essential Oil from Algerian Artemisia campestris
Authors: Khalida Boutemak, Nasssima Benali, Nadji Moulai-Mostefa
Abstract:
Effect of enzyme on the yield and chemical composition of Artemisia campestris essential oil is reported in the present study. It was demonstrated that enzyme facilitated the extraction of essential oil with increase in oil yield and did not affect any noticeable change in flavour profile of the volatile oil. Essential oil was tested for antibacterial activity using Escherichia coli; which was extremely sensitive against control with the largest inhibition (29mm), whereas Staphylococcus aureus was the most sensitive against essential oil obtained from enzymatic pre-treatment with the largest inhibition zone (25mm). The antioxidant activity of the essential oil with hemicellulase pre-treatment (EO2) and control sample (EO1) was determined through reducing power. It was significantly lower than the standard drug (vitamin C) in this order: vitamin C˃EO2˃EO1.Keywords: Artemisia campestris, enzyme pre-treatment, hemicellulase, antibacterial activity, antioxidant activity
Procedia PDF Downloads 3292572 Anomaly Detection in a Data Center with a Reconstruction Method Using a Multi-Autoencoders Model
Authors: Victor Breux, Jérôme Boutet, Alain Goret, Viviane Cattin
Abstract:
Early detection of anomalies in data centers is important to reduce downtimes and the costs of periodic maintenance. However, there is little research on this topic and even fewer on the fusion of sensor data for the detection of abnormal events. The goal of this paper is to propose a method for anomaly detection in data centers by combining sensor data (temperature, humidity, power) and deep learning models. The model described in the paper uses one autoencoder per sensor to reconstruct the inputs. The auto-encoders contain Long-Short Term Memory (LSTM) layers and are trained using the normal samples of the relevant sensors selected by correlation analysis. The difference signal between the input and its reconstruction is then used to classify the samples using feature extraction and a random forest classifier. The data measured by the sensors of a data center between January 2019 and May 2020 are used to train the model, while the data between June 2020 and May 2021 are used to assess it. Performances of the model are assessed a posteriori through F1-score by comparing detected anomalies with the data center’s history. The proposed model outperforms the state-of-the-art reconstruction method, which uses only one autoencoder taking multivariate sequences and detects an anomaly with a threshold on the reconstruction error, with an F1-score of 83.60% compared to 24.16%.Keywords: anomaly detection, autoencoder, data centers, deep learning
Procedia PDF Downloads 1942571 A Multi-Family Offline SPE LC-MS/MS Analytical Method for Anionic, Cationic and Non-ionic Surfactants in Surface Water
Authors: Laure Wiest, Barbara Giroud, Azziz Assoumani, Francois Lestremau, Emmanuelle Vulliet
Abstract:
Due to their production at high tonnages and their extensive use, surfactants are contaminants among those determined at the highest concentrations in wastewater. However, analytical methods and data regarding their occurrence in river water are scarce and concern only a few families, mainly anionic surfactants. The objective of this study was to develop an analytical method to extract and analyze a wide variety of surfactants in a minimum of steps, with a sensitivity compatible with the detection of ultra-traces in surface waters. 27 substances, from 12 families of surfactants, anionic, cationic and non-ionic were selected for method optimization. Different retention mechanisms for the extraction by solid phase extraction (SPE) were tested and compared in order to improve their detection by liquid chromatography coupled with tandem mass spectrometry (LC-MS/MS). The best results were finally obtained with a C18 grafted silica LC column and a polymer cartridge with hydrophilic lipophilic balance (HLB), and the method developed allows the extraction of the three types of surfactants with satisfactory recoveries. The final analytical method comprised only one extraction and two LC injections. It was validated and applied for the quantification of surfactants in 36 river samples. The method's limits of quantification (LQ), intra- and inter-day precision and accuracy were evaluated, and good performances were obtained for the 27 substances. As these compounds have many areas of application, contaminations of instrument and method blanks were observed and considered for the determination of LQ. Nevertheless, with LQ between 15 and 485 ng/L, and accuracy of over 80%, this method was suitable for monitoring surfactants in surface waters. Application on French river samples revealed the presence of anionic, cationic and non-ionic surfactants with median concentrations ranging from 24 ng/L for octylphenol ethoxylates (OPEO) to 4.6 µg/L for linear alkylbenzenesulfonates (LAS). The analytical method developed in this work will therefore be useful for future monitoring of surfactants in waters. Moreover, this method, which shows good performances for anionic, non-ionic and cationic surfactants, may be easily adapted to other surfactants.Keywords: anionic surfactant, cationic surfactant, LC-MS/MS, non-ionic surfactant, SPE, surface water
Procedia PDF Downloads 1462570 Thermodynamic Modelling of Liquid-Liquid Equilibria (LLE) in the Separation of p-Cresol from the Coal Tar by Solvent Extraction
Authors: D. S. Fardhyanti, Megawati, W. B. Sediawan
Abstract:
Coal tar is a liquid by-product of the process of coal gasification and carbonation. This liquid oil mixture contains various kinds of useful compounds such as aromatic compounds and phenolic compounds. These compounds are widely used as raw material for insecticides, dyes, medicines, perfumes, coloring matters, and many others. This research investigates thermodynamic modelling of liquid-liquid equilibria (LLE) in the separation of phenol from the coal tar by solvent extraction. The equilibria are modeled by ternary components of Wohl, Van Laar, and Three-Suffix Margules models. The values of the parameters involved are obtained by curve-fitting to the experimental data. Based on the comparison between calculated and experimental data, it turns out that among the three models studied, the Three-Suffix Margules seems to be the best to predict the LLE of p-Cresol mixtures for those system.Keywords: coal tar, phenol, Wohl, Van Laar, Three-Suffix Margules
Procedia PDF Downloads 2592569 Gas Chromatography Coupled to Tandem Mass Spectrometry and Liquid Chromatography Coupled to Tandem Mass Spectrometry Qualitative Determination of Pesticides Found in Tea Infusions
Authors: Mihai-Alexandru Florea, Veronica Drumea, Roxana Nita, Cerasela Gird, Laura Olariu
Abstract:
The aim of this study was to investigate the residues of pesticide found in tea water infusions. A multi-residues method to determine 147 pesticides has been developed using the QuEChERS (Quick, Easy, Cheap, Effective, Rugged, Safe) procedure and dispersive solid phase extraction (d-SPE) for the cleanup the pesticides from complex matrices such as plants and tea. Sample preparation was carefully optimized for the efficient removal of coextracted matrix components by testing more solvent systems. Determination of pesticides was performed using GC-MS/MS (100 of pesticides) and LC-MS/MS (47 of pesticides). The selected reaction monitoring (SRM) mode was chosen to achieve low detection limits and high compounds selectivity and sensitivity. Overall performance was evaluated and validated according to DG-SANTE Guidelines. To assess the pesticide residue transfer rate (qualitative) from dried tea in infusions the samples (tea) were spiked with a mixture of pesticides at the maximum residues level accepted for teas and herbal infusions. In order to investigate the release of the pesticides in tea preparations, the medicinal plants were prepared in four ways by variation of water temperature and the infusion time. The pesticides from infusions were extracted using two methods: QuEChERS versus solid-phase extraction (SPE). More that 90 % of the pesticides studied was identified in infusion.Keywords: tea, solid-phase extraction (SPE), selected reaction monitoring (SRM), QuEChERS
Procedia PDF Downloads 2142568 Analysis Model for the Relationship of Users, Products, and Stores on Online Marketplace Based on Distributed Representation
Authors: Ke He, Wumaier Parezhati, Haruka Yamashita
Abstract:
Recently, online marketplaces in the e-commerce industry, such as Rakuten and Alibaba, have become some of the most popular online marketplaces in Asia. In these shopping websites, consumers can select purchase products from a large number of stores. Additionally, consumers of the e-commerce site have to register their name, age, gender, and other information in advance, to access their registered account. Therefore, establishing a method for analyzing consumer preferences from both the store and the product side is required. This study uses the Doc2Vec method, which has been studied in the field of natural language processing. Doc2Vec has been used in many cases to analyze the extraction of semantic relationships between documents (represented as consumers) and words (represented as products) in the field of document classification. This concept is applicable to represent the relationship between users and items; however, the problem is that one more factor (i.e., shops) needs to be considered in Doc2Vec. More precisely, a method for analyzing the relationship between consumers, stores, and products is required. The purpose of our study is to combine the analysis of the Doc2vec model for users and shops, and for users and items in the same feature space. This method enables the calculation of similar shops and items for each user. In this study, we derive the real data analysis accumulated in the online marketplace and demonstrate the efficiency of the proposal.Keywords: Doc2Vec, online marketplace, marketing, recommendation systems
Procedia PDF Downloads 1122567 Extraction of Scandium (Sc) from an Ore with Functionalized Nanoporous Silicon Adsorbent
Authors: Arezoo Rahmani, Rinez Thapa, Juha-Matti Aalto, Petri Turhanen, Jouko Vepsalainen, Vesa-PekkaLehto, Joakim Riikonen
Abstract:
Production of Scandium (Sc) is a complicated process because Sc is found only in low concentrations in ores and the concentration of Sc is very low compared with other metals. Therefore, utilization of typical extraction processes such as solvent extraction is problematic in scandium extraction. The Adsorption/desorption method can be used, but it is challenging to prepare materials, which have good selectivity, high adsorption capacity, and high stability. Therefore, efficient and environmentally friendly methods for Sc extraction are needed. In this study, the nanoporous composite material was developed for extracting Sc from an Sc ore. The nanoporous composite material offers several advantageous properties such as large surface area, high chemical and mechanical stability, fast diffusion of the metals in the material and possibility to construct a filter out of the material with good flow-through properties. The nanoporous silicon material was produced by first stabilizing the surfaces with a silicon carbide layer and then functionalizing the surface with bisphosphonates that act as metal chelators. The surface area and porosity of the material were characterized by N₂ adsorption and the morphology was studied by scanning electron microscopy (SEM). The bisphosphonate content of the material was studied by thermogravimetric analysis (TGA). The concentration of metal ions in the adsorption/desorption experiments was measured with inductively coupled plasma mass spectrometry (ICP-MS). The maximum capacity of the material was 25 µmol/g Sc at pH=1 and 45 µmol/g Sc at pH=3, obtained from adsorption isotherm. The selectivity of the material towards Sc in artificial solutions containing several metal ions was studied at pH one and pH 3. The result shows good selectivity of the nanoporous composite towards adsorption of Sc. Scandium was less efficiently adsorbed from solution leached from the ore of Sc because of excessive amounts of iron (Fe), aluminum (Al) and titanium (Ti) which disturbed the adsorption process. For example, the concentration of Fe was more than 4500 ppm, while the concentration of Sc was only three ppm, approximately 1500 times lower. Precipitation methods were developed to lower the concentration of the metals other than Sc. Optimal pH for precipitation was found to be pH 4. The concentration of Fe, Al and Ti were decreased by 99, 70, 99.6%, respectively, while the concentration of Sc decreased only 22%. Despite the large reduction in the concentration of other metals, more work is needed to further increase the relative concentration of Sc compared with other metals to efficiently extract it using the developed nanoporous composite material. Nevertheless, the developed material may provide an affordable, efficient and environmentally friendly method to extract Sc on a large scale.Keywords: adsorption, nanoporous silicon, ore solution, scandium
Procedia PDF Downloads 1462566 High Resolution Image Generation Algorithm for Archaeology Drawings
Authors: Xiaolin Zeng, Lei Cheng, Zhirong Li, Xueping Liu
Abstract:
Aiming at the problem of low accuracy and susceptibility to cultural relic diseases in the generation of high-resolution archaeology drawings by current image generation algorithms, an archaeology drawings generation algorithm based on a conditional generative adversarial network is proposed. An attention mechanism is added into the high-resolution image generation network as the backbone network, which enhances the line feature extraction capability and improves the accuracy of line drawing generation. A dual-branch parallel architecture consisting of two backbone networks is implemented, where the semantic translation branch extracts semantic features from orthophotographs of cultural relics, and the gradient screening branch extracts effective gradient features. Finally, the fusion fine-tuning module combines these two types of features to achieve the generation of high-quality and high-resolution archaeology drawings. Experimental results on the self-constructed archaeology drawings dataset of grotto temple statues show that the proposed algorithm outperforms current mainstream image generation algorithms in terms of pixel accuracy (PA), structural similarity (SSIM), and peak signal-to-noise ratio (PSNR) and can be used to assist in drawing archaeology drawings.Keywords: archaeology drawings, digital heritage, image generation, deep learning
Procedia PDF Downloads 60