Search results for: mineral processing.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4448

Search results for: mineral processing.

3608 Epistemic Emotions during Cognitive Conflict: Associations with Metacognitive Feelings in High Conflict Scenarios

Authors: Katerina Nerantzaki, Panayiota Metallidou, Anastasia Efklides

Abstract:

The aim of the study was to investigate: (a) changes in the intensity of various epistemic emotions during cognitive processing in a decision-making task and (b) their associations with metacognitive feelings of difficulty and confidence. One hundred and fifty-two undergraduate university students were asked individually to read in the e-prime environment decision-making scenarios about moral dilemmas concerning self-driving cars, which differed in the level of conflict they produced, and then to make a choice between two options. Further, the participants were asked to rate on a four-point scale four epistemic emotions (surprise, curiosity, confusion, and wonder) and two metacognitive feelings (feeling of difficulty and feeling of confidence) after making their choice in each scenario. Changes in cognitive processing due to the level of conflict affected differently the intensity of the specific epistemic emotions. Further, there were interrelations of epistemic emotions with metacognitive feelings.

Keywords: confusion, curiosity, epistemic emotions, metacognitive experiences, surprise

Procedia PDF Downloads 79
3607 Standardized Testing of Filter Systems regarding Their Separation Efficiency in Terms of Allergenic Particles and Airborne Germs

Authors: Johannes Mertl

Abstract:

Our surrounding air contains various particles. Besides typical representatives of inorganic dust, such as soot and ash, also particles originating from animals, microorganisms or plants are floating through the air, so-called bioaerosols. The group of bioaerosols consists of a broad spectrum of particles of different size, including fungi, bacteria, viruses, spores, or tree, flower and grass pollen that are of high relevance for allergy sufferers. In dependence of the environmental climate and the actual season, these allergenic particles can be found in enormous numbers in the air and are inhaled by humans via the respiration tract, with a potential for inflammatory diseases of the airways, such as asthma or allergic rhinitis. As a consequence air filter systems of ventilation and air conditioning devices are required to meet very high standards to prevent, or at least lower the number of allergens and airborne germs entering the indoor air. Still, filter systems are merely classified for their separation rates using well-defined mineral test dust, while no appropriate sufficiently standardized test methods for bioaerosols exist. However, determined separation rates for mineral test particles of a certain size cannot simply be transferred to bioaerosols, as separation efficiency of particularly fine and respirable particles (< 10 microns) is dependent not only on their shape and particle diameter, but also defined by their density and physicochemical properties. For this reason, the OFI developed a test method, which directly enables a testing of filters and filter media for their separation rates on bioaerosols, as well as a classification of filters. Besides allergens from an intact or fractured tree or grass pollen, allergenic proteins bound to particulates, as well as allergenic fungal spores (e.g. Cladosporium cladosporioides), or bacteria can be used to classify filters regarding their separation rates. Allergens passing through the filter can then be detected by highly sensitive immunological assays (ELISA) or in the case of fungal spores by microbiological methods, which allow for the detection of even one single spore passing the filter. The test procedure, which is carried out in laboratory scale, was furthermore validated regarding its sufficiency to cover real life situations by upscaling using air conditioning devices showing great conformity in terms of separation rates. Additionally, a clinical study with allergy sufferers was performed to verify analytical results. Several different air conditioning filters from the car industry have been tested, showing significant differences in their separation rates.

Keywords: airborne germs, allergens, classification of filters, fine dust

Procedia PDF Downloads 252
3606 Research on Load Balancing Technology for Web Service Mobile Host

Authors: Yao Lu, Xiuguo Zhang, Zhiying Cao

Abstract:

In this paper, Load Balancing idea is used in the Web service mobile host. The main idea of Load Balancing is to establish a one-to-many mapping mechanism: An entrance-mapping request to plurality of processing node in order to realize the dividing and assignment processing. Because the mobile host is a resource constrained environment, there are some Web services which cannot be completed on the mobile host. When the mobile host resource is not enough to complete the request, Load Balancing scheduler will divide the request into a plurality of sub-requests and transfer them to different auxiliary mobile hosts. Auxiliary mobile host executes sub-requests, and then, the results will be returned to the mobile host. Service request integrator receives results of sub-requests from the auxiliary mobile host, and integrates the sub-requests. In the end, the complete request is returned to the client. Experimental results show that this technology adopted in this paper can complete requests and have a higher efficiency.

Keywords: Dinic, load balancing, mobile host, web service

Procedia PDF Downloads 328
3605 An Intelligent Text Independent Speaker Identification Using VQ-GMM Model Based Multiple Classifier System

Authors: Ben Soltane Cheima, Ittansa Yonas Kelbesa

Abstract:

Speaker Identification (SI) is the task of establishing identity of an individual based on his/her voice characteristics. The SI task is typically achieved by two-stage signal processing: training and testing. The training process calculates speaker specific feature parameters from the speech and generates speaker models accordingly. In the testing phase, speech samples from unknown speakers are compared with the models and classified. Even though performance of speaker identification systems has improved due to recent advances in speech processing techniques, there is still need of improvement. In this paper, a Closed-Set Tex-Independent Speaker Identification System (CISI) based on a Multiple Classifier System (MCS) is proposed, using Mel Frequency Cepstrum Coefficient (MFCC) as feature extraction and suitable combination of vector quantization (VQ) and Gaussian Mixture Model (GMM) together with Expectation Maximization algorithm (EM) for speaker modeling. The use of Voice Activity Detector (VAD) with a hybrid approach based on Short Time Energy (STE) and Statistical Modeling of Background Noise in the pre-processing step of the feature extraction yields a better and more robust automatic speaker identification system. Also investigation of Linde-Buzo-Gray (LBG) clustering algorithm for initialization of GMM, for estimating the underlying parameters, in the EM step improved the convergence rate and systems performance. It also uses relative index as confidence measures in case of contradiction in identification process by GMM and VQ as well. Simulation results carried out on voxforge.org speech database using MATLAB highlight the efficacy of the proposed method compared to earlier work.

Keywords: feature extraction, speaker modeling, feature matching, Mel frequency cepstrum coefficient (MFCC), Gaussian mixture model (GMM), vector quantization (VQ), Linde-Buzo-Gray (LBG), expectation maximization (EM), pre-processing, voice activity detection (VAD), short time energy (STE), background noise statistical modeling, closed-set tex-independent speaker identification system (CISI)

Procedia PDF Downloads 309
3604 Recovery of Heavy Metals by Ion Exchange on the Zeolite Materials

Authors: K. Menad, A. Faddeg

Abstract:

Zeolites are a family of mineral compounds. With special properties that have led to several important industrial applications. Ion exchange has enabled the first industrial application in the field of water treatment. The exchange by aqueous pathway is the method most used in the case of such microporous materials and this technique will be used in this work. The objective of this work is to find performance materials for the recovery of heavy metals such as cadmium. The study is to compare the properties of different ion exchange zeolite Na-X, Na-A, their physical mixture and the composite A (LTA) / X (FAU). After the synthesis of various zeolites X and A, it was designed a model Core-Shell to form a composite zeolite A on zeolite X. Finally, ion exchange studies were performed on these zeolite materials. The cation is exclusively tested for cadmium, a toxic element and is harmful to health and the environment.

Keywords: zeolite A, zeolite X, ion exchange, water treatment

Procedia PDF Downloads 431
3603 Emotional Analysis for Text Search Queries on Internet

Authors: Gemma García López

Abstract:

The goal of this study is to analyze if search queries carried out in search engines such as Google, can offer emotional information about the user that performs them. Knowing the emotional state in which the Internet user is located can be a key to achieve the maximum personalization of content and the detection of worrying behaviors. For this, two studies were carried out using tools with advanced natural language processing techniques. The first study determines if a query can be classified as positive, negative or neutral, while the second study extracts emotional content from words and applies the categorical and dimensional models for the representation of emotions. In addition, we use search queries in Spanish and English to establish similarities and differences between two languages. The results revealed that text search queries performed by users on the Internet can be classified emotionally. This allows us to better understand the emotional state of the user at the time of the search, which could involve adapting the technology and personalizing the responses to different emotional states.

Keywords: emotion classification, text search queries, emotional analysis, sentiment analysis in text, natural language processing

Procedia PDF Downloads 141
3602 Waste Management in a Hot Laboratory of Japan Atomic Energy Agency – 1: Overview and Activities in Chemical Processing Facility

Authors: Kazunori Nomura, Hiromichi Ogi, Masaumi Nakahara, Sou Watanabe, Atsuhiro Shibata

Abstract:

Chemical Processing Facility of Japan Atomic Energy Agency is a basic research field for advanced back-end technology developments with using actual high-level radioactive materials such as irradiated fuels from the fast reactor, high-level liquid waste from reprocessing plant. In the nature of a research facility, various kinds of chemical reagents have been offered for fundamental tests. Most of them were treated properly and stored in the liquid waste vessel equipped in the facility, but some were not treated and remained at the experimental space as a kind of legacy waste. It is required to treat the waste in safety. On the other hand, we formulated the Medium- and Long-Term Management Plan of Japan Atomic Energy Agency Facilities. This comprehensive plan considers Chemical Processing Facility as one of the facilities to be decommissioned. Even if the plan is executed, treatment of the “legacy” waste beforehand must be a necessary step for decommissioning operation. Under this circumstance, we launched a collaborative research project called the STRAD project, which stands for Systematic Treatment of Radioactive liquid waste for Decommissioning, in order to develop the treatment processes for wastes of the nuclear research facility. In this project, decomposition methods of chemicals causing a troublesome phenomenon such as corrosion and explosion have been developed and there is a prospect of their decomposition in the facility by simple method. And solidification of aqueous or organic liquid wastes after the decomposition has been studied by adding cement or coagulants. Furthermore, we treated experimental tools of various materials with making an effort to stabilize and to compact them before the package into the waste container. It is expected to decrease the number of transportation of the solid waste and widen the operation space. Some achievements of these studies will be shown in this paper. The project is expected to contribute beneficial waste management outcome that can be shared world widely.

Keywords: chemical processing facility, medium- and long-term management plan of JAEA facilities, STRAD project, treatment of radioactive waste

Procedia PDF Downloads 142
3601 Exploiting Fast Independent Component Analysis Based Algorithm for Equalization of Impaired Baseband Received Signal

Authors: Muhammad Umair, Syed Qasim Gilani

Abstract:

A technique using Independent Component Analysis (ICA) for blind receiver signal processing is investigated. The problem of the receiver signal processing is viewed as of signal equalization and implementation imperfections compensation. Based on this, a model similar to a general ICA problem is developed for the received signal. Then, the use of ICA technique for blind signal equalization in the time domain is presented. The equalization is regarded as a signal separation problem, since the desired signal is separated from interference terms. This problem is addressed in the paper by over-sampling of the received signal. By using ICA for equalization, besides channel equalization, other transmission imperfections such as Direct current (DC) bias offset, carrier phase and In phase Quadrature phase imbalance will also be corrected. Simulation results for a system using 16-Quadraure Amplitude Modulation(QAM) are presented to show the performance of the proposed scheme.

Keywords: blind equalization, blind signal separation, equalization, independent component analysis, transmission impairments, QAM receiver

Procedia PDF Downloads 214
3600 Linear Frequency Modulation-Frequency Shift Keying Radar with Compressive Sensing

Authors: Ho Jeong Jin, Chang Won Seo, Choon Sik Cho, Bong Yong Choi, Kwang Kyun Na, Sang Rok Lee

Abstract:

In this paper, a radar signal processing technique using the LFM-FSK (Linear Frequency Modulation-Frequency Shift Keying) is proposed for reducing the false alarm rate based on the compressive sensing. The LFM-FSK method combines FMCW (Frequency Modulation Continuous Wave) signal with FSK (Frequency Shift Keying). This shows an advantage which can suppress the ghost phenomenon without the complicated CFAR (Constant False Alarm Rate) algorithm. Moreover, the parametric sparse algorithm applying the compressive sensing that restores signals efficiently with respect to the incomplete data samples is also integrated, leading to reducing the burden of ADC in the receiver of radars. 24 GHz FMCW signal is applied and tested in the real environment with FSK modulated data for verifying the proposed algorithm along with the compressive sensing.

Keywords: compressive sensing, LFM-FSK radar, radar signal processing, sparse algorithm

Procedia PDF Downloads 481
3599 Strategic Metals and Rare Earth Elements Exploration of Lithium Cesium Tantalum Type Pegmatites: A Case Study from Northwest Himalayas

Authors: Auzair Mehmood, Mohammad Arif

Abstract:

The LCT (Li, Cs and Ta rich)-type pegmatites, genetically related to peraluminous S-type granites, are being mined for strategic metals (SMs) and rare earth elements (REEs) around the world. This study investigates the SMs and REEs potentials of pegmatites that are spatially associated with an S-type granitic suite of the Himalayan sequence, specifically Mansehra Granitic Complex (MGC), northwest Pakistan. Geochemical signatures of the pegmatites and some of their mineral extracts were analyzed using Inductive Coupled Plasma Mass Spectroscopy (ICP-MS) technique to explore and generate potential prospects (if any) for SMs and REEs. In general, the REE patterns of the studied whole-rock pegmatite samples show tetrad effect and possess low total REE abundances, strong positive Europium (Eu) anomalies, weak negative Cesium (Cs) anomalies and relative enrichment in heavy REE. Similar features have been observed on the REE patterns of the feldspar extracts. However, the REE patterns of the muscovite extracts reflect preferential enrichment and possess negative Eu anomalies. The trace element evaluation further suggests that the MGC pegmatites have undergone low levels of fractionation. Various trace elements concentrations (and their ratios) including Ta versus Cs, K/Rb (Potassium/Rubidium) versus Rb and Th/U (Thorium/Uranium) versus K/Cs, were used to analyze the economically viable mineral potential of the studied rocks. On most of the plots, concentrations fall below the dividing line and confer either barren or low-level mineralization potential of the studied rocks for both SMs and REEs. The results demonstrate paucity of the MGC pegmatites with respect to Ta-Nb (Tantalum-Niobium) mineralization, which is in sharp contrast to many Pan-African S-type granites around the world. The MGC pegmatites are classified as muscovite pegmatites based on their K/Rb versus Cs relationship. This classification is consistent with the occurrence of rare accessory minerals like garnet, biotite, tourmaline, and beryl. Furthermore, the classification corroborates with an earlier sorting of the MCG pegmatites into muscovite-bearing, biotite-bearing, and subordinate muscovite-biotite types. These types of pegmatites lack any significant SMs and REEs mineralization potentials. Field relations, such as close spatial association with parent granitic rocks and absence of internal zonation structure, also reflect the barren character and hence lack of any potential prospects of the MGC pegmatites.

Keywords: exploration, fractionation, Himalayas, pegmatites, rare earth elements

Procedia PDF Downloads 203
3598 Mineralogical Characteristics of Phosphates from the Djebel Onk Deposits: Treatment and Valorization of Co-Products

Authors: Samira Tlili, Amina Grairia, Sihem Benayache, Saida Bouyegh, Sabrina Ladjama, Abdelmoumen Guedri

Abstract:

Phosphorites from Djebel Onk Tebessa/Algeria deposit contain a CaO of 50-52 wt. % and P₂O₅ level of ≥ 30.20 wt. %. The microstructure revealed using a spectroscopy electronic microscope (SEM) consists of phosphate granules with an ovular form. In this investigation, we have identified phosphate with varying particle sizes using mineralogical methods. The phosphogypsum formed by the mineralization of natural phosphate has also been discovered. This co-product was formed during the attack on natural phosphates by sulfuric acid. This study demonstrated the effectiveness of the thermoanalytical technique of differential scanning calorimetry (DSC), X-ray diffraction, and EDS/MEB analysis. FTIR analyses also validated the identification of mineral phases with the observation of bands from structural phosphate groups.

Keywords: phosphate, Djebel Onk deposit, mineralogy, valorization, phosphogypsum

Procedia PDF Downloads 22
3597 In Situ Volume Imaging of Cleared Mice Seminiferous Tubules Opens New Window to Study Spermatogenic Process in 3D

Authors: Lukas Ded

Abstract:

Studying the tissue structure and histogenesis in the natural, 3D context is challenging but highly beneficial process. Contrary to classical approach of the physical tissue sectioning and subsequent imaging, it enables to study the relationships of individual cellular and histological structures in their native context. Recent developments in the tissue clearing approaches and microscopic volume imaging/data processing enable the application of these methods also in the areas of developmental and reproductive biology. Here, using the CLARITY tissue procedure and 3D confocal volume imaging we optimized the protocol for clearing, staining and imaging of the mice seminiferous tubules isolated from the testes without cardiac perfusion procedure. Our approach enables the high magnification and fine resolution axial imaging of the whole diameter of the seminiferous tubules with possible unlimited lateral length imaging. Hence, the large continuous pieces of the seminiferous tubule can be scanned and digitally reconstructed for the study of the single tubule seminiferous stages using nuclear dyes. Furthermore, the application of the antibodies and various molecular dyes can be used for molecular labeling of individual cellular and subcellular structures and resulting 3D images can highly increase our understanding of the spatiotemporal aspects of the seminiferous tubules development and sperm ultrastructure formation. Finally, our newly developed algorithms for 3D data processing enable the massive parallel processing of the large amount of individual cell and tissue fluorescent signatures and building the robust spermatogenic models under physiological and pathological conditions.

Keywords: CLARITY, spermatogenesis, testis, tissue clearing, volume imaging

Procedia PDF Downloads 136
3596 Mobile Microscope for the Detection of Pathogenic Cells Using Image Processing

Authors: P. S. Surya Meghana, K. Lingeshwaran, C. Kannan, V. Raghavendran, C. Priya

Abstract:

One of the most basic and powerful tools in all of science and medicine is the light microscope, the fundamental device for laboratory as well as research purposes. With the improving technology, the need for portable, economic and user-friendly instruments is in high demand. The conventional microscope fails to live up to the emerging trend. Also, adequate access to healthcare is not widely available, especially in developing countries. The most basic step towards the curing of a malady is the diagnosis of the disease itself. The main aim of this paper is to diagnose Malaria with the most common device, cell phones, which prove to be the immediate solution for most of the modern day needs with the development of wireless infrastructure allowing to compute and communicate on the move. This opened up the opportunity to develop novel imaging, sensing, and diagnostics platforms using mobile phones as an underlying platform to address the global demand for accurate, sensitive, cost-effective, and field-portable measurement devices for use in remote and resource-limited settings around the world.

Keywords: cellular, hand-held, health care, image processing, malarial parasites, microscope

Procedia PDF Downloads 267
3595 Numerical Investigation of the Effects of Surfactant Concentrations on the Dynamics of Liquid-Liquid Interfaces

Authors: Bamikole J. Adeyemi, Prashant Jadhawar, Lateef Akanji

Abstract:

Theoretically, there exist two mathematical interfaces (fluid-solid and fluid-fluid) when a liquid film is present on solid surfaces. These interfaces overlap if the mineral surface is oil-wet or mixed wet, and therefore, the effects of disjoining pressure are significant on both boundaries. Hence, dewetting is a necessary process that could detach oil from the mineral surface. However, if the thickness of the thin water film directly in contact with the surface is large enough, disjoining pressure can be thought to be zero at the liquid-liquid interface. Recent studies show that the integration of fluid-fluid interactions with fluid-rock interactions is an important step towards a holistic approach to understanding smart water effects. Experiments have shown that the brine solution can alter the micro forces at oil-water interfaces, and these ion-specific interactions lead to oil emulsion formation. The natural emulsifiers present in crude oil behave as polyelectrolytes when the oil interfaces with low salinity water. Wettability alteration caused by low salinity waterflooding during Enhanced Oil Recovery (EOR) process results from the activities of divalent ions. However, polyelectrolytes are said to lose their viscoelastic property with increasing cation concentrations. In this work, the influence of cation concentrations on the dynamics of viscoelastic liquid-liquid interfaces is numerically investigated. The resultant ion concentrations at the crude oil/brine interfaces were estimated using a surface complexation model. Subsequently, the ion concentration parameter is integrated into a mathematical model to describe its effects on the dynamics of a viscoelastic interfacial thin film. The film growth, stability, and rupture were measured after different time steps for three types of fluids (Newtonian, purely elastic and viscoelastic fluids). The interfacial films respond to exposure time in a similar manner with an increasing growth rate, which resulted in the formation of more droplets with time. Increased surfactant accumulation at the interface results in a higher film growth rate which leads to instability and subsequent formation of more satellite droplets. Purely elastic and viscoelastic properties limit film growth rate and consequent film stability compared to the Newtonian fluid. Therefore, low salinity and reduced concentration of the potential determining ions in injection water will lead to improved interfacial viscoelasticity.

Keywords: liquid-liquid interfaces, surfactant concentrations, potential determining ions, residual oil mobilization

Procedia PDF Downloads 143
3594 FT-NIR Method to Determine Moisture in Gluten Free Rice-Based Pasta during Drying

Authors: Navneet Singh Deora, Aastha Deswal, H. N. Mishra

Abstract:

Pasta is one of the most widely consumed food products around the world. Rapid determination of the moisture content in pasta will assist food processors to provide online quality control of pasta during large scale production. Rapid Fourier transform near-infrared method (FT-NIR) was developed for determining moisture content in pasta. A calibration set of 150 samples, a validation set of 30 samples and a prediction set of 25 samples of pasta were used. The diffuse reflection spectra of different types of pastas were measured by FT-NIR analyzer in the 4,000-12,000 cm-1 spectral range. Calibration and validation sets were designed for the conception and evaluation of the method adequacy in the range of moisture content 10 to 15 percent (w.b) of the pasta. The prediction models based on partial least squares (PLS) regression, were developed in the near-infrared. Conventional criteria such as the R2, the root mean square errors of cross validation (RMSECV), root mean square errors of estimation (RMSEE) as well as the number of PLS factors were considered for the selection of three pre-processing (vector normalization, minimum-maximum normalization and multiplicative scatter correction) methods. Spectra of pasta sample were treated with different mathematic pre-treatments before being used to build models between the spectral information and moisture content. The moisture content in pasta predicted by FT-NIR methods had very good correlation with their values determined via traditional methods (R2 = 0.983), which clearly indicated that FT-NIR methods could be used as an effective tool for rapid determination of moisture content in pasta. The best calibration model was developed with min-max normalization (MMN) spectral pre-processing (R2 = 0.9775). The MMN pre-processing method was found most suitable and the maximum coefficient of determination (R2) value of 0.9875 was obtained for the calibration model developed.

Keywords: FT-NIR, pasta, moisture determination, food engineering

Procedia PDF Downloads 258
3593 Investigating the Effect of Orthographic Transparency on Phonological Awareness in Bilingual Children with Dyslexia

Authors: Sruthi Raveendran

Abstract:

Developmental dyslexia, characterized by reading difficulties despite normal intelligence, presents a significant challenge for bilingual children navigating languages with varying degrees of orthographic transparency. This study bridges a critical gap in dyslexia interventions for bilingual populations in India by examining how consistency and predictability of letter-sound relationships in a writing system (orthographic transparency) influence the ability to understand and manipulate the building blocks of sound in language (phonological processing). The study employed a computerized visual rhyme-judgment task with concurrent EEG (electroencephalogram) recording. The task compared reaction times, accuracy of performance, and event-related potential (ERP) components (N170, N400, and LPC) for rhyming and non-rhyming stimuli in two orthographies: English (opaque orthography) and Kannada (transparent orthography). As hypothesized, the results revealed advantages in phonological processing tasks for transparent orthography (Kannada). Children with dyslexia were faster and more accurate when judging rhymes in Kannada compared to English. This suggests that a language with consistent letter-sound relationships (transparent orthography) facilitates processing, especially for tasks that involve manipulating sounds within words (rhyming). Furthermore, brain activity measured by event-related potentials (ERP) showed less effort required for processing words in Kannada, as reflected by smaller N170, N400, and LPC amplitudes. These findings highlight the crucial role of orthographic transparency in optimizing reading performance for bilingual children with dyslexia. These findings emphasize the need for language-specific intervention strategies that consider the unique linguistic characteristics of each language. While acknowledging the complexity of factors influencing dyslexia, this research contributes valuable insights into the impact of orthographic transparency on phonological awareness in bilingual children. This knowledge paves the way for developing tailored interventions that promote linguistic inclusivity and optimize literacy outcomes for children with dyslexia.

Keywords: developmental dyslexia, phonological awareness, rhyme judgment, orthographic transparency, Kannada, English, N170, N400, LPC

Procedia PDF Downloads 7
3592 Filtering and Reconstruction System for Grey-Level Forensic Images

Authors: Ahd Aljarf, Saad Amin

Abstract:

Images are important source of information used as evidence during any investigation process. Their clarity and accuracy is essential and of the utmost importance for any investigation. Images are vulnerable to losing blocks and having noise added to them either after alteration or when the image was taken initially, therefore, having a high performance image processing system and it is implementation is very important in a forensic point of view. This paper focuses on improving the quality of the forensic images. For different reasons packets that store data can be affected, harmed or even lost because of noise. For example, sending the image through a wireless channel can cause loss of bits. These types of errors might give difficulties generally for the visual display quality of the forensic images. Two of the images problems: noise and losing blocks are covered. However, information which gets transmitted through any way of communication may suffer alteration from its original state or even lose important data due to the channel noise. Therefore, a developed system is introduced to improve the quality and clarity of the forensic images.

Keywords: image filtering, image reconstruction, image processing, forensic images

Procedia PDF Downloads 365
3591 A Query Optimization Strategy for Autonomous Distributed Database Systems

Authors: Dina K. Badawy, Dina M. Ibrahim, Alsayed A. Sallam

Abstract:

Distributed database is a collection of logically related databases that cooperate in a transparent manner. Query processing uses a communication network for transmitting data between sites. It refers to one of the challenges in the database world. The development of sophisticated query optimization technology is the reason for the commercial success of database systems, which complexity and cost increase with increasing number of relations in the query. Mariposa, query trading and query trading with processing task-trading strategies developed for autonomous distributed database systems, but they cause high optimization cost because of involvement of all nodes in generating an optimal plan. In this paper, we proposed a modification on the autonomous strategy K-QTPT that make the seller’s nodes with the lowest cost have gradually high priorities to reduce the optimization time. We implement our proposed strategy and present the results and analysis based on those results.

Keywords: autonomous strategies, distributed database systems, high priority, query optimization

Procedia PDF Downloads 523
3590 Investigation of the Unbiased Characteristic of Doppler Frequency to Different Antenna Array Geometries

Authors: Somayeh Komeylian

Abstract:

Array signal processing techniques have been recently developing in a variety application of the performance enhancement of receivers by refraining the power of jamming and interference signals. In this scenario, biases induced to the antenna array receiver degrade significantly the accurate estimation of the carrier phase. Owing to the integration of frequency becomes the carrier phase, we have obtained the unbiased doppler frequency for the high precision estimation of carrier phase. The unbiased characteristic of Doppler frequency to the power jamming and the other interference signals allows achieving the highly accurate estimation of phase carrier. In this study, we have rigorously investigated the unbiased characteristic of Doppler frequency to the variation of the antenna array geometries. The simulation results have efficiently verified that the Doppler frequency remains also unbiased and accurate to the variation of antenna array geometries.

Keywords: array signal processing, unbiased doppler frequency, GNSS, carrier phase, and slowly fluctuating point target

Procedia PDF Downloads 159
3589 Preparation on Sentimental Analysis on Social Media Comments with Bidirectional Long Short-Term Memory Gated Recurrent Unit and Model Glove in Portuguese

Authors: Leonardo Alfredo Mendoza, Cristian Munoz, Marco Aurelio Pacheco, Manoela Kohler, Evelyn Batista, Rodrigo Moura

Abstract:

Natural Language Processing (NLP) techniques are increasingly more powerful to be able to interpret the feelings and reactions of a person to a product or service. Sentiment analysis has become a fundamental tool for this interpretation but has few applications in languages other than English. This paper presents a classification of sentiment analysis in Portuguese with a base of comments from social networks in Portuguese. A word embedding's representation was used with a 50-Dimension GloVe pre-trained model, generated through a corpus completely in Portuguese. To generate this classification, the bidirectional long short-term memory and bidirectional Gated Recurrent Unit (GRU) models are used, reaching results of 99.1%.

Keywords: natural processing language, sentiment analysis, bidirectional long short-term memory, BI-LSTM, gated recurrent unit, GRU

Procedia PDF Downloads 159
3588 Thermal Image Segmentation Method for Stratification of Freezing Temperatures

Authors: Azam Fazelpour, Saeed R. Dehghani, Vlastimil Masek, Yuri S. Muzychka

Abstract:

The study uses an image analysis technique employing thermal imaging to measure the percentage of areas with various temperatures on a freezing surface. An image segmentation method using threshold values is applied to a sequence of image recording the freezing process. The phenomenon is transient and temperatures vary fast to reach the freezing point and complete the freezing process. Freezing salt water is subjected to the salt rejection that makes the freezing point dynamic and dependent on the salinity at the phase interface. For a specific area of freezing, nucleation starts from one side and end to another side, which causes a dynamic and transient temperature in that area. Thermal cameras are able to reveal a difference in temperature due to their sensitivity to infrared radiance. Using Experimental setup, a video is recorded by a thermal camera to monitor radiance and temperatures during the freezing process. Image processing techniques are applied to all frames to detect and classify temperatures on the surface. Image processing segmentation method is used to find contours with same temperatures on the icing surface. Each segment is obtained using the temperature range appeared in the image and correspond pixel values in the image. Using the contours extracted from image and camera parameters, stratified areas with different temperatures are calculated. To observe temperature contours on the icing surface using the thermal camera, the salt water sample is dropped on a cold surface with the temperature of -20°C. A thermal video is recorded for 2 minutes to observe the temperature field. Examining the results obtained by the method and the experimental observations verifies the accuracy and applicability of the method.

Keywords: ice contour boundary, image processing, image segmentation, salt ice, thermal image

Procedia PDF Downloads 320
3587 Automatic Extraction of Arbitrarily Shaped Buildings from VHR Satellite Imagery

Authors: Evans Belly, Imdad Rizvi, M. M. Kadam

Abstract:

Satellite imagery is one of the emerging technologies which are extensively utilized in various applications such as detection/extraction of man-made structures, monitoring of sensitive areas, creating graphic maps etc. The main approach here is the automated detection of buildings from very high resolution (VHR) optical satellite images. Initially, the shadow, the building and the non-building regions (roads, vegetation etc.) are investigated wherein building extraction is mainly focused. Once all the landscape is collected a trimming process is done so as to eliminate the landscapes that may occur due to non-building objects. Finally the label method is used to extract the building regions. The label method may be altered for efficient building extraction. The images used for the analysis are the ones which are extracted from the sensors having resolution less than 1 meter (VHR). This method provides an efficient way to produce good results. The additional overhead of mid processing is eliminated without compromising the quality of the output to ease the processing steps required and time consumed.

Keywords: building detection, shadow detection, landscape generation, label, partitioning, very high resolution (VHR) satellite imagery

Procedia PDF Downloads 314
3586 Evaluation of Mechanical Properties and Analysis of Rapidly Heat Treated M-42 High Speed Steel

Authors: R. N. Karthik Babu, R. Sarvesh, A. Rajendra Prasad, G. Swaminathan

Abstract:

M42 is a molybdenum-series high-speed alloy steel widely used because of its better hot-hardness and wear resistance. These steels are conventionally heat treated in a salt bath furnace with up to three stages of preheating with predetermined soaking and holding periods. Such methods often involve long periods of processing with a large amount of energy consumed. In this study, the M42 steel samples were heat-treated by rapidly heating the specimens to the austenising temperature of 1260 °C and cooled conventionally by quenching in a neutral salt bath at a temperature of 550 °C with the aid of a hybrid microwave furnace. As metals reflect microwaves, they cannot directly be heated up when placed in a microwave furnace. The technology used herein requires the specimens to be placed in a crucible lined with SiC which is a good absorber of microwaves and the SiC lining heats the metal through radiation which facilitates the volumetric heating of the metal. A sample of similar dimensions was heat treated conventionally and cooled in the same manner. Conventional tempering process was then carried out on both these samples and analysed for various parameters such as micro-hardness, processing time, etc. Microstructure analysis and scanning electron microscopy was also carried out. The objective of the study being that similar or better properties, with substantial time and energy saving and cost cutting are achievable by rapid heat treatment through hybrid microwave furnaces. It is observed that the heat treatment is done with substantial time and energy savings, and also with minute improvement in mechanical properties of the tool steel heat treated.

Keywords: rapid heating, heat treatment, metal processing, microwave heating

Procedia PDF Downloads 286
3585 Wavelet Coefficients Based on Orthogonal Matching Pursuit (OMP) Based Filtering for Remotely Sensed Images

Authors: Ramandeep Kaur, Kamaljit Kaur

Abstract:

In recent years, the technology of the remote sensing is growing rapidly. Image enhancement is one of most commonly used of image processing operations. Noise reduction plays very important role in digital image processing and various technologies have been located ahead to reduce the noise of the remote sensing images. The noise reduction using wavelet coefficients based on Orthogonal Matching Pursuit (OMP) has less consequences on the edges than available methods but this is not as establish in edge preservation techniques. So in this paper we provide a new technique minimum patch based noise reduction OMP which reduce the noise from an image and used edge preservation patch which preserve the edges of the image and presents the superior results than existing OMP technique. Experimental results show that the proposed minimum patch approach outperforms over existing techniques.

Keywords: image denoising, minimum patch, OMP, WCOMP

Procedia PDF Downloads 389
3584 Performance-Based Quality Evaluation of Database Conceptual Schemas

Authors: Janusz Getta, Zhaoxi Pan

Abstract:

Performance-based quality evaluation of database conceptual schemas is an important aspect of database design process. It is evident that different conceptual schemas provide different logical schemas and performance of user applications strongly depends on logical and physical database structures. This work presents the entire process of performance-based quality evaluation of conceptual schemas. First, we show format. Then, the paper proposes a new specification of object algebra for representation of conceptual level database applications. Transformation of conceptual schemas and expression of object algebra into implementation schema and implementation in a particular database system allows for precise estimation of the processing costs of database applications and as a consequence for precise evaluation of performance-based quality of conceptual schemas. Then we describe an experiment as a proof of concept for the evaluation procedure presented in the paper.

Keywords: conceptual schema, implementation schema, logical schema, object algebra, performance evaluation, query processing

Procedia PDF Downloads 292
3583 Bacterial Recovery of Copper Ores

Authors: Zh. Karaulova, D. Baizhigitov

Abstract:

At the Aktogay deposit, the oxidized ore section has been developed since 2015; by now, the reserves of easily enriched ore are decreasing, and a large number of copper-poor, difficult-to-enrich ores has been accumulated in the dumps of the KAZ Minerals Aktogay deposit, which is unprofitable to mine using the traditional mining methods. Hence, another technology needs to be implemented, which will significantly expand the raw material base of copper production in Kazakhstan and ensure the efficient use of natural resources. Heap and dump bacterial recovery are the most acceptable technologies for processing low-grade secondary copper sulfide ores. Test objects were the copper ores of Aktogay deposit and chemolithotrophic bacteria Leptospirillum ferrooxidans (L.f.), Acidithiobacillus caldus (A.c.), Sulfobacillus Acidophilus (S.a.), which are mixed cultures were both used in bacterial oxidation systems. They can stay active in the 20-400C temperature range. These bacteria were the most extensively studied and widely used in sulfide mineral recovery technology. Biocatalytic acceleration was achieved as a result of bacteria oxidizing iron sulfides to form iron sulfate, which subsequently underwent chemical oxidation to become sulfate oxide. The following results have been achieved at the initial stage: the goal was to grow and maintain the life activity of bacterial cultures under laboratory conditions. These bacteria grew the best within the pH 1,2-1,8 range with light stirring and in an aerated environment. The optimal growth temperature was 30-33оC. The growth rate decreased by one-half for each 4-5°C fall in temperature from 30°C. At best, the number of bacteria doubled every 24 hours. Typically, the maximum concentration of cells that can be grown in ferrous solution is about 107/ml. A further step researched in this case was the adaptation of microorganisms to the environment of certain metals. This was followed by mass production of inoculum and maintenance for their further cultivation on a factory scale. This was done by adding sulfide concentrate, allowing the bacteria to convert the ferrous sulfate as indicated by the Eh (>600 mV), then diluting to double the volume and adding concentrate to achieve the same metal level. This process was repeated until the desired metal level and volumes were achieved. The final stage of bacterial recovery was the transportation and irrigation of secondary sulfide copper ores of the oxidized ore section. In conclusion, the project was implemented at the Aktogay mine since the bioleaching process was prolonged. Besides, the method of bacterial recovery might compete well with existing non-biological methods of extraction of metals from ores.

Keywords: bacterial recovery, copper ore, bioleaching, bacterial inoculum

Procedia PDF Downloads 74
3582 Clutter Suppression Based on Singular Value Decomposition and Fast Wavelet Algorithm

Authors: Ruomeng Xiao, Zhulin Zong, Longfa Yang

Abstract:

Aiming at the problem that the target signal is difficult to detect under the strong ground clutter environment, this paper proposes a clutter suppression algorithm based on the combination of singular value decomposition and the Mallat fast wavelet algorithm. The method first carries out singular value decomposition on the radar echo data matrix, realizes the initial separation of target and clutter through the threshold processing of singular value, and then carries out wavelet decomposition on the echo data to find out the target location, and adopts the discard method to select the appropriate decomposition layer to reconstruct the target signal, which ensures the minimum loss of target information while suppressing the clutter. After the verification of the measured data, the method has a significant effect on the target extraction under low SCR, and the target reconstruction can be realized without the prior position information of the target and the method also has a certain enhancement on the output SCR compared with the traditional single wavelet processing method.

Keywords: clutter suppression, singular value decomposition, wavelet transform, Mallat algorithm, low SCR

Procedia PDF Downloads 118
3581 Video Foreground Detection Based on Adaptive Mixture Gaussian Model for Video Surveillance Systems

Authors: M. A. Alavianmehr, A. Tashk, A. Sodagaran

Abstract:

Modeling background and moving objects are significant techniques for video surveillance and other video processing applications. This paper presents a foreground detection algorithm that is robust against illumination changes and noise based on adaptive mixture Gaussian model (GMM), and provides a novel and practical choice for intelligent video surveillance systems using static cameras. In the previous methods, the image of still objects (background image) is not significant. On the contrary, this method is based on forming a meticulous background image and exploiting it for separating moving objects from their background. The background image is specified either manually, by taking an image without vehicles, or is detected in real-time by forming a mathematical or exponential average of successive images. The proposed scheme can offer low image degradation. The simulation results demonstrate high degree of performance for the proposed method.

Keywords: image processing, background models, video surveillance, foreground detection, Gaussian mixture model

Procedia PDF Downloads 516
3580 A Generalized Sparse Bayesian Learning Algorithm for Near-Field Synthetic Aperture Radar Imaging: By Exploiting Impropriety and Noncircularity

Authors: Pan Long, Bi Dongjie, Li Xifeng, Xie Yongle

Abstract:

The near-field synthetic aperture radar (SAR) imaging is an advanced nondestructive testing and evaluation (NDT&E) technique. This paper investigates the complex-valued signal processing related to the near-field SAR imaging system, where the measurement data turns out to be noncircular and improper, meaning that the complex-valued data is correlated to its complex conjugate. Furthermore, we discover that the degree of impropriety of the measurement data and that of the target image can be highly correlated in near-field SAR imaging. Based on these observations, A modified generalized sparse Bayesian learning algorithm is proposed, taking impropriety and noncircularity into account. Numerical results show that the proposed algorithm provides performance gain, with the help of noncircular assumption on the signals.

Keywords: complex-valued signal processing, synthetic aperture radar, 2-D radar imaging, compressive sensing, sparse Bayesian learning

Procedia PDF Downloads 131
3579 Graphical User Interface for Presting Matlab Work for Reduction of Chromatic Disperion Using Digital Signal Processing for Optical Communication

Authors: Muhammad Faiz Liew Abdullah, Bhagwan Das, Nor Shahida, Abdul Fattah Chandio

Abstract:

This study presents the designed features of Graphical User Interface (GUI) for chromatic dispersion (CD) reduction using digital signal processing (DSP) techniques. GUI is specially designed for windows platform. The obtained simulation results from matlab are presented via this GUI. After importing results from matlab in GUI, It will present your work on any windows7 and onwards versions platforms without matlab software. First part of the GUI contains the research methodology block diagram and in the second part, output for each stage is shown in separate reserved area for the result display. Each stage of methodology has the captions to display the results. This GUI will be very helpful during presentations instead of making slides this GUI will present all your work easily in the absence of other software’s such as Matlab, Labview, MS PowerPoint. GUI is designed using C programming in MS Visio Studio.

Keywords: Matlab simulation results, C programming, MS VISIO studio, chromatic dispersion

Procedia PDF Downloads 462