Search results for: analog signal processing
3971 Characterization of Volatiles Botrytis cinerea in Blueberry Using Solid Phase Micro Extraction, Gas Chromatography Mass Spectrometry
Authors: Ahmed Auda, Manjree Agarwala, Giles Hardya, Yonglin Rena
Abstract:
Botrytis cinerea is a major pest for many plants. It can attack a wide range of plant parts. It can attack buds, flowers, and leaves, stems, and fruit. However, B. cinerea can be mixed with other diseases that cause the same damage. There are many species of botrytis and more than one different strains of each. Botrytis might infect the foliage of nursery stock stored through winter in damp conditions. There are no known resistant plants. Botrytis must have nutrients or food source before it infests the plant. Nutrients leaking from wounded plant parts or dying tissue like old flower petals give the required nutrients. From this food, the fungus becomes more attackers and invades healthy tissue. Dark to light brown rot forms in the ill tissue. High humidity conditions support the growth of this fungus. However, we suppose that selection pressure can act on the morphological and neurophysiologic filter properties of the receiver and on both the biochemical and the physiological regulation of the signal. Communication is implied when signal and receiver evolves toward more and more specific matching, culminating. In other hand, receivers respond to portions of a body odor bouquet which is released to the environment not as an (intentional) signal but as an unavoidable consequence of metabolic activity or tissue damage. Each year Botrytis species can cause considerable economic losses to plant crops. Even with the application of strict quarantine and control measures, these fungi can still find their way into crops and cause the imposition of onerous restrictions on exports. Blueberry fruit mould caused by a fungal infection usually results in major losses during post-harvest storage. Therefore, the management of infection in early stages of disease development is necessary to minimize losses. The overall purpose of this study will develop sensitive, cheap, quick and robust diagnostic techniques for the detection of B. cinerea in blueberry. The specific aim was designed to investigate the performance of volatile organic compounds (VOCs) in the detection and discrimination of blueberry fruits infected by fungal pathogens with an emphasis on Botrytis in the early storage stage of post-harvest.Keywords: botrytis cinerea, blueberry, GC/MS, VOCs
Procedia PDF Downloads 2413970 An Analysis of the Relations between Aggregates’ Shape and Mechanical Properties throughout the Railway Ballast Service Life
Authors: Daianne Fernandes Diogenes
Abstract:
Railway ballast aggregates’ shape properties and size distribution can be directly affected by several factors, such as traffic, fouling, and maintenance processes, which cause breakage and wearing, leading to the fine particles’ accumulation through the ballast layer. This research aims to analyze the influence of traffic, tamping process, and sleepers’ stiffness on aggregates' shape and mechanical properties, by using traditional and digital image processing (DIP) techniques and cyclic tests, like resilient modulus (RM) and permanent deformation (PD). Aggregates were collected in different phases of the railway service life: (i) right after the crushing process; (ii) after construction, for the aggregates positioned below the sleepers and (iii) after 5 years of operation. An increase in the percentage of cubic particles was observed for the materials (ii) and (iii), providing a better interlocking, increasing stiffness and reducing axial deformation after 5 years of service, when compared to the initial conditions.Keywords: digital image processing, mechanical behavior, railway ballast, shape properties
Procedia PDF Downloads 1223969 Digital Transformation and Environmental Disclosure in Industrial Firms: The Moderating Role of the Top Management Team
Authors: Yongxin Chen, Min Zhang
Abstract:
As industrial enterprises are the primary source of national pollution, environmental information disclosure is a crucial way to demonstrate to stakeholders the work they have done in fulfilling their environmental responsibilities and accepting social supervision. In the era of the digital economy, many companies, actively embracing the opportunities that come with digital transformation, have begun to apply digital technology to information collection and disclosure within the enterprise. However, less is known about the relationship between digital transformation and environmental disclosure. This study investigates how enterprise digital transformation affects environmental disclosure in 643 Chinese industrial companies, according to information processing theory. What is intriguing is that the depth (size) and breadth (diversity) of environmental disclosure linearly increase with the rise in the collection, processing, and analytical capabilities in the digital transformation process. However, the volume of data will grow exponentially, leading to a marginal increase in the economic and environmental costs of utilizing, storing, and managing data. In our empirical findings, linearly increasing benefits and marginal costs create a unique inverted U-shaped relationship between the degree of digital transformation and environmental disclosure in the Chinese industrial sector. Besides, based on the upper echelons theory, we also propose that the top management team with high stability and managerial capabilities will invest more effort and expense into improving environmental disclosure quality, lowering the carbon footprint caused by digital technology, maintaining data security etc. In both these contexts, the increasing marginal cost curves would become steeper, weakening the inverted U-shaped slope between DT and ED.Keywords: digital transformation, environmental disclosure, the top management team, information processing theory, upper echelon theory
Procedia PDF Downloads 1423968 Review on Wear Behavior of Magnesium Matrix Composites
Authors: Amandeep Singh, Niraj Bala
Abstract:
In the last decades, light-weight materials such as magnesium matrix composites have become hot topic for material research due to their excellent mechanical and physical properties. However, relatively very less work has been done related to the wear behavior of these composites. Magnesium matrix composites have wide applications in automobile and aerospace sector. In this review, attempt has been done to collect the literature related to wear behavior of magnesium matrix composites fabricated through various processing techniques such as stir casting, powder metallurgy, friction stir processing etc. Effect of different reinforcements, reinforcement content, reinforcement size, wear load, sliding speed and time have been studied by different researchers in detail. Wear mechanism under different experimental condition has been reviewed in detail. The wear resistance of magnesium and its alloys can be enhanced with the addition of different reinforcements. Wear resistance can further be enhanced by increasing the percentage of added reinforcements. Increase in applied load during wear test leads to increase in wear rate of magnesium composites.Keywords: hardness, magnesium matrix composites, reinforcement, wear
Procedia PDF Downloads 3323967 Artificial Cells Capable of Communication by Using Polymer Hydrogel
Authors: Qi Liu, Jiqin Yao, Xiaohu Zhou, Bo Zheng
Abstract:
The first artificial cell was produced by Thomas Chang in the 1950s when he was trying to make a mimic of red blood cells. Since then, many different types of artificial cells have been constructed from one of the two approaches: a so-called bottom-up approach, which aims to create a cell from scratch, and a top-down approach, in which genes are sequentially knocked out from organisms until only the minimal genome required for sustaining life remains. In this project, bottom-up approach was used to build a new cell-free expression system which mimics artificial cell that capable of protein expression and communicate with each other. The artificial cells constructed from the bottom-up approach are usually lipid vesicles, polymersomes, hydrogels or aqueous droplets containing the nucleic acids and transcription-translation machinery. However, lipid vesicles based artificial cells capable of communication present several issues in the cell communication research: (1) The lipid vesicles normally lose the important functions such as protein expression within a few hours. (2) The lipid membrane allows the permeation of only small molecules and limits the types of molecules that can be sensed and released to the surrounding environment for chemical communication; (3) The lipid vesicles are prone to rupture due to the imbalance of the osmotic pressure. To address these issues, the hydrogel-based artificial cells were constructed in this work. To construct the artificial cell, polyacrylamide hydrogel was functionalized with Acrylate PEG Succinimidyl Carboxymethyl Ester (ACLT-PEG2000-SCM) moiety on the polymer backbone. The proteinaceous factors can then be immobilized on the polymer backbone by the reaction between primary amines of proteins and N-hydroxysuccinimide esters (NHS esters) of ACLT-PEG2000-SCM, the plasmid template and ribosome were encapsulated inside the hydrogel particles. Because the artificial cell could continuously express protein with the supply of nutrients and energy, the artificial cell-artificial cell communication and artificial cell-natural cell communication could be achieved by combining the artificial cell vector with designed plasmids. The plasmids were designed referring to the quorum sensing (QS) system of bacteria, which largely relied on cognate acyl-homoserine lactone (AHL) / transcription pairs. In one communication pair, “sender” is the artificial cell or natural cell that can produce AHL signal molecule by synthesizing the corresponding signal synthase that catalyzed the conversion of S-adenosyl-L-methionine (SAM) into AHL, while the “receiver” is the artificial cell or natural cell that can sense the quorum sensing signaling molecule form “sender” and in turn express the gene of interest. In the experiment, GFP was first immobilized inside the hydrogel particle to prove that the functionalized hydrogel particles could be used for protein binding. After that, the successful communication between artificial cell-artificial cell and artificial cell-natural cell was demonstrated, the successful signal between artificial cell-artificial cell or artificial cell-natural cell could be observed by recording the fluorescence signal increase. The hydrogel-based artificial cell designed in this work can help to study the complex communication system in bacteria, it can also be further developed for therapeutic applications.Keywords: artificial cell, cell-free system, gene circuit, synthetic biology
Procedia PDF Downloads 1523966 Direct Conversion of Crude Oils into Petrochemicals under High Severity Conditions
Authors: Anaam H. Al-ShaikhAli, Mansour A. Al-Herz
Abstract:
The research leverages the proven HS-FCC technology to directly crack crude oils into petrochemical building blocks. Crude oils were subjected to an optimized hydro-processing process where metal contaminants and sulfur were reduced to an acceptable level for feeding the crudes into the HS-FCC technology. The hydro-processing is achieved through a fixed-bed reactor which is composed of 3 layers of catalysts. The crude oil is passed through a dementalization catalyst followed by a desulfurization catalyst and finally a de-aromatization catalyst. The hydroprocessing was conducted at an optimized liquid hourly space velocity (LHSV), temperature, and pressure for an optimal reduction of metals and sulfur from the crudes. The hydro-processed crudes were then fed into a micro activity testing (MAT) unit to simulate the HS-FCC technology. The catalytic cracking of crude oils was conducted over tailored catalyst formulations under an optimized catalyst/oil ratio and cracking temperature for optimal production of total light olefins.Keywords: petrochemical, catalytic cracking, catalyst synthesis, HS-FCC technology
Procedia PDF Downloads 933965 ZigBee Wireless Sensor Nodes with Hybrid Energy Storage System Based on Li-Ion Battery and Solar Energy Supply
Authors: Chia-Chi Chang, Chuan-Bi Lin, Chia-Min Chan
Abstract:
Most ZigBee sensor networks to date make use of nodes with limited processing, communication, and energy capabilities. Energy consumption is of great importance in wireless sensor applications as their nodes are commonly battery-driven. Once ZigBee nodes are deployed outdoors, limited power may make a sensor network useless before its purpose is complete. At present, there are two strategies for long node and network lifetime. The first strategy is saving energy as much as possible. The energy consumption will be minimized through switching the node from active mode to sleep mode and routing protocol with ultra-low energy consumption. The second strategy is to evaluate the energy consumption of sensor applications as accurately as possible. Erroneous energy model may render a ZigBee sensor network useless before changing batteries. In this paper, we present a ZigBee wireless sensor node with four key modules: a processing and radio unit, an energy harvesting unit, an energy storage unit, and a sensor unit. The processing unit uses CC2530 for controlling the sensor, carrying out routing protocol, and performing wireless communication with other nodes. The harvesting unit uses a 2W solar panel to provide lasting energy for the node. The storage unit consists of a rechargeable 1200 mAh Li-ion battery and a battery charger using a constant-current/constant-voltage algorithm. Our solution to extend node lifetime is implemented. Finally, a long-term sensor network test is used to exhibit the functionality of the solar powered system.Keywords: ZigBee, Li-ion battery, solar panel, CC2530
Procedia PDF Downloads 3743964 Peak Frequencies in the Collective Membrane Potential of a Hindmarsh-Rose Small-World Neural Network
Authors: Sun Zhe, Ruggero Micheletto
Abstract:
As discussed extensively in many studies, noise in neural networks have an important role in the functioning and time evolution of the system. The mechanism by which noise induce stochastic resonance enhancing and influencing certain operations is not clarified nor is the mechanism of information storage and coding. With the present research we want to study the role of noise, especially focusing on the frequency peaks in a three variable Hindmarsh−Rose Small−World network. We investigated the behaviour of the network to external noises. We demonstrate that a variation of signal to noise ratio of about 10 dB induces an increase in membrane potential signal of about 15%, averaged over the whole network. We also considered the integral of the whole membrane potential as a paradigm of internal noise, the one generated by the brain network. We showed that this internal noise is attenuated with the size of the network or with the number of random connections. By means of Fourier analysis we found that it has distinct peaks of frequencies, moreover, we showed that increasing the size of the network introducing more neurons, reduced the maximum frequencies generated by the network, whereas the increase in the number of random connections (determined by the small-world probability p) led to a trend toward higher frequencies. This study may give clues on how networks utilize noise to alter the collective behaviour of the system in their operations.Keywords: neural networks, stochastic processes, small-world networks, discrete Fourier analysis
Procedia PDF Downloads 2913963 Epistemic Emotions during Cognitive Conflict: Associations with Metacognitive Feelings in High Conflict Scenarios
Authors: Katerina Nerantzaki, Panayiota Metallidou, Anastasia Efklides
Abstract:
The aim of the study was to investigate: (a) changes in the intensity of various epistemic emotions during cognitive processing in a decision-making task and (b) their associations with metacognitive feelings of difficulty and confidence. One hundred and fifty-two undergraduate university students were asked individually to read in the e-prime environment decision-making scenarios about moral dilemmas concerning self-driving cars, which differed in the level of conflict they produced, and then to make a choice between two options. Further, the participants were asked to rate on a four-point scale four epistemic emotions (surprise, curiosity, confusion, and wonder) and two metacognitive feelings (feeling of difficulty and feeling of confidence) after making their choice in each scenario. Changes in cognitive processing due to the level of conflict affected differently the intensity of the specific epistemic emotions. Further, there were interrelations of epistemic emotions with metacognitive feelings.Keywords: confusion, curiosity, epistemic emotions, metacognitive experiences, surprise
Procedia PDF Downloads 793962 Design of a Standard Weather Data Acquisition Device for the Federal University of Technology, Akure Nigeria
Authors: Isaac Kayode Ogunlade
Abstract:
Data acquisition (DAQ) is the process by which physical phenomena from the real world are transformed into an electrical signal(s) that are measured and converted into a digital format for processing, analysis, and storage by a computer. The DAQ is designed using PIC18F4550 microcontroller, communicating with Personal Computer (PC) through USB (Universal Serial Bus). The research deployed initial knowledge of data acquisition system and embedded system to develop a weather data acquisition device using LM35 sensor to measure weather parameters and the use of Artificial Intelligence(Artificial Neural Network - ANN)and statistical approach(Autoregressive Integrated Moving Average – ARIMA) to predict precipitation (rainfall). The device is placed by a standard device in the Department of Meteorology, Federal University of Technology, Akure (FUTA) to know the performance evaluation of the device. Both devices (standard and designed) were subjected to 180 days with the same atmospheric condition for data mining (temperature, relative humidity, and pressure). The acquired data is trained in MATLAB R2012b environment using ANN, and ARIMAto predict precipitation (rainfall). Root Mean Square Error (RMSE), Mean Absolute Error (MAE), Correction Square (R2), and Mean Percentage Error (MPE) was deplored as standardize evaluation to know the performance of the models in the prediction of precipitation. The results from the working of the developed device show that the device has an efficiency of 96% and is also compatible with Personal Computer (PC) and laptops. The simulation result for acquired data shows that ANN models precipitation (rainfall) prediction for two months (May and June 2017) revealed a disparity error of 1.59%; while ARIMA is 2.63%, respectively. The device will be useful in research, practical laboratories, and industrial environments.Keywords: data acquisition system, design device, weather development, predict precipitation and (FUTA) standard device
Procedia PDF Downloads 923961 Video Compression Using Contourlet Transform
Authors: Delara Kazempour, Mashallah Abasi Dezfuli, Reza Javidan
Abstract:
Video compression used for channels with limited bandwidth and storage devices has limited storage capabilities. One of the most popular approaches in video compression is the usage of different transforms. Discrete cosine transform is one of the video compression methods that have some problems such as blocking, noising and high distortion inappropriate effect in compression ratio. wavelet transform is another approach is better than cosine transforms in balancing of compression and quality but the recognizing of curve curvature is so limit. Because of the importance of the compression and problems of the cosine and wavelet transforms, the contourlet transform is most popular in video compression. In the new proposed method, we used contourlet transform in video image compression. Contourlet transform can save details of the image better than the previous transforms because this transform is multi-scale and oriented. This transform can recognize discontinuity such as edges. In this approach we lost data less than previous approaches. Contourlet transform finds discrete space structure. This transform is useful for represented of two dimension smooth images. This transform, produces compressed images with high compression ratio along with texture and edge preservation. Finally, the results show that the majority of the images, the parameters of the mean square error and maximum signal-to-noise ratio of the new method based contourlet transform compared to wavelet transform are improved but in most of the images, the parameters of the mean square error and maximum signal-to-noise ratio in the cosine transform is better than the method based on contourlet transform.Keywords: video compression, contourlet transform, discrete cosine transform, wavelet transform
Procedia PDF Downloads 4443960 Research on Load Balancing Technology for Web Service Mobile Host
Authors: Yao Lu, Xiuguo Zhang, Zhiying Cao
Abstract:
In this paper, Load Balancing idea is used in the Web service mobile host. The main idea of Load Balancing is to establish a one-to-many mapping mechanism: An entrance-mapping request to plurality of processing node in order to realize the dividing and assignment processing. Because the mobile host is a resource constrained environment, there are some Web services which cannot be completed on the mobile host. When the mobile host resource is not enough to complete the request, Load Balancing scheduler will divide the request into a plurality of sub-requests and transfer them to different auxiliary mobile hosts. Auxiliary mobile host executes sub-requests, and then, the results will be returned to the mobile host. Service request integrator receives results of sub-requests from the auxiliary mobile host, and integrates the sub-requests. In the end, the complete request is returned to the client. Experimental results show that this technology adopted in this paper can complete requests and have a higher efficiency.Keywords: Dinic, load balancing, mobile host, web service
Procedia PDF Downloads 3283959 Pilot-Assisted Direct-Current Biased Optical Orthogonal Frequency Division Multiplexing Visible Light Communication System
Authors: Ayad A. Abdulkafi, Shahir F. Nawaf, Mohammed K. Hussein, Ibrahim K. Sileh, Fouad A. Abdulkafi
Abstract:
Visible light communication (VLC) is a new approach of optical wireless communication proposed to support the congested radio frequency (RF) spectrum. VLC systems are combined with orthogonal frequency division multiplexing (OFDM) to achieve high rate transmission and high spectral efficiency. In this paper, we investigate the Pilot-Assisted Channel Estimation for DC biased Optical OFDM (PACE-DCO-OFDM) systems to reduce the effects of the distortion on the transmitted signal. Least-square (LS) and linear minimum mean-squared error (LMMSE) estimators are implemented in MATLAB/Simulink to enhance the bit-error-rate (BER) of PACE-DCO-OFDM. Results show that DCO-OFDM system based on PACE scheme has achieved better BER performance compared to conventional system without pilot assisted channel estimation. Simulation results show that the proposed PACE-DCO-OFDM based on LMMSE algorithm can more accurately estimate the channel and achieves better BER performance when compared to the LS based PACE-DCO-OFDM and the traditional system without PACE. For the same signal to noise ratio (SNR) of 25 dB, the achieved BER is about 5×10-4 for LMMSE-PACE and 4.2×10-3 with LS-PACE while it is about 2×10-1 for system without PACE scheme.Keywords: channel estimation, OFDM, pilot-assist, VLC
Procedia PDF Downloads 1803958 Plasma Chemical Gasification of Solid Fuel with Mineral Mass Processing
Authors: V. E. Messerle, O. A. Lavrichshev, A. B. Ustimenko
Abstract:
Currently and in the foreseeable future (up to 2100), the global economy is oriented to the use of organic fuel, mostly, solid fuels, the share of which constitutes 40% in the generation of electric power. Therefore, the development of technologies for their effective and environmentally friendly application represents a priority problem nowadays. This work presents the results of thermodynamic and experimental investigations of plasma technology for processing of low-grade coals. The use of this technology for producing target products (synthesis gas, hydrogen, technical carbon, and valuable components of mineral mass of coals) meets the modern environmental and economic requirements applied to basic industrial sectors. The plasma technology of coal processing for the production of synthesis gas from the coal organic mass (COM) and valuable components from coal mineral mass (CMM) is highly promising. Its essence is heating the coal dust by reducing electric arc plasma to the complete gasification temperature, when the COM converts into synthesis gas, free from particles of ash, nitrogen oxides and sulfur. At the same time, oxides of the CMM are reduced by the carbon residue, producing valuable components, such as technical silicon, ferrosilicon, aluminum and carbon silicon, as well as microelements of rare metals, such as uranium, molybdenum, vanadium, titanium. Thermodynamic analysis of the process was made using a versatile computation program TERRA. Calculations were carried out in the temperature range 300 - 4000 K and a pressure of 0.1 MPa. Bituminous coal with the ash content of 40% and the heating value 16,632 kJ/kg was taken for the investigation. The gaseous phase of coal processing products includes, basically, a synthesis gas with a concentration of up to 99 vol.% at 1500 K. CMM components completely converts from the condensed phase into the gaseous phase at a temperature above 2600 K. At temperatures above 3000 K, the gaseous phase includes, basically, Si, Al, Ca, Fe, Na, and compounds of SiO, SiH, AlH, and SiS. The latter compounds dissociate into relevant elements with increasing temperature. Complex coal conversion for the production of synthesis gas from COM and valuable components from CMM was investigated using a versatile experimental plant the main element of which was plug and flow plasma reactor. The material and thermal balances helped to find the integral indicators for the process. Plasma-steam gasification of the low-grade coal with CMM processing gave the synthesis gas yield 95.2%, the carbon gasification 92.3%, and coal desulfurization 95.2%. The reduced material of the CMM was found in the slag in the form of ferrosilicon as well as silicon and iron carbides. The maximum reduction of the CMM oxides was observed in the slag from the walls of the plasma reactor in the areas with maximum temperatures, reaching 47%. The thusly produced synthesis gas can be used for synthesis of methanol, or as a high-calorific reducing gas instead of blast-furnace coke as well as power gas for thermal power plants. Reduced material of CMM can be used in metallurgy.Keywords: gasification, mineral mass, organic mass, plasma, processing, solid fuel, synthesis gas, valuable components
Procedia PDF Downloads 6083957 Emotional Analysis for Text Search Queries on Internet
Authors: Gemma García López
Abstract:
The goal of this study is to analyze if search queries carried out in search engines such as Google, can offer emotional information about the user that performs them. Knowing the emotional state in which the Internet user is located can be a key to achieve the maximum personalization of content and the detection of worrying behaviors. For this, two studies were carried out using tools with advanced natural language processing techniques. The first study determines if a query can be classified as positive, negative or neutral, while the second study extracts emotional content from words and applies the categorical and dimensional models for the representation of emotions. In addition, we use search queries in Spanish and English to establish similarities and differences between two languages. The results revealed that text search queries performed by users on the Internet can be classified emotionally. This allows us to better understand the emotional state of the user at the time of the search, which could involve adapting the technology and personalizing the responses to different emotional states.Keywords: emotion classification, text search queries, emotional analysis, sentiment analysis in text, natural language processing
Procedia PDF Downloads 1413956 Waste Management in a Hot Laboratory of Japan Atomic Energy Agency – 1: Overview and Activities in Chemical Processing Facility
Authors: Kazunori Nomura, Hiromichi Ogi, Masaumi Nakahara, Sou Watanabe, Atsuhiro Shibata
Abstract:
Chemical Processing Facility of Japan Atomic Energy Agency is a basic research field for advanced back-end technology developments with using actual high-level radioactive materials such as irradiated fuels from the fast reactor, high-level liquid waste from reprocessing plant. In the nature of a research facility, various kinds of chemical reagents have been offered for fundamental tests. Most of them were treated properly and stored in the liquid waste vessel equipped in the facility, but some were not treated and remained at the experimental space as a kind of legacy waste. It is required to treat the waste in safety. On the other hand, we formulated the Medium- and Long-Term Management Plan of Japan Atomic Energy Agency Facilities. This comprehensive plan considers Chemical Processing Facility as one of the facilities to be decommissioned. Even if the plan is executed, treatment of the “legacy” waste beforehand must be a necessary step for decommissioning operation. Under this circumstance, we launched a collaborative research project called the STRAD project, which stands for Systematic Treatment of Radioactive liquid waste for Decommissioning, in order to develop the treatment processes for wastes of the nuclear research facility. In this project, decomposition methods of chemicals causing a troublesome phenomenon such as corrosion and explosion have been developed and there is a prospect of their decomposition in the facility by simple method. And solidification of aqueous or organic liquid wastes after the decomposition has been studied by adding cement or coagulants. Furthermore, we treated experimental tools of various materials with making an effort to stabilize and to compact them before the package into the waste container. It is expected to decrease the number of transportation of the solid waste and widen the operation space. Some achievements of these studies will be shown in this paper. The project is expected to contribute beneficial waste management outcome that can be shared world widely.Keywords: chemical processing facility, medium- and long-term management plan of JAEA facilities, STRAD project, treatment of radioactive waste
Procedia PDF Downloads 1423955 Mitigation of Interference in Satellite Communications Systems via a Cross-Layer Coding Technique
Authors: Mario A. Blanco, Nicholas Burkhardt
Abstract:
An important problem in satellite communication systems which operate in the Ka and EHF frequency bands consists of the overall degradation in link performance of mobile terminals due to various types of degradations in the link/channel, such as fading, blockage of the link to the satellite (especially in urban environments), intentional as well as other types of interference, etc. In this paper, we focus primarily on the interference problem, and we develop a very efficient and cost-effective solution based on the use of fountain codes. We first introduce a satellite communications (SATCOM) terminal uplink interference channel model that is classically used against communication systems that use spread-spectrum waveforms. We then consider the use of fountain codes, with focus on Raptor codes, as our main mitigation technique to combat the degradation in link/receiver performance due to the interference signal. The performance of the receiver is obtained in terms of average probability of bit and message error rate as a function of bit energy-to-noise density ratio, Eb/N0, and other parameters of interest, via a combination of analysis and computer simulations, and we show that the use of fountain codes is extremely effective in overcoming the effects of intentional interference on the performance of the receiver and associated communication links. We then show this technique can be extended to mitigate other types of SATCOM channel degradations, such as those caused by channel fading, shadowing, and hard-blockage of the uplink signal.Keywords: SATCOM, interference mitigation, fountain codes, turbo codes, cross-layer
Procedia PDF Downloads 3613954 Photo-Electrochemical/Electro-Fenton Coupling Oxidation System with Fe/Co-Based Anode and Cathode Metal-Organic Frameworks Derivative Materials for Sulfamethoxazole Treatment
Authors: Xin Chen, Xinyong Li, Qidong Zhao, Dong Wang
Abstract:
A new coupling system was constructed by combining photo-electrochemical cell with electro-fenton cell (PEC-EF). The electrode material in this system was derived from MnyFe₁₋yCo Prussian-Blue-Analog (PBA). Mn₀.₄Fe₀.₆Co₀.₆₇-N@C spin-coated on carbon paper behaved as the gas diffusion cathode and Mn₀.₄Fe₀.₆Co₀.₆₇O₂.₂ spin-coated on fluorine-tin oxide glass (FTO) as anode. The two separated cells could degrade Sulfamethoxazole (SMX) simultaneously and some coupling mechanisms by PEC and EF enhancing the degradation efficiency were investigated. The continuous on-site generation of H₂O₂ at cathode through an oxygen reduction reaction (ORR) was realized over rotating ring-disk electrode (RRDE). The electron transfer number (n) of the ORR with Mn₀.₄Fe₀.₆Co₀.₆₇-N@C was 2.5 in the selected potential and pH range. The photo-electrochemical properties of Mn₀.₄Fe₀.₆Co₀.₆₇O₂.₂ were systematically studied, which displayed good response towards visible light. The photoinduced electrons at anode can transfer to cathode for further use. Efficient photo-electro-catalytic performance was observed in degrading SMX. Almost 100% SMX removal was achieved in 120 min. This work not only provided a highly effective technique for antibiotic treatment but also revealed the synergic effect between PEC and EF.Keywords: electro-fenton, photo-electrochemical, synergic effect, sulfamethoxazole
Procedia PDF Downloads 1803953 In Situ Volume Imaging of Cleared Mice Seminiferous Tubules Opens New Window to Study Spermatogenic Process in 3D
Authors: Lukas Ded
Abstract:
Studying the tissue structure and histogenesis in the natural, 3D context is challenging but highly beneficial process. Contrary to classical approach of the physical tissue sectioning and subsequent imaging, it enables to study the relationships of individual cellular and histological structures in their native context. Recent developments in the tissue clearing approaches and microscopic volume imaging/data processing enable the application of these methods also in the areas of developmental and reproductive biology. Here, using the CLARITY tissue procedure and 3D confocal volume imaging we optimized the protocol for clearing, staining and imaging of the mice seminiferous tubules isolated from the testes without cardiac perfusion procedure. Our approach enables the high magnification and fine resolution axial imaging of the whole diameter of the seminiferous tubules with possible unlimited lateral length imaging. Hence, the large continuous pieces of the seminiferous tubule can be scanned and digitally reconstructed for the study of the single tubule seminiferous stages using nuclear dyes. Furthermore, the application of the antibodies and various molecular dyes can be used for molecular labeling of individual cellular and subcellular structures and resulting 3D images can highly increase our understanding of the spatiotemporal aspects of the seminiferous tubules development and sperm ultrastructure formation. Finally, our newly developed algorithms for 3D data processing enable the massive parallel processing of the large amount of individual cell and tissue fluorescent signatures and building the robust spermatogenic models under physiological and pathological conditions.Keywords: CLARITY, spermatogenesis, testis, tissue clearing, volume imaging
Procedia PDF Downloads 1363952 Mobile Microscope for the Detection of Pathogenic Cells Using Image Processing
Authors: P. S. Surya Meghana, K. Lingeshwaran, C. Kannan, V. Raghavendran, C. Priya
Abstract:
One of the most basic and powerful tools in all of science and medicine is the light microscope, the fundamental device for laboratory as well as research purposes. With the improving technology, the need for portable, economic and user-friendly instruments is in high demand. The conventional microscope fails to live up to the emerging trend. Also, adequate access to healthcare is not widely available, especially in developing countries. The most basic step towards the curing of a malady is the diagnosis of the disease itself. The main aim of this paper is to diagnose Malaria with the most common device, cell phones, which prove to be the immediate solution for most of the modern day needs with the development of wireless infrastructure allowing to compute and communicate on the move. This opened up the opportunity to develop novel imaging, sensing, and diagnostics platforms using mobile phones as an underlying platform to address the global demand for accurate, sensitive, cost-effective, and field-portable measurement devices for use in remote and resource-limited settings around the world.Keywords: cellular, hand-held, health care, image processing, malarial parasites, microscope
Procedia PDF Downloads 2673951 FT-NIR Method to Determine Moisture in Gluten Free Rice-Based Pasta during Drying
Authors: Navneet Singh Deora, Aastha Deswal, H. N. Mishra
Abstract:
Pasta is one of the most widely consumed food products around the world. Rapid determination of the moisture content in pasta will assist food processors to provide online quality control of pasta during large scale production. Rapid Fourier transform near-infrared method (FT-NIR) was developed for determining moisture content in pasta. A calibration set of 150 samples, a validation set of 30 samples and a prediction set of 25 samples of pasta were used. The diffuse reflection spectra of different types of pastas were measured by FT-NIR analyzer in the 4,000-12,000 cm-1 spectral range. Calibration and validation sets were designed for the conception and evaluation of the method adequacy in the range of moisture content 10 to 15 percent (w.b) of the pasta. The prediction models based on partial least squares (PLS) regression, were developed in the near-infrared. Conventional criteria such as the R2, the root mean square errors of cross validation (RMSECV), root mean square errors of estimation (RMSEE) as well as the number of PLS factors were considered for the selection of three pre-processing (vector normalization, minimum-maximum normalization and multiplicative scatter correction) methods. Spectra of pasta sample were treated with different mathematic pre-treatments before being used to build models between the spectral information and moisture content. The moisture content in pasta predicted by FT-NIR methods had very good correlation with their values determined via traditional methods (R2 = 0.983), which clearly indicated that FT-NIR methods could be used as an effective tool for rapid determination of moisture content in pasta. The best calibration model was developed with min-max normalization (MMN) spectral pre-processing (R2 = 0.9775). The MMN pre-processing method was found most suitable and the maximum coefficient of determination (R2) value of 0.9875 was obtained for the calibration model developed.Keywords: FT-NIR, pasta, moisture determination, food engineering
Procedia PDF Downloads 2583950 Unsupervised Classification of DNA Barcodes Species Using Multi-Library Wavelet Networks
Authors: Abdesselem Dakhli, Wajdi Bellil, Chokri Ben Amar
Abstract:
DNA Barcode, a short mitochondrial DNA fragment, made up of three subunits; a phosphate group, sugar and nucleic bases (A, T, C, and G). They provide good sources of information needed to classify living species. Such intuition has been confirmed by many experimental results. Species classification with DNA Barcode sequences has been studied by several researchers. The classification problem assigns unknown species to known ones by analyzing their Barcode. This task has to be supported with reliable methods and algorithms. To analyze species regions or entire genomes, it becomes necessary to use similarity sequence methods. A large set of sequences can be simultaneously compared using Multiple Sequence Alignment which is known to be NP-complete. To make this type of analysis feasible, heuristics, like progressive alignment, have been developed. Another tool for similarity search against a database of sequences is BLAST, which outputs shorter regions of high similarity between a query sequence and matched sequences in the database. However, all these methods are still computationally very expensive and require significant computational infrastructure. Our goal is to build predictive models that are highly accurate and interpretable. This method permits to avoid the complex problem of form and structure in different classes of organisms. On empirical data and their classification performances are compared with other methods. Our system consists of three phases. The first is called transformation, which is composed of three steps; Electron-Ion Interaction Pseudopotential (EIIP) for the codification of DNA Barcodes, Fourier Transform and Power Spectrum Signal Processing. The second is called approximation, which is empowered by the use of Multi Llibrary Wavelet Neural Networks (MLWNN).The third is called the classification of DNA Barcodes, which is realized by applying the algorithm of hierarchical classification.Keywords: DNA barcode, electron-ion interaction pseudopotential, Multi Library Wavelet Neural Networks (MLWNN)
Procedia PDF Downloads 3183949 Investigating the Effect of Orthographic Transparency on Phonological Awareness in Bilingual Children with Dyslexia
Authors: Sruthi Raveendran
Abstract:
Developmental dyslexia, characterized by reading difficulties despite normal intelligence, presents a significant challenge for bilingual children navigating languages with varying degrees of orthographic transparency. This study bridges a critical gap in dyslexia interventions for bilingual populations in India by examining how consistency and predictability of letter-sound relationships in a writing system (orthographic transparency) influence the ability to understand and manipulate the building blocks of sound in language (phonological processing). The study employed a computerized visual rhyme-judgment task with concurrent EEG (electroencephalogram) recording. The task compared reaction times, accuracy of performance, and event-related potential (ERP) components (N170, N400, and LPC) for rhyming and non-rhyming stimuli in two orthographies: English (opaque orthography) and Kannada (transparent orthography). As hypothesized, the results revealed advantages in phonological processing tasks for transparent orthography (Kannada). Children with dyslexia were faster and more accurate when judging rhymes in Kannada compared to English. This suggests that a language with consistent letter-sound relationships (transparent orthography) facilitates processing, especially for tasks that involve manipulating sounds within words (rhyming). Furthermore, brain activity measured by event-related potentials (ERP) showed less effort required for processing words in Kannada, as reflected by smaller N170, N400, and LPC amplitudes. These findings highlight the crucial role of orthographic transparency in optimizing reading performance for bilingual children with dyslexia. These findings emphasize the need for language-specific intervention strategies that consider the unique linguistic characteristics of each language. While acknowledging the complexity of factors influencing dyslexia, this research contributes valuable insights into the impact of orthographic transparency on phonological awareness in bilingual children. This knowledge paves the way for developing tailored interventions that promote linguistic inclusivity and optimize literacy outcomes for children with dyslexia.Keywords: developmental dyslexia, phonological awareness, rhyme judgment, orthographic transparency, Kannada, English, N170, N400, LPC
Procedia PDF Downloads 93948 Short-Term Effects of Environmentally Relevant Concentrations of Organic UV Filters on Signal Crayfish Pacifastacus Leniusculus
Authors: Viktoriia Malinovska, Iryna Kuklina, Katerina Grabicova, Milos Buric, Pavel Kozak
Abstract:
Personal care products, including organic UV filters, are considered emerging contaminants and their toxic effects have been a concern for the last decades. Sunscreen compounds continually enter the surface waters via sewage water treatment due to incomplete removal and during human recreational and laundry activities. Despite the environmental occurrence of organic UV filters in the freshwater environment, little is known about their impacts on aquatic biota. In this study, environmentally relevant concentrations of 5-Benzoyl-4-hydroxy-2-methoxybenzenesulfonic acid (BP-4, 2.5 µg/L) and 2-Phenylbenzimidazole-5-sulfonic acid (PBSA, 3 µg/L) were used to evaluate the cardiac and locomotor responses of signal crayfish Pacifastacus leniusculus during a short time period. The effects of these compounds were evident in experimental animals. Specimens exposed to both tested compounds exhibited significantly bigger changes in distance moved and time movement than controls. Significant differences in changes in mean heart rate were detected in both PBSA and BP-4 experimental groups compared to control groups. Such behavioral and physiological alterations demonstrate the ecological effects of selected sunscreen compounds during a short time period. Since the evidence of the impacts of sunscreen compounds is scarce, the knowledge of how organic UV filters influence aquatic organisms is of key importance for future research.Keywords: aquatic pollutants, behavior, freshwaters, heart rate, invertebrate
Procedia PDF Downloads 1053947 Filtering and Reconstruction System for Grey-Level Forensic Images
Authors: Ahd Aljarf, Saad Amin
Abstract:
Images are important source of information used as evidence during any investigation process. Their clarity and accuracy is essential and of the utmost importance for any investigation. Images are vulnerable to losing blocks and having noise added to them either after alteration or when the image was taken initially, therefore, having a high performance image processing system and it is implementation is very important in a forensic point of view. This paper focuses on improving the quality of the forensic images. For different reasons packets that store data can be affected, harmed or even lost because of noise. For example, sending the image through a wireless channel can cause loss of bits. These types of errors might give difficulties generally for the visual display quality of the forensic images. Two of the images problems: noise and losing blocks are covered. However, information which gets transmitted through any way of communication may suffer alteration from its original state or even lose important data due to the channel noise. Therefore, a developed system is introduced to improve the quality and clarity of the forensic images.Keywords: image filtering, image reconstruction, image processing, forensic images
Procedia PDF Downloads 3663946 A Query Optimization Strategy for Autonomous Distributed Database Systems
Authors: Dina K. Badawy, Dina M. Ibrahim, Alsayed A. Sallam
Abstract:
Distributed database is a collection of logically related databases that cooperate in a transparent manner. Query processing uses a communication network for transmitting data between sites. It refers to one of the challenges in the database world. The development of sophisticated query optimization technology is the reason for the commercial success of database systems, which complexity and cost increase with increasing number of relations in the query. Mariposa, query trading and query trading with processing task-trading strategies developed for autonomous distributed database systems, but they cause high optimization cost because of involvement of all nodes in generating an optimal plan. In this paper, we proposed a modification on the autonomous strategy K-QTPT that make the seller’s nodes with the lowest cost have gradually high priorities to reduce the optimization time. We implement our proposed strategy and present the results and analysis based on those results.Keywords: autonomous strategies, distributed database systems, high priority, query optimization
Procedia PDF Downloads 5243945 An Intelligent Scheme Switching for MIMO Systems Using Fuzzy Logic Technique
Authors: Robert O. Abolade, Olumide O. Ajayi, Zacheaus K. Adeyemo, Solomon A. Adeniran
Abstract:
Link adaptation is an important strategy for achieving robust wireless multimedia communications based on quality of service (QoS) demand. Scheme switching in multiple-input multiple-output (MIMO) systems is an aspect of link adaptation, and it involves selecting among different MIMO transmission schemes or modes so as to adapt to the varying radio channel conditions for the purpose of achieving QoS delivery. However, finding the most appropriate switching method in MIMO links is still a challenge as existing methods are either computationally complex or not always accurate. This paper presents an intelligent switching method for the MIMO system consisting of two schemes - transmit diversity (TD) and spatial multiplexing (SM) - using fuzzy logic technique. In this method, two channel quality indicators (CQI) namely average received signal-to-noise ratio (RSNR) and received signal strength indicator (RSSI) are measured and are passed as inputs to the fuzzy logic system which then gives a decision – an inference. The switching decision of the fuzzy logic system is fed back to the transmitter to switch between the TD and SM schemes. Simulation results show that the proposed fuzzy logic – based switching technique outperforms conventional static switching technique in terms of bit error rate and spectral efficiency.Keywords: channel quality indicator, fuzzy logic, link adaptation, MIMO, spatial multiplexing, transmit diversity
Procedia PDF Downloads 1523944 Preparation on Sentimental Analysis on Social Media Comments with Bidirectional Long Short-Term Memory Gated Recurrent Unit and Model Glove in Portuguese
Authors: Leonardo Alfredo Mendoza, Cristian Munoz, Marco Aurelio Pacheco, Manoela Kohler, Evelyn Batista, Rodrigo Moura
Abstract:
Natural Language Processing (NLP) techniques are increasingly more powerful to be able to interpret the feelings and reactions of a person to a product or service. Sentiment analysis has become a fundamental tool for this interpretation but has few applications in languages other than English. This paper presents a classification of sentiment analysis in Portuguese with a base of comments from social networks in Portuguese. A word embedding's representation was used with a 50-Dimension GloVe pre-trained model, generated through a corpus completely in Portuguese. To generate this classification, the bidirectional long short-term memory and bidirectional Gated Recurrent Unit (GRU) models are used, reaching results of 99.1%.Keywords: natural processing language, sentiment analysis, bidirectional long short-term memory, BI-LSTM, gated recurrent unit, GRU
Procedia PDF Downloads 1593943 Thermal Image Segmentation Method for Stratification of Freezing Temperatures
Authors: Azam Fazelpour, Saeed R. Dehghani, Vlastimil Masek, Yuri S. Muzychka
Abstract:
The study uses an image analysis technique employing thermal imaging to measure the percentage of areas with various temperatures on a freezing surface. An image segmentation method using threshold values is applied to a sequence of image recording the freezing process. The phenomenon is transient and temperatures vary fast to reach the freezing point and complete the freezing process. Freezing salt water is subjected to the salt rejection that makes the freezing point dynamic and dependent on the salinity at the phase interface. For a specific area of freezing, nucleation starts from one side and end to another side, which causes a dynamic and transient temperature in that area. Thermal cameras are able to reveal a difference in temperature due to their sensitivity to infrared radiance. Using Experimental setup, a video is recorded by a thermal camera to monitor radiance and temperatures during the freezing process. Image processing techniques are applied to all frames to detect and classify temperatures on the surface. Image processing segmentation method is used to find contours with same temperatures on the icing surface. Each segment is obtained using the temperature range appeared in the image and correspond pixel values in the image. Using the contours extracted from image and camera parameters, stratified areas with different temperatures are calculated. To observe temperature contours on the icing surface using the thermal camera, the salt water sample is dropped on a cold surface with the temperature of -20°C. A thermal video is recorded for 2 minutes to observe the temperature field. Examining the results obtained by the method and the experimental observations verifies the accuracy and applicability of the method.Keywords: ice contour boundary, image processing, image segmentation, salt ice, thermal image
Procedia PDF Downloads 3213942 Automatic Extraction of Arbitrarily Shaped Buildings from VHR Satellite Imagery
Authors: Evans Belly, Imdad Rizvi, M. M. Kadam
Abstract:
Satellite imagery is one of the emerging technologies which are extensively utilized in various applications such as detection/extraction of man-made structures, monitoring of sensitive areas, creating graphic maps etc. The main approach here is the automated detection of buildings from very high resolution (VHR) optical satellite images. Initially, the shadow, the building and the non-building regions (roads, vegetation etc.) are investigated wherein building extraction is mainly focused. Once all the landscape is collected a trimming process is done so as to eliminate the landscapes that may occur due to non-building objects. Finally the label method is used to extract the building regions. The label method may be altered for efficient building extraction. The images used for the analysis are the ones which are extracted from the sensors having resolution less than 1 meter (VHR). This method provides an efficient way to produce good results. The additional overhead of mid processing is eliminated without compromising the quality of the output to ease the processing steps required and time consumed.Keywords: building detection, shadow detection, landscape generation, label, partitioning, very high resolution (VHR) satellite imagery
Procedia PDF Downloads 314