Search results for: Wavelet Decomposition.
88 Economic Growth Relations to Domestic and International Air Passenger Transport in Brazil
Authors: Manoela Cabo da Silva, Elton Fernandes, Ricardo Pacheco, Heloisa Pires
Abstract:
This study examined cointegration and causal relationships between economic growth and regular domestic and international passenger air transport in Brazil. Total passengers embarked and disembarked were used as a proxy for air transport activity and gross domestic product (GDP) as a proxy for economic development. The test spanned the period from 2000 to 2015 for domestic passenger traffic and from 1995 to 2015 for international traffic. The results confirm the hypothesis that there is cointegration between passenger traffic series and economic development, showing a bi-directional Granger causal relationship between domestic traffic and economic development and unidirectional influence by economic growth on international passenger air transport demand. Variance decomposition of the series showed that domestic air transport was far more important than international transport to promoting economic development in Brazil.
Keywords: Air passenger transport, cointegration, economic growth, GDP, granger causality.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 100387 Spike Sorting Method Using Exponential Autoregressive Modeling of Action Potentials
Authors: Sajjad Farashi
Abstract:
Neurons in the nervous system communicate with each other by producing electrical signals called spikes. To investigate the physiological function of nervous system it is essential to study the activity of neurons by detecting and sorting spikes in the recorded signal. In this paper a method is proposed for considering the spike sorting problem which is based on the nonlinear modeling of spikes using exponential autoregressive model. The genetic algorithm is utilized for model parameter estimation. In this regard some selected model coefficients are used as features for sorting purposes. For optimal selection of model coefficients, self-organizing feature map is used. The results show that modeling of spikes with nonlinear autoregressive model outperforms its linear counterpart. Also the extracted features based on the coefficients of exponential autoregressive model are better than wavelet based extracted features and get more compact and well-separated clusters. In the case of spikes different in small-scale structures where principal component analysis fails to get separated clouds in the feature space, the proposed method can obtain well-separated cluster which removes the necessity of applying complex classifiers.
Keywords: Exponential autoregressive model, Neural data, spike sorting, time series modeling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 176986 Microbial Production of Levan using Date Syrup and Investigation of Its Properties
Authors: Marzieh Moosavi-Nasab, Behnaz Layegh , Ladan Aminlari, Mohammad B. Hashemi
Abstract:
Levan, an exopolysaccharide, was produced by Microbacterium laevaniformans and its yield was characterized as a function of concentrations of date syrup, sucrose and the fermentation time. The optimum condition for levan production from sucrose was at concentration of 20% sucrose for 48 h and for date syrup was 25% for 48 h. The results show that an increase in fermentation time caused a decrease in the levan production at all concentrations of date syrup tested. Under these conditions after 48 h in sucrose medium, levan production reached 48.9 g/L and for date syrup reached 10.48 g/L . The effect of pH on the yield of the purified levan was examined and the optimum pH for levan production was determined to be 6.0. Levan was composed mainly of fructose residues when analyzed by TLC and FT-IR spectroscopy. Date syrup is a cheap substrate widely available in Iran and has potential for levan production. The thermal stability of levan was assessed by Thermo Gravimetric Analysis (TGA) that revealed the onset of decomposition near to 49°C for the levan produced from sucrose and 51°C for the levan from date syrup. DSC results showed a single Tg at 98°C for levan produced from sucrose and 206 °C for levan from date syrup.Keywords: Date syrup, Fermentation, Levan, Microbacteriumlaevaniformans
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 273085 A Numerical Investigation of Lamb Wave Damage Diagnosis for Composite Delamination Using Instantaneous Phase
Authors: Haode Huo, Jingjing He, Rui Kang, Xuefei Guan
Abstract:
This paper presents a study of Lamb wave damage diagnosis of composite delamination using instantaneous phase data. Numerical experiments are performed using the finite element method. Different sizes of delamination damages are modeled using finite element package ABAQUS. Lamb wave excitation and responses data are obtained using a pitch-catch configuration. Empirical mode decomposition is employed to extract the intrinsic mode functions (IMF). Hilbert–Huang Transform is applied to each of the resulting IMFs to obtain the instantaneous phase information. The baseline data for healthy plates are also generated using the same procedure. The size of delamination is correlated with the instantaneous phase change for damage diagnosis. It is observed that the unwrapped instantaneous phase of shows a consistent behavior with the increasing delamination size.Keywords: Delamination, lamb wave, finite element method, EMD, instantaneous phase.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 72384 Biodegradation of Lignocellulosic Residues of Water Hyacinth (Eichhornia crassipes) and Response Surface Methodological Approach to Optimize Bioethanol Production Using Fermenting Yeast Pachysolen tannophilus NRRL Y-2460
Authors: A. Manivannan, R. T. Narendhirakannan
Abstract:
The objective of this research was to investigate biodegradation of water hyacinth (Eichhornia crassipes) to produce bioethanol using dilute-acid pretreatment (1% sulfuric acid) results in high hemicellulose decomposition and using yeast (Pachysolen tannophilus) as bioethanol producing strain. A maximum ethanol yield of 1.14g/L with coefficient, 0.24g g-1; productivity, 0.015g l-1h-1 was comparable to predicted value 32.05g/L obtained by Central Composite Design (CCD). Maximum ethanol yield coefficient was comparable to those obtained through enzymatic saccharification and fermentation of acid hydrolysate using fully equipped fermentor. Although maximum ethanol concentration was low in lab scale, the improvement of lignocellulosic ethanol yield is necessary for large scale production.
Keywords: Acid hydrolysis, Biodegradation, Hemicellulose, Pachysolen tannophilus, Water hyacinth.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 189083 Improved Feature Processing for Iris Biometric Authentication System
Authors: Somnath Dey, Debasis Samanta
Abstract:
Iris-based biometric authentication is gaining importance in recent times. Iris biometric processing however, is a complex process and computationally very expensive. In the overall processing of iris biometric in an iris-based biometric authentication system, feature processing is an important task. In feature processing, we extract iris features, which are ultimately used in matching. Since there is a large number of iris features and computational time increases as the number of features increases, it is therefore a challenge to develop an iris processing system with as few as possible number of features and at the same time without compromising the correctness. In this paper, we address this issue and present an approach to feature extraction and feature matching process. We apply Daubechies D4 wavelet with 4 levels to extract features from iris images. These features are encoded with 2 bits by quantizing into 4 quantization levels. With our proposed approach it is possible to represent an iris template with only 304 bits, whereas existing approaches require as many as 1024 bits. In addition, we assign different weights to different iris region to compare two iris templates which significantly increases the accuracy. Further, we match the iris template based on a weighted similarity measure. Experimental results on several iris databases substantiate the efficacy of our approach.Keywords: Iris recognition, biometric, feature processing, patternrecognition, pattern matching.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 213882 Implementation of SU-MIMO and MU-MIMOGTD-System under Imperfect CSI Knowledge
Authors: Parit Kanjanavirojkul, Kiatwarakorn Keeratishananond, Prapun Suksompong
Abstract:
We study the performance of compressed beamforming weights feedback technique in generalized triangular decomposition (GTD) based MIMO system. GTD is a beamforming technique that enjoys QoS flexibility. The technique, however, will perform at its optimum only when the full knowledge of channel state information (CSI) is available at the transmitter. This would be impossible in the real system, where there are channel estimation error and limited feedback. We suggest a way to implement the quantized beamforming weights feedback, which can significantly reduce the feedback data, on GTD-based MIMO system and investigate the performance of the system. Interestingly, we found that compressed beamforming weights feedback does not degrade the BER performance of the system at low input power, while the channel estimation error and quantization do. For comparison, GTD is more sensitive to compression and quantization, while SVD is more sensitive to the channel estimation error. We also explore the performance of GTDbased MU-MIMO system, and find that the BER performance starts to degrade largely at around -20 dB channel estimation error.Keywords: MIMO, MU-MIMO, GTD, Imperfect CSI.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 194981 Phenolic-Based Chemical Production from Catalytic Depolymerization of Alkaline Lignin over Fumed Silica Catalyst
Authors: S. Totong, P. Daorattanachai, N. Laosiripojana
Abstract:
Lignin depolymerization into phenolic-based chemicals is an interesting process for utilizing and upgrading a benefit and value of lignin. In this study, the depolymerization reaction was performed to convert alkaline lignin into smaller molecule compounds. Fumed SiO₂ was used as a catalyst to improve catalytic activity in lignin decomposition. The important parameters in depolymerization process (i.e., reaction temperature, reaction time, etc.) were also investigated. In addition, gas chromatography with mass spectrometry (GC-MS), flame-ironized detector (GC-FID), and Fourier transform infrared spectroscopy (FT-IR) were used to analyze and characterize the lignin products. It was found that fumed SiO₂ catalyst led the good catalytic activity in lignin depolymerization. The main products from catalytic depolymerization were guaiacol, syringol, vanillin, and phenols. Additionally, metal supported on fumed SiO₂ such as Cu/SiO₂ and Ni/SiO₂ increased the catalyst activity in terms of phenolic products yield.
Keywords: Alkaline lignin, catalytic, depolymerization, fumed SiO2, phenolic-based chemicals.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 86180 Robust Digital Cinema Watermarking
Authors: Sadi Vural, Hiromi Tomii, Hironori Yamauchi
Abstract:
With the advent of digital cinema and digital broadcasting, copyright protection of video data has been one of the most important issues. We present a novel method of watermarking for video image data based on the hardware and digital wavelet transform techniques and name it as “traceable watermarking" because the watermarked data is constructed before the transmission process and traced after it has been received by an authorized user. In our method, we embed the watermark to the lowest part of each image frame in decoded video by using a hardware LSI. Digital Cinema is an important application for traceable watermarking since digital cinema system makes use of watermarking technology during content encoding, encryption, transmission, decoding and all the intermediate process to be done in digital cinema systems. The watermark is embedded into the randomly selected movie frames using hash functions. Embedded watermark information can be extracted from the decoded video data. For that, there is no need to access original movie data. Our experimental results show that proposed traceable watermarking method for digital cinema system is much better than the convenient watermarking techniques in terms of robustness, image quality, speed, simplicity and robust structure.Keywords: Decoder, Digital content, JPEG2000 Frame, System-On-Chip, traceable watermark, Hash Function, CRC-32.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 164679 Source of Oseltamivir Resistance Due to R152K Mutation of Influenza B Virus Neuraminidase: Molecular Modeling
Authors: J. Tengrang, T. Rungrotmongkol, S. Hannongbua
Abstract:
Every 2-3 years the influenza B virus serves epidemics. Neuraminidase (NA) is an important target for influenza drug design. Although, oseltamivir, an oral neuraminidase drug, has been shown good inhibitory efficiency against wild-type of influenza B virus, the lower susceptibility to the R152K mutation has been reported. Better understanding of oseltamivir efficiency and resistance toward the influenza B NA wild-type and R152K mutant, respectively, could be useful for rational drug design. Here, two complex systems of wild-type and R152K NAs with oseltamivir bound were studied using molecular dynamics (MD) simulations. Based on 5-ns MD simulation, the loss of notable hydrogen bond and decrease in per-residue decomposition energy from the mutated residue K152 contributed to drug compared to those of R152 in wildtype were found to be a primary source of high-level of oseltamivir resistance due to the R152K mutation.Keywords: Influenza B neuraminidase, Molecular dynamics simulation, Oseltamivir resistance, R152K mutant
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 190878 Contribution to the Analytical Study of Barrier Surface Waves: Decomposition of the Solution
Authors: T. Zitoun, M. Bouhadef
Abstract:
When a partially or completely immersed solid moves in a liquid such as water, it undergoes a force called hydrodynamic drag. Reducing this force has always been the objective of hydrodynamic engineers to make water slide better on submerged bodies. This paper deals with the examination of the different terms composing the analytical solution of the flow over an obstacle embedded at the bottom of a hydraulic channel. We have chosen to use a linear method to study a two-dimensional flow over an obstacle, in order to understand the evolution of the drag. We set the following assumptions: incompressible inviscid fluid, irrotational flow, low obstacle height compared to the water height. Those assumptions allow overcoming the difficulties associated with modelling these waves. We will mathematically formulate the equations that allow the determination of the stream function, and then the free surface equation. A similar method is used to determine the exact analytical solution for an obstacle in the shape of a sinusoidal arch.Keywords: Free-surface wave, inviscid fluid, analytical solution, hydraulic channel.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 79677 A Parallel Approach for 3D-Variational Data Assimilation on GPUs in Ocean Circulation Models
Authors: Rossella Arcucci, Luisa D’Amore, Simone Celestino, Giuseppe Scotti, Giuliano Laccetti
Abstract:
This work is the first dowel in a rather wide research activity in collaboration with Euro Mediterranean Center for Climate Changes, aimed at introducing scalable approaches in Ocean Circulation Models. We discuss designing and implementation of a parallel algorithm for solving the Variational Data Assimilation (DA) problem on Graphics Processing Units (GPUs). The algorithm is based on the fully scalable 3DVar DA model, previously proposed by the authors, which uses a Domain Decomposition approach (we refer to this model as the DD-DA model). We proceed with an incremental porting process consisting of 3 distinct stages: requirements and source code analysis, incremental development of CUDA kernels, testing and optimization. Experiments confirm the theoretic performance analysis based on the so-called scale up factor demonstrating that the DD-DA model can be suitably mapped on GPU architectures.Keywords: Data Assimilation, Parallel Algorithm, GPU architectures, Ocean Models.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 200876 Scope of BOD, Nitrogen and Phosphorous Removal through Plant-Soil Interaction in the Wetland
Authors: Debabrata Mazumder
Abstract:
Constructed and natural wetlands are being used extensively to treat different types of wastewater including the domestic one. Considerable removal efficiency has been achieved for a variety of pollutants like BOD, nitrogen and phosphorous in the wetlands. Wetland treatment appears to be the best choice for treatment or pre-treatment of wastewater because of the low maintenance cost and simplicity of operation. Wetlands are the natural exporters of organic carbon on account of decomposition of organic matter. The emergent plants like reeds, bulrushes and cattails are commonly used in constructed wetland for the treatment process providing surface for bacterial growth, filtration of solids, nutrient uptake and oxygenation to promote nitrification as well as denitrification. The present paper explored different scopes of organic matter (BOD), nitrogen and phosphorous removal from wastewater through wetlands. Emphasis is given to look into the soil chemistry for tracing the behavior of carbon, nitrogen and phosphorus in the wetland. Due consideration is also made to see the viability for upgrading the BOD, nitrogen and phosphorus removal efficiency through different classical modifications of wetland.
Keywords: BOD removal, modification, nitrogen removal, phosphorous removal, wetland.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 273275 Soil Compaction in Tropical Organic Farming Systems and Its Impact on Natural Soil-Borne Disease Suppression: Challenges for Management
Authors: Ishak, L., McHenry, M. T., Brown, P. H.
Abstract:
Organic farming systems still depend on intensive, mechanical soil tillage. Frequent passes by machinery traffic cause substantial soil compaction that threatens soil health. Adopting practices as reduced tillage and organic matter retention on the soil surface are considered effective ways to control soil compaction. In tropical regions, however, the acceleration of soil organic matter decomposition and soil carbon turnover on the topsoil layer is influenced more rapidly by the oscillation process of drying and wetting. It is hypothesized therefore, that rapid reduction in soil organic matter hastens the potential for compaction to occur in organic farming systems. Compaction changes soil physical properties and as a consequence it has been implicated as a causal agent in the inhibition of natural disease suppression in soils. Here we describe relationships between soil management in organic vegetable systems, soil compaction, and declining soil capacity to suppress pathogenic microorganisms.
Keywords: Organic farming systems, soil compaction, soil disease suppression, tropical regions.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 216574 An Effective Noise Resistant FM Continuous-Wave Radar Vital Sign Signal Detection Method
Authors: Lu Yang, Meiyang Song, Xiang Yu, Wenhao Zhou, Chuntao Feng
Abstract:
To address the problem that the FM continuous-wave (FMCW) radar extracts human vital sign signals which are susceptible to noise interference and low reconstruction accuracy, a detection scheme for the sign signals is proposed. Firstly, an improved complete ensemble empirical modal decomposition with adaptive noise (ICEEMDAN) algorithm is applied to decompose the radar-extracted thoracic signals to obtain several intrinsic modal functions (IMF) with different spatial scales, and then the IMF components are optimized by a backpropagation (BP) neural network improved by immune genetic algorithm (IGA). The simulation results show that this scheme can effectively separate the noise, accurately extract the respiratory and heartbeat signals and improve the reconstruction accuracy and signal to-noise ratio of the sign signals.
Keywords: Frequency modulated continuous wave radar, ICEEMDAN, BP Neural Network, vital signs signal.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 47773 Decoupled, Reduced Order Model for Double Output Induction Generator Using Integral Manifolds and Iterative Separation Theory
Authors: M. Sedighizadeh, A. Rezazadeh
Abstract:
In this paper presents a technique for developing the computational efficiency in simulating double output induction generators (DOIG) with two rotor circuits where stator transients are to be included. Iterative decomposition is used to separate the flux– Linkage equations into decoupled fast and slow subsystems, after which the model order of the fast subsystems is reduced by neglecting the heavily damped fast transients caused by the second rotor circuit using integral manifolds theory. The two decoupled subsystems along with the equation for the very slowly changing slip constitute a three time-scale model for the machine which resulted in increasing computational speed. Finally, the proposed method of reduced order in this paper is compared with the other conventional methods in linear and nonlinear modes and it is shown that this method is better than the other methods regarding simulation accuracy and speed.Keywords: DOIG, Iterative separation, Integral manifolds, Reduced order.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 125972 Improving the Analytical Power of Dynamic DEA Models, by the Consideration of the Shape of the Distribution of Inputs/Outputs Data: A Linear Piecewise Decomposition Approach
Authors: Elias K. Maragos, Petros E. Maravelakis
Abstract:
In Dynamic Data Envelopment Analysis (DDEA), which is a subfield of Data Envelopment Analysis (DEA), the productivity of Decision Making Units (DMUs) is considered in relation to time. In this case, as it is accepted by the most of the researchers, there are outputs, which are produced by a DMU to be used as inputs in a future time. Those outputs are known as intermediates. The common models, in DDEA, do not take into account the shape of the distribution of those inputs, outputs or intermediates data, assuming that the distribution of the virtual value of them does not deviate from linearity. This weakness causes the limitation of the accuracy of the analytical power of the traditional DDEA models. In this paper, the authors, using the concept of piecewise linear inputs and outputs, propose an extended DDEA model. The proposed model increases the flexibility of the traditional DDEA models and improves the measurement of the dynamic performance of DMUs.
Keywords: Data envelopment analysis, Dynamic DEA, Piecewise linear inputs, Piecewise linear outputs.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 65471 Effects of Additives on Thermal Decompositions of Carbon Black/High Density Polyethylene Compounds
Authors: Orathai Pornsunthorntawee, Wareerom Polrut, Nopphawan Phonthammachai
Abstract:
In the present work, the effects of additives, including contents of the added antioxidants and type of the selected metallic stearates (either calcium stearate (CaSt) or zinc stearate (ZnSt)), on the thermal stabilities of carbon black (CB)/high density polyethylene (HDPE) compounds were studied. The results showed that the AO contents played a key role in the thermal stabilities of the CB/HDPE compounds — the higher the AO content, the higher the thermal stabilities. Although the CaSt-containing compounds were slightly superior to those with ZnSt in terms of the thermal stabilities, the remaining solid residue of CaSt after heated to the temperature of 600 °C (mainly calcium carbonate (CaCO3) as characterized by the X-ray diffraction (XRD) technique) seemed to catalyze the decomposition of CB in the HDPE-based compounds. Hence, the quantification of CB in the CaSt-containing compounds with a muffle furnace gave an inaccurate CB content — much lower than actual value. However, this phenomenon was negligible in the ZnSt-containing system.
Keywords: Antioxidant, Stearate, Carbon black, Polyethylene.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 338570 Use of Natural Fibers in Landfill Leachate Treatment
Authors: J. F. Marina Araujo, F. Marcus Vinicius Araujo, R. Daniella Mulinari
Abstract:
Due to the resultant leachate from waste decomposition in landfills has polluter potential hundred times greater than domestic sewage, this is considered a problem related to the depreciation of environment requiring pre-disposal treatment.In seeking to improve this situation, this project proposes the treatment of landfill leachate using natural fibers intercropped with advanced oxidation processes. The selected natural fibers were palm, coconut and banana fiber.These materials give sustainability to the project because, besides having adsorbent capacity, are often part of waste discarded. The study was conducted in laboratory scale.In trials, the effluents were characterized as Chemical Oxygen Demand (COD), Turbidity and Color. The results indicate that is technically promising since that there were extremely oxidative conditions, the use of certain natural fibers in the reduction of pollutants in leachate have been obtained results of COD removals between 67.9% and 90.9%, Turbidity between 88.0% and 99.7% and Color between 67.4% and 90.4%.The expectation generated is to continue evaluating the association of efficiency of other natural fibers with other landfill leachate treatment processes.Keywords: Landfill leachate, chemical treatment, natural Fibers, advanced oxidation processes.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 262369 Active Segment Selection Method in EEG Classification Using Fractal Features
Authors: Samira Vafaye Eslahi
Abstract:
BCI (Brain Computer Interface) is a communication machine that translates brain massages to computer commands. These machines with the help of computer programs can recognize the tasks that are imagined. Feature extraction is an important stage of the process in EEG classification that can effect in accuracy and the computation time of processing the signals. In this study we process the signal in three steps of active segment selection, fractal feature extraction, and classification. One of the great challenges in BCI applications is to improve classification accuracy and computation time together. In this paper, we have used student’s 2D sample t-statistics on continuous wavelet transforms for active segment selection to reduce the computation time. In the next level, the features are extracted from some famous fractal dimension estimation of the signal. These fractal features are Katz and Higuchi. In the classification stage we used ANFIS (Adaptive Neuro-Fuzzy Inference System) classifier, FKNN (Fuzzy K-Nearest Neighbors), LDA (Linear Discriminate Analysis), and SVM (Support Vector Machines). We resulted that active segment selection method would reduce the computation time and Fractal dimension features with ANFIS analysis on selected active segments is the best among investigated methods in EEG classification.
Keywords: EEG, Student’s t- statistics, BCI, Fractal Features, ANFIS, FKNN.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 211968 Solar Radiation Time Series Prediction
Authors: Cameron Hamilton, Walter Potter, Gerrit Hoogenboom, Ronald McClendon, Will Hobbs
Abstract:
A model was constructed to predict the amount of solar radiation that will make contact with the surface of the earth in a given location an hour into the future. This project was supported by the Southern Company to determine at what specific times during a given day of the year solar panels could be relied upon to produce energy in sufficient quantities. Due to their ability as universal function approximators, an artificial neural network was used to estimate the nonlinear pattern of solar radiation, which utilized measurements of weather conditions collected at the Griffin, Georgia weather station as inputs. A number of network configurations and training strategies were utilized, though a multilayer perceptron with a variety of hidden nodes trained with the resilient propagation algorithm consistently yielded the most accurate predictions. In addition, a modeled direct normal irradiance field and adjacent weather station data were used to bolster prediction accuracy. In later trials, the solar radiation field was preprocessed with a discrete wavelet transform with the aim of removing noise from the measurements. The current model provides predictions of solar radiation with a mean square error of 0.0042, though ongoing efforts are being made to further improve the model’s accuracy.
Keywords: Artificial Neural Networks, Resilient Propagation, Solar Radiation, Time Series Forecasting.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 276067 Structuring and Visualizing Healthcare Claims Data Using Systems Architecture Methodology
Authors: Inas S. Khayal, Weiping Zhou, Jonathan Skinner
Abstract:
Healthcare delivery systems around the world are in crisis. The need to improve health outcomes while decreasing healthcare costs have led to an imminent call to action to transform the healthcare delivery system. While Bioinformatics and Biomedical Engineering have primarily focused on biological level data and biomedical technology, there is clear evidence of the importance of the delivery of care on patient outcomes. Classic singular decomposition approaches from reductionist science are not capable of explaining complex systems. Approaches and methods from systems science and systems engineering are utilized to structure healthcare delivery system data. Specifically, systems architecture is used to develop a multi-scale and multi-dimensional characterization of the healthcare delivery system, defined here as the Healthcare Delivery System Knowledge Base. This paper is the first to contribute a new method of structuring and visualizing a multi-dimensional and multi-scale healthcare delivery system using systems architecture in order to better understand healthcare delivery.Keywords: Health informatics, systems thinking, systems architecture, healthcare delivery system, data analytics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 113766 Development of a Nano-Alumina-Zirconia Composite Catalyst as an Active Thin Film in Biodiesel Production
Authors: N. Marzban, J. K. Heydarzadeh M. Pourmohammadbagher, M. H. Hatami, A. Samia
Abstract:
A nano-alumina-zirconia composite catalyst was synthesized by a simple aqueous sol-gel method using AlCl3.6H2O and ZrCl4 as precursors. Thermal decomposition of the precursor and subsequent formation of γ-Al2O3 and t-Zr were investigated by thermal analysis. XRD analysis showed that γ-Al2O3 and t-ZrO2 phases were formed at 700 °C. FT-IR analysis also indicated that the phase transition to γ-Al2O3 occurred in corroboration with X-ray studies. TEM analysis of the calcined powder revealed that spherical particles were in the range of 8-12 nm. The nano-alumina-zirconia composite particles were mesoporous and uniformly distributed in their crystalline phase. In order to measure the catalytic activity, esterification reaction was carried out. Biodiesel, as a renewable fuel, was formed in a continuous packed column reactor. Free fatty acid (FFA) was esterified with ethanol in a heterogeneous catalytic reactor. It was found that the synthesized γ-Al2O3/ZrO2 composite had the potential to be used as a heterogeneous base catalyst for biodiesel production processes.Keywords: Nano-alumina-zirconia, composite catalyst, thin film, biodiesel.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 174665 Development of a Neural Network based Algorithm for Multi-Scale Roughness Parameters and Soil Moisture Retrieval
Authors: L. Bennaceur Farah, I. R. Farah, R. Bennaceur, Z. Belhadj, M. R. Boussema
Abstract:
The overall objective of this paper is to retrieve soil surfaces parameters namely, roughness and soil moisture related to the dielectric constant by inverting the radar backscattered signal from natural soil surfaces. Because the classical description of roughness using statistical parameters like the correlation length doesn't lead to satisfactory results to predict radar backscattering, we used a multi-scale roughness description using the wavelet transform and the Mallat algorithm. In this description, the surface is considered as a superposition of a finite number of one-dimensional Gaussian processes each having a spatial scale. A second step in this study consisted in adapting a direct model simulating radar backscattering namely the small perturbation model to this multi-scale surface description. We investigated the impact of this description on radar backscattering through a sensitivity analysis of backscattering coefficient to the multi-scale roughness parameters. To perform the inversion of the small perturbation multi-scale scattering model (MLS SPM) we used a multi-layer neural network architecture trained by backpropagation learning rule. The inversion leads to satisfactory results with a relative uncertainty of 8%.Keywords: Remote sensing, rough surfaces, inverse problems, SAR, radar scattering, Neural networks and Fractals.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 159464 Mechanisms of Organic Contaminants Uptake and Degradation in Plants
Authors: E.Kvesitadze, T.Sadunishvili, G.Kvesitadze
Abstract:
As a result of urbanization, the unpredictable growth of industry and transport, production of chemicals, military activities, etc. the concentration of anthropogenic toxicants spread in nature exceeds all the permissible standards. Most dangerous among these contaminants are organic compounds having great persistence, bioaccumulation, and toxicity along with our awareness of their prominent occurrence in the environment and food chain. Among natural ecological tools, plants still occupying above 40% of the world land, until recently, were considered as organisms having only a limited ecological potential, accumulating in plant biomass and partially volatilizing contaminants of different structure. However, analysis of experimental data of the last two decades revealed the essential role of plants in environment remediation due to ability to carry out intracellular degradation processes leading to partial or complete decomposition of carbon skeleton of different structure contaminants. Though, phytoremediation technologies still are in research and development, their various applications have been successfully used. The paper aims to analyze mechanisms of organic contaminants uptake and detoxification in plants, being the less studied issue in evaluation and exploration of plants potential for environment remediation.
Keywords: organic contaminants, Detoxification, metalloenzymes, plant ultrastructure.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 306263 Compressed Sensing of Fetal Electrocardiogram Signals Based on Joint Block Multi-Orthogonal Least Squares Algorithm
Authors: Xiang Jianhong, Wang Cong, Wang Linyu
Abstract:
With the rise of medical IoT technologies, Wireless body area networks (WBANs) can collect fetal electrocardiogram (FECG) signals to support telemedicine analysis. The compressed sensing (CS)-based WBANs system can avoid the sampling of a large amount of redundant information and reduce the complexity and computing time of data processing, but the existing algorithms have poor signal compression and reconstruction performance. In this paper, a Joint block multi-orthogonal least squares (JBMOLS) algorithm is proposed. We apply the FECG signal to the Joint block sparse model (JBSM), and a comparative study of sparse transformation and measurement matrices is carried out. A FECG signal compression transmission mode based on Rbio5.5 wavelet, Bernoulli measurement matrix, and JBMOLS algorithm is proposed to improve the compression and reconstruction performance of FECG signal by CS-based WBANs. Experimental results show that the compression ratio (CR) required for accurate reconstruction of this transmission mode is increased by nearly 10%, and the runtime is saved by about 30%.
Keywords: telemedicine, fetal electrocardiogram, compressed sensing, joint sparse reconstruction, block sparse signal
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 50962 Fabrication and Characterization of CdS Nanoparticles Annealed by using Different Radiations
Authors: Aneeqa Sabah, Saadat Anwar Siddiqi, Salamat Ali
Abstract:
The systematic manipulations of shapes and sizes of inorganic compounds greatly benefit the various application fields including optics, magnetic, electronics, catalysis and medicine. However shape control has been much more difficult to achieve. Hence exploration of novel method for the preparation of differently shaped nanoparticles is challenging research area. II-VI group of semiconductor cadmium sulphide (CdS) nanostructure with different morphologies (such as, acicular like, mesoporous, spherical shapes) and of crystallite sizes vary from 11 to 16 nm were successfully synthesized by chemical aqueous precipitation of Cd2+ ions with homogeneously released S2- ions from decomposition of cadmium sulphate (CdSO4) and thioacetamide (CH3CSNH2) by annealing at different radiations (microwave, ultrasonic and sunlight) with matter and systematic research has been done for various factors affecting the controlled growth rate of CdS nanoparticles. The obtained nanomaterials have been characterized by X-ray Diffraction (XRD), Fourier Transform Infrared Spectroscopy (FTIR), Thermogravometric (DSC-TGA) analysis and Scanning Electron Microscopy (SEM). The result indicates that on increasing the reaction time particle size increases but on increasing the molar ratios grain size decreases.Keywords: CdS nanoparticles, Morphology, Oxidation, Radiations
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 298261 Geometric Data Structures and Their Selected Applications
Authors: Miloš Šeda
Abstract:
Finding the shortest path between two positions is a fundamental problem in transportation, routing, and communications applications. In robot motion planning, the robot should pass around the obstacles touching none of them, i.e. the goal is to find a collision-free path from a starting to a target position. This task has many specific formulations depending on the shape of obstacles, allowable directions of movements, knowledge of the scene, etc. Research of path planning has yielded many fundamentally different approaches to its solution, mainly based on various decomposition and roadmap methods. In this paper, we show a possible use of visibility graphs in point-to-point motion planning in the Euclidean plane and an alternative approach using Voronoi diagrams that decreases the probability of collisions with obstacles. The second application area, investigated here, is focused on problems of finding minimal networks connecting a set of given points in the plane using either only straight connections between pairs of points (minimum spanning tree) or allowing the addition of auxiliary points to the set to obtain shorter spanning networks (minimum Steiner tree).Keywords: motion planning, spanning tree, Steiner tree, Delaunay triangulation, Voronoi diagram.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 151660 Convex Restrictions for Outage Constrained MU-MISO Downlink under Imperfect Channel State Information
Authors: A. Preetha Priyadharshini, S. B. M. Priya
Abstract:
In this paper, we consider the MU-MISO downlink scenario, under imperfect channel state information (CSI). The main issue in imperfect CSI is to keep the probability of each user achievable outage rate below the given threshold level. Such a rate outage constraints present significant and analytical challenges. There are many probabilistic methods are used to minimize the transmit optimization problem under imperfect CSI. Here, decomposition based large deviation inequality and Bernstein type inequality convex restriction methods are used to perform the optimization problem under imperfect CSI. These methods are used for achieving improved output quality and lower complexity. They provide a safe tractable approximation of the original rate outage constraints. Based on these method implementations, performance has been evaluated in the terms of feasible rate and average transmission power. The simulation results are shown that all the two methods offer significantly improved outage quality and lower computational complexity.
Keywords: Imperfect channel state information, outage probability, multiuser- multi input single output.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 111359 Agent/Group/Role Organizational Model to Simulate an Industrial Control System
Authors: Noureddine Seddari, Mohamed Belaoued, Salah Bougueroua
Abstract:
The modeling of complex systems is generally based on the decomposition of their components into sub-systems easier to handle. This division has to be made in a methodical way. In this paper, we introduce an industrial control system modeling and simulation based on the Multi-Agent System (MAS) methodology AALAADIN and more particularly the underlying conceptual model Agent/Group/Role (AGR). Indeed, in this division using AGR model, the overall system is decomposed into sub-systems in order to improve the understanding of regulation and control systems, and to simplify the implementation of the obtained agents and their groups, which are implemented using the Multi-Agents Development KIT (MAD-KIT) platform. This approach appears to us to be the most appropriate for modeling of this type of systems because, due to the use of MAS, it is possible to model real systems in which very complex behaviors emerge from relatively simple and local interactions between many different individuals, therefore a MAS is well adapted to describe a system from the standpoint of the activity of its components, that is to say when the behavior of the individuals is complex (difficult to describe with equations). The main aim of this approach is the take advantage of the performance, the scalability and the robustness that are intuitively provided by MAS.
Keywords: Complex systems, modeling and simulation, industrial control system, MAS, AALAADIN, AGR, MAD-KIT.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1188