Search results for: curvelet transform
133 Synthesis, Characterization and Performance Study of Newly Developed Amine Polymeric Membrane (APM) for Carbon Dioxide (CO2) Removal
Authors: Rizwan Nasir, Hilmi Mukhtar, Zakaria Man, Dzeti Farhah Mohshim
Abstract:
Carbon dioxide has been well associated with greenhouse effect, and due to its corrosive nature it is an undesirable compound. A variety of physical-chemical processes are available for the removal of carbon dioxide. Previous attempts in this field have established alkanolamine group has the capability to remove carbon dioxide. So, this study combined the polymeric membrane and alkanolamine solutions to fabricate the amine polymeric membrane (APM) to remove carbon dioxide (CO2). This study entails the effect of three types of amines, monoethanolamine (MEA), diethanolamine (DEA), and methyldiethanolamine (MDEA). The effect of each alkanolamine group on the morphology and performance of polyether sulfone (PES) polymeric membranes was studied. Flat sheet membranes were fabricated by solvent evaporation method by adding polymer and different alkanolamine solutions in the N-Methyl-2-pyrrolidone (NMP) solvent. The final membranes were characterized by using Field Emission Electron Microscope (FESEM), Fourier Transform Infrared (FTIR), and Thermo-Gravimetric Analysis (TGA). The membrane separation performance was studied. The PES-DEA and PES-MDEA membrane has good ability to remove carbon dioxide.
Keywords: Amine Polymeric membrane, Alkanolamine solution, CO2 Removal, Characterization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2253132 Parallel-computing Approach for FFT Implementation on Digital Signal Processor (DSP)
Authors: Yi-Pin Hsu, Shin-Yu Lin
Abstract:
An efficient parallel form in digital signal processor can improve the algorithm performance. The butterfly structure is an important role in fast Fourier transform (FFT), because its symmetry form is suitable for hardware implementation. Although it can perform a symmetric structure, the performance will be reduced under the data-dependent flow characteristic. Even though recent research which call as novel memory reference reduction methods (NMRRM) for FFT focus on reduce memory reference in twiddle factor, the data-dependent property still exists. In this paper, we propose a parallel-computing approach for FFT implementation on digital signal processor (DSP) which is based on data-independent property and still hold the property of low-memory reference. The proposed method combines final two steps in NMRRM FFT to perform a novel data-independent structure, besides it is very suitable for multi-operation-unit digital signal processor and dual-core system. We have applied the proposed method of radix-2 FFT algorithm in low memory reference on TI TMSC320C64x DSP. Experimental results show the method can reduce 33.8% clock cycles comparing with the NMRRM FFT implementation and keep the low-memory reference property.
Keywords: Parallel-computing, FFT, low-memory reference, TIDSP.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2198131 A Business Model Design Process for Social Enterprises: The Critical Role of the Environment
Authors: Hadia Abdel Aziz, Raghda El Ebrashi
Abstract:
Business models are shaped by their design space or the environment they are designed to be implemented in. The rapidly changing economic, technological, political, regulatory and market external environment severely affects business logic. This is particularly true for social enterprises whose core mission is to transform their environments, and thus, their whole business logic revolves around the interchange between the enterprise and the environment. The context in which social business operates imposes different business design constraints while at the same time, open up new design opportunities. It is also affected to a great extent by the impact that successful enterprises generate; a continuous loop of interaction that needs to be managed through a dynamic capability in order to generate a lasting powerful impact. This conceptual research synthesizes and analyzes literature on social enterprise, social enterprise business models, business model innovation, business model design, and the open system view theory to propose a new business model design process for social enterprises that takes into account the critical role of environmental factors. This process would help the social enterprise develop a dynamic capability that ensures the alignment of its business model to its environmental context, thus, maximizing its probability of success.
Keywords: Social enterprise, business model, business model design.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3031130 A Mapping Approach of Code Generation for Arinc653-Based Avionics Software
Authors: Lu Zou, Dianfu MA, Ying Wang, Xianqi Zhao
Abstract:
Avionic software architecture has transit from a federated avionics architecture to an integrated modular avionics (IMA) .ARINC 653 (Avionics Application Standard Software Interface) is a software specification for space and time partitioning in Safety-critical avionics Real-time operating systems. Methods to transform the abstract avionics application logic function to the executable model have been brought up, however with less consideration about the code generating input and output model specific for ARINC 653 platform and inner-task synchronous dynamic interaction order sequence. In this paper, we proposed an AADL-based model-driven design methodology to fulfill the purpose to automatically generating Cµ executable model on ARINC 653 platform from the ARINC653 architecture which defined as AADL653 in order to facilitate the development of the avionics software constructed on ARINC653 OS. This paper presents the mapping rules between the AADL653 elements and the elements in Cµ language, and define the code generating rules , designs an automatic C µ code generator .Then, we use a case to illustrate our approach. Finally, we give the related work and future research directions.Keywords: IMA, ARINC653, AADL653, code generation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3038129 Pension Plan Member’s Investment Strategies with Transaction Cost and Couple Risky Assets Modelled by the O-U Process
Authors: Udeme O. Ini, Edikan E. Akpanibah
Abstract:
This paper studies the optimal investment strategies for a plan member (PM) in a defined contribution (DC) pension scheme with transaction cost, taxes on invested funds and couple risky assets (stocks) under the Ornstein-Uhlenbeck (O-U) process. The PM’s portfolio is assumed to consist of a risk-free asset and two risky assets where the two risky assets are driven by the O-U process. The Legendre transformation and dual theory is use to transform the resultant optimal control problem which is a nonlinear partial differential equation (PDE) into linear PDE and the resultant linear PDE is then solved for the explicit solutions of the optimal investment strategies for PM exhibiting constant absolute risk aversion (CARA) using change of variable technique. Furthermore, theoretical analysis is used to study the influences of some sensitive parameters on the optimal investment strategies with observations that the optimal investment strategies for the two risky assets increase with increase in the dividend and decreases with increase in tax on the invested funds, risk averse coefficient, initial fund size and the transaction cost.
Keywords: Ornstein-Uhlenbeck process, portfolio management, Legendre transforms, CARA utility.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 477128 Solid Dispersions of Cefixime Using β-Cyclodextrin: Characterization and in vitro Evaluation
Authors: Nagasamy Venkatesh Dhandapani, Amged Awad El-Gied
Abstract:
Cefixime, a BCS class II drug, is insoluble in water but freely soluble in acetone and in alcohol. The aqueous solubility of cefixime in water is poor and exhibits exceptionally slow and intrinsic dissolution rate. In the present study, cefixime and β-Cyclodextrin (β-CD) solid dispersions were prepared with a view to study the effect and influence of β-CD on the solubility and dissolution rate of this poorly aqueous soluble drug. Phase solubility profile revealed that the solubility of cefixime was increased in the presence of β-CD and was classified as AL-type. Effect of variable, such as drug:carrier ratio, was studied. Physical characterization of the solid dispersion was characterized by Fourier transform infrared spectroscopy (FT-IR) and Differential scanning calorimetry (DSC). These studies revealed that a distinct loss of drug crystallinity in the solid molecular dispersions is ostensibly accounting for enhancement of dissolution rate in distilled water. The drug release from the prepared solid dispersion exhibited a first order kinetics. Solid dispersions of cefixime showed a 6.77 times fold increase in dissolution rate over the pure drug.Keywords: Cefixime, β-Cyclodextrin, solid dispersions, kneading method, dissolution, release kinetics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1611127 A Frequency Grouping Approach for Blind Deconvolution of Fairly Motionless Sources
Authors: E. S. Gower, T. Tsalaile, E. Rakgati, M. O. J. Hawksford
Abstract:
A frequency grouping approach for multi-channel instantaneous blind source separation (I-BSS) of convolutive mixtures is proposed for a lower net residual inter-symbol interference (ISI) and inter-channel interference (ICI) than the conventional short-time Fourier transform (STFT) approach. Starting in the time domain, STFTs are taken with overlapping windows to convert the convolutive mixing problem into frequency domain instantaneous mixing. Mixture samples at the same frequency but from different STFT windows are grouped together forming unique frequency groups. The individual frequency group vectors are input to the I-BSS algorithm of choice, from which the output samples are dispersed back to their respective STFT windows. After applying the inverse STFT, the resulting time domain signals are used to construct the complete source estimates via the weighted overlap-add method (WOLA). The proposed algorithm is tested for source deconvolution given two mixtures, and simulated along with the STFT approach to illustrate its superiority for fairly motionless sources.Keywords: Blind source separation, short-time Fouriertransform, weighted overlap-add method
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1527126 Conjugate Mixed Convection Heat Transfer and Entropy Generation of Cu-Water Nanofluid in an Enclosure with Thick Wavy Bottom Wall
Authors: Sanjib Kr Pal, S. Bhattacharyya
Abstract:
Mixed convection of Cu-water nanofluid in an enclosure with thick wavy bottom wall has been investigated numerically. A co-ordinate transformation method is used to transform the computational domain into an orthogonal co-ordinate system. The governing equations in the computational domain are solved through a pressure correction based iterative algorithm. The fluid flow and heat transfer characteristics are analyzed for a wide range of Richardson number (0.1 ≤ Ri ≤ 5), nanoparticle volume concentration (0.0 ≤ ϕ ≤ 0.2), amplitude (0.0 ≤ α ≤ 0.1) of the wavy thick- bottom wall and the wave number (ω) at a fixed Reynolds number. Obtained results showed that heat transfer rate increases remarkably by adding the nanoparticles. Heat transfer rate is dependent on the wavy wall amplitude and wave number and decreases with increasing Richardson number for fixed amplitude and wave number. The Bejan number and the entropy generation are determined to analyze the thermodynamic optimization of the mixed convection.Keywords: Entropy generation, mixed convection, conjugate heat transfer, numerical, nanofluid, wall waviness.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1046125 Automated Optic Disc Detection in Retinal Images of Patients with Diabetic Retinopathy and Risk of Macular Edema
Authors: Arturo Aquino, Manuel Emilio Gegundez, Diego Marin
Abstract:
In this paper, a new automated methodology to detect the optic disc (OD) automatically in retinal images from patients with risk of being affected by Diabetic Retinopathy (DR) and Macular Edema (ME) is presented. The detection procedure comprises two independent methodologies. On one hand, a location methodology obtains a pixel that belongs to the OD using image contrast analysis and structure filtering techniques and, on the other hand, a boundary segmentation methodology estimates a circular approximation of the OD boundary by applying mathematical morphology, edge detection techniques and the Circular Hough Transform. The methodologies were tested on a set of 1200 images composed of 229 retinographies from patients affected by DR with risk of ME, 431 with DR and no risk of ME and 540 images of healthy retinas. The location methodology obtained 98.83% success rate, whereas the OD boundary segmentation methodology obtained good circular OD boundary approximation in 94.58% of cases. The average computational time measured over the total set was 1.67 seconds for OD location and 5.78 seconds for OD boundary segmentation.
Keywords: Diabetic retinopathy, macular edema, optic disc, automated detection, automated segmentation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2790124 Unsupervised Classification of DNA Barcodes Species Using Multi-Library Wavelet Networks
Authors: Abdesselem Dakhli, Wajdi Bellil, Chokri Ben Amar
Abstract:
DNA Barcode provides good sources of needed information to classify living species. The classification problem has to be supported with reliable methods and algorithms. To analyze species regions or entire genomes, it becomes necessary to use the similarity sequence methods. A large set of sequences can be simultaneously compared using Multiple Sequence Alignment which is known to be NP-complete. However, all the used methods are still computationally very expensive and require significant computational infrastructure. Our goal is to build predictive models that are highly accurate and interpretable. In fact, our method permits to avoid the complex problem of form and structure in different classes of organisms. The empirical data and their classification performances are compared with other methods. Evenly, in this study, we present our system which is consisted of three phases. The first one, is called transformation, is composed of three sub steps; Electron-Ion Interaction Pseudopotential (EIIP) for the codification of DNA Barcodes, Fourier Transform and Power Spectrum Signal Processing. Moreover, the second phase step is an approximation; it is empowered by the use of Multi Library Wavelet Neural Networks (MLWNN). Finally, the third one, is called the classification of DNA Barcodes, is realized by applying the algorithm of hierarchical classification.Keywords: DNA Barcode, Electron-Ion Interaction Pseudopotential, Multi Library Wavelet Neural Networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1967123 Robust Digital Cinema Watermarking
Authors: Sadi Vural, Hiromi Tomii, Hironori Yamauchi
Abstract:
With the advent of digital cinema and digital broadcasting, copyright protection of video data has been one of the most important issues. We present a novel method of watermarking for video image data based on the hardware and digital wavelet transform techniques and name it as “traceable watermarking" because the watermarked data is constructed before the transmission process and traced after it has been received by an authorized user. In our method, we embed the watermark to the lowest part of each image frame in decoded video by using a hardware LSI. Digital Cinema is an important application for traceable watermarking since digital cinema system makes use of watermarking technology during content encoding, encryption, transmission, decoding and all the intermediate process to be done in digital cinema systems. The watermark is embedded into the randomly selected movie frames using hash functions. Embedded watermark information can be extracted from the decoded video data. For that, there is no need to access original movie data. Our experimental results show that proposed traceable watermarking method for digital cinema system is much better than the convenient watermarking techniques in terms of robustness, image quality, speed, simplicity and robust structure.Keywords: Decoder, Digital content, JPEG2000 Frame, System-On-Chip, traceable watermark, Hash Function, CRC-32.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1647122 Pricing European Options under Jump Diffusion Models with Fast L-stable Padé Scheme
Authors: Salah Alrabeei, Mohammad Yousuf
Abstract:
The goal of option pricing theory is to help the investors to manage their money, enhance returns and control their financial future by theoretically valuing their options. Modeling option pricing by Black-School models with jumps guarantees to consider the market movement. However, only numerical methods can solve this model. Furthermore, not all the numerical methods are efficient to solve these models because they have nonsmoothing payoffs or discontinuous derivatives at the exercise price. In this paper, the exponential time differencing (ETD) method is applied for solving partial integrodifferential equations arising in pricing European options under Merton’s and Kou’s jump-diffusion models. Fast Fourier Transform (FFT) algorithm is used as a matrix-vector multiplication solver, which reduces the complexity from O(M2) into O(M logM). A partial fraction form of Pad`e schemes is used to overcome the complexity of inverting polynomial of matrices. These two tools guarantee to get efficient and accurate numerical solutions. We construct a parallel and easy to implement a version of the numerical scheme. Numerical experiments are given to show how fast and accurate is our scheme.Keywords: Integral differential equations, L-stable methods, pricing European options, Jump–diffusion model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 499121 Analysis of Vocal Fold Vibrations from High-Speed Digital Images Based On Dynamic Time Warping
Authors: A. I. A. Rahman, Sh-Hussain Salleh, K. Ahmad, K. Anuar
Abstract:
Analysis of vocal fold vibration is essential for understanding the mechanism of voice production and for improving clinical assessment of voice disorders. This paper presents a Dynamic Time Warping (DTW) based approach to analyze and objectively classify vocal fold vibration patterns. The proposed technique was designed and implemented on a Glottal Area Waveform (GAW) extracted from high-speed laryngeal images by delineating the glottal edges for each image frame. Feature extraction from the GAW was performed using Linear Predictive Coding (LPC). Several types of voice reference templates from simulations of clear, breathy, fry, pressed and hyperfunctional voice productions were used. The patterns of the reference templates were first verified using the analytical signal generated through Hilbert transformation of the GAW. Samples from normal speakers’ voice recordings were then used to evaluate and test the effectiveness of this approach. The classification of the voice patterns using the technique of LPC and DTW gave the accuracy of 81%.
Keywords: Dynamic Time Warping, Glottal Area Waveform, Linear Predictive Coding, High-Speed Laryngeal Images, Hilbert Transform.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2334120 Papain Immobilized Polyurethane Film as Antimicrobial Food Package
Authors: M. Cynthya, V. Prabhawathi, D. Mukesh
Abstract:
Food contamination occurs during post process handling. This leads to spoilage and growth of pathogenic microorganisms in the food, thereby reducing its shelf life or spreading of food borne diseases. Several methods are tried and one of which is use of antimicrobial packaging. Here, papain, a protease enzyme, is covalently immobilized with the help of glutarldehyde on polyurethane and used as a food wrap to protect food from microbial contamination. Covalent immobilization of papain was achieved at a pH of 7.4; temperature of 4°C; glutaraldehyde concentration of 0.5%; incubation time of 24h; and 50mg of papain. The formation of -C=Nobserved in the Fourier transform infrared spectrum confirmed the immobilization of the enzyme on the polymer. Immobilized enzyme retained higher activity than the native free enzyme. The modified polyurethane showed better reduction of Staphylococcus aureus biofilm than bare polymer film (eight folds reduction in live colonies, two times reduction in protein and 6 times reduction in carbohydrates). The efficacy of this was studied by wrapping it over S. aureus contaminated cottage cheese (paneer) and cheese and stored at a temperature of 4°C for 7days. The modified film reduced the bacterial contamination by eight folds when compared to the bare film. FTIR also indicated reduction in lipids, sugars and proteins in the biofilm.
Keywords: Cheese, Papain, polyurethane, Staphylococcus aureus.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2951119 Optimal Model Order Selection for Transient Error Autoregressive Moving Average (TERA) MRI Reconstruction Method
Authors: Abiodun M. Aibinu, Athaur Rahman Najeeb, Momoh J. E. Salami, Amir A. Shafie
Abstract:
An alternative approach to the use of Discrete Fourier Transform (DFT) for Magnetic Resonance Imaging (MRI) reconstruction is the use of parametric modeling technique. This method is suitable for problems in which the image can be modeled by explicit known source functions with a few adjustable parameters. Despite the success reported in the use of modeling technique as an alternative MRI reconstruction technique, two important problems constitutes challenges to the applicability of this method, these are estimation of Model order and model coefficient determination. In this paper, five of the suggested method of evaluating the model order have been evaluated, these are: The Final Prediction Error (FPE), Akaike Information Criterion (AIC), Residual Variance (RV), Minimum Description Length (MDL) and Hannan and Quinn (HNQ) criterion. These criteria were evaluated on MRI data sets based on the method of Transient Error Reconstruction Algorithm (TERA). The result for each criterion is compared to result obtained by the use of a fixed order technique and three measures of similarity were evaluated. Result obtained shows that the use of MDL gives the highest measure of similarity to that use by a fixed order technique.Keywords: Autoregressive Moving Average (ARMA), MagneticResonance Imaging (MRI), Parametric modeling, Transient Error.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1615118 Spectroscopic Determination of Functionalized Active Principles from Coleus aromaticus Benth Leaf Extract Using Ionic Liquids
Authors: Zharama M. Llarena
Abstract:
Green chemistry for plant extraction of active principles is the main interest of many researchers concerned with climate change. While classical organic solvents are detrimental to our environment, greener alternatives to ionic liquids are very promising for sustainable organic chemistry. This study focused on the determination of functional groups observed in the main constituents from the ionic liquid extracts of Coleus aromaticus Benth leaves using FT-IR Spectroscopy. Moreover, this research aimed to determine the best ionic liquid that can separate functionalized plant constituents from the leaves Coleus aromaticus Benth using Fourier Transform Infrared Spectroscopy. Coleus aromaticus Benth leaf extract in different ionic liquids, elucidated pharmacologically important functional groups present in major constituents of the plant, namely, rosmarinic acid, caffeic acid and chlorogenic acid. In connection to distinctive appearance of functional groups in the spectrum and highest % transmittance, potassium chloride-glycerol is the best ionic liquid for green extraction.
Keywords: Coleus aromaticus, ionic liquid, rosmarinic acid, caffeic acid, chlorogenic acid.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1785117 Synthesis and Physicochemical Characterization of Biomimetic Scaffold of Gelatin/Zn-Incorporated 58S Bioactive Glass
Authors: Seyed Mohammad Hosseini, Amirhossein Moghanian
Abstract:
The main purpose of this research was to design a biomimetic system by freeze-drying method for evaluating the effect of adding 5 and 10 mol. % of zinc (Zn) in 58S bioactive glass and gelatin (5ZnBG/G and 10ZnBG/G) in terms of structural and biological changes. The structural analyses of samples were performed by X-Ray Diffraction (XRD), scanning electron microscopy (SEM) and Fourier-transform infrared (FTIR) spectroscopy. Also, 3-(4,5dimethylthiazol-2-yl)-2,5-diphenyltetrazolium bromide (MTT) and alkaline phosphatase (ALP) activity tests were carried out for investigation of MC3T3-E1 cell behaviors. The SEM results demonstrated the spherical shape of the formed hydroxyapatite (HA) phases and also HA characteristic peaks were detected by XRD spectroscopy after 3 days of immersion in the simulated body fluid (SBF) solution. Meanwhile, FTIR spectra proved that the intensity of P–O peaks for 5ZnBG/G was more than 10ZnBG/G and control samples. Moreover, the results of ALP activity test illustrated that the optimal amount of Zn (5ZnBG/G) caused a considerable enhancement in bone cell growth. Taken together, the scaffold with 5 mol.% Zn was introduced as an optimal sample because of its higher biocompatibility, in vitro bioactivity and growth of MC3T3-E1 cells in comparison with other samples in bone tissue engineering.
Keywords: Scaffold, gelatin, modified bioactive glass, ALP, bone tissue engineering.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 409116 Fast Wavelet Image Denoising Based on Local Variance and Edge Analysis
Authors: Gaoyong Luo
Abstract:
The approach based on the wavelet transform has been widely used for image denoising due to its multi-resolution nature, its ability to produce high levels of noise reduction and the low level of distortion introduced. However, by removing noise, high frequency components belonging to edges are also removed, which leads to blurring the signal features. This paper proposes a new method of image noise reduction based on local variance and edge analysis. The analysis is performed by dividing an image into 32 x 32 pixel blocks, and transforming the data into wavelet domain. Fast lifting wavelet spatial-frequency decomposition and reconstruction is developed with the advantages of being computationally efficient and boundary effects minimized. The adaptive thresholding by local variance estimation and edge strength measurement can effectively reduce image noise while preserve the features of the original image corresponding to the boundaries of the objects. Experimental results demonstrate that the method performs well for images contaminated by natural and artificial noise, and is suitable to be adapted for different class of images and type of noises. The proposed algorithm provides a potential solution with parallel computation for real time or embedded system application.Keywords: Edge strength, Fast lifting wavelet, Image denoising, Local variance.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2028115 Growth and Characterization of L-Asparagine (LAS) Crystal Admixture of Paranitrophenol (PNP): A NLO Material
Authors: Grace Sahaya Sheba, P. Omegala Priyakumari, M. Gunasekaran
Abstract:
L-asparagine admixture Paranitrophenol (LAPNP) single crystals were grown successfully by solution method with slow evaporation technique at room temperature. Crystals of size 12mm×5 mm×3mm have been obtained in 15 days. The grown crystals were Brown color and transparent. The solubility of the grown samples has been found out at various temperatures. The lattice parameters of the grown crystals were determined by X-ray diffraction technique. The reflection planes of the sample were confirmed by the powder X-ray diffraction study and diffraction peaks were indexed. Fourier transform infrared (FTIR) studies were used to confirm the presence of various functional groups in the crystals. UV–visible absorption spectrum was recorded to study the optical transparency of grown crystal. The nonlinear optical (NLO) property of the grown crystal was confirmed by Kurtz–Perry powder technique and a study of its second harmonic generation efficiency in comparison with potassium dihydrogen phosphate (KDP) has been made. The mechanical strength of the crystal was estimated by Vickers hardness test. The grown crystals were subjected to thermo gravimetric and differential thermal analysis (TG/DTA). The dielectric behavior of the sample was also studied
Keywords: Characterization, Microhardnes, Non-linear optical materials, Solution growth, Spectroscopy, XRD.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2998114 Energy Detection Based Sensing and Primary User Traffic Classification for Cognitive Radio
Authors: Urvee B. Trivedi, U. D. Dalal
Abstract:
As wireless communication services grow quickly; the seriousness of spectrum utilization has been on the rise gradually. An emerging technology, cognitive radio has come out to solve today’s spectrum scarcity problem. To support the spectrum reuse functionality, secondary users are required to sense the radio frequency environment, and once the primary users are found to be active, the secondary users are required to vacate the channel within a certain amount of time. Therefore, spectrum sensing is of significant importance. Once sensing is done, different prediction rules apply to classify the traffic pattern of primary user. Primary user follows two types of traffic patterns: periodic and stochastic ON-OFF patterns. A cognitive radio can learn the patterns in different channels over time. Two types of classification methods are discussed in this paper, by considering edge detection and by using autocorrelation function. Edge detection method has a high accuracy but it cannot tolerate sensing errors. Autocorrelation-based classification is applicable in the real environment as it can tolerate some amount of sensing errors.Keywords: Cognitive radio (CR), probability of detection (PD), probability of false alarm (PF), primary User (PU), secondary user (SU), Fast Fourier transform (FFT), signal to noise ratio (SNR).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1470113 Arabic Literature as a Tool for Educational Transformation in Nigeria
Authors: Abdulfatah A Raji
Abstract:
This paper started with the definitions of literature, Arabic literature, transformation and went further to highlight the components of educational transformation. The general history of Arabic literature was discussed with focus on how it undergoes some transformations from pre-Islamic period through Quranic era, Abbasid literature to renaissance period in which the modernization of Arabic literature started in Egypt. It also traces the spread of Arabic literature in Nigeria from the pre-colonial era during the Kanuri rulers to Jihad of Usman Dan Fodio and the development of literature which manifested to the Teacher’s Colleges and Bayero University in Northern Nigeria. Also, the establishment of primary and post-primary schools by Muslim organizations in many cities and towns of the Western part of Nigeria. Literary criticism was also discussed in line with Arabic literature. Poetry work of eminent poets were cited to show its importance in line with educational transformation in Nigerian literature and lessons from the cited Arabic poetry works were also highlighted to include: motivation to behave well and to tolerate others, better spirits of interaction, love and co-existence among different sexes, religion etc. All these can help in developing a better educational transformation in Nigeria which can in turn help in how to conduct researches for national development. The paper recommended compulsory Arabic literature at all levels of the nations’ educational system as well as publication of Arabic books and journals to encourage peace in this era of conflicts and further transform Nigeria’s educational system for better.Keywords: Arabic, literature, peace, development, Nigeria
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1568112 Preparation and Conductivity Measurements of LSM/YSZ Composite Solid Oxide Electrolysis Cell Anode Materials
Authors: Christian C. Vaso, Rinlee Butch M. Cervera
Abstract:
One of the most promising anode materials for solid oxide electrolysis cell (SOEC) application is the Sr-doped LaMnO3 (LSM) which is known to have a high electronic conductivity but low ionic conductivity. To increase the ionic conductivity or diffusion of ions through the anode, Yttria-stabilized Zirconia (YSZ), which has good ionic conductivity, is proposed to be combined with LSM to create a composite electrode and to obtain a high mixed ionic and electronic conducting anode. In this study, composite of lanthanum strontium manganite and YSZ oxide, La0.8Sr0.2MnO3/Zr0.92Y0.08O2 (LSM/YSZ), with different wt.% compositions of LSM and YSZ were synthesized using solid-state reaction. The obtained prepared composite samples of 60, 50, and 40 wt.% LSM with remaining wt.% of 40, 50, and 60, respectively for YSZ were fully characterized for its microstructure by using powder X-ray diffraction (XRD), Thermogravimetric analysis (TGA), Fourier transform infrared (FTIR), and Scanning electron microscope/Energy dispersive spectroscopy (SEM/EDS) analyses. Surface morphology of the samples via SEM analysis revealed a well-sintered and densified pure LSM, while a more porous composite sample of LSM/YSZ was obtained. Electrochemical impedance measurements at intermediate temperature range (500-700 °C) of the synthesized samples were also performed which revealed that the 50 wt.% LSM with 50 wt.% YSZ (L50Y50) sample showed the highest total conductivity of 8.27x10-1 S/cm at 600 oC with 0.22 eV activation energy.Keywords: Ceramics, microstructure, fuel cells, electrochemical impedance spectroscopy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1863111 An Adaptive Dimensionality Reduction Approach for Hyperspectral Imagery Semantic Interpretation
Authors: Akrem Sellami, Imed Riadh Farah, Basel Solaiman
Abstract:
With the development of HyperSpectral Imagery (HSI) technology, the spectral resolution of HSI became denser, which resulted in large number of spectral bands, high correlation between neighboring, and high data redundancy. However, the semantic interpretation is a challenging task for HSI analysis due to the high dimensionality and the high correlation of the different spectral bands. In fact, this work presents a dimensionality reduction approach that allows to overcome the different issues improving the semantic interpretation of HSI. Therefore, in order to preserve the spatial information, the Tensor Locality Preserving Projection (TLPP) has been applied to transform the original HSI. In the second step, knowledge has been extracted based on the adjacency graph to describe the different pixels. Based on the transformation matrix using TLPP, a weighted matrix has been constructed to rank the different spectral bands based on their contribution score. Thus, the relevant bands have been adaptively selected based on the weighted matrix. The performance of the presented approach has been validated by implementing several experiments, and the obtained results demonstrate the efficiency of this approach compared to various existing dimensionality reduction techniques. Also, according to the experimental results, we can conclude that this approach can adaptively select the relevant spectral improving the semantic interpretation of HSI.Keywords: Band selection, dimensionality reduction, feature extraction, hyperspectral imagery, semantic interpretation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1171110 Temperature Related Alterations to Mineral Levels and Crystalline Structure in Porcine Long Bone: Intense Heat vs. Open Flame
Authors: Caighley Logan, Suzzanne McColl
Abstract:
The outcome of fire related fatalities, along with other research, has found fires can have a detrimental effect to the mineral and crystalline structures within bone. This study focused on the mineral and crystalline structures within porcine bone samples to analyse the changes caused, with the intent of effectively ‘reverse engineering’ the data collected from burned bone samples to discover what may have happened. Using Fourier Transform Infrared (FTIR), and X-Ray Fluorescence (XRF), the data were collected from a controlled source of intense heat (muffle furnace) and an open fire, based in a living room setting in a standard size shipping container (2.5 m x 2.4 m) of a similar temperature with a known ignition source, a gasoline lighter. This approach is to analyse the changes to the samples and how the changes differ depending on the heat source. Results have found significant differences in the levels of remaining minerals for each type of heat/burning (p =< 0.001), particularly Phosphorus and Calcium, this also includes notable additions of absorbed elements and minerals from the surrounding materials, i.e., Cerium (Ce), Bromine (Br) and Neodymium (Ne). The analysis techniques included provide validated results in conjunction with previous studies.
Keywords: Forensic anthropology, thermal alterations, porcine bone, FTIR, XRF.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 204109 Artificial Intelligence-Based Chest X-Ray Test of COVID-19 Patients
Authors: Dhurgham Al-Karawi, Nisreen Polus, Shakir Al-Zaidi, Sabah Jassim
Abstract:
The management of COVID-19 patients based on chest imaging is emerging as an essential tool for evaluating the spread of the pandemic which has gripped the global community. It has already been used to monitor the situation of COVID-19 patients who have issues in respiratory status. There has been increase to use chest imaging for medical triage of patients who are showing moderate-severe clinical COVID-19 features, this is due to the fast dispersal of the pandemic to all continents and communities. This article demonstrates the development of machine learning techniques for the test of COVID-19 patients using Chest X-Ray (CXR) images in nearly real-time, to distinguish the COVID-19 infection with a significantly high level of accuracy. The testing performance has covered a combination of different datasets of CXR images of positive COVID-19 patients, patients with viral and bacterial infections, also, people with a clear chest. The proposed AI scheme successfully distinguishes CXR scans of COVID-19 infected patients from CXR scans of viral and bacterial based pneumonia as well as normal cases with an average accuracy of 94.43%, sensitivity 95%, and specificity 93.86%. Predicted decisions would be supported by visual evidence to help clinicians speed up the initial assessment process of new suspected cases, especially in a resource-constrained environment.
Keywords: COVID-19, chest x-ray scan, artificial intelligence, texture analysis, local binary pattern transform, Gabor filter.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 677108 Optical Limiting Characteristics of Core-Shell Nanoparticles
Authors: G.Vinitha, A.Ramalingam
Abstract:
TiO2 nanoparticles were synthesized by hydrothermal method at 180°C from TiOSO4 aqueous solution with1m/l concentration. The obtained products were coated with silica by means of a seeded polymerization technique for a coating time of 1440 minutes to obtain well defined TiO2@SiO2 core-shell structure. The uncoated and coated nanoparticles were characterized by using X-Ray diffraction technique (XRD), Fourier Transform Infrared Spectroscopy (FT-IR) to study their physico-chemical properties. Evidence from XRD and FTIR results show that SiO2 is homogenously coated on the surface of titania particles. FTIR spectra show that there exists an interaction between TiO2 and SiO2 and results in the formation of Ti-O-Si chemical bonds at the interface of TiO2 particles and SiO2 coating layer. The non linear optical limiting properties of TiO2 and TiO2@SiO2 nanoparticles dispersed in ethylene glycol were studied at 532nm using 5ns Nd:YAG laser pulses. Three-photon absorption is responsible for optical limiting characteristics in these nanoparticles and it is seen that the optical nonlinearity is enhanced in core-shell structures when compared with single counterparts. This effective three-photon type absorption at this wavelength, is of potential application in fabricating optical limiting devices.Keywords: hydrothermal method, optical limiting devicesseeded polymerization technique, three-photon type absorption
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1818107 Principle Components Updates via Matrix Perturbations
Authors: Aiman Elragig, Hanan Dreiwi, Dung Ly, Idriss Elmabrook
Abstract:
This paper highlights a new approach to look at online principle components analysis (OPCA). Given a data matrix X ∈ R,^m x n we characterise the online updates of its covariance as a matrix perturbation problem. Up to the principle components, it turns out that online updates of the batch PCA can be captured by symmetric matrix perturbation of the batch covariance matrix. We have shown that as n→ n0 >> 1, the batch covariance and its update become almost similar. Finally, utilize our new setup of online updates to find a bound on the angle distance of the principle components of X and its update.Keywords: Online data updates, covariance matrix, online principle component analysis (OPCA), matrix perturbation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1038106 Solar Radiation Time Series Prediction
Authors: Cameron Hamilton, Walter Potter, Gerrit Hoogenboom, Ronald McClendon, Will Hobbs
Abstract:
A model was constructed to predict the amount of solar radiation that will make contact with the surface of the earth in a given location an hour into the future. This project was supported by the Southern Company to determine at what specific times during a given day of the year solar panels could be relied upon to produce energy in sufficient quantities. Due to their ability as universal function approximators, an artificial neural network was used to estimate the nonlinear pattern of solar radiation, which utilized measurements of weather conditions collected at the Griffin, Georgia weather station as inputs. A number of network configurations and training strategies were utilized, though a multilayer perceptron with a variety of hidden nodes trained with the resilient propagation algorithm consistently yielded the most accurate predictions. In addition, a modeled direct normal irradiance field and adjacent weather station data were used to bolster prediction accuracy. In later trials, the solar radiation field was preprocessed with a discrete wavelet transform with the aim of removing noise from the measurements. The current model provides predictions of solar radiation with a mean square error of 0.0042, though ongoing efforts are being made to further improve the model’s accuracy.
Keywords: Artificial Neural Networks, Resilient Propagation, Solar Radiation, Time Series Forecasting.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2761105 Leveraging Hyperledger Iroha for the Issuance and Verification of Higher-Education Certificates
Authors: Vasiliki Vlachou, Christos Kontzinos, Ourania Markaki, Panagiotis Kokkinakos, Vagelis Karakolis, John Psarras
Abstract:
Higher Education is resisting the pull of technology, especially as this concerns the issuance and verification of degrees and certificates. It is widely known that education certificates are largely produced in paper form making them vulnerable to damage while holders of such certificates are dependent on the universities and other issuing organisations. QualiChain is an EU Horizon 2020 (H2020) research project aiming to transform and revolutionise the domain of public education and its ties with the job market by leveraging blockchain, analytics and decision support to develop a platform for the verification and sharing of education certificates. Blockchain plays an integral part in the QualiChain solution in providing a trustworthy environment to store, share and manage such accreditations. Under the context of this paper, three prominent blockchain platforms (Ethereum, Hyperledger Fabric, Hyperledger Iroha) were considered as a means of experimentation for creating a system with the basic functionalities that will be needed for trustworthy degree verification. The methodology and respective system developed and presented in this paper used Hyperledger Iroha and proved that this specific platform can be used to easily develop decentralize applications. Future papers will attempt to further experiment with other blockchain platforms and assess which has the best potential.
Keywords: Blockchain, degree verification, higher education certificates, Hyperledger Iroha.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 830104 Transformability in Post-Earthquake Houses in Iran: with Special Focus on Lar City
Authors: M. Parva, K. Dola, F. Pour Rahimian
Abstract:
Earthquake is considered as one of the most catastrophic disasters in Iran, in terms of both short-term and long-term hazards. Due to the particular financial and time constraints in Iran, quickly constructed post-earthquake houses (PEHs) do not fulfill the minimum requirements to be considered as comfortable dwellings for people. Consequently, people often transform PEHs after they start to reside. However, lack of understanding about process, motivation, and results of housing transformation leads to construction of some houses not suitable for future transformations, hence resulting in eventually demolished or abandoned PEHs. This study investigated housing transformations in a natural bed of post-earthquake Lar. This paper reports results of the conducted survey for comparing normal condition housing transformation with post-earthquake housing transformation in order to reveal the factors that affect post-earthquake housing transformation in Iran. The findings proposed the use of a combination of ‘Temporary’ and ‘Permanent’ housing reconstruction models in Iran to provide victims with basic but permanent post-disaster dwellings. It is also suggested that needs for future transformation should be predicted and addressed during early stages of design and development. This study contributes to both research and practice regarding post-earthquake housing reconstruction in Iran by proposing new design approaches and guidelines.
Keywords: Housing transformation, Iran, Lar, post-earthquake housing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1877