Search results for: step gradient solvent system
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 20973

Search results for: step gradient solvent system

20013 Experimental Model for Instruction of Pre-Service Teachers in ICT Tools and E-Learning Environments

Authors: Rachel Baruch

Abstract:

This article describes the implementation of an experimental model for teaching ICT tools and digital environments in teachers training college. In most educational systems in the Western world, new programs were developed in order to bridge the digital gap between teachers and students. In spite of their achievements, these programs are limited due to several factors: The teachers in the schools implement new methods incorporating technological tools into the curriculum, but meanwhile the technology changes and advances. The interface of tools changes frequently, some tools disappear and new ones are invented. These conditions require an experimental model of training the pre-service teachers. The appropriate method for instruction within the domain of ICT tools should be based on exposing the learners to innovations, helping them to gain experience, teaching them how to deal with challenges and difficulties on their own, and training them. This study suggests some principles for this approach and describes step by step the implementation of this model.

Keywords: ICT tools, e-learning, pre-service teachers, new model

Procedia PDF Downloads 469
20012 Keyframe Extraction Using Face Quality Assessment and Convolution Neural Network

Authors: Rahma Abed, Sahbi Bahroun, Ezzeddine Zagrouba

Abstract:

Due to the huge amount of data in videos, extracting the relevant frames became a necessity and an essential step prior to performing face recognition. In this context, we propose a method for extracting keyframes from videos based on face quality and deep learning for a face recognition task. This method has two steps. We start by generating face quality scores for each face image based on the use of three face feature extractors, including Gabor, LBP, and HOG. The second step consists in training a Deep Convolutional Neural Network in a supervised manner in order to select the frames that have the best face quality. The obtained results show the effectiveness of the proposed method compared to the methods of the state of the art.

Keywords: keyframe extraction, face quality assessment, face in video recognition, convolution neural network

Procedia PDF Downloads 241
20011 One Step Further: Pull-Process-Push Data Processing

Authors: Romeo Botes, Imelda Smit

Abstract:

In today’s modern age of technology vast amounts of data needs to be processed in real-time to keep users satisfied. This data comes from various sources and in many formats, including electronic and mobile devices such as GPRS modems and GPS devices. They make use of different protocols including TCP, UDP, and HTTP/s for data communication to web servers and eventually to users. The data obtained from these devices may provide valuable information to users, but are mostly in an unreadable format which needs to be processed to provide information and business intelligence. This data is not always current, it is mostly historical data. The data is not subject to implementation of consistency and redundancy measures as most other data usually is. Most important to the users is that the data are to be pre-processed in a readable format when it is entered into the database. To accomplish this, programmers build processing programs and scripts to decode and process the information stored in databases. Programmers make use of various techniques in such programs to accomplish this, but sometimes neglect the effect some of these techniques may have on database performance. One of the techniques generally used,is to pull data from the database server, process it and push it back to the database server in one single step. Since the processing of the data usually takes some time, it keeps the database busy and locked for the period of time that the processing takes place. Because of this, it decreases the overall performance of the database server and therefore the system’s performance. This paper follows on a paper discussing the performance increase that may be achieved by utilizing array lists along with a pull-process-push data processing technique split in three steps. The purpose of this paper is to expand the number of clients when comparing the two techniques to establish the impact it may have on performance of the CPU storage and processing time.

Keywords: performance measures, algorithm techniques, data processing, push data, process data, array list

Procedia PDF Downloads 248
20010 Development of a Two-Step 'Green' Process for (-) Ambrafuran Production

Authors: Lucia Steenkamp, Chris V. D. Westhuyzen, Kgama Mathiba

Abstract:

Ambergris, and more specifically its oxidation product (–)-ambrafuran, is a scarce, valuable, and sought-after perfumery ingredient. The material is used as a fixative agent to stabilise perfumes in formulations by reducing the evaporation rate of volatile substances. Ambergris is a metabolic product of the sperm whale (Physeter macrocephatus L.), resulting from intestinal irritation. Chemically, (–)-ambrafuran is produced from the natural product sclareol in eight synthetic steps – in the process using harsh and often toxic chemicals to do so. An overall yield of no more than 76% can be achieved in some routes, but generally, this is lower. A new 'green' route has been developed in our laboratory in which sclareol, extracted from the Clary sage plant, is converted to (–)-ambrafuran in two steps with an overall yield in excess of 80%. The first step uses a microorganism, Hyphozyma roseoniger, to bioconvert sclareol to an intermediate diol using substrate concentrations up to 50g/L. The yield varies between 90 and 67% depending on the substrate concentration used. The purity of the diol product is 95%, and the diol is used without further purification in the next step. The intermediate diol is then cyclodehydrated to the final product (–)-ambrafuran using a zeolite, which is not harmful to the environment and is readily recycled. The yield of the product is 96%, and following a single recrystallization, the purity of the product is > 99.5%. A preliminary LC-MS study of the bioconversion identified several intermediates produced in the fermentation broth under oxygen-restricted conditions. Initially, a short-lived ketone is produced in equilibrium with a more stable pyranol, a key intermediate in the process. The latter is oxidised under Norrish type I cleavage conditions to yield an acetate, which is hydrolysed either chemically or under lipase action to afford the primary fermentation product, an intermediate diol. All the intermediates identified point to the likely CYP450 action as the key enzyme(s) in the mechanism. This invention is an exceptional example of how the power of biocatalysis, combined with a mild, benign chemical step, can be deployed to replace a total chemical synthesis of a specific chiral antipode of a commercially relevant material.

Keywords: ambrafuran, biocatalysis, fragrance, microorganism

Procedia PDF Downloads 236
20009 Applying Renowned Energy Simulation Engines to Neural Control System of Double Skin Façade

Authors: Zdravko Eškinja, Lovre Miljanić, Ognjen Kuljača

Abstract:

This paper is an overview of simulation tools used to model specific thermal dynamics that occurs while controlling double skin façade. Research has been conducted on simplified construction with single zone where one side is glazed. Heat flow and temperature responses are simulated in three different simulation tools: IDA-ICE, EnergyPlus and HAMBASE. The excitation of observed system, used in all simulations, was a temperature step of exterior environment. Air infiltration, insulation and other disturbances are excluded from this research. Although such isolated behaviour is not possible in reality, experiments are carried out to gain novel information about heat flow transients which are not observable under regular conditions. Results revealed new possibilities for adapting the parameters of the neural network regulator. Along numerical simulations, the same set-up has been also tested in a real-time experiment with a 1:18 scaled model and thermal chamber. The comparison analysis brings out interesting conclusion about simulation accuracy in this particular case.

Keywords: double skin façade, experimental tests, heat control, heat flow, simulated tests, simulation tools

Procedia PDF Downloads 236
20008 An Impregnated Active Layer Mode of Solution Combustion Synthesis as a Tool for the Solution Combustion Mechanism Investigation

Authors: Zhanna Yermekova, Sergey Roslyakov

Abstract:

Solution combustion synthesis (SCS) is the unique method which multiple times has proved itself as an effective and efficient approach for the versatile synthesis of a variety of materials. It has significant advantages such as relatively simple handling process, high rates of product synthesis, mixing of the precursors on a molecular level, and fabrication of the nanoproducts as a result. Nowadays, an overwhelming majority of solution combustion investigations performed through the volume combustion synthesis (VCS) where the entire liquid precursor is heated until the combustion self-initiates throughout the volume. Less amount of the experiments devoted to the steady-state self-propagating mode of SCS. Under the beforementioned regime, the precursor solution is dried until the gel-like media, and later on, the gel substance is locally ignited. In such a case, a combustion wave propagates in a self-sustaining mode as in conventional solid combustion synthesis. Even less attention is given to the impregnated active layer (IAL) mode of solution combustion. An IAL approach to the synthesis is implying that the solution combustion of the precursors should be initiated on the surface of the third chemical or inside the third substance. This work is aiming to emphasize an underestimated role of the impregnated active layer mode of the solution combustion synthesis for the fundamental studies of the combustion mechanisms. It also serves the purpose of popularizing the technical terms and clarifying the difference between them. In order to do so, the solution combustion synthesis of γ-FeNi (PDF#47-1417) alloy has been accomplished within short (seconds) one-step reaction of metal precursors with hexamethylenetetramine (HTMA) fuel. An idea of the special role of the Ni in a process of alloy formation was suggested and confirmed with the particularly organized set of experiments. The first set of experiments were conducted in a conventional steady-state self-propagating mode of SCS. An alloy was synthesized as a single monophasic product. In two other experiments, the synthesis was divided into two independent processes which are possible under the IAL mode of solution combustion. The sequence of the process was changed according to the equations which are describing an Experiment A and B below: Experiment A: Step 1. Fe(NO₃)₃*9H₂O + HMTA = FeO + gas products; Step 2. FeO + Ni(NO₃)₂*6H₂O + HMTA = Ni + FeO + gas products; Experiment B: Step 1. Ni(NO₃)₂*6H₂O + HMTA = Ni + gas products; Step 2. Ni + Fe(NO₃)₃*9H₂O + HMTA = Fe₃Ni₂+ traces (Ni + FeO). Based on the IAL experiment results, one can see that combustion of the Fe(NO₃)₃9H₂O on the surface of the Ni is leading to the alloy formation while presence of the already formed FeO does not affect the Ni(NO₃)₂*6H₂O + HMTA reaction in any way and Ni is the main product of the synthesis.

Keywords: alloy, hexamethylenetetramine, impregnated active layer mode, mechanism, solution combustion synthesis

Procedia PDF Downloads 137
20007 The Implementation of a Numerical Technique to Thermal Design of Fluidized Bed Cooler

Authors: Damiaa Saad Khudor

Abstract:

The paper describes an investigation for the thermal design of a fluidized bed cooler and prediction of heat transfer rate among the media categories. It is devoted to the thermal design of such equipment and their application in the industrial fields. It outlines the strategy for the fluidization heat transfer mode and its implementation in industry. The thermal design for fluidized bed cooler is used to furnish a complete design for a fluidized bed cooler of Sodium Bicarbonate. The total thermal load distribution between the air-solid and water-solid along the cooler is calculated according to the thermal equilibrium. The step by step technique was used to accomplish the thermal design of the fluidized bed cooler. It predicts the load, air, solid and water temperature along the trough. The thermal design for fluidized bed cooler revealed to the installation of a heat exchanger consists of (65) horizontal tubes with (33.4) mm diameter and (4) m length inside the bed trough.

Keywords: fluidization, powder technology, thermal design, heat exchangers

Procedia PDF Downloads 516
20006 Vertical Distribution of the Monthly Average Values of the Air Temperature above the Territory of Kakheti in 2012-2017

Authors: Khatia Tavidashvili, Nino Jamrishvili, Valerian Omsarashvili

Abstract:

Studies of the vertical distribution of the air temperature in the atmosphere have great value for the solution of different problems of meteorology and climatology (meteorological forecast of showers, thunderstorms, and hail, weather modification, estimation of climate change, etc.). From the end of May 2015 in Kakheti after 25-year interruption, the work of anti-hail service was restored. Therefore, in connection with climate change, the need for the detailed study of the contemporary regime of the vertical distribution of the air temperature above this territory arose. In particular, the indicated information is necessary for the optimum selection of rocket means with the works on the weather modification (fight with the hail, the regulation of atmospheric precipitations, etc.). Construction of the detailed maps of the potential damage distribution of agricultural crops from the hail, etc. taking into account the dimensions of hailstones in the clouds according to the data of radar measurements and height of locality are the most important factors. For now, in Georgia, there is no aerological probing of atmosphere. To solve given problem we processed information about air temperature profiles above Telavi, at 27 km above earth's surface. Information was gathered during four observation time (4, 10, 16, 22 hours with local time. After research, we found vertical distribution of the average monthly values of the air temperature above Kakheti in ‎2012-2017 from January to December. Research was conducted from 0.543 to 27 km above sea level during four periods of research. In particular, it is obtained: -during January the monthly average air temperature linearly diminishes with 2.6 °C on the earth's surface to -57.1 °C at the height of 10 km, then little it changes up to the height of 26 km; the gradient of the air temperature in the layer of the atmosphere from 0.543 to 8 km - 6.3 °C/km; height of zero isotherm - is 1.33 km. -during July the air temperature linearly diminishes with 23.5 °C to -64.7 °C at the height of 17 km, then it grows to -47.5 °C at the height of 27 km; the gradient of the air temperature of - 6.1 °C/km; height of zero isotherm - is 4.39 km, which on 0.16 km is higher than in the sixties of past century.

Keywords: hail, Kakheti, meteorology, vertical distribution of the air temperature

Procedia PDF Downloads 177
20005 Internet of Things Based Process Model for Smart Parking System

Authors: Amjaad Alsalamah, Liyakathunsia Syed

Abstract:

Transportation is an essential need for many people to go to their work, school, and home. In particular, the main common method inside many cities is to drive the car. Driving a car can be an easy job to reach the destination and load all stuff in a reasonable time. However, deciding to find a parking lot for a car can take a long time using the traditional system that can issue a paper ticket for each customer. The old system cannot guarantee a parking lot for all customers. Also, payment methods are not always available, and many customers struggled to find their car among a numerous number of cars. As a result, this research focuses on providing an online smart parking system in order to save time and budget. This system provides a flexible management system for both parking owner and customers by receiving all request via the online system and it gets an accurate result for all available parking and its location.

Keywords: smart parking system, IoT, tracking system, process model, cost, time

Procedia PDF Downloads 340
20004 Fabrication of Pure and Doped MAPbI3 Thin Films by One Step Chemical Vapor Deposition Method for Energy Harvesting Applications

Authors: S. V. N. Pammi, Soon-Gil Yoon

Abstract:

In the present study, we report a facile chemical vapor deposition (CVD) method for Perovskite MAPbI3 thin films by doping with Br and Cl. We performed a systematic optimization of CVD parameters such as deposition temperature, working pressure and annealing time and temperature to obtain high-quality films of CH3NH3PbI3, CH3NH3PbI3-xBrx and CH3NH3PbI3-xClx perovskite. Scanning electron microscopy and X-ray Diffraction pattern showed that the perovskite films have a large grain size when compared to traditional spin coated thin films. To the best of our knowledge, there are very few reports on highly quality perovskite thin films by various doping such as Br and Cl using one step CVD and there is scope for significant improvement in device efficiency. In addition, their band-gap can be conveniently and widely tuned via doping process. This deposition process produces perovskite thin films with large grain size, long diffusion length and high surface coverage. The enhancement of the output power, CH3NH3PbI3 (MAPbI3) dye films when compared to spin coated films and enhancement in output power by doping in doped films was demonstrated in detail. The facile one-step method for deposition of perovskite thin films shows a potential candidate for photovoltaic and energy harvesting applications.

Keywords: perovskite thin films, chemical vapor deposition, energy harvesting, photovoltaics

Procedia PDF Downloads 312
20003 Modeling Sorption and Permeation in the Separation of Benzene/ Cyclohexane Mixtures through Styrene-Butadiene Rubber Crosslinked Membranes

Authors: Hassiba Benguergoura, Kamal Chanane, Sâad Moulay

Abstract:

Pervaporation (PV), a membrane-based separation technology, has gained much attention because of its energy saving capability and low-cost, especially for separation of azeotropic or close-boiling liquid mixtures. There are two crucial issues for industrial application of pervaporation process. The first is developing membrane material and tailoring membrane structure to obtain high pervaporation performances. The second is modeling pervaporation transport to better understand of the above-mentioned structure–pervaporation relationship. Many models were proposed to predict the mass transfer process, among them, solution-diffusion model is most widely used in describing pervaporation transport including preferential sorption, diffusion and evaporation steps. For modeling pervaporation transport, the permeation flux, which depends on the solubility and diffusivity of components in the membrane, should be obtained first. Traditionally, the solubility was calculated according to the Flory–Huggins theory. Separation of the benzene (Bz)/cyclohexane (Cx) mixture is industrially significant. Numerous papers have been focused on the Bz/Cx system to assess the PV properties of membrane materials. Membranes with both high permeability and selectivity are desirable for practical application. Several new polymers have been prepared to get both high permeability and selectivity. Styrene-butadiene rubbers (SBR), dense membranes cross-linked by chloromethylation were used in the separation of benzene/cyclohexane mixtures. The impact of chloromethylation reaction as a new method of cross-linking SBR on the pervaporation performance have been reported. In contrast to the vulcanization with sulfur, the cross-linking takes places on styrene units of polymeric chains via a methylene bridge. The partial pervaporative (PV) fluxes of benzene/cyclohexane mixtures in styrene-butadiene rubber (SBR) were predicted using Fick's first law. The predicted partial fluxes and the PV separation factor agreed well with the experimental data by integrating Fick's law over the benzene concentration. The effects of feed concentration and operating temperature on the predicted permeation flux by this proposed model are investigated. The predicted permeation fluxes are in good agreement with experimental data at lower benzene concentration in feed, but at higher benzene concentration, the model overestimated permeation flux. The predicted and experimental permeation fluxes all increase with operating temperature increasing. Solvent sorption levels for benzene/ cyclohexane mixtures in a SBR membrane were determined experimentally. The results showed that the solvent sorption levels were strongly affected by the feed composition. The Flory- Huggins equation generates higher R-square coefficient for the sorption selectivity.

Keywords: benzene, cyclohexane, pervaporation, permeation, sorption modeling, SBR

Procedia PDF Downloads 333
20002 Highly Accurate Target Motion Compensation Using Entropy Function Minimization

Authors: Amin Aghatabar Roodbary, Mohammad Hassan Bastani

Abstract:

One of the defects of stepped frequency radar systems is their sensitivity to target motion. In such systems, target motion causes range cell shift, false peaks, Signal to Noise Ratio (SNR) reduction and range profile spreading because of power spectrum interference of each range cell in adjacent range cells which induces distortion in High Resolution Range Profile (HRRP) and disrupt target recognition process. Thus Target Motion Parameters (TMPs) effects compensation should be employed. In this paper, such a method for estimating TMPs (velocity and acceleration) and consequently eliminating or suppressing the unwanted effects on HRRP based on entropy minimization has been proposed. This method is carried out in two major steps: in the first step, a discrete search method has been utilized over the whole acceleration-velocity lattice network, in a specific interval seeking to find a less-accurate minimum point of the entropy function. Then in the second step, a 1-D search over velocity is done in locus of the minimum for several constant acceleration lines, in order to enhance the accuracy of the minimum point found in the first step. The provided simulation results demonstrate the effectiveness of the proposed method.

Keywords: automatic target recognition (ATR), high resolution range profile (HRRP), motion compensation, stepped frequency waveform technique (SFW), target motion parameters (TMPs)

Procedia PDF Downloads 153
20001 Tenants Use Less Input on Rented Plots: Evidence from Northern Ethiopia

Authors: Desta Brhanu Gebrehiwot

Abstract:

The study aims to investigate the impact of land tenure arrangements on fertilizer use per hectare in Northern Ethiopia. Household and Plot level data are used for analysis. Land tenure contracts such as sharecropping and fixed rent arrangements have endogeneity. Different unobservable characteristics may affect renting-out decisions. Thus, the appropriate method of analysis was the instrumental variable estimation technic. Therefore, the family of instrumental variable estimation methods two-stage least-squares regression (2SLS, the generalized method of moments (GMM), Limited information maximum likelihood (LIML), and instrumental variable Tobit (IV-Tobit) was used. Besides, a method to handle a binary endogenous variable is applied, which uses a two-step estimation. In the first step probit model includes instruments, and in the second step, maximum likelihood estimation (MLE) (“etregress” command in Stata 14) was used. There was lower fertilizer use per hectare on sharecropped and fixed rented plots relative to owner-operated. The result supports the Marshallian inefficiency principle in sharecropping. The difference in fertilizer use per hectare could be explained by a lack of incentivized detailed contract forms, such as giving more proportion of the output to the tenant under sharecropping contracts, which motivates to use of more fertilizer in rented plots to maximize the production because most sharecropping arrangements share output equally between tenants and landlords.

Keywords: tenure-contracts, endogeneity, plot-level data, Ethiopia, fertilizer

Procedia PDF Downloads 91
20000 System Dietadhoc® - A Fusion of Human-Centred Design and Agile Development for the Explainability of AI Techniques Based on Nutritional and Clinical Data

Authors: Michelangelo Sofo, Giuseppe Labianca

Abstract:

In recent years, the scientific community's interest in the exploratory analysis of biomedical data has increased exponentially. Considering the field of research of nutritional biologists, the curative process, based on the analysis of clinical data, is a very delicate operation due to the fact that there are multiple solutions for the management of pathologies in the food sector (for example can recall intolerances and allergies, management of cholesterol metabolism, diabetic pathologies, arterial hypertension, up to obesity and breathing and sleep problems). In this regard, in this research work a system was created capable of evaluating various dietary regimes for specific patient pathologies. The system is founded on a mathematical-numerical model and has been created tailored for the real working needs of an expert in human nutrition using the human-centered design (ISO 9241-210), therefore it is in step with continuous scientific progress in the field and evolves through the experience of managed clinical cases (machine learning process). DietAdhoc® is a decision support system nutrition specialists for patients of both sexes (from 18 years of age) developed with an agile methodology. Its task consists in drawing up the biomedical and clinical profile of the specific patient by applying two algorithmic optimization approaches on nutritional data and a symbolic solution, obtained by transforming the relational database underlying the system into a deductive database. For all three solution approaches, particular emphasis has been given to the explainability of the suggested clinical decisions through flexible and customizable user interfaces. Furthermore, the system has multiple software modules based on time series and visual analytics techniques that allow to evaluate the complete picture of the situation and the evolution of the diet assigned for specific pathologies.

Keywords: medical decision support, physiological data extraction, data driven diagnosis, human centered AI, symbiotic AI paradigm

Procedia PDF Downloads 31
19999 Ensemble Methods in Machine Learning: An Algorithmic Approach to Derive Distinctive Behaviors of Criminal Activity Applied to the Poaching Domain

Authors: Zachary Blanks, Solomon Sonya

Abstract:

Poaching presents a serious threat to endangered animal species, environment conservations, and human life. Additionally, some poaching activity has even been linked to supplying funds to support terrorist networks elsewhere around the world. Consequently, agencies dedicated to protecting wildlife habitats have a near intractable task of adequately patrolling an entire area (spanning several thousand kilometers) given limited resources, funds, and personnel at their disposal. Thus, agencies need predictive tools that are both high-performing and easily implementable by the user to help in learning how the significant features (e.g. animal population densities, topography, behavior patterns of the criminals within the area, etc) interact with each other in hopes of abating poaching. This research develops a classification model using machine learning algorithms to aid in forecasting future attacks that is both easy to train and performs well when compared to other models. In this research, we demonstrate how data imputation methods (specifically predictive mean matching, gradient boosting, and random forest multiple imputation) can be applied to analyze data and create significant predictions across a varied data set. Specifically, we apply these methods to improve the accuracy of adopted prediction models (Logistic Regression, Support Vector Machine, etc). Finally, we assess the performance of the model and the accuracy of our data imputation methods by learning on a real-world data set constituting four years of imputed data and testing on one year of non-imputed data. This paper provides three main contributions. First, we extend work done by the Teamcore and CREATE (Center for Risk and Economic Analysis of Terrorism Events) research group at the University of Southern California (USC) working in conjunction with the Department of Homeland Security to apply game theory and machine learning algorithms to develop more efficient ways of reducing poaching. This research introduces ensemble methods (Random Forests and Stochastic Gradient Boosting) and applies it to real-world poaching data gathered from the Ugandan rain forest park rangers. Next, we consider the effect of data imputation on both the performance of various algorithms and the general accuracy of the method itself when applied to a dependent variable where a large number of observations are missing. Third, we provide an alternate approach to predict the probability of observing poaching both by season and by month. The results from this research are very promising. We conclude that by using Stochastic Gradient Boosting to predict observations for non-commercial poaching by season, we are able to produce statistically equivalent results while being orders of magnitude faster in computation time and complexity. Additionally, when predicting potential poaching incidents by individual month vice entire seasons, boosting techniques produce a mean area under the curve increase of approximately 3% relative to previous prediction schedules by entire seasons.

Keywords: ensemble methods, imputation, machine learning, random forests, statistical analysis, stochastic gradient boosting, wildlife protection

Procedia PDF Downloads 295
19998 Efficient L-Xylulose Production Using Whole-Cell Biocatalyst With NAD+ Regeneration System Through Co-Expression of Xylitol Dehydrogenase and NADH Oxidase in Escherichia Coli

Authors: Mesfin Angaw Tesfay

Abstract:

L-Xylulose is a potentially valuable rare sugar used as starting material for antiviral and anticancer drug development in pharmaceutical industries. L-Xylulose exist in a very low concentration in nature and have to be synthesized from cheap starting materials such as xylitol through biotechnological approaches. In this study, cofactor engineering and deep eutectic solvent were applied to improve the efficiency of L-xylulose production from xylitol. A water-forming NAD+ regeneration enzyme (NADH oxidase) from Streptococcus mutans ATCC 25175 was introduced into E. coli with xylitol-4-dehydrogenase (XDH) of Pantoea ananatis resulting in recombinant cells harboring the vector pETDuet-xdh-SmNox. Further, three deep eutectic solvents (DES) including, Choline chloride/glycerol (ChCl/G), Choline chloride/urea (ChCl/U), and Choline chloride/ethylene glycol (ChCl/EG) have been employed to facilitate the conversion efficiency of L-xylulose from xylitol. The co-expression system exhibited optimal activity at a temperature of 37 ℃ and pH 8.5, and the addition of Mg2+ enhanced the catalytic activity by 1.19-fold. Co-expression of NADH oxidase with XDH enzyme resulted in increased L-xylulose concentration and productivity from xylitol as well as the intracellular NAD+ concentration. Two of the DES used (ChCl/U and ChCl/EG) show positive effects on product yield and the ChCl/G has inhibiting effects. The optimum concentration of ChCl/U was 2.5%, which increased the L-xylulose yields compared to the control without DES. In a 1 L fermenter the final concentration and productivity of L-xylulose from 50 g/L of xylitol reached 48.45 g/L, and 2.42 g/L.h respectively, which was the highest report. Overall, this study is a suitable approach for large-scale production of L-xylulose from xylitol using the engineered E. coli cell.

Keywords: Xylitol-4-dehydrogenase, NADH oxidase, L-xylulose, Xylitol, Coexpression, DESs

Procedia PDF Downloads 30
19997 Linac Quality Controls Using An Electronic Portal Imaging Device

Authors: Domingo Planes Meseguer, Raffaele Danilo Esposito, Maria Del Pilar Dorado Rodriguez

Abstract:

Monthly quality control checks for a Radiation Therapy Linac may be performed is a simple and efficient way once they have been standardized and protocolized. On the other hand this checks, in spite of being imperatives, require a not negligible execution times in terms of machine time and operators time. Besides it must be taken into account the amount of disposable material which may be needed together with the use of commercial software for their performing. With the aim of optimizing and standardizing mechanical-geometric checks and multi leaves collimator checks, we decided to implement a protocol which makes use of the Electronic Portal Imaging Device (EPID) available on our Linacs. The user is step by step guided by the software during the whole procedure. Acquired images are automatically analyzed by our programs all of them written using only free software.

Keywords: quality control checks, linac, radiation oncology, medical physics, free software

Procedia PDF Downloads 203
19996 Quintic Spline Method for Variable Coefficient Fourth-Order Parabolic Partial Differential Equations

Authors: Reza Mohammadi, Mahdieh Sahebi

Abstract:

We develop a method based on polynomial quintic spline for numerical solution of fourth-order non-homogeneous parabolic partial differential equation with variable coefficient. By using polynomial quintic spline in off-step points in space and finite difference in time directions, we obtained two three level implicit methods. Stability analysis of the presented method has been carried out. We solve four test problems numerically to validate the proposed derived method. Numerical comparison with other existence methods shows the superiority of our presented scheme.

Keywords: fourth-order parabolic equation, variable coefficient, polynomial quintic spline, off-step points, stability analysis

Procedia PDF Downloads 369
19995 Branched Chain Amino Acid Kinesio PVP Gel Tape from Extract of Pea (Pisum sativum L.) Based on Ultrasound-Assisted Extraction Technology

Authors: Doni Dermawan

Abstract:

Modern sports competition as a consequence of the increase in the value of the business and entertainment in the field of sport has been demanding athletes to always have excellent physical endurance performance. Physical exercise is done in a long time, and intensive may pose a risk of muscle tissue damage caused by the increase of the enzyme creatine kinase. Branched Chain Amino Acids (BCAA) is an essential amino acid that is composed of leucine, isoleucine, and valine which serves to maintain muscle tissue, keeping the immune system, and prevent further loss of coordination and muscle pain. Pea (Pisum sativum L.) is a kind of leguminous plants that are rich in Branched Chain Amino Acids (BCAA) where every one gram of protein pea contains 82.7 mg of leucine; 56.3 mg isoleucine; and 56.0 mg of valine. This research aims to develop Branched Chain Amino Acids (BCAA) from pea extract is applied in dosage forms Gel PVP Kinesio Tape technology using Ultrasound-assisted Extraction. The method used in the writing of this paper is the Cochrane Collaboration Review that includes literature studies, testing the quality of the study, the characteristics of the data collection, analysis, interpretation of results, and clinical trials as well as recommendations for further research. Extraction of BCAA in pea done using ultrasound-assisted extraction technology with optimization variables includes the type of solvent extraction (NaOH 0.1%), temperature (20-250C), time (15-30 minutes) power (80 watt) and ultrasonic frequency (35 KHz). The advantages of this extraction method are the level of penetration of the solvent into the membrane of the cell is high and can increase the transfer period so that the BCAA substance separation process more efficient. BCAA extraction results are then applied to the polymer PVP (Polyvinylpyrrolidone) Gel powder composed of PVP K30 and K100 HPMC dissolved in 10 mL of water-methanol (1: 1) v / v. Preparations Kinesio Tape Gel PVP is the BCAA in the gel are absorbed into the muscle tissue, and joints through tensile force then provides stimulation to the muscle circulation with variable pressure so that the muscle can increase the biomechanical movement and prevent damage to the muscle enzyme creatine kinase. Analysis and evaluation of test preparation include interaction, thickness, weight uniformity, humidity, water vapor permeability, the levels of the active substance, content uniformity, percentage elongation, stability testing, release profile, permeation in vitro and in vivo skin irritation testing.

Keywords: branched chain amino acid, BCAA, Kinesio tape, pea, PVP gel, ultrasound-assisted extraction

Procedia PDF Downloads 292
19994 Comparative Production of Secondary Metabolites by Prunus africana (Hook. F.) Kalkman Provenances in Cameroon and Some Associated Endophytic Fungi

Authors: Gloria M. Ntuba-Jua, Afui M. Mih, Eneke E. T. Bechem

Abstract:

Prunus africana (Hook. F.) Kalkman, commonly known as Pygeum or African cherry belongs to the Rosaceae family. It is a medium to large, evergreen tree with a spreading crown of 10 to 20 m. It is used by the traditional medical practitioners for the treatment of over 45ailments in Cameroon and sub-Sahara Africa. In modern medicine, it is used in the treatment of benign prostrate hyperplasia (BPH), prostate gland hypertrophy (enlarged prostate glands). This is possible because of its ability to produce some secondary metabolites which are believed to have bioactivity against these ailments. The ready international market for the sale of Prunus bark, uncontrolled exploitation, illegal harvesting using inappropriate techniques and poor timing of harvesting have contributed enormously to making the plant endangered. It is known to harbor a large number of endophytic fungi with the potential to produce similar secondary metabolites as the parent plant. Alternative sourcing of medicinal principles through endophytic fungi requires succinct knowledge of the endophytic fungi. This will serve as a conservation measure for Prunus africana by reducing dependence on Prunus bark for such metabolites. This work thus sought to compare the production of some major secondary metabolites produced by P. africana and some of its associated endophytic fungi. The leaves and stem bark of the plant from different provenances were soaked in methanol for 72 hrs to yield the methanolic crude extract. The phytochemical screening of the methanolic crude extracts using different standard procedures revealed the presence of tannins, flavonoids, terpenoids, saponins, phenolics and steroids. Pure cultures of some predominantly isolated endophyte species from the difference Prunus provenances such as Curvularia sp, and Morphospecies P001 were also grown in Potato Dextrose Broth (PDB) for 21 days and later extracted with Methylene dichloride (MDC) solvent after 24hrs to produce crude culture extracts. Qualitative assessment of crude culture extracts showed the presence of tannins, terpenoids, phenolics and steroids particularly β-Sitosterol, (a major bioactive metabolite) as did the plant tissues. Qualitative analysis by thin layer chromatography (TLC) was done to confirm and compare the production of β-Sitosterol (as marker compounds) in the crude extracts of the plant and endophyte. Samples were loaded on TLC silica gel aluminium barked plate (Kieselgel 60 F254, 0.2 mm, Merck) using acetone/hexane, (3.0:7.0) solvent system. They were visualized under an ultra violet lamp (UV254 and UV360). TLC revealed that leaves had a higher concentration of β-sitosterol in terms of band intensity than stem barks from the different provenances. The intensity of β-sitosterol bands in the culture extracts of endophytes was comparable to the plant extracts except for Curvularia sp (very minute) whose band was very faint. The ability of these fungi to make β-sitosterol was confirmed by TLC analysis with the compound having chromatographic properties (retention factor) similar to those of β-sitosterol standard. The ability of these major endophytes to produce secondary metabolites similar to the host has therefore been demonstrated. There is, therefore, the potential of developing the in vitro production system of Prunus secondary metabolites thereby enhancing its conservation.

Keywords: Caneroon, endophytic fungi, Prunus africana, secondary metabolite

Procedia PDF Downloads 239
19993 An Extended Domain-Specific Modeling Language for Marine Observatory Relying on Enterprise Architecture

Authors: Charbel Aoun, Loic Lagadec

Abstract:

A Sensor Network (SN) is considered as an operation of two phases: (1) the observation/measuring, which means the accumulation of the gathered data at each sensor node; (2) transferring the collected data to some processing center (e.g., Fusion Servers) within the SN. Therefore, an underwater sensor network can be defined as a sensor network deployed underwater that monitors underwater activity. The deployed sensors, such as Hydrophones, are responsible for registering underwater activity and transferring it to more advanced components. The process of data exchange between the aforementioned components perfectly defines the Marine Observatory (MO) concept which provides information on ocean state, phenomena and processes. The first step towards the implementation of this concept is defining the environmental constraints and the required tools and components (Marine Cables, Smart Sensors, Data Fusion Server, etc). The logical and physical components that are used in these observatories perform some critical functions such as the localization of underwater moving objects. These functions can be orchestrated with other services (e.g. military or civilian reaction). In this paper, we present an extension to our MO meta-model that is used to generate a design tool (ArchiMO). We propose new constraints to be taken into consideration at design time. We illustrate our proposal with an example from the MO domain. Additionally, we generate the corresponding simulation code using our self-developed domain-specific model compiler. On the one hand, this illustrates our approach in relying on Enterprise Architecture (EA) framework that respects: multiple views, perspectives of stakeholders, and domain specificity. On the other hand, it helps reducing both complexity and time spent in design activity, while preventing from design modeling errors during porting this activity in the MO domain. As conclusion, this work aims to demonstrate that we can improve the design activity of complex system based on the use of MDE technologies and a domain-specific modeling language with the associated tooling. The major improvement is to provide an early validation step via models and simulation approach to consolidate the system design.

Keywords: smart sensors, data fusion, distributed fusion architecture, sensor networks, domain specific modeling language, enterprise architecture, underwater moving object, localization, marine observatory, NS-3, IMS

Procedia PDF Downloads 182
19992 Development of Analytical Systems for Nurses in Kenya

Authors: Peris Wanjiku

Abstract:

The objective of this paper is to describe the development and implications of a national nursing workforce analytical system in Kenya. Findings: Creating a national electronic nursing workforce analytical system provides more reliable information on nurses ‘national demographics, migration patterns, and workforce capacity and efficiency. Data analysis is most useful for human resources for health (HRH) planning when workforce capacity data can be linked to worksite staffing requirements. As a result of establishing this database, the Kenya Ministry of Health has improved its capability to assess its nursing workforce and document important workforce trends, such as out-migration. Current data identify the United States as the leading recipient country of Kenyan nurses. The overwhelming majority of Kenyan nurses who decide to out-migrate are amongst Kenya’s most qualified. Conclusions: The Kenya nursing database is a first step toward facilitating evidence-based decision-making in HRH. This database is unique to developing countries in sub-Saharan Africa. Establishing an electronic workforce database requires long-term investment and sustained support by national and global stakeholders.

Keywords: analytical, information, health, migration

Procedia PDF Downloads 101
19991 Hybrid Control Strategy for Nine-Level Asymmetrical Cascaded H-Bridge Inverter

Authors: Bachir Belmadani, Rachid Taleb, M’hamed Helaimi

Abstract:

Multilevel inverters are well used in high power electronic applications because of their ability to generate a very good quality of waveforms, reducing switching frequency, and their low voltage stress across the power devices. This paper presents the hybrid pulse-width modulation (HPWM) strategy of a uniform step asymmetrical cascaded H-bridge nine-level Inverter (USACHB9LI). The HPWM approach is compared to the well-known sinusoidal pulse-width modulation (SPWM) strategy. Simulation results demonstrate the better performances and technical advantages of the HPWM controller in feeding a high power induction motor.

Keywords: uniform step asymmetrical cascaded h-bridge high-level inverter, hybrid pwm, sinusoidal pwm, high power induction motor

Procedia PDF Downloads 573
19990 Numerical Simulation of a Combined Impact of Cooling and Ventilation on the Indoor Environmental Quality

Authors: Matjaz Prek

Abstract:

Impact of three different combinations of cooling and ventilation systems on the indoor environmental quality (IEQ) has been studied. Comparison of chilled ceiling cooling in combination with displacement ventilation, cooling with fan coil unit and cooling with flat wall displacement outlets was performed. All three combinations were evaluated from the standpoint of whole-body and local thermal comfort criteria as well as from the standpoint of ventilation effectiveness. The comparison was made on the basis of numerical simulation with DesignBuilder and Fluent. Numerical simulations were carried out in two steps. Firstly the DesignBuilder software environment was used to model the buildings thermal performance and evaluation of the interaction between the environment and the building. Heat gains of the building and of the individual space, as well as the heat loss on the boundary surfaces in the room, were calculated. In the second step Fluent software environment was used to simulate the response of the indoor environment, evaluating the interaction between building and human, using the simulation results obtained in the first step. Among the systems presented, the ceiling cooling system in combination with displacement ventilation was found to be the most suitable as it offers a high level of thermal comfort with adequate ventilation efficiency. Fan coil cooling has proved inadequate from the standpoint of thermal comfort whereas flat wall displacement outlets were inadequate from the standpoint of ventilation effectiveness. The study showed the need in evaluating indoor environment not solely from the energy use point of view, but from the point of view of indoor environmental quality as well.

Keywords: cooling, ventilation, thermal comfort, ventilation effectiveness, indoor environmental quality, IEQ, computational fluid dynamics

Procedia PDF Downloads 189
19989 Critical Conditions for the Initiation of Dynamic Recrystallization Prediction: Analytical and Finite Element Modeling

Authors: Pierre Tize Mha, Mohammad Jahazi, Amèvi Togne, Olivier Pantalé

Abstract:

Large-size forged blocks made of medium carbon high-strength steels are extensively used in the automotive industry as dies for the production of bumpers and dashboards through the plastic injection process. The manufacturing process of the large blocks starts with ingot casting, followed by open die forging and a quench and temper heat treatment process to achieve the desired mechanical properties and numerical simulation is widely used nowadays to predict these properties before the experiment. But the temperature gradient inside the specimen remains challenging in the sense that the temperature before loading inside the material is not the same, but during the simulation, constant temperature is used to simulate the experiment because it is assumed that temperature is homogenized after some holding time. Therefore to be close to the experiment, real distribution of the temperature through the specimen is needed before the mechanical loading. Thus, We present here a robust algorithm that allows the calculation of the temperature gradient within the specimen, thus representing a real temperature distribution within the specimen before deformation. Indeed, most numerical simulations consider a uniform temperature gradient which is not really the case because the surface and core temperatures of the specimen are not identical. Another feature that influences the mechanical properties of the specimen is recrystallization which strongly depends on the deformation conditions and the type of deformation like Upsetting, Cogging...etc. Indeed, Upsetting and Cogging are the stages where the greatest deformations are observed, and a lot of microstructural phenomena can be observed, like recrystallization, which requires in-depth characterization. Complete dynamic recrystallization plays an important role in the final grain size during the process and therefore helps to increase the mechanical properties of the final product. Thus, the identification of the conditions for the initiation of dynamic recrystallization is still relevant. Also, the temperature distribution within the sample and strain rate influence the recrystallization initiation. So the development of a technique allowing to predict the initiation of this recrystallization remains challenging. In this perspective, we propose here, in addition to the algorithm allowing to get the temperature distribution before the loading stage, an analytical model leading to determine the initiation of this recrystallization. These two techniques are implemented into the Abaqus finite element software via the UAMP and VUHARD subroutines for comparison with a simulation where an isothermal temperature is imposed. The Artificial Neural Network (ANN) model to describe the plastic behavior of the material is also implemented via the VUHARD subroutine. From the simulation, the temperature distribution inside the material and recrystallization initiation is properly predicted and compared to the literature models.

Keywords: dynamic recrystallization, finite element modeling, artificial neural network, numerical implementation

Procedia PDF Downloads 82
19988 An intelligent Troubleshooting System and Performance Evaluator for Computer Network

Authors: Iliya Musa Adamu

Abstract:

This paper seeks to develop an expert system that would troubleshoot computer network and evaluate the network system performance so as to reduce the workload on technicians and increase the efficiency and effectiveness of solutions proffered to computer network problems. The platform of the system was developed using ASP.NET, whereas the codes are implemented in Visual Basic and integrated with SQL Server 2005. The knowledge base was represented using production rule, whereas the searching method that was used in developing the network troubleshooting expert system is the forward-chaining-rule-based-system. This software tool offers the advantage of providing an immediate solution to most computer network problems encountered by computer users.

Keywords: expert system, forward chaining rule based system, network, troubleshooting

Procedia PDF Downloads 653
19987 Fake News Detection for Korean News Using Machine Learning Techniques

Authors: Tae-Uk Yun, Pullip Chung, Kee-Young Kwahk, Hyunchul Ahn

Abstract:

Fake news is defined as the news articles that are intentionally and verifiably false, and could mislead readers. Spread of fake news may provoke anxiety, chaos, fear, or irrational decisions of the public. Thus, detecting fake news and preventing its spread has become very important issue in our society. However, due to the huge amount of fake news produced every day, it is almost impossible to identify it by a human. Under this context, researchers have tried to develop automated fake news detection using machine learning techniques over the past years. But, there have been no prior studies proposed an automated fake news detection method for Korean news to our best knowledge. In this study, we aim to detect Korean fake news using text mining and machine learning techniques. Our proposed method consists of two steps. In the first step, the news contents to be analyzed is convert to quantified values using various text mining techniques (topic modeling, TF-IDF, and so on). After that, in step 2, classifiers are trained using the values produced in step 1. As the classifiers, machine learning techniques such as logistic regression, backpropagation network, support vector machine, and deep neural network can be applied. To validate the effectiveness of the proposed method, we collected about 200 short Korean news from Seoul National University’s FactCheck. which provides with detailed analysis reports from 20 media outlets and links to source documents for each case. Using this dataset, we will identify which text features are important as well as which classifiers are effective in detecting Korean fake news.

Keywords: fake news detection, Korean news, machine learning, text mining

Procedia PDF Downloads 278
19986 Synthesis of Liposomal Vesicles by a Novel Supercritical Fluid Process

Authors: Wen-Chyan Tsai, Syed S. H. Rizvi

Abstract:

Organic solvent residues are always associated with liposomes produced by the traditional techniques like the thin film hydration and reverse phase evaporation methods, which limit the applications of these vesicles in the pharmaceutical, food and cosmetic industries. Our objective was to develop a novel and benign process of liposomal microencapsulation by using supercritical carbon dioxide (SC-CO2) as the sole phospholipid-dissolving medium and a green substitute for organic solvents. This process consists of supercritical fluid extraction followed by rapid expansion via a nozzle and automatic cargo suction. Lecithin and cholesterol mixed in 10:1 mass ratio were dissolved in SC-CO2 at 20 ± 0.5 MPa and 60 oC. After at least two hours of equilibrium, the lecithin/cholesterol-laden SC-CO2 was passed through a 1000-micron nozzle and immediately mixed with the cargo solution to form liposomes. Liposomal micro-encapsulation was conducted at three pressures (8.27, 12.41, 16.55 MPa), three temperatures (75, 83 and 90 oC) and two flow rates (0.25 ml/sec and 0.5 ml/sec). Liposome size, zeta potential and encapsulation efficiency were characterized as functions of the operating parameters. The average liposomal size varied from 400-500 nm to 1000-1200 nm when the pressure was increased from 8.27 to 16.55 MPa. At 12.41 MPa, 90 oC and 0.25 ml per second of 0.2 M glucose cargo loading rate, the highest encapsulation efficiency of 31.65 % was achieved. Under a confocal laser scanning microscope, large unilamellar vesicles and multivesicular vesicles were observed to make up a majority of the liposomal emulsion. This new approach is a rapid and continuous process for bulk production of liposomes using a green solvent. Based on the results to date, it is feasible to apply this technique to encapsulate hydrophilic compounds inside the aqueous core as well as lipophilic compounds in the phospholipid bilayers of the liposomes for controlled release, solubility improvement and targeted therapy of bioactive compounds.

Keywords: liposome, micro encapsulation, supercritical carbon dioxide, non-toxic process

Procedia PDF Downloads 434
19985 Model Evaluation of Thermal Effects Created by Cell Membrane Electroporation

Authors: Jiahui Song

Abstract:

The use of very high electric fields (~ 100kV/cm or higher) with pulse durations in the nanosecond range has been a recent development. The electric pulses have been used as tools to generate electroporation which has many biomedical applications. Most of the studies of electroporation have ignored possible thermal effects because of the small duration of the applied voltage pulses. However, it has been predicted membrane temperature gradients ranging from 0.2×109 to 109 K/m. This research focuses on thermal gradients that drives for electroporative enhancements, even though the actual temperature values might not have changed appreciably from their equilibrium levels. The dynamics of pore formation with the application of an externally applied electric field is studied on the basis of molecular dynamics (MD) simulations using the GROMACS package. Different temperatures are assigned to various regions to simulate the appropriate temperature gradients. The GROMACS provides the force fields for the lipid membranes, which is taken to comprise of dipalmitoyl-phosphatidyl-choline (DPPC) molecules. The water model mimicks the aqueous environment surrounding the membrane. Velocities of water and membrane molecules are generated randomly at each simulation run according to a Maxwellian distribution. For statistical significance, a total of eight MD simulations are carried out with different starting molecular velocities for each simulation. MD simulation shows no pore is formed in a 10-ns snapshot for a DPPC membrane set at a uniform temperature of 295 K after a 0.4 V/nm electric field is applied. A nano-sized pore is clearly seen in a 10-ns snapshot on the same geometry but with the top and bottom membrane surfaces kept at temperatures of 300 and 295 K, respectively. For the same applied electric field, the formation of nanopores is clearly demonstrated, but only in the presence of a temperature gradient. MD simulation results show enhanced electroporative effects arising from thermal gradients. The study suggests the temperature gradient is a secondary driver, with the electric field being the primary cause for electroporation.

Keywords: nanosecond, electroporation, thermal effects, molecular dynamics

Procedia PDF Downloads 85
19984 Synthesis of Electrospun Polydimethylsiloxane (PDMS)/Polyvinylidene Fluoriure (PVDF) Nanofibrous Membranes for CO₂ Capture

Authors: Wen-Wen Wang, Qian Ye, Yi-Feng Lin

Abstract:

Carbon dioxide emissions are expected to increase continuously, resulting in climate change and global warming. As a result, CO₂ capture has attracted a large amount of research attention. Among the various CO₂ capture methods, membrane technology has proven to be highly efficient in capturing CO₂, because it can be scaled up, low energy consumptions and small area requirements for use by the gas separation. Various nanofibrous membranes were successfully prepared by a simple electrospinning process. The membrane contactor combined with chemical absorption and membrane process in the post-combustion CO₂ capture is used in this study. In a membrane contactor system, the highly porous and water-repellent nanofibrous membranes were used as a gas-liquid interface in a membrane contactor system for CO₂ absorption. In this work, we successfully prepared the polyvinylidene fluoride (PVDF) porous membranes with an electrospinning process. Afterwards, the as-prepared water-repellent PVDF porous membranes were used for the CO₂ capture application. However, the pristine PVDF nanofibrous membranes were wetted by the amine absorbents, resulting in the decrease in the CO₂ absorption flux, the hydrophobic polydimethylsiloxane (PDMS) materials were added into the PVDF nanofibrous membranes to improve the solvent resistance of the membranes. To increase the hydrophobic properties and CO₂ absorption flux, more hydrophobic surfaces of the PDMS/PVDF nanofibrous membranes are obtained by the grafting of fluoroalkylsilane (FAS) on the membranes surface. Furthermore, the highest CO₂ absorption flux of the PDMS/PVDF nanofibrous membranes is reached after the FAS modification with four times. The PDMS/PVDF nanofibrous membranes with 60 wt% PDMS addition can be a long and continuous operation of the CO₂ absorption and regeneration experiments. It demonstrates the as-prepared PDMS/PVDF nanofibrous membranes could potentially be used for large-scale CO₂ absorption during the post-combustion process in power plants.

Keywords: CO₂ capture, electrospinning process, membrane contactor, nanofibrous membranes, PDMS/PVDF

Procedia PDF Downloads 276