Search results for: ϵ - constraint method
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 18897

Search results for: ϵ - constraint method

14517 Strength Investigation of Liquefied Petroleum Gas Cylinders: Dynamic Loads

Authors: Moudar Zgoul, Hashem Alkhaldi

Abstract:

A large number of transportable LPG cylinders are manufactured annually for domestic use. These LPG cylinders are manufactured from mild steel and filled maximally with 12.5 kg liquefied gas under internal pressure of 0.6 N/mm² at a temperature of 50°C. Many millions of such LPG cylinders are in daily use mainly, for purposes of space heating, water heating, and cooking. Thereby, they are imposed to severe conditions leading to their failure. Each year not less than 5000 of these LPG cylinders fail, some of those failures cause damage and loss in lives and properties. In this work, LPG cylinders were investigated; Stress calculations and deformations under dynamic (impact) loadings were carried out to simulate the effects of such loads on the cylinders while in service. Analysis of the LPG cylinders was carried out using the finite element method; shell and cylindrical elements were used at the top, bottom, and in middle (weld region), permitting elastic-plastic analysis for a thin-walled LPG cylinder. Variables such as maximum stresses and maximum deflections under the effect of impact loading were investigated in this work. Results showed that the maximum stresses reach 680 MPa when dropped from 3m-height. The maximum radial deformation occurs at the cylinder’s top in case of the top-position impact. This information should be useful for enhancing the strength of such cylinders and to for prolonging their service life.

Keywords: dynamic analysis, finite element method, impact load, LPG cylinders

Procedia PDF Downloads 307
14516 Perceived Environmental Effects of Charcoal Production among Rural Dwellers in Rainforest and Guinea Savannah Agro-Ecological Zones of Nigeria

Authors: P. O. Eniola, S. O. Odebode

Abstract:

Charcoal production constitutes serious environmental problems to most developing countries of the world. Hence, the study assessed perceived environmental effects of charcoal production (CP) among the rural dwellers in rainforest and guinea savannah (GS) zones of Nigeria. Multi-stage sampling procedure was used to select 83 and 85 charcoal producers in GS and rainforest zones respectively. Eighteen statements on perceived environmental effects of charcoal production were collected. Data was collected through the use of structured interview schedule and analysed using both descriptive and inferential statistics. Descriptive analysis showed that the mean age was 43 years, 90.5% males, 90.6% married and 35.3% of respondents had no formal education. The majority (80.0%) of the respondents make use of earth mound method of CP and 52.9% of respondents produced between 32-32000kg of charcoal per annum. Respondents (62.7%) perceived that charcoal production could lead to erosion, 62.4% reduce the available trees for future use (62.4%) and reduce available air in the environment (54.1%). A significant difference existed in the perceived environmental effects of charcoal production between rainforest and guinea savannah agro-ecological zones (F=14.62). There is a need for the government to quickly work on other available and affordable alternative household energy sources.

Keywords: deforestation, energy, earth mound method, environment

Procedia PDF Downloads 378
14515 Evaluation of Expected Annual Loss Probabilities of RC Moment Resisting Frames

Authors: Saemee Jun, Dong-Hyeon Shin, Tae-Sang Ahn, Hyung-Joon Kim

Abstract:

Building loss estimation methodologies which have been advanced considerably in recent decades are usually used to estimate socio and economic impacts resulting from seismic structural damage. In accordance with these methods, this paper presents the evaluation of an annual loss probability of a reinforced concrete moment resisting frame designed according to Korean Building Code. The annual loss probability is defined by (1) a fragility curve obtained from a capacity spectrum method which is similar to a method adopted from HAZUS, and (2) a seismic hazard curve derived from annual frequencies of exceedance per peak ground acceleration. Seismic fragilities are computed to calculate the annual loss probability of a certain structure using functions depending on structural capacity, seismic demand, structural response and the probability of exceeding damage state thresholds. This study carried out a nonlinear static analysis to obtain the capacity of a RC moment resisting frame selected as a prototype building. The analysis results show that the probability of being extensive structural damage in the prototype building is expected to 0.004% in a year.

Keywords: expected annual loss, loss estimation, RC structure, fragility analysis

Procedia PDF Downloads 390
14514 Minimum Vertices Dominating Set Algorithm for Secret Sharing Scheme

Authors: N. M. G. Al-Saidi, K. A. Kadhim, N. A. Rajab

Abstract:

Over the past decades, computer networks and data communication system has been developing fast, so, the necessity to protect a transmitted data is a challenging issue, and data security becomes a serious problem nowadays. A secret sharing scheme is a method which allows a master key to be distributed among a finite set of participants, in such a way that only certain authorized subsets of participants to reconstruct the original master key. To create a secret sharing scheme, many mathematical structures have been used; the most widely used structure is the one that is based on graph theory (graph access structure). Subsequently, many researchers tried to find efficient schemes based on graph access structures. In this paper, we propose a novel efficient construction of a perfect secret sharing scheme for uniform access structure. The dominating set of vertices in a regular graph is used for this construction in the following way; each vertex represents a participant and each minimum independent dominating subset represents a minimal qualified subset. Some relations between dominating set, graph order and regularity are achieved, and can be used to demonstrate the possibility of using dominating set to construct a secret sharing scheme. The information rate that is used as a measure for the efficiency of such systems is calculated to show that the proposed method has some improved values.

Keywords: secret sharing scheme, dominating set, information rate, access structure, rank

Procedia PDF Downloads 380
14513 Optimization of Cacao Fermentation in Davao Philippines Using Sustainable Method

Authors: Ian Marc G. Cabugsa, Kim Ryan Won, Kareem Mamac, Manuel Dee, Merlita Garcia

Abstract:

An optimized cacao fermentation technique was developed for the cacao farmers of Davao City Philippines. Cacao samples with weights ranging from 150-250 kilograms were collected from various cacao farms in Davao City and Zamboanga City Philippines. Different fermentation techniques were used starting with design of the sweat box, prefermentation conditionings, number of days for fermentation and number of turns. As the beans are being fermented, its temperature was regularly monitored using a digital thermometer. The resultant cacao beans were assessed using physical and chemical means. For the physical assessment, the bean cut test, bean count tests, and sensory test were used. Quantification of theobromine, caffeine, and antioxidants in the form of equivalent quercetin was used for chemical assessment. Both the theobromine and caffeine were analyzed using HPLC method while the antioxidant was analyzed spectrometrically. To come up with the best fermentation procedure, the different assessment were given priority coefficients wherein the physical tests – taste test, cut, and bean count tests were given priority over the results of the chemical test. The result of the study was an optimized fermentation protocol that is readily adaptable and transferable to any cacao cooperatives or groups in Mindanao or even Philippines as a whole.

Keywords: cacao, fermentation, HPLC, optimization, Philippines

Procedia PDF Downloads 437
14512 Numerical and Experimental Investigation of a Mechanical System with a Pendulum

Authors: Andrzej Mitura, Krzysztof Kecik, Michal Augustyniak

Abstract:

This paper presents a numerical and experimental research of a nonlinear two degrees of freedom system. The tested system consists of a mechanical oscillator (the primary subsystem) with the attached pendulum (the secondary subsystem). The oscillator is suspended on a linear (or nonlinear) coil spring and a nonlinear magnetorheorogical damper and it is excited kinematically. Added pendulum can be used to reduce vibration of a primary subsystem or to energy harvesting. The numerical and experimental investigations showed that the pendulum can perform several types of motion, for example: chaotic motion, constant position in lower or upper (stable inverted pendulum), rotation, symmetrical or asymmetrical swinging vibrations. The main objective of this study is to determine an influence of system parameters for increasing the zone when the pendulum rotates. As a final effect a semi-active control method to change the pendulum solution on the rotation is proposed. To the implementation of this method the magnetorheorogical damper is applied. Continuous rotation of the pendulum is desirable for recovery of energy. The work is financed by Grant no. 0234/IP2/2011/71 from the Polish Ministry of Science and Higher Education in years 2012-2014.

Keywords: autoparametric vibrations, chaos and rotation control, magnetorheological damper

Procedia PDF Downloads 362
14511 Binary Decision Diagram Based Methods to Evaluate the Reliability of Systems Considering Failure Dependencies

Authors: Siqi Qiu, Yijian Zheng, Xin Guo Ming

Abstract:

In many reliability and risk analysis, failures of components are supposed to be independent. However, in reality, the ignorance of failure dependencies among components may render the results of reliability and risk analysis incorrect. There are two principal ways to incorporate failure dependencies in system reliability and risk analysis: implicit and explicit methods. In the implicit method, failure dependencies can be modeled by joint probabilities, correlation values or conditional probabilities. In the explicit method, certain types of dependencies can be modeled in a fault tree as mutually independent basic events for specific component failures. In this paper, explicit and implicit methods based on BDD will be proposed to evaluate the reliability of systems considering failure dependencies. The obtained results prove the equivalence of the proposed implicit and explicit methods. It is found that the consideration of failure dependencies decreases the reliability of systems. This observation is intuitive, because more components fail due to failure dependencies. The consideration of failure dependencies helps designers to reduce the dependencies between components during the design phase to make the system more reliable.

Keywords: reliability assessment, risk assessment, failure dependencies, binary decision diagram

Procedia PDF Downloads 454
14510 Thermodynamic Modeling of Three Pressure Level Reheat HRSG, Parametric Analysis and Optimization Using PSO

Authors: Mahmoud Nadir, Adel Ghenaiet

Abstract:

The main purpose of this study is the thermodynamic modeling, the parametric analysis, and the optimization of three pressure level reheat HRSG (Heat Recovery Steam Generator) using PSO method (Particle Swarm Optimization). In this paper, a parametric analysis followed by a thermodynamic optimization is presented. The chosen objective function is the specific work of the steam cycle that may be, in the case of combined cycle (CC), a good criterion of thermodynamic performance analysis, contrary to the conventional steam turbines in which the thermal efficiency could be also an important criterion. The technologic constraints such as maximal steam cycle temperature, minimal steam fraction at steam turbine outlet, maximal steam pressure, minimal stack temperature, minimal pinch point, and maximal superheater effectiveness are also considered. The parametric analyses permitted to understand the effect of design parameters and the constraints on steam cycle specific work variation. PSO algorithm was used successfully in HRSG optimization, knowing that the achieved results are in accordance with those of the previous studies in which genetic algorithms were used. Moreover, this method is easy to implement comparing with the other methods.

Keywords: combined cycle, HRSG thermodynamic modeling, optimization, PSO, steam cycle specific work

Procedia PDF Downloads 368
14509 Framework to Quantify Customer Experience

Authors: Anant Sharma, Ashwin Rajan

Abstract:

Customer experience is measured today based on defining a set of metrics and KPIs, setting up thresholds and defining triggers across those thresholds. While this is an effective way of measuring against a Key Performance Indicator ( referred to as KPI in the rest of the paper ), this approach cannot capture the various nuances that make up the overall customer experience. Customers consume a product or service at various levels, which is not reflected in metrics like Customer Satisfaction or Net Promoter Score, but also across other measurements like recurring revenue, frequency of service usage, e-learning and depth of usage. Here we explore an alternative method of measuring customer experience by flipping the traditional views. Rather than rolling customers up to a metric, we roll up metrics to hierarchies and then measure customer experience. This method allows any team to quantify customer experience across multiple touchpoints in a customer’s journey. We make use of various data sources which contain information for metrics like CXSAT, NPS, Renewals, and depths of service usage collected across a customer lifecycle. This data can be mined systematically to get linkages between different data points like geographies, business groups, products and time. Additional views can be generated by blending synthetic contexts into the data to show trends and top/bottom types of reports. We have created a framework that allows us to measure customer experience using the above logic.

Keywords: analytics, customers experience, BI, business operations, KPIs, metrics

Procedia PDF Downloads 58
14508 Particle Filter State Estimation Algorithm Based on Improved Artificial Bee Colony Algorithm

Authors: Guangyuan Zhao, Nan Huang, Xuesong Han, Xu Huang

Abstract:

In order to solve the problem of sample dilution in the traditional particle filter algorithm and achieve accurate state estimation in a nonlinear system, a particle filter method based on an improved artificial bee colony (ABC) algorithm was proposed. The algorithm simulated the process of bee foraging and optimization and made the high likelihood region of the backward probability of particles moving to improve the rationality of particle distribution. The opposition-based learning (OBL) strategy is introduced to optimize the initial population of the artificial bee colony algorithm. The convergence factor is introduced into the neighborhood search strategy to limit the search range and improve the convergence speed. Finally, the crossover and mutation operations of the genetic algorithm are introduced into the search mechanism of the following bee, which makes the algorithm jump out of the local extreme value quickly and continue to search the global extreme value to improve its optimization ability. The simulation results show that the improved method can improve the estimation accuracy of particle filters, ensure the diversity of particles, and improve the rationality of particle distribution.

Keywords: particle filter, impoverishment, state estimation, artificial bee colony algorithm

Procedia PDF Downloads 128
14507 Next-Generation Lunar and Martian Laser Retro-Reflectors

Authors: Simone Dell'Agnello

Abstract:

There are laser retroreflectors on the Moon and no laser retroreflectors on Mars. Here we describe the design, construction, qualification and imminent deployment of next-generation, optimized laser retroreflectors on the Moon and on Mars (where they will be the first ones). These instruments are positioned by time-of-flight measurements of short laser pulses, the so-called 'laser ranging' technique. Data analysis is carried out with PEP, the Planetary Ephemeris Program of CfA (Center for Astrophysics). Since 1969 Lunar Laser Ranging (LLR) to Apollo/Lunokhod laser retro-reflector (CCR) arrays supplied accurate tests of General Relativity (GR) and new gravitational physics: possible changes of the gravitational constant Gdot/G, weak and strong equivalence principle, gravitational self-energy (Parametrized Post Newtonian parameter beta), geodetic precession, inverse-square force-law; it can also constraint gravitomagnetism. Some of these measurements also allowed for testing extensions of GR, including spacetime torsion, non-minimally coupled gravity. LLR has also provides significant information on the composition of the deep interior of the Moon. In fact, LLR first provided evidence of the existence of a fluid component of the deep lunar interior. In 1969 CCR arrays contributed a negligible fraction of the LLR error budget. Since laser station range accuracy improved by more than a factor 100, now, because of lunar librations, current array dominate the error due to their multi-CCR geometry. We developed a next-generation, single, large CCR, MoonLIGHT (Moon Laser Instrumentation for General relativity high-accuracy test) unaffected by librations that supports an improvement of the space segment of the LLR accuracy up to a factor 100. INFN also developed INRRI (INstrument for landing-Roving laser Retro-reflector Investigations), a microreflector to be laser-ranged by orbiters. Their performance is characterized at the SCF_Lab (Satellite/lunar laser ranging Characterization Facilities Lab, INFN-LNF, Frascati, Italy) for their deployment on the lunar surface or the cislunar space. They will be used to accurately position landers, rovers, hoppers, orbiters of Google Lunar X Prize and space agency missions, thanks to LLR observations from station of the International Laser Ranging Service in the USA, in France and in Italy. INRRI was launched in 2016 with the ESA mission ExoMars (Exobiology on Mars) EDM (Entry, descent and landing Demonstration Module), deployed on the Schiaparelli lander and is proposed for the ExoMars 2020 Rover. Based on an agreement between NASA and ASI (Agenzia Spaziale Italiana), another microreflector, LaRRI (Laser Retro-Reflector for InSight), was delivered to JPL (Jet Propulsion Laboratory) and integrated on NASA’s InSight Mars Lander in August 2017 (launch scheduled in May 2018). Another microreflector, LaRA (Laser Retro-reflector Array) will be delivered to JPL for deployment on the NASA Mars 2020 Rover. The first lunar landing opportunities will be from early 2018 (with TeamIndus) to late 2018 with commercial missions, followed by opportunities with space agency missions, including the proposed deployment of MoonLIGHT and INRRI on NASA’s Resource Prospectors and its evolutions. In conclusion, we will extend significantly the CCR Lunar Geophysical Network and populate the Mars Geophysical Network. These networks will enable very significantly improved tests of GR.

Keywords: general relativity, laser retroreflectors, lunar laser ranging, Mars geodesy

Procedia PDF Downloads 254
14506 The Pore–Scale Darcy–Brinkman–Stokes Model for the Description of Advection–Diffusion–Precipitation Using Level Set Method

Authors: Jiahui You, Kyung Jae Lee

Abstract:

Hydraulic fracturing fluid (HFF) is widely used in shale reservoir productions. HFF contains diverse chemical additives, which result in the dissolution and precipitation of minerals through multiple chemical reactions. In this study, a new pore-scale Darcy–Brinkman–Stokes (DBS) model coupled with Level Set Method (LSM) is developed to address the microscopic phenomena occurring during the iron–HFF interaction, by numerically describing mass transport, chemical reactions, and pore structure evolution. The new model is developed based on OpenFOAM, which is an open-source platform for computational fluid dynamics. Here, the DBS momentum equation is used to solve for velocity by accounting for the fluid-solid mass transfer; an advection-diffusion equation is used to compute the distribution of injected HFF and iron. The reaction–induced pore evolution is captured by applying the LSM, where the solid-liquid interface is updated by solving the level set distance function and reinitialized to a signed distance function. Then, a smoothened Heaviside function gives a smoothed solid-liquid interface over a narrow band with a fixed thickness. The stated equations are discretized by the finite volume method, while the re-initialized equation is discretized by the central difference method. Gauss linear upwind scheme is used to solve the level set distance function, and the Pressure–Implicit with Splitting of Operators (PISO) method is used to solve the momentum equation. The numerical result is compared with 1–D analytical solution of fluid-solid interface for reaction-diffusion problems. Sensitivity analysis is conducted with various Damkohler number (DaII) and Peclet number (Pe). We categorize the Fe (III) precipitation into three patterns as a function of DaII and Pe: symmetrical smoothed growth, unsymmetrical growth, and dendritic growth. Pe and DaII significantly affect the location of precipitation, which is critical in determining the injection parameters of hydraulic fracturing. When DaII<1, the precipitation uniformly occurs on the solid surface both in upstream and downstream directions. When DaII>1, the precipitation mainly occurs on the solid surface in an upstream direction. When Pe>1, Fe (II) transported deeply into and precipitated inside the pores. When Pe<1, the precipitation of Fe (III) occurs mainly on the solid surface in an upstream direction, and they are easily precipitated inside the small pore structures. The porosity–permeability relationship is subsequently presented. This pore-scale model allows high confidence in the description of Fe (II) dissolution, transport, and Fe (III) precipitation. The model shows fast convergence and requires a low computational load. The results can provide reliable guidance for injecting HFF in shale reservoirs to avoid clogging and wellbore pollution. Understanding Fe (III) precipitation, and Fe (II) release and transport behaviors give rise to a highly efficient hydraulic fracture project.

Keywords: reactive-transport , Shale, Kerogen, precipitation

Procedia PDF Downloads 152
14505 Assessment of Some Local Clay Minerals Used for the Production of Floor Tiles: Panacea for Economic Growth

Authors: Ekenyem Stan Chinweike

Abstract:

The suitability of some clay deposits in south eastern Nigeria (Unwana, Ekebedi and Nsu) as materials for the production of floor tiles was investigated. The clay samples were analyzed using wet classical method to determine their chemical composition. Floor tile test specimens were produced using standard method. The test specimens were tested for physical properties such as compressive strength and porosity at 1050◦c and 1150◦c temperature levels. The chemical analysis showed the following results: Unwana (5102 52.24%, AL2o3, 27.20%, Fe2o3 7%, T102 (1.52%), Ekebedi (S102 (58.53%), Al2o3 28.42%, Fe2o3 7%, Ti o2 (1.12%),NSU SIo2 (58.16%), Al2O3 (28.42%), Fe2O3 1.89%, T102 (0.82%) The compressive strength of Unwana, Ekebedi and Nsu clays at 1050◦c are respectively: 15MPa, 13.75MPa and 13.5MPa. At 1150◦c, the values are 16.2MPa and 16.0MPa for Ekebedi and Nsu clays respectively. The porosity of Unwana, Ekebedi and Nsu clays at 1050◦c are respectively31.57%, 23.15% and 24.21%. At 1150◦c, the values are 23.65% and 24.75% for Ekebedi and Nsu respectively. The three clays can be used for production of tiles but Ekebedi has the highest compressive strength which makes it the most suitable clay for the production of floor tiles when compared with floor tiles of the same nominal size stipulated by ASTM standard.

Keywords: feldspar, quartz, porosity, compressive strength, clay minerals

Procedia PDF Downloads 363
14504 A Method for Solid-Liquid Separation of Cs+ from Radioactive Waste by Using Ionic Liquids and Extractants

Authors: J. W. Choi, S. Y. Cho, H. J. Lee, W. Z. Oh, S. J. Choi

Abstract:

Ionic liquids (ILs), which is alternative to conventional organic solvent, were used for extraction of Cs ions. ILs, as useful environment friendly green solvents, have been recently applied as replacement for traditional volatile organic compounds (VOCs) in liquid/liquid extraction of heavy metal ions as well as organic and inorganic species and pollutants. Thus, Ionic liquids were used for extraction of Cs ions from the liquid radioactive waste. In most cases, Cs ions present in radioactive wastes in very low concentration, approximately less than 1ppm. Therefore, unlike established extraction system the required amount of ILs as extractant is comparatively very small. This extraction method involves cation exchange mechanism in which Cs ion transfers to the organic phase and binds to one crown ether by chelation in exchange of single ILs cation, IL_cation+, transfer to the aqueous phase. In this extraction system showed solid-liquid separation in which the Ionic liquid 1-ethyl-3-methylimidazolium bis(trifluoromethylsulfonly)imide (C2mimTf2N) and the crown ether Dicyclohexano-18-crown-6 (DCH18C6) both were used here in very little amount as solvent and as extractant, respectively. 30 mM of CsNO3 was used as simulated waste solution cesium ions. Generally, in liquid-liquid extraction, the molar ratio of CE:Cs+:ILs was 1:5~10:>100, while our applied molar ratio of CE:Cs+:ILs was 1:2:1~10. The quantity of CE and Cs ions were fixed to 0.6 and 1.2 mmol, respectively. The phenomenon of precipitation showed two kinds of separation: solid-liquid separation in the ratio of 1:2:1 and 1:2:2; solid-liquid-liquid separation (3 phase) in the ratio of 1:2:5 and 1:2:10. In the last system, 3 phases were precipitate-ionic liquids-aqueous. The precipitate was verified to consist of Cs+, DCH18C6, Tf2N- based on the cation exchange mechanism. We analyzed precipitate using scanning electron microscopy with X-ray microanalysis (SEM-EDS), an elemental analyser, Fourier transform infrared spectroscopy (FT-IR) and differential scanning calorimetry (DSC). The experimental results showed an easy extraction method and confirmed the composition of solid precipitate. We also obtained information that complex formation ratio of Cs+ to DCH18C6 is 0.88:1 regardless of C2mimTf2N quantities.

Keywords: extraction, precipitation, solid-liquid seperation, ionic liquid, precipitate

Procedia PDF Downloads 404
14503 Kluyveromyces marxianus ABB S8 as Yeast-Based Technology to Manufacture Low FODMAP Baking Good

Authors: Jordi Cuñé, Carlos de Lecea, Laia Marti

Abstract:

Small molecules known as fermentable oligo-, di-, and monosaccharides and polyols (FODMAPs) are quickly fermented in the large intestine after being poorly absorbed in the small intestine. There is proof that individuals suffering from functional gastrointestinal disorders, like irritable bowel syndrome (IBS), observe an improvement while following a diet low in FODMAPs. Because wheat has a relatively high fructan content, it is a key source of FODMAPs in our diet. A yeast-based method was created in this study to lower the amounts of FODMAP in (whole wheat) bread. In contrast to fermentation by regular baker yeast, the combination of Kluyveromyces marxianus ABB S7 with Saccharomyces cerevisiae allowed a reduction of fructan content by 60% without implying the appearance of other substrates categorized as FODMAP (excess fructose or polyols). The final FODMAP content in the developed whole wheat bread would allow its classification as a safe product for sensitive people, according to international consensus. Cocultures of S. cerevisiae and K. marxianus were established in order to ensure sufficient CO₂ generation; larger quantities of gas were produced due to the strains' synergistic relationship. Thus, this method works well for lowering the levels of FODMAPs in bread.

Keywords: Kluyveromyces marxianus, bakery, bread, FODMAP, IBS, functional gastro intestinal disorders

Procedia PDF Downloads 39
14502 Preparation of Polylactide Nanoparticles by Supercritical Fluid Technology

Authors: Jakub Zágora, Daniela Plachá, Karla Čech Barabaszová, Sylva Holešová, Roman Gábor, Alexandra Muñoz Bonilla, Marta Fernández García

Abstract:

The development of new antimicrobial materials that are not toxic to higher living organisms is a major challenge today. Newly developed materials can have high application potential in biomedicine, coatings, packaging, etc. A combination of commonly used biopolymer polylactide with cationic polymers seems to be very successful in the fight against antimicrobial resistance [1].PLA will play a key role in fulfilling the intention set out in the New Deal announced by the EU commission, as it is a bioplastic that is easily degradable, recyclable, and mass-produced. Also, the development of 3D printing in the context of this initiative, and the actual use of PLA as one of the main materials used for this printing, make the technology around the preparation and modification of PLA quite logical. Moreover, theenvironmentally friendly and energy saving technology like supercritical fluid process (SFP) will be used for their preparation. In a first approach, polylactide nano- and microparticles and structures were prepared by supercritical fluid extraction. The RESS (rapid expansion supercritical fluid solution) method is easier to optimize and shows better particle size control. On the contrary, a highly porous structure was obtained using the SAS (supercritical antisolvent) method. In a second part, the antimicrobial biobased polymer was introduced by SFP.

Keywords: polylactide, antimicrobial polymers, supercritical fluid technology, micronization

Procedia PDF Downloads 170
14501 Single-Molecule Analysis of Structure and Dynamics in Polymer Materials by Super-Resolution Technique

Authors: Hiroyuki Aoki

Abstract:

The physical properties of polymer materials are dependent on the conformation and molecular motion of a polymer chain. Therefore, the structure and dynamic behavior of the single polymer chain have been the most important concerns in the field of polymer physics. However, it has been impossible to directly observe the conformation of the single polymer chain in a bulk medium. In the current work, the novel techniques to study the conformation and dynamics of a single polymer chain are proposed. Since a fluorescence method is extremely sensitive, the fluorescence microscopy enables the direct detection of a single molecule. However, the structure of the polymer chain as large as 100 nm cannot be resolved by conventional fluorescence methods because of the diffraction limit of light. In order to observe the single chains, we developed the labeling method of polymer materials with a photo-switchable dye and the super-resolution microscopy. The real-space conformational analysis of single polymer chains with the spatial resolution of 15-20 nm was achieved. The super-resolution microscopy enables us to obtain the three-dimensional coordinates; therefore, we succeeded the conformational analysis in three dimensions. The direct observation by the nanometric optical microscopy would reveal the detailed information on the molecular processes in the various polymer systems.

Keywords: polymer materials, single molecule, super-resolution techniques, conformation

Procedia PDF Downloads 288
14500 Mesoporous Nanocomposites for Sustained Release Applications

Authors: Daniela Istrati, Alina Morosan, Maria Stanca, Bogdan Purcareanu, Adrian Fudulu, Laura Olariu, Alice Buteica, Ion Mindrila, Rodica Cristescu, Dan Eduard Mihaiescu

Abstract:

Our present work is related to the synthesis, characterization and applications of new nanocomposite materials based on silica mesoporous nanocompozites systems. The nanocomposite support was obtained by using a specific step–by–step multilayer structure buildup synthetic route, characterized by XRD (X-Ray Difraction), TEM (Transmission Electron Microscopy), FT-IR (Fourier Transform-Infra Red Spectrometry), BET (Brunauer–Emmett–Teller method) and loaded with Salvia officinalis plant extract obtained by a hydro-alcoholic extraction route. The sustained release of the target compounds was studied by a modified LC method, proving low release profiles, as expected for the high specific surface area support. The obtained results were further correlated with the in vitro / in vivo behavior of the nanocomposite material and recommending the silica mesoporous nanocomposites as good candidates for biomedical applications. Acknowledgements: This study has been funded by the Research Project PN-III-P2-2.1-PTE-2016-0160, 49-PTE / 2016 (PROZECHIMED) and Project Number PN-III-P4-ID-PCE-2016-0884 / 2017.

Keywords: biomedical, mesoporous, nanocomposites, natural products, sustained release

Procedia PDF Downloads 201
14499 Improve Closed Loop Performance and Control Signal Using Evolutionary Algorithms Based PID Controller

Authors: Mehdi Shahbazian, Alireza Aarabi, Mohsen Hadiyan

Abstract:

Proportional-Integral-Derivative (PID) controllers are the most widely used controllers in industry because of its simplicity and robustness. Different values of PID parameters make different step response, so an increasing amount of literature is devoted to proper tuning of PID controllers. The problem merits further investigation as traditional tuning methods make large control signal that can damages the system but using evolutionary algorithms based tuning methods improve the control signal and closed loop performance. In this paper three tuning methods for PID controllers have been studied namely Ziegler and Nichols, which is traditional tuning method and evolutionary algorithms based tuning methods, that are, Genetic algorithm and particle swarm optimization. To examine the validity of PSO and GA tuning methods a comparative analysis of DC motor plant is studied. Simulation results reveal that evolutionary algorithms based tuning method have improved control signal amplitude and quality factors of the closed loop system such as rise time, integral absolute error (IAE) and maximum overshoot.

Keywords: evolutionary algorithm, genetic algorithm, particle swarm optimization, PID controller

Procedia PDF Downloads 466
14498 Selecting Answers for Questions with Multiple Answer Choices in Arabic Question Answering Based on Textual Entailment Recognition

Authors: Anes Enakoa, Yawei Liang

Abstract:

Question Answering (QA) system is one of the most important and demanding tasks in the field of Natural Language Processing (NLP). In QA systems, the answer generation task generates a list of candidate answers to the user's question, in which only one answer is correct. Answer selection is one of the main components of the QA, which is concerned with selecting the best answer choice from the candidate answers suggested by the system. However, the selection process can be very challenging especially in Arabic due to its particularities. To address this challenge, an approach is proposed to answer questions with multiple answer choices for Arabic QA systems based on Textual Entailment (TE) recognition. The developed approach employs a Support Vector Machine that considers lexical, semantic and syntactic features in order to recognize the entailment between the generated hypotheses (H) and the text (T). A set of experiments has been conducted for performance evaluation and the overall performance of the proposed method reached an accuracy of 67.5% with C@1 score of 80.46%. The obtained results are promising and demonstrate that the proposed method is effective for TE recognition task.

Keywords: information retrieval, machine learning, natural language processing, question answering, textual entailment

Procedia PDF Downloads 135
14497 In-situ Oxygen Enrichment for Underground Coal Gasification

Authors: Adesola O. Orimoloye, Edward Gobina

Abstract:

Membrane separation technology is still considered as an emerging technology in the mining sector and does not yet have the widespread acceptance that it has in other industrial sectors. Underground Coal Gasification (UCG), wherein coal is converted to gas in-situ, is a safer alternative to mining method that retains all pollutants underground making the process environmentally friendly. In-situ combustion of coal for power generation allows access to more of the physical global coal resource than would be included in current economically recoverable reserve estimates. Where mining is no longer taking place, for economic or geological reasons, controlled gasification permits exploitation of the deposit (again a reaction of coal to form a synthesis gas) of coal seams in situ. The oxygen supply stage is one of the most expensive parts of any gasification project but the use of membranes is a potentially attractive approach for producing oxygen-enriched air. In this study, a variety of cost-effective membrane materials that gives an optimal amount of oxygen concentrations in the range of interest was designed and tested at diverse operating conditions. Oxygen-enriched atmosphere improves the combustion temperature but a decline is observed if oxygen concentration exceeds optimum. Experimental result also reveals the preparatory method, apparatus and performance of the fabricated membrane.

Keywords: membranes, oxygen-enrichment, gasification, coal

Procedia PDF Downloads 445
14496 Runoff Estimation Using NRCS-CN Method

Authors: E. K. Naseela, B. M. Dodamani, Chaithra Chandran

Abstract:

The GIS and remote sensing techniques facilitate accurate estimation of surface runoff from watershed. In the present study an attempt has been made to evaluate the applicability of Natural Resources Service Curve Number method using GIS and Remote sensing technique in the upper Krishna basin (69,425 Sq.km). Landsat 7 (with resolution 30 m) satellite data for the year 2012 has been used for the preparation of land use land cover (LU/LC) map. The hydrologic soil group is mapped using GIS platform. The weighted curve numbers (CN) for all the 5 subcatchments calculated on the basis of LU/LC type and hydrologic soil class in the area by considering antecedent moisture condition. Monthly rainfall data was available for 58 raingauge stations. Overlay technique is adopted for generating weighted curve number. Results of the study show that land use changes determined from satellite images are useful in studying the runoff response of the basin. The results showed that there is no significant difference between observed and estimated runoff depths. For each subcatchment, statistically positive correlations were detected between observed and estimated runoff depth (0.6Keywords: curve number, GIS, remote sensing, runoff

Procedia PDF Downloads 522
14495 Perception of Nursing Care of Patients in a University Hospital

Authors: Merve Aydin, Mağfiret Kara Kaşikçi

Abstract:

Aim: To determine the perceptions of inpatients about care at Farabi Hospital in KTU. Material and Method: This research was conducted by using the universe known examples of formulas and probability selected by sampling method with 277 chosen patients in the hospital at least 14 days in other internal and surgical clinics except for pediatric, psychiatry, and intensive care unit services between January-March 2014 in KTU Farabi Hospital. The data was collected through the forms of nursing care perception scale of patients and defining characteristics of patients. In the evaluation of data, percentage, mean, Mann Whitney U, Student t and Kurskall Wallis tests were applied. Results: The average point the patients got in nursing care perception scale is 62.64±10.08’dir. 48.7 % of patients regard nursing care well and 36.8 % of them regard it very well. 19 % of the patients regard nursing care badly. When the age, sex, occupation, marital status, educational background, residential place, income level, hospitalization period, hospitalization clinic and having a hospital attendant were compared with nursing care perception average point, the difference among point averages was not found meaningful statistically (p > 0.05). The average point of nursing care perception was found greater in those having chronic disease (p < 0.05). Conclusion: The perception point of patients about nursing care is above the average according to the average of the lowest and highest points. The great majority of patients regard nursing care well or very well.

Keywords: hospital, patient, perception of nursing care, nursing care

Procedia PDF Downloads 379
14494 Determination of Inflow Performance Relationship for Naturally Fractured Reservoirs: Numerical Simulation Study

Authors: Melissa Ramirez, Mohammad Awal

Abstract:

The Inflow Performance Relationship (IPR) of a well is a relation between the oil production rate and flowing bottom-hole pressure. This relationship is an important tool for petroleum engineers to understand and predict the well performance. In the petroleum industry, IPR correlations are used to design and evaluate well completion, optimizing well production, and designing artificial lift. The most commonly used IPR correlations models are Vogel and Wiggins, these models are applicable to homogeneous and isotropic reservoir data. In this work, a new IPR model is developed to determine inflow performance relationship of oil wells in a naturally fracture reservoir. A 3D black-oil reservoir simulator is used to develop the oil mobility function for the studied reservoir. Based on simulation runs, four flow rates are run to record the oil saturation and calculate the relative permeability for a naturally fractured reservoir. The new method uses the result of a well test analysis along with permeability and pressure-volume-temperature data in the fluid flow equations to obtain the oil mobility function. Comparisons between the new method and two popular correlations for non-fractured reservoirs indicate the necessity for developing and using an IPR correlation specifically developed for a fractured reservoir.

Keywords: inflow performance relationship, mobility function, naturally fractured reservoir, well test analysis

Procedia PDF Downloads 251
14493 Evaluation of Uniformity for Gafchromic Sheets for Film Dosimetry

Authors: Fayzan Ahmed, Saad Bin Saeed, Abdul Qadir Jangda

Abstract:

Gafchromic™ sheet are extensively used for the QA of intensity modulated radiation therapy and other in-vivo dosimetry. Intra-sheet Non-uniformity of scanner as well as film causes undesirable fluctuations which are reflected in dosimetry The aim of this study is to define a systematic and robust method to investigate the intra-sheet uniformity of the unexposed Gafchromic Sheets and the region of interest (ROI) of the scanner. Sheets of lot No#: A05151201 were scanned before and after the expiry period with the EPSON™ XL10000 scanner in the transmission mode, landscape orientation and 72 dpi resolution. ROI of (8’x 10’ inches) equal to the sheet dimension in the center of the scanner is used to acquire images with full transmission, block transmission and with sheets in place. 500 virtual grids, created in MATALB® are imported as a macros in ImageJ (1.49m Wayne Rasband) to analyze the images. In order to remove the edge effects, the outer 86 grids are excluded from the analysis. The standard deviation of the block transmission and full transmission are 0.38% and 0.66% confirming a higher uniformity of the scanner. Expired and non-expired sheets have standard deviations of 2.18% and 1.29%, show that uniformity decreases after expiry. The results are promising and indicates a good potential of this method to be used as a uniformity check for scanner and unexposed Gafchromic sheets.

Keywords: IMRT, film dosimetry, virtual grids, uniformity

Procedia PDF Downloads 472
14492 Prioritization in Modern Portfolio Management - An Action Design Research Approach to Method Development for Scaled Agility

Authors: Jan-Philipp Schiele, Karsten Schlinkmeier

Abstract:

Allocation of scarce resources is a core process of traditional project portfolio management. However, with the popularity of agile methodology, established concepts and methods of portfolio management are reaching their limits and need to be adapted. Consequently, the question arises of how the process of resource allocation can be managed appropriately in scaled agile environments. The prevailing framework SAFe offers Weightest Shortest Job First (WSJF) as a prioritization technique, butestablished companies are still looking for methodical adaptions to apply WSJF for prioritization in portfolios in a more goal-oriented way and aligned for their needs in practice. In this paper, the relevant problem of prioritization in portfolios is conceptualized from the perspective of coordination and related mechanisms to support resource allocation. Further, an Action Design Research (ADR) project with case studies in a finance company is outlined to develop a practically applicable yet scientifically sound prioritization method based on coordination theory. The ADR project will be flanked by consortium research with various practitioners from the financial and insurance industry. Preliminary design requirements indicate that the use of a feedback loop leads to better team and executive level coordination in the prioritization process.

Keywords: scaled agility, portfolio management, prioritization, business-IT alignment

Procedia PDF Downloads 179
14491 Multivariate Control Chart to Determine Efficiency Measurements in Industrial Processes

Authors: J. J. Vargas, N. Prieto, L. A. Toro

Abstract:

Control charts are commonly used to monitor processes involving either variable or attribute of quality characteristics and determining the control limits as a critical task for quality engineers to improve the processes. Nonetheless, in some applications it is necessary to include an estimation of efficiency. In this paper, the ability to define the efficiency of an industrial process was added to a control chart by means of incorporating a data envelopment analysis (DEA) approach. In depth, a Bayesian estimation was performed to calculate the posterior probability distribution of parameters as means and variance and covariance matrix. This technique allows to analyse the data set without the need of using the hypothetical large sample implied in the problem and to be treated as an approximation to the finite sample distribution. A rejection simulation method was carried out to generate random variables from the parameter functions. Each resulting vector was used by stochastic DEA model during several cycles for establishing the distribution of each efficiency measures for each DMU (decision making units). A control limit was calculated with model obtained and if a condition of a low level efficiency of DMU is presented, system efficiency is out of control. In the efficiency calculated a global optimum was reached, which ensures model reliability.

Keywords: data envelopment analysis, DEA, Multivariate control chart, rejection simulation method

Procedia PDF Downloads 365
14490 Towards Long-Range Pixels Connection for Context-Aware Semantic Segmentation

Authors: Muhammad Zubair Khan, Yugyung Lee

Abstract:

Deep learning has recently achieved enormous response in semantic image segmentation. The previously developed U-Net inspired architectures operate with continuous stride and pooling operations, leading to spatial data loss. Also, the methods lack establishing long-term pixels connection to preserve context knowledge and reduce spatial loss in prediction. This article developed encoder-decoder architecture with bi-directional LSTM embedded in long skip-connections and densely connected convolution blocks. The network non-linearly combines the feature maps across encoder-decoder paths for finding dependency and correlation between image pixels. Additionally, the densely connected convolutional blocks are kept in the final encoding layer to reuse features and prevent redundant data sharing. The method applied batch-normalization for reducing internal covariate shift in data distributions. The empirical evidence shows a promising response to our method compared with other semantic segmentation techniques.

Keywords: deep learning, semantic segmentation, image analysis, pixels connection, convolution neural network

Procedia PDF Downloads 88
14489 Motion Detection Method for Clutter Rejection in the Bio-Radar Signal Processing

Authors: Carolina Gouveia, José Vieira, Pedro Pinho

Abstract:

The cardiopulmonary signal monitoring, without the usage of contact electrodes or any type of in-body sensors, has several applications such as sleeping monitoring and continuous monitoring of vital signals in bedridden patients. This system has also applications in the vehicular environment to monitor the driver, in order to avoid any possible accident in case of cardiac failure. Thus, the bio-radar system proposed in this paper, can measure vital signals accurately by using the Doppler effect principle that relates the received signal properties with the distance change between the radar antennas and the person’s chest-wall. Once the bio-radar aim is to monitor subjects in real-time and during long periods of time, it is impossible to guarantee the patient immobilization, hence their random motion will interfere in the acquired signals. In this paper, a mathematical model of the bio-radar is presented, as well as its simulation in MATLAB. The used algorithm for breath rate extraction is explained and a method for DC offsets removal based in a motion detection system is proposed. Furthermore, experimental tests were conducted with a view to prove that the unavoidable random motion can be used to estimate the DC offsets accurately and thus remove them successfully.

Keywords: bio-signals, DC component, Doppler effect, ellipse fitting, radar, SDR

Procedia PDF Downloads 124
14488 Critical Success Factors for Sustainable Smart City Project in India

Authors: Debasis Sarkar

Abstract:

Development of a Smart City would depend upon the development of its infrastructure in a smart way. Primarily based on the ideology of the fourth industrial revolution a Smart City project should have Smart governance, smart health care, smart building, smart transportation, smart mobility, smart energy, smart technology and smart citizen. Considering the Indian scenario of current state of cities in India, it has become very essential to decide the specific parameters which would govern the development of a Smart City project. It has been observed that there are significant parameters beyond Information and Communication Technology (ICT), which govern the development of a Smart City project. This paper is an attempt to identify the Critical Success Factors (CSF) which are significantly responsible for the development of a Smart City project in Western India. Responses to questionnaire survey were analyzed on basis of Likert scale. They were further critically evaluated with help of Factor Comparison Method (FCM) and Analytical Hierarchy Process (AHP). The project authorities need to incorporate Building Information Modeling (BIM) to make the smart city project more collaborative. To make the project more sustainable, use of flyash in the concrete used, reduced usage of cement and steel, use of alternate fuels like biodiesel is recommended.

Keywords: analytical hierarchical process, building information modeling, critical success factors, factor comparison method

Procedia PDF Downloads 237