Search results for: pre-bored pile system
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 17709

Search results for: pre-bored pile system

11139 A Combined Activated Sludge-Filtration-Ozonation Process for Abattoir Wastewater Treatment

Authors: Pello Alfonso-Muniozguren, Madeleine Bussemaker, Ralph Chadeesingh, Caryn Jones, David Oakley, Judy Lee, Devendra Saroj

Abstract:

Current industrialized livestock agriculture is growing every year leading to an increase in the generation of wastewater that varies considerably in terms of organic content and microbial population. Therefore, suitable wastewater treatment methods are required to ensure the wastewater quality meet regulations before discharge. In the present study, a combined lab scale activated sludge-filtration-ozonation system was used to treat a pre-treated abattoir wastewater. A hydraulic retention time of 24 hours and a solid retention time of 13 days were used for the activated sludge process, followed by a filtration step (4-7 µm) and using ozone as tertiary treatment. An average reduction of 93% and 98% was achieved for Chemical Oxygen Demand (COD) and Biological Oxygen Demand (BOD), respectively, obtaining final values of 128 mg/L COD and 12 mg/L BOD. For the Total Suspended Solids (TSS), the average reduction increased to 99% in the same system, reducing the final value down to 3 mg/L. Additionally, 98% reduction in Phosphorus (P) and a complete inactivation of Total Coliforms (TC) was obtained after 17 min ozonation time. For Total Viable Counts (TVC), a drastic reduction was observed with 30 min ozonation time (6 log inactivation) at an ozone dose of 71 mg O3/L. Overall, the combined process was sufficient to meet discharge requirements without further treatment for the measured parameters (COD, BOD, TSS, P, TC, and TVC).

Keywords: abattoir waste water, activated sludge, ozone, waste water treatment

Procedia PDF Downloads 280
11138 Carbon Capture and Storage in Geological Formation, its Legal, Regulatory Imperatives and Opportunities in India

Authors: Kalbende Krunal Ramesh

Abstract:

The Carbon Capture and Storage Technology (CCS) provides a veritable platform to bridge the gap between the seemingly irreconcilable twin global challenges of ensuring a secure, reliable and diversified energy supply and mitigating climate change by reducing atmospheric emissions of carbon dioxide. Making its proper regulatory policy and making it flexible for the government and private company by law to regulate, also exploring the opportunity in this sector is the main aim of this paper. India's total annual emissions was 1725 Mt CO2 in 2011, which comprises of 6% of total global emission. It is very important to control the greenhouse gas emission for the environment protection. This paper discusses the various regulatory policy and technology adopted by some of the countries for successful using CCS technology. The brief geology of sedimentary basins in India is studied, ranging from the category I to category IV and deep water and potential for mature technology in CCS is reviewed. Areas not suitable for CO2 storage using presently mature technologies were over viewed. CSS and Clean development mechanism was developed for India, considering the various aspects from research and development, project appraisal, approval and validation, implementation, monitoring and verification, carbon credit issued, cap and trade system and its storage potential. The opportunities in oil and gas operations, power sector, transport sector is discussed briefly.

Keywords: carbon credit issued, cap and trade system, carbon capture and storage technology, greenhouse gas

Procedia PDF Downloads 433
11137 A Comparative Study of Sampling-Based Uncertainty Propagation with First Order Error Analysis and Percentile-Based Optimization

Authors: M. Gulam Kibria, Shourav Ahmed, Kais Zaman

Abstract:

In system analysis, the information on the uncertain input variables cause uncertainty in the system responses. Different probabilistic approaches for uncertainty representation and propagation in such cases exist in the literature. Different uncertainty representation approaches result in different outputs. Some of the approaches might result in a better estimation of system response than the other approaches. The NASA Langley Multidisciplinary Uncertainty Quantification Challenge (MUQC) has posed challenges about uncertainty quantification. Subproblem A, the uncertainty characterization subproblem, of the challenge posed is addressed in this study. In this subproblem, the challenge is to gather knowledge about unknown model inputs which have inherent aleatory and epistemic uncertainties in them with responses (output) of the given computational model. We use two different methodologies to approach the problem. In the first methodology we use sampling-based uncertainty propagation with first order error analysis. In the other approach we place emphasis on the use of Percentile-Based Optimization (PBO). The NASA Langley MUQC’s subproblem A is developed in such a way that both aleatory and epistemic uncertainties need to be managed. The challenge problem classifies each uncertain parameter as belonging to one the following three types: (i) An aleatory uncertainty modeled as a random variable. It has a fixed functional form and known coefficients. This uncertainty cannot be reduced. (ii) An epistemic uncertainty modeled as a fixed but poorly known physical quantity that lies within a given interval. This uncertainty is reducible. (iii) A parameter might be aleatory but sufficient data might not be available to adequately model it as a single random variable. For example, the parameters of a normal variable, e.g., the mean and standard deviation, might not be precisely known but could be assumed to lie within some intervals. It results in a distributional p-box having the physical parameter with an aleatory uncertainty, but the parameters prescribing its mathematical model are subjected to epistemic uncertainties. Each of the parameters of the random variable is an unknown element of a known interval. This uncertainty is reducible. From the study, it is observed that due to practical limitations or computational expense, the sampling is not exhaustive in sampling-based methodology. That is why the sampling-based methodology has high probability of underestimating the output bounds. Therefore, an optimization-based strategy to convert uncertainty described by interval data into a probabilistic framework is necessary. This is achieved in this study by using PBO.

Keywords: aleatory uncertainty, epistemic uncertainty, first order error analysis, uncertainty quantification, percentile-based optimization

Procedia PDF Downloads 240
11136 An Analytical Approach to Assess and Compare the Vulnerability Risk of Operating Systems

Authors: Pubudu K. Hitigala Kaluarachchilage, Champike Attanayake, Sasith Rajasooriya, Chris P. Tsokos

Abstract:

Operating system (OS) security is a key component of computer security. Assessing and improving OSs strength to resist against vulnerabilities and attacks is a mandatory requirement given the rate of new vulnerabilities discovered and attacks occurring. Frequency and the number of different kinds of vulnerabilities found in an OS can be considered an index of its information security level. In the present study five mostly used OSs, Microsoft Windows (windows 7, windows 8 and windows 10), Apple’s Mac and Linux are assessed for their discovered vulnerabilities and the risk associated with each. Each discovered and reported vulnerability has an exploitability score assigned in CVSS score of the national vulnerability database. In this study the risk from vulnerabilities in each of the five Operating Systems is compared. Risk Indexes used are developed based on the Markov model to evaluate the risk of each vulnerability. Statistical methodology and underlying mathematical approach is described. Initially, parametric procedures are conducted and measured. There were, however, violations of some statistical assumptions observed. Therefore the need for non-parametric approaches was recognized. 6838 vulnerabilities recorded were considered in the analysis. According to the risk associated with all the vulnerabilities considered, it was found that there is a statistically significant difference among average risk levels for some operating systems, indicating that according to our method some operating systems have been more risk vulnerable than others given the assumptions and limitations. Relevant test results revealing a statistically significant difference in the Risk levels of different OSs are presented.

Keywords: cybersecurity, Markov chain, non-parametric analysis, vulnerability, operating system

Procedia PDF Downloads 183
11135 Aerodynamic Modelling of Unmanned Aerial System through Computational Fluid Dynamics: Application to the UAS-S45 Balaam

Authors: Maxime A. J. Kuitche, Ruxandra M. Botez, Arthur Guillemin

Abstract:

As the Unmanned Aerial Systems have found diverse utilities in both military and civil aviation, the necessity to obtain an accurate aerodynamic model has shown an enormous growth of interest. Recent modeling techniques are procedures using optimization algorithms and statistics that require many flight tests and are therefore extremely demanding in terms of costs. This paper presents a procedure to estimate the aerodynamic behavior of an unmanned aerial system from a numerical approach using computational fluid dynamic analysis. The study was performed using an unstructured mesh obtained from a grid convergence analysis at a Mach number of 0.14, and at an angle of attack of 0°. The flow around the aircraft was described using a standard k-ω turbulence model. Thus, the Reynold Averaged Navier-Stokes (RANS) equations were solved using ANSYS FLUENT software. The method was applied on the UAS-S45 designed and manufactured by Hydra Technologies in Mexico. The lift, the drag, and the pitching moment coefficients were obtained at different angles of attack for several flight conditions defined in terms of altitudes and Mach numbers. The results obtained from the Computational Fluid Dynamics analysis were compared with the results obtained by using the DATCOM semi-empirical procedure. This comparison has indicated that our approach is highly accurate and that the aerodynamic model obtained could be useful to estimate the flight dynamics of the UAS-S45.

Keywords: aerodynamic modelling, CFD Analysis, ANSYS FLUENT, UAS-S45

Procedia PDF Downloads 375
11134 Research Developments in Vibration Control of Structure Using Tuned Liquid Column Dampers: A State-of-the-Art Review

Authors: Jay Gohel, Anant Parghi

Abstract:

A tuned liquid column damper (TLCD) is a modified passive system of tuned mass damper, where a liquid is used in place of mass in the structure. A TLCD consists of U-shaped tube with an orifice that produces damping against the liquid motion in the tube. This paper provides a state-of-the-art review on the vibration control of wind and earthquake excited structures using liquid dampers. Further, the paper will also discuss the theoretical background of TCLD, history of liquid dampers and existing literature on experimental, numerical, and analytical study. The review will also include different configuration of TLCD viz single TLCD, multi tuned liquid column damper (MTLCD), TLCD-Interior (TLCDI), tuned liquid column ball damper (TLCBD), tuned liquid column ball gas damper (TLCBGD), and pendulum liquid column damper (PLCD). The dynamic characteristics of the different configurate TLCD system and their effectiveness in reducing the vibration of structure will be discussed. The effectiveness of semi-active TLCD will be also discussed with reference to experimental and analytical results. In addition, the review will also provide the numerous examples of implemented TLCD to control the vibration in real structures. Based on the comprehensive review of literature, some important conclusions will be made and the need for future research will be identified for vibration control of structures using TLCD.

Keywords: earthquake, wind, tuned liquid column damper, passive response control, structures

Procedia PDF Downloads 208
11133 Adequacy of Antenatal Care and Its Relationship with Low Birth Weight in Botucatu, São Paulo, Brazil: A Case-Control Study

Authors: Cátia Regina Branco da Fonseca, Maria Wany Louzada Strufaldi, Lídia Raquel de Carvalho, Rosana Fiorini Puccini

Abstract:

Background: Birth weight reflects gestational conditions and development during the fetal period. Low birth weight (LBW) may be associated with antenatal care (ANC) adequacy and quality. The purpose of this study was to analyze ANC adequacy and its relationship with LBW in the Unified Health System in Brazil. Methods: A case-control study was conducted in Botucatu, São Paulo, Brazil, 2004 to 2008. Data were collected from secondary sources (the Live Birth Certificate), and primary sources (the official medical records of pregnant women). The study population consisted of two groups, each with 860 newborns. The case group comprised newborns weighing less than 2,500 grams, while the control group comprised live newborns weighing greater than or equal to 2,500 grams. Adequacy of ANC was evaluated according to three measurements: 1. Adequacy of the number of ANC visits adjusted to gestational age; 2. Modified Kessner Index; and 3. Adequacy of ANC laboratory studies and exams summary measure according to parameters defined by the Ministry of Health in the Program for Prenatal and Birth Care Humanization. Results: Analyses revealed that LBW was associated with the number of ANC visits adjusted to gestational age (OR = 1.78, 95% CI 1.32-2.34) and the ANC laboratory studies and exams summary measure (OR = 4.13, 95% CI 1.36-12.51). According to the modified Kessner Index, 64.4% of antenatal visits in the LBW group were adequate, with no differences between groups. Conclusions: Our data corroborate the association between inadequate number of ANC visits, laboratory studies and exams, and increased risk of LBW newborns. No association was found between the modified Kessner Index as a measure of adequacy of ANC and LBW. This finding reveals the low indices of coverage for basic actions already well regulated in the Health System in Brazil. Despite the association found in the study, we cannot conclude that LBW would be prevented only by an adequate ANC, as LBW is associated with factors of complex and multifactorial etiology. The results could be used to plan monitoring measures and evaluate programs of health care assistance during pregnancy, at delivery and to newborns, focusing on reduced LBW rates.

Keywords: low birth weight, antenatal care, prenatal care, adequacy of health care, health evaluation, public health system

Procedia PDF Downloads 431
11132 Diagnosis of the Heart Rhythm Disorders by Using Hybrid Classifiers

Authors: Sule Yucelbas, Gulay Tezel, Cuneyt Yucelbas, Seral Ozsen

Abstract:

In this study, it was tried to identify some heart rhythm disorders by electrocardiography (ECG) data that is taken from MIT-BIH arrhythmia database by subtracting the required features, presenting to artificial neural networks (ANN), artificial immune systems (AIS), artificial neural network based on artificial immune system (AIS-ANN) and particle swarm optimization based artificial neural network (PSO-NN) classifier systems. The main purpose of this study is to evaluate the performance of hybrid AIS-ANN and PSO-ANN classifiers with regard to the ANN and AIS. For this purpose, the normal sinus rhythm (NSR), atrial premature contraction (APC), sinus arrhythmia (SA), ventricular trigeminy (VTI), ventricular tachycardia (VTK) and atrial fibrillation (AF) data for each of the RR intervals were found. Then these data in the form of pairs (NSR-APC, NSR-SA, NSR-VTI, NSR-VTK and NSR-AF) is created by combining discrete wavelet transform which is applied to each of these two groups of data and two different data sets with 9 and 27 features were obtained from each of them after data reduction. Afterwards, the data randomly was firstly mixed within themselves, and then 4-fold cross validation method was applied to create the training and testing data. The training and testing accuracy rates and training time are compared with each other. As a result, performances of the hybrid classification systems, AIS-ANN and PSO-ANN were seen to be close to the performance of the ANN system. Also, the results of the hybrid systems were much better than AIS, too. However, ANN had much shorter period of training time than other systems. In terms of training times, ANN was followed by PSO-ANN, AIS-ANN and AIS systems respectively. Also, the features that extracted from the data affected the classification results significantly.

Keywords: AIS, ANN, ECG, hybrid classifiers, PSO

Procedia PDF Downloads 442
11131 The Justice of Resources Allocation for People with Disability Base on Activity and Participation Functioning: The Cross-Section Study of National Population

Authors: Chia-Feng Yen, Shyang-Woei Lin

Abstract:

Background: In Taiwan, people with disability can obtain national social welfare services after evaluation. All subsidies and services in- kind are pronounced in People with Disabilities Rights Protection Act. The new disability eligibility determination system base on ICF has carried out five years in Taiwan. There were no systematic outcomes to discuss the relationships between the evaluation results of activity and participation functioning (AP functioning) and ratification of social services for people with disability. The decision-making of welfare resources allocation is in local government, so the ratification could be affected by resource variations in every area (local governments). The purposes of this study are to compare the ratification rate between different areas (the equity of allocation), and to understand the ratification of social services for people with disability after needs assessment stage that can help to predict the resources allocation for local governments in the further. Methods: A cross-sectional study was used, and the data came from Disability Eligibility Determination System in Taiwan between 2013/11/04-2015/01/12. All samples were evaluated by FUNDES-adult version 7th and they all above 18 years old. The samples were collected face to face by physicians and AP evaluators. Result: In the needs assessment stage, the welfare ratification rates are significant differences between these local governments for the samples with the similar impairment and AP functioning.

Keywords: allocation, activity and participation, people with disability, justice

Procedia PDF Downloads 168
11130 Improve Heat Pipes Thermal Performance In H-VAC Systems Using CFD Modeling

Authors: A. Ghanami, M.Heydari

Abstract:

Heat pipe is simple heat transfer device which combines the conduction and phase change phenomena to control the heat transfer without any need for external power source. At hot surface of heat pipe, the liquid phase absorbs heat and changes to vapor phase. The vapor phase flows to condenser region and with the loss of heat changes to liquid phase. Due to gravitational force the liquid phase flows to evaporator section. In HVAC systems the working fluid is chosen based on the operating temperature. The heat pipe has significant capability to reduce the humidity in HVAC systems. Each HVAC system which uses heater, humidifier or dryer is a suitable nominate for the utilization of heat pipes. Generally heat pipes have three main sections: condenser, adiabatic region and evaporator. Performance investigation and optimization of heat pipes operation in order to increase their efficiency is crucial. In present article, a parametric study is performed to improve the heat pipe performance. Therefore, the heat capacity of heat pipe with respect to geometrical and confining parameters is investigated. For the better observation of heat pipe operation in HVAC systems, a CFD simulation in Eulerian- Eulerian multiphase approach is also performed. The results show that heat pipe heat transfer capacity is higher for water as working fluid with the operating temperature of 340 K. It is also showed that the vertical orientation of heat pipe enhances it’s heat transfer capacity.used in the abstract.

Keywords: Heat pipe, HVAC system, Grooved Heat pipe, Heat pipe limits.

Procedia PDF Downloads 482
11129 Endoscopic Treatment of Patients with Large Bile Duct Stones

Authors: Yuri Teterin, Lomali Generdukaev, Dmitry Blagovestnov, Peter Yartcev

Abstract:

Introduction: Under the definition "large biliary stones," we referred to stones over 1.5 cm, in which standard transpapillary litho extraction techniques were unsuccessful. Electrohydraulic and laser contact lithotripsy under SpyGlass control have been actively applied for the last decade in order to improve endoscopic treatment results. Aims and Methods: Between January 2019 and July 2022, the N.V. Sklifosovsky Research Institute of Emergency Care treated 706 patients diagnosed with choledocholithiasis who underwent biliary stones removed from the common bile duct. Of them, in 57 (8, 1%) patients, the use of a Dormia basket or Biliary stone extraction balloon was technically unsuccessful due to the size of the stones (more than 15 mm in diameter), which required their destruction. Mechanical lithotripsy was used in 35 patients, and electrohydraulic and laser lithotripsy under SpyGlass direct visualization system - in 26 patients. Results: The efficiency of mechanical lithotripsy was 72%. Complications in this group were observed in 2 patients. In both cases, on day one after lithotripsy, acute pancreatitis developed, which resolved on day three with conservative therapy (Clavin-Dindo type 2). The efficiency of contact lithotripsy was in 100% of patients. Complications were not observed in this group. Bilirubin level in this group normalized on the 3rd-4th day. Conclusion: Our study showed the efficacy and safety of electrohydraulic and laser lithotripsy under SpyGlass control in a well-defined group of patients with large bile duct stones.

Keywords: contact lithotripsy, choledocholithiasis, SpyGlass, cholangioscopy, laser, electrohydraulic system, ERCP

Procedia PDF Downloads 80
11128 Monte Carlo Simulation of X-Ray Spectra in Diagnostic Radiology and Mammography Using MCNP4C

Authors: Sahar Heidary, Ramin Ghasemi Shayan

Abstract:

The overall goal Monte Carlo N-atom radioactivity transference PC program (MCNP4C) was done for the regeneration of x-ray groups in diagnostic radiology and mammography. The electrons were transported till they slow down and stopover in the target. Both bremsstrahlung and characteristic x-ray creation were measured in this study. In this issue, the x-ray spectra forecast by several computational models recycled in the diagnostic radiology and mammography energy kind have been calculated by appraisal with dignified spectra and their outcome on the scheming of absorbed dose and effective dose (ED) told to the adult ORNL hermaphroditic phantom quantified. This comprises practical models (TASMIP and MASMIP), semi-practical models (X-rayb&m, X-raytbc, XCOMP, IPEM, Tucker et al., and Blough et al.), and Monte Carlo modeling (EGS4, ITS3.0, and MCNP4C). Images got consuming synchrotron radiation (SR) and both screen-film and the CR system were related with images of the similar trials attained with digital mammography equipment. In sight of the worthy feature of the effects gained, the CR system was used in two mammographic inspections with SR. For separately mammography unit, the capability acquiesced bilateral mediolateral oblique (MLO) and craniocaudal(CC) mammograms attained in a woman with fatty breasts and a woman with dense breasts. Referees planned the common groups and definite absences that managed to a choice to miscarry the part that formed the scientific imaginings.

Keywords: mammography, monte carlo, effective dose, radiology

Procedia PDF Downloads 131
11127 Investigation of the EEG Signal Parameters during Epileptic Seizure Phases in Consequence to the Application of External Healing Therapy on Subjects

Authors: Karan Sharma, Ajay Kumar

Abstract:

Epileptic seizure is a type of disease due to which electrical charge in the brain flows abruptly resulting in abnormal activity by the subject. One percent of total world population gets epileptic seizure attacks.Due to abrupt flow of charge, EEG (Electroencephalogram) waveforms change. On the display appear a lot of spikes and sharp waves in the EEG signals. Detection of epileptic seizure by using conventional methods is time-consuming. Many methods have been evolved that detect it automatically. The initial part of this paper provides the review of techniques used to detect epileptic seizure automatically. The automatic detection is based on the feature extraction and classification patterns. For better accuracy decomposition of the signal is required before feature extraction. A number of parameters are calculated by the researchers using different techniques e.g. approximate entropy, sample entropy, Fuzzy approximate entropy, intrinsic mode function, cross-correlation etc. to discriminate between a normal signal & an epileptic seizure signal.The main objective of this review paper is to present the variations in the EEG signals at both stages (i) Interictal (recording between the epileptic seizure attacks). (ii) Ictal (recording during the epileptic seizure), using most appropriate methods of analysis to provide better healthcare diagnosis. This research paper then investigates the effects of a noninvasive healing therapy on the subjects by studying the EEG signals using latest signal processing techniques. The study has been conducted with Reiki as a healing technique, beneficial for restoring balance in cases of body mind alterations associated with an epileptic seizure. Reiki is practiced around the world and is recommended for different health services as a treatment approach. Reiki is an energy medicine, specifically a biofield therapy developed in Japan in the early 20th century. It is a system involving the laying on of hands, to stimulate the body’s natural energetic system. Earlier studies have shown an apparent connection between Reiki and the autonomous nervous system. The Reiki sessions are applied by an experienced therapist. EEG signals are measured at baseline, during session and post intervention to bring about effective epileptic seizure control or its elimination altogether.

Keywords: EEG signal, Reiki, time consuming, epileptic seizure

Procedia PDF Downloads 406
11126 Value of Willingness to Pay for a Quality-Adjusted Life Years Gained in Iran; A Modified Chained-Approach

Authors: Seyedeh-Fariba Jahanbin, Hasan Yusefzadeh, Bahram Nabilou, Cyrus Alinia, Cyrus Alinia

Abstract:

Background: Due to the lack of a constant Willingness to Pay per one additional Quality Adjusted Life Years gained based on the preferences of Iran’s general public, the cost-efectiveness of health system interventions is unclear and making it challenging to apply economic evaluation to health resources priority setting. Methods: We have measured this cost-efectiveness threshold with the participation of 2854 individuals from fve provinces, each representing an income quintile, using a modifed Time Trade-Of-based Chained-Approach. In this online-based empirical survey, to extract the health utility value, participants were randomly assigned to one of two green (21121) and yellow (22222) health scenarios designed based on the earlier validated EQ-5D-3L questionnaire. Results: Across the two health state versions, mean values for one QALY gain (rounded) ranged from $6740-$7400 and $6480-$7120, respectively, for aggregate and trimmed models, which are equivalent to 1.35-1.18 times of the GDP per capita. Log-linear Multivariate OLS regression analysis confrmed that respondents were more likely to pay if their income, disutility, and education level were higher than their counterparts. Conclusions: In the health system of Iran, any intervention that is with the incremental cost-efectiveness ratio, equal to and less than 7402.12 USD, will be considered cost-efective.

Keywords: willingness to Pay, QALY, chained-approach, cost-efectiveness threshold, Iran

Procedia PDF Downloads 85
11125 Determining Water Quantity from Sprayer Nozzle Using Particle Image Velocimetry (PIV) and Image Processing Techniques

Authors: M. Nadeem, Y. K. Chang, C. Diallo, U. Venkatadri, P. Havard, T. Nguyen-Quang

Abstract:

Uniform distribution of agro-chemicals is highly important because there is a significant loss of agro-chemicals, for example from pesticide, during spraying due to non-uniformity of droplet and off-target drift. Improving the efficiency of spray pattern for different cropping systems would reduce energy, costs and to minimize environmental pollution. In this paper, we examine the water jet patterns in order to study the performance and uniformity of water distribution during the spraying process. We present a method to quantify the water amount from a sprayer jet by using the Particle Image Velocimetry (PIV) system. The results of the study will be used to optimize sprayer or nozzles design for chemical application. For this study, ten sets of images were acquired by using the following PIV system settings: double frame mode, trigger rate is 4 Hz, and time between pulsed signals is 500 µs. Each set of images contained different numbers of double-framed images: 10, 20, 30, 40, 50, 60, 70, 80, 90 and 100 at eight different pressures 25, 50, 75, 100, 125, 150, 175 and 200 kPa. The PIV images obtained were analysed using custom-made image processing software for droplets and volume calculations. The results showed good agreement of both manual and PIV measurements and suggested that the PIV technique coupled with image processing can be used for a precise quantification of flow through nozzles. The results also revealed that the method of measuring fluid flow through PIV is reliable and accurate for sprayer patterns.

Keywords: image processing, PIV, quantifying the water volume from nozzle, spraying pattern

Procedia PDF Downloads 237
11124 Isolation of a Bacterial Community with High Removal Efficiencies of the Insecticide Bendiocarb

Authors: Eusebio A. Jiménez-Arévalo, Deifilia Ahuatzi-Chacón, Juvencio Galíndez-Mayer, Cleotilde Juárez-Ramírez, Nora Ruiz-Ordaz

Abstract:

Bendiocarb is a known toxic xenobiotic that presents acute and chronic risks for freshwater invertebrates and estuarine and marine biota; thus, the treatment of water contaminated with the insecticide is of concern. In this paper, a bacterial community with the capacity to grow in bendiocarb as its sole carbon and nitrogen source was isolated by enrichment techniques in batch culture, from samples of a composting plant located in the northeast of Mexico City. Eight cultivable bacteria were isolated from the microbial community, by PCR amplification of 16 rDNA; Pseudoxanthomonas spadix (NC_016147.2, 98%), Ochrobacterium anthropi (NC_009668.1, 97%), Staphylococcus capitis (NZ_CP007601.1, 99%), Bosea thiooxidans. (NZ_LMAR01000067.1, 99%), Pseudomonas denitrificans. (NC_020829.1, 99%), Agromyces sp. (NZ_LMKQ01000001.1, 98%), Bacillus thuringiensis. (NC_022873.1, 97%), Pseudomonas alkylphenolia (NZ_CP009048.1, 98%). NCBI accession numbers and percentage of similarity are indicated in parentheses. These bacteria were regarded as the isolated species for having the best similarity matches. The ability to degrade bendiocarb by the immobilized bacterial community in a packed bed biofilm reactor, using as support volcanic stone fragments (tezontle), was evaluated. The reactor system was operated in batch using mineral salts medium and 30 mg/L of bendiocarb as carbon and nitrogen source. With this system, an overall removal efficiency (ηbend) rounding 90%, was reached.

Keywords: bendiocarb, biodegradation, biofilm reactor, carbamate insecticide

Procedia PDF Downloads 279
11123 Wireless Information Transfer Management and Case Study of a Fire Alarm System in a Residential Building

Authors: Mohsen Azarmjoo, Mehdi Mehdizadeh Koupaei, Maryam Mehdizadeh Koupaei, Asghar Mahdlouei Azar

Abstract:

The increasing prevalence of wireless networks in our daily lives has made them indispensable. The aim of this research is to investigate the management of information transfer in wireless networks and the integration of renewable solar energy resources in a residential building. The focus is on the transmission of electricity and information through wireless networks, as well as the utilization of sensors and wireless fire alarm systems. The research employs a descriptive approach to examine the transmission of electricity and information on a wireless network with electric and optical telephone lines. It also investigates the transmission of signals from sensors and wireless fire alarm systems via radio waves. The methodology includes a detailed analysis of security, comfort conditions, and costs related to the utilization of wireless networks and renewable solar energy resources. The study reveals that it is feasible to transmit electricity on a network cable using two pairs of network cables without the need for separate power cabling. Additionally, the integration of renewable solar energy systems in residential buildings can reduce dependence on traditional energy carriers. The use of sensors and wireless remote information processing can enhance the safety and efficiency of energy usage in buildings and the surrounding spaces.

Keywords: renewable energy, intelligentization, wireless sensors, fire alarm system

Procedia PDF Downloads 54
11122 MIM and Experimental Studies of the Thermal Drift in an Ultra-High Precision Instrument for Dimensional Metrology

Authors: Kamélia Bouderbala, Hichem Nouira, Etienne Videcoq, Manuel Girault, Daniel Petit

Abstract:

Thermal drifts caused by the power dissipated by the mechanical guiding systems constitute the main limit to enhance the accuracy of an ultra-high precision cylindricity measuring machine. For this reason, a high precision compact prototype has been designed to simulate the behaviour of the instrument. It ensures in situ calibration of four capacitive displacement probes by comparison with four laser interferometers. The set-up includes three heating wires for simulating the powers dissipated by the mechanical guiding systems, four additional heating wires located between each laser interferometer head and its respective holder, 19 Platinum resistance thermometers (Pt100) to observe the temperature evolution inside the set-up and four Pt100 sensors to monitor the ambient temperature. Both a Reduced Model (RM), based on the Modal Identification Method (MIM) was developed and optimized by comparison with the experimental results. Thereafter, time dependent tests were performed under several conditions to measure the temperature variation at 19 fixed positions in the system and compared to the calculated RM results. The RM results show good agreement with experiment and reproduce as well the temperature variations, revealing the importance of the RM proposed for the evaluation of the thermal behaviour of the system.

Keywords: modal identification method (MIM), thermal behavior and drift, dimensional metrology, measurement

Procedia PDF Downloads 396
11121 Thermal Vacuum Chamber Test Result for CubeSat Transmitter

Authors: Fitri D. Jaswar, Tharek A. Rahman, Yasser A. Ahmad

Abstract:

CubeSat in low earth orbit (LEO) mainly uses ultra high frequency (UHF) transmitter with fixed radio frequency (RF) output power to download the telemetry and the payload data. The transmitter consumes large amount of electrical energy during the transmission considering the limited satellite size of a CubeSat. A transmitter with power control ability is designed to achieve optimize the signal to noise ratio (SNR) and efficient power consumption. In this paper, the thermal vacuum chamber (TVAC) test is performed to validate the performance of the UHF band transmitter with power control capability. The TVAC is used to simulate the satellite condition in the outer space environment. The TVAC test was conducted at the Laboratory of Spacecraft Environment Interaction Engineering, Kyushu Institute of Technology, Japan. The TVAC test used 4 thermal cycles starting from +60°C to -20°C for the temperature setting. The pressure condition inside chamber was less than 10-5Pa. During the test, the UHF transmitter is integrated in a CubeSat configuration with other CubeSat subsystem such as on board computer (OBC), power module, and satellite structure. The system is validated and verified through its performance in terms of its frequency stability and the RF output power. The UHF band transmitter output power is tested from 0.5W to 2W according the satellite mode of operations and the satellite power limitations. The frequency stability is measured and the performance obtained is less than 2 ppm in the tested operating temperature range. The test demonstrates the RF output power is adjustable in a thermal vacuum condition.

Keywords: communication system, CubeSat, SNR, UHF transmitter

Procedia PDF Downloads 264
11120 The Quantitative Optical Modulation of Dopamine Receptor-Mediated Endocytosis Using an Optogenetic System

Authors: Qiaoyue Kuang, Yang Li, Mizuki Endo, Takeaki Ozawa

Abstract:

G protein-coupled receptors (GPCR) are the largest family of receptor proteins that detect molecules outside the cell and activate cellular responses. Of the GPCRs, dopamine receptors, which recognize extracellular dopamine, are essential to mammals due to their roles in numerous physiological events, including autonomic movement, hormonal regulation, emotions, and the reward system in the brain. To precisely understand the physiological roles of dopamine receptors, it is important to spatiotemporally control the signaling mediated by dopamine receptors, which is strongly dependent on their surface expression. Conventionally, chemical-induced interactions were applied to trigger the endocytosis of cell surface receptors. However, these methods were subjected to diffusion and therefore lacked temporal and special precision. To further understand the receptor-mediated signaling and to control the plasma membrane expression of receptors, an optogenetic tool called E-fragment was developed. The C-terminus of a light-sensitive photosensory protein cyptochrome2 (CRY2) was attached to β-Arrestin, and the E-fragment was generated by fusing the C-terminal peptide of vasopressin receptor (V2R) to CRY2’s binding partner protein CIB. The CRY2-CIB heterodimerization triggered by blue light stimulation brings β-Arrestin to the vicinity of membrane receptors and results in receptor endocytosis. In this study, the E-fragment system was applied to dopamine receptors 1 and 2 (DRD1 and DRD2) to control dopamine signaling. First, confocal fluorescence microscope observation qualitatively confirmed the light-induced endocytosis of E-fragment fused receptors. Second, NanoBiT bioluminescence assay verified quantitatively that the surface amount of E-fragment labeled receptors decreased after light treatment. Finally, GloSensor bioluminescence assay results suggested that the E-fragment-dependent receptor light-induced endocytosis decreased cAMP production in DRD1 signaling and attenuated the inhibition effect of DRD2 on cAMP production. The developed optogenetic tool was able to induce receptor endocytosis by external light, providing opportunities to further understand numerous physiological activities by controlling receptor-mediated signaling spatiotemporally.

Keywords: dopamine receptors, endocytosis, G protein-coupled receptors, optogenetics

Procedia PDF Downloads 102
11119 Fuzzy Multi-Objective Approach for Emergency Location Transportation Problem

Authors: Bidzina Matsaberidze, Anna Sikharulidze, Gia Sirbiladze, Bezhan Ghvaberidze

Abstract:

In the modern world emergency management decision support systems are actively used by state organizations, which are interested in extreme and abnormal processes and provide optimal and safe management of supply needed for the civil and military facilities in geographical areas, affected by disasters, earthquakes, fires and other accidents, weapons of mass destruction, terrorist attacks, etc. Obviously, these kinds of extreme events cause significant losses and damages to the infrastructure. In such cases, usage of intelligent support technologies is very important for quick and optimal location-transportation of emergency service in order to avoid new losses caused by these events. Timely servicing from emergency service centers to the affected disaster regions (response phase) is a key task of the emergency management system. Scientific research of this field takes the important place in decision-making problems. Our goal was to create an expert knowledge-based intelligent support system, which will serve as an assistant tool to provide optimal solutions for the above-mentioned problem. The inputs to the mathematical model of the system are objective data, as well as expert evaluations. The outputs of the system are solutions for Fuzzy Multi-Objective Emergency Location-Transportation Problem (FMOELTP) for disasters’ regions. The development and testing of the Intelligent Support System were done on the example of an experimental disaster region (for some geographical zone of Georgia) which was generated using a simulation modeling. Four objectives are considered in our model. The first objective is to minimize an expectation of total transportation duration of needed products. The second objective is to minimize the total selection unreliability index of opened humanitarian aid distribution centers (HADCs). The third objective minimizes the number of agents needed to operate the opened HADCs. The fourth objective minimizes the non-covered demand for all demand points. Possibility chance constraints and objective constraints were constructed based on objective-subjective data. The FMOELTP was constructed in a static and fuzzy environment since the decisions to be made are taken immediately after the disaster (during few hours) with the information available at that moment. It is assumed that the requests for products are estimated by homeland security organizations, or their experts, based upon their experience and their evaluation of the disaster’s seriousness. Estimated transportation times are considered to take into account routing access difficulty of the region and the infrastructure conditions. We propose an epsilon-constraint method for finding the exact solutions for the problem. It is proved that this approach generates the exact Pareto front of the multi-objective location-transportation problem addressed. Sometimes for large dimensions of the problem, the exact method requires long computing times. Thus, we propose an approximate method that imposes a number of stopping criteria on the exact method. For large dimensions of the FMOELTP the Estimation of Distribution Algorithm’s (EDA) approach is developed.

Keywords: epsilon-constraint method, estimation of distribution algorithm, fuzzy multi-objective combinatorial programming problem, fuzzy multi-objective emergency location/transportation problem

Procedia PDF Downloads 321
11118 Wind Energy Harvester Based on Triboelectricity: Large-Scale Energy Nanogenerator

Authors: Aravind Ravichandran, Marc Ramuz, Sylvain Blayac

Abstract:

With the rapid development of wearable electronics and sensor networks, batteries cannot meet the sustainable energy requirement due to their limited lifetime, size and degradation. Ambient energies such as wind have been considered as an attractive energy source due to its copious, ubiquity, and feasibility in nature. With miniaturization leading to high-power and robustness, triboelectric nanogenerator (TENG) have been conceived as a promising technology by harvesting mechanical energy for powering small electronics. TENG integration in large-scale applications is still unexplored considering its attractive properties. In this work, a state of the art design TENG based on wind venturi system is demonstrated for use in any complex environment. When wind introduces into the air gap of the homemade TENG venturi system, a thin flexible polymer repeatedly contacts with and separates from electrodes. This device structure makes the TENG suitable for large scale harvesting without massive volume. Multiple stacking not only amplifies the output power but also enables multi-directional wind utilization. The system converts ambient mechanical energy to electricity with 400V peak voltage by charging of a 1000mF super capacitor super rapidly. Its future implementation in an array of applications aids in environment friendly clean energy production in large scale medium and the proposed design performs with an exhaustive material testing. The relation between the interfacial micro-and nano structures and the electrical performance enhancement is comparatively studied. Nanostructures are more beneficial for the effective contact area, but they are not suitable for the anti-adhesion property due to the smaller restoring force. Considering these issues, the nano-patterning is proposed for further enhancement of the effective contact area. By considering these merits of simple fabrication, outstanding performance, robust characteristic and low-cost technology, we believe that TENG can open up great opportunities not only for powering small electronics, but can contribute to large-scale energy harvesting through engineering design being complementary to solar energy in remote areas.

Keywords: triboelectric nanogenerator, wind energy, vortex design, large scale energy

Procedia PDF Downloads 213
11117 Performance Analysis of Geophysical Database Referenced Navigation: The Combination of Gravity Gradient and Terrain Using Extended Kalman Filter

Authors: Jisun Lee, Jay Hyoun Kwon

Abstract:

As an alternative way to compensate the INS (inertial navigation system) error in non-GNSS (Global Navigation Satellite System) environment, geophysical database referenced navigation is being studied. In this study, both gravity gradient and terrain data were combined to complement the weakness of sole geophysical data as well as to improve the stability of the positioning. The main process to compensate the INS error using geophysical database was constructed on the basis of the EKF (Extended Kalman Filter). In detail, two type of combination method, centralized and decentralized filter, were applied to check the pros and cons of its algorithm and to find more robust results. The performance of each navigation algorithm was evaluated based on the simulation by supposing that the aircraft flies with precise geophysical DB and sensors above nine different trajectories. Especially, the results were compared to the ones from sole geophysical database referenced navigation to check the improvement due to a combination of the heterogeneous geophysical database. It was found that the overall navigation performance was improved, but not all trajectories generated better navigation result by the combination of gravity gradient with terrain data. Also, it was found that the centralized filter generally showed more stable results. It is because that the way to allocate the weight for the decentralized filter could not be optimized due to the local inconsistency of geophysical data. In the future, switching of geophysical data or combining different navigation algorithm are necessary to obtain more robust navigation results.

Keywords: Extended Kalman Filter, geophysical database referenced navigation, gravity gradient, terrain

Procedia PDF Downloads 349
11116 Biogas Production from Kitchen Waste for a Household Sustainability

Authors: Vuiswa Lucia Sethunya, Tonderayi Matambo, Diane Hildebrandt

Abstract:

South African’s informal settlements produce tonnes of kitchen waste (KW) per year which is dumped into the landfill. These landfill sites are normally located in close proximity to the household of the poor communities; this is a problem in which the young children from those communities end up playing in these landfill sites which may result in some health hazards because of methane, carbon dioxide and sulphur gases which are produced. To reduce this large amount of organic materials being deposited into landfills and to provide a cleaner place for those within the community especially the children, an energy conversion process such as anaerobic digestion of the organic waste to produce biogas was implemented. In this study, the digestion of various kitchen waste was investigated in order to understand and develop a system that is suitable for household use to produce biogas for cooking. Three sets of waste of different nutritional compositions were digested as per acquired in the waste streams of a household at mesophilic temperature (35ᵒC). These sets of KW were co-digested with cow dung (CW) at different ratios to observe the microbial behaviour and the system’s stability in a laboratory scale system. The gas chromatography-flame ionization detector analyses have been performed to identify and quantify the presence of organic compounds in the liquid samples from co-digested and mono-digested food waste. Acetic acid, propionic acid, butyric acid and valeric acid are the fatty acids which were studied. Acetic acid (1.98 g/L), propionic acid (0.75 g/L) and butyric acid (2.16g/L) were the most prevailing fatty acids. The results obtained from organic acids analysis suggest that the KW can be an innovative substituent to animal manure for biogas production. The faster degradation period in which the microbes break down the organic compound to produce the fatty acids during the anaerobic process of KW also makes it a better feedstock during high energy demand periods. The C/N ratio analysis showed that from the three waste streams the first stream containing vegetables (55%), fruits (16%), meat (25%) and pap (4%) yielded more methane-based biogas of 317mL/g of volatile solids (VS) at C/N of 21.06. Generally, this shows that a household will require a heterogeneous composition of nutrient-based waste to be fed into the digester to acquire the best biogas yield to sustain a households cooking needs.

Keywords: anaerobic digestion, biogas, kitchen waste, household

Procedia PDF Downloads 200
11115 Engineering Topology of Construction Ecology in Urban Environments: Suez Canal Economic Zone

Authors: Moustafa Osman Mohammed

Abstract:

Integration sustainability outcomes give attention to construction ecology in the design review of urban environments to comply with Earth’s System that is composed of integral parts of the (i.e., physical, chemical and biological components). Naturally, exchange patterns of industrial ecology have consistent and periodic cycles to preserve energy flows and materials in Earth’s System. When engineering topology is affecting internal and external processes in system networks, it postulated the valence of the first-level spatial outcome (i.e., project compatibility success). These instrumentalities are dependent on relating the second-level outcome (i.e., participant security satisfaction). Construction ecology approach feedback energy from resources flows between biotic and abiotic in the entire Earth’s ecosystems. These spatial outcomes are providing an innovation, as entails a wide range of interactions to state, regulate and feedback “topology” to flow as “interdisciplinary equilibrium” of ecosystems. The interrelation dynamics of ecosystems are performing a process in a certain location within an appropriate time for characterizing their unique structure in “equilibrium patterns”, such as biosphere and collecting a composite structure of many distributed feedback flows. These interdisciplinary systems regulate their dynamics within complex structures. These dynamic mechanisms of the ecosystem regulate physical and chemical properties to enable a gradual and prolonged incremental pattern to develop a stable structure. The engineering topology of construction ecology for integration sustainability outcomes offers an interesting tool for ecologists and engineers in the simulation paradigm as an initial form of development structure within compatible computer software. This approach argues from ecology, resource savings, static load design, financial other pragmatic reasons, while an artistic/architectural perspective, these are not decisive. The paper described an attempt to unify analytic and analogical spatial modeling in developing urban environments as a relational setting, using optimization software and applied as an example of integrated industrial ecology where the construction process is based on a topology optimization approach.

Keywords: construction ecology, industrial ecology, urban topology, environmental planning

Procedia PDF Downloads 130
11114 Distributed Automation System Based Remote Monitoring of Power Quality Disturbance on LV Network

Authors: Emmanuel D. Buedi, K. O. Boateng, Griffith S. Klogo

Abstract:

Electrical distribution networks are prone to power quality disturbances originating from the complexity of the distribution network, mode of distribution (overhead or underground) and types of loads used by customers. Data on the types of disturbances present and frequency of occurrence is needed for economic evaluation and hence finding solution to the problem. Utility companies have resorted to using secondary power quality devices such as smart meters to help gather the required data. Even though this approach is easier to adopt, data gathered from these devices may not serve the required purpose, since the installation of these devices in the electrical network usually does not conform to available PQM placement methods. This paper presents a design of a PQM that is capable of integrating into an existing DAS infrastructure to take advantage of available placement methodologies. The monitoring component of the design is implemented and installed to monitor an existing LV network. Data from the monitor is analyzed and presented. A portion of the LV network of the Electricity Company of Ghana is modeled in MATLAB-Simulink and analyzed under various earth fault conditions. The results presented show the ability of the PQM to detect and analyze PQ disturbance such as voltage sag and overvoltage. By adopting a placement methodology and installing these nodes, utilities are assured of accurate and reliable information with respect to the quality of power delivered to consumers.

Keywords: power quality, remote monitoring, distributed automation system, economic evaluation, LV network

Procedia PDF Downloads 349
11113 State, Public Policies, and Rights: Public Expenditure and Social and Welfare Policies in America, as Opposed to Argentina

Authors: Mauro Cristeche

Abstract:

This paper approaches the intervention of the American State in the social arena and the modeling of the rights system from the Argentinian experience, by observing the characteristics of its federal budgetary system, the evolution of social public spending and welfare programs in recent years, labor and poverty statistics, and the changes on the labor market structure. The analysis seeks to combine different methodologies and sources: in-depth interviews with specialists, analysis of theoretical and mass-media material, and statistical sources. Among the results, it could be mentioned that the tendency to state interventionism (what has been called ‘nationalization of social life’) is quite evident in the United States, and manifests itself in multiple forms. The bibliography consulted, and the experts interviewed pointed out this increase of the state presence in historical terms (beyond short-term setbacks) in terms of increase of public spending, fiscal pressure, public employment, protective and control mechanisms, the extension of welfare policies to the poor sectors, etc. In fact, despite the significant differences between both countries, the United States and Argentina have common patterns of behavior in terms of the aforementioned phenomena. On the other hand, dissimilarities are also important. Some of them are determined by each country's own political history. The influence of political parties on the economic model seems more decisive in the United States than in Argentina, where the tendency to state interventionism is more stable. The centrality of health spending is evident in America, while in Argentina that discussion is more concentrated in the social security system and public education. The biggest problem of the labor market in the United States is the disqualification as a consequence of the technological development while in Argentina it is a result of its weakness. Another big difference is the huge American public spending on Defense. Then, the more federal character of the American State is also a factor of differential analysis against a centralized Argentine state. American public employment (around 10%) is comparatively quite lower than the Argentinian (around 18%). The social statistics show differences, but inequality and poverty have been growing as a trend in the last decades in both countries. According to public rates, poverty represents 14% in The United States and 33% in Argentina. American public spending is important (welfare spending and total public spending represent around 12% and 34% of GDP, respectively), but a bit lower than Latin-American or European average). In both cases, the tendency to underemployment and disqualification unemployment does not assume a serious gravity. Probably one of the most important aspects of the analysis is that private initiative and public intervention are much more intertwined in the United States, which makes state intervention more ‘fuzzy’, while in Argentina the difference is clearer. Finally, the power of its accumulation of capital and, more specifically, of the industrial and services sectors in the United States, which continues to be the engine of the economy, express great differences with Argentina, supported by its agro-industrial power and its public sector.

Keywords: state intervention, welfare policies, labor market, system of rights, United States of America

Procedia PDF Downloads 131
11112 Enhancing Thai In-Service Science Teachers' Technological Pedagogical Content Knowledge Integrating Local Context and Sufficiency Economy into Science Teaching

Authors: Siriwan Chatmaneerungcharoen

Abstract:

An emerging body of ‘21st century skills’-such as adaptability, complex communication skills, technology skills and the ability to solve non-routine problems--are valuable across a wide range of jobs in the national economy. Within the Thai context, a focus on the Philosophy of Sufficiency Economy is integrated into Science Education. Thai science education has advocated infusing 21st century skills and Philosophy of Sufficiency Economy into the school curriculum and several educational levels have launched such efforts. Therefore, developing science teachers to have proper knowledge is the most important factor to success of the goals. The purposes of this study were to develop 40 Cooperative Science teachers’ Technological Pedagogical Content Knowledge (TPACK) and to develop Professional Development Model integrated with Co-teaching Model and Coaching System (Co-TPACK). TPACK is essential to career development for teachers. Forty volunteer In-service teachers who were science cooperative teachers participated in this study for 2 years. Data sources throughout the research project consisted of teacher refection, classroom observations, Semi-structure interviews, Situation interview, questionnaires and document analysis. Interpretivist framework was used to analyze the data. Findings indicate that at the beginning, the teachers understood only the meaning of Philosophy of Sufficiency Economy but they did not know how to integrate the Philosophy of Sufficiency Economy into their science classrooms. Mostly, they preferred to use lecture based teaching and experimental teaching styles. While the Co- TPACK was progressing, the teachers had blended their teaching styles and learning evaluation methods. Co-TPACK consists of 3 cycles (Student Teachers’ Preparation Cycle, Cooperative Science Teachers Cycle, Collaboration cycle (Co-teaching, Co-planning, and Co-Evaluating and Coaching System)).The Co-TPACK enhances the 40 cooperative science teachers, student teachers and university supervisor to exchange their knowledge and experience on teaching science. There are many channels that they used for communication including online. They have used more Phuket context-integrated lessons, technology-integrated teaching and Learning that can explicit Philosophy of Sufficiency Economy. Their sustained development is shown in their lesson plans and teaching practices.

Keywords: technological pedagogical content knowledge, philosophy of sufficiency economy, professional development, coaching system

Procedia PDF Downloads 464
11111 The Review of Permanent Downhole Monitoring System

Authors: Jing Hu, Dong Yang

Abstract:

With the increasingly difficult development and operating environment of exploration, there are many new challenges and difficulties in developing and exploiting oil and gas resources. These include the ability to dynamically monitor wells and provide data and assurance for the completion and production of high-cost and complex wells. A key technology in providing these assurances and maximizing oilfield profitability is real-time permanent reservoir monitoring. The emergence of optical fiber sensing systems has gradually begun to replace traditional electronic systems. Traditional temperature sensors can only achieve single-point temperature monitoring, but fiber optic sensing systems based on the Bragg grating principle have a high level of reliability, accuracy, stability, and resolution, enabling cost-effective monitoring, which can be done in real-time, anytime, and without well intervention. Continuous data acquisition is performed along the entire wellbore. The integrated package with the downhole pressure gauge, packer, and surface system can also realize real-time dynamic monitoring of the pressure in some sections of the downhole, avoiding oil well intervention and eliminating the production delay and operational risks of conventional surveys. Real-time information obtained through permanent optical fibers can also provide critical reservoir monitoring data for production and recovery optimization.

Keywords: PDHM, optical fiber, coiled tubing, photoelectric composite cable, digital-oilfield

Procedia PDF Downloads 79
11110 Artificial Membrane Comparison for Skin Permeation in Skin PAMPA

Authors: Aurea C. L. Lacerda, Paulo R. H. Moreno, Bruna M. P. Vianna, Cristina H. R. Serra, Airton Martin, André R. Baby, Vladi O. Consiglieri, Telma M. Kaneko

Abstract:

The modified Franz cell is the most widely used model for in vitro permeation studies, however it still presents some disadvantages. Thus, some alternative methods have been developed such as Skin PAMPA, which is a bio- artificial membrane that has been applied for skin penetration estimation of xenobiotics based on HT permeability model consisting. Skin PAMPA greatest advantage is to carry out more tests, in a fast and inexpensive way. The membrane system mimics the stratum corneum characteristics, which is the primary skin barrier. The barrier properties are given by corneocytes embedded in a multilamellar lipid matrix. This layer is the main penetration route through the paracellular permeation pathway and it consists of a mixture of cholesterol, ceramides, and fatty acids as the dominant components. However, there is no consensus on the membrane composition. The objective of this work was to compare the performance among different bio-artificial membranes for studying the permeation in skin PAMPA system. Material and methods: In order to mimetize the lipid composition`s present in the human stratum corneum six membranes were developed. The membrane composition was equimolar mixture of cholesterol, ceramides 1-O-C18:1, C22, and C20, plus fatty acids C20 and C24. The membrane integrity assay was based on the transport of Brilliant Cresyl Blue, which has a low permeability; and Lucifer Yellow with very poor permeability and should effectively be completely rejected. The membrane characterization was performed using Confocal Laser Raman Spectroscopy, using stabilized laser at 785 nm with 10 second integration time and 2 accumulations. The membrane behaviour results on the PAMPA system were statistically evaluated and all of the compositions have shown integrity and permeability. The confocal Raman spectra were obtained in the region of 800-1200 cm-1 that is associated with the C-C stretches of the carbon scaffold from the stratum corneum lipids showed similar pattern for all the membranes. The ceramides, long chain fatty acids and cholesterol in equimolar ratio permitted to obtain lipid mixtures with self-organization capability, similar to that occurring into the stratum corneum. Conclusion: The artificial biological membranes studied for Skin PAMPA showed to be similar and with comparable properties to the stratum corneum.

Keywords: bio-artificial membranes, comparison, confocal Raman, skin PAMPA

Procedia PDF Downloads 509