Search results for: residual volume
2280 Reliability Analysis of Geometric Performance of Onboard Satellite Sensors: A Study on Location Accuracy
Authors: Ch. Sridevi, A. Chalapathi Rao, P. Srinivasulu
Abstract:
The location accuracy of data products is a critical parameter in assessing the geometric performance of satellite sensors. This study focuses on reliability analysis of onboard sensors to evaluate their performance in terms of location accuracy performance over time. The analysis utilizes field failure data and employs the weibull distribution to determine the reliability and in turn to understand the improvements or degradations over a period of time. The analysis begins by scrutinizing the location accuracy error which is the root mean square (RMS) error of differences between ground control point coordinates observed on the product and the map and identifying the failure data with reference to time. A significant challenge in this study is to thoroughly analyze the possibility of an infant mortality phase in the data. To address this, the Weibull distribution is utilized to determine if the data exhibits an infant stage or if it has transitioned into the operational phase. The shape parameter beta plays a crucial role in identifying this stage. Additionally, determining the exact start of the operational phase and the end of the infant stage poses another challenge as it is crucial to eliminate residual infant mortality or wear-out from the model, as it can significantly increase the total failure rate. To address this, an approach utilizing the well-established statistical Laplace test is applied to infer the behavior of sensors and to accurately ascertain the duration of different phases in the lifetime and the time required for stabilization. This approach also helps in understanding if the bathtub curve model, which accounts for the different phases in the lifetime of a product, is appropriate for the data and whether the thresholds for the infant period and wear-out phase are accurately estimated by validating the data in individual phases with Weibull distribution curve fitting analysis. Once the operational phase is determined, reliability is assessed using Weibull analysis. This analysis not only provides insights into the reliability of individual sensors with regards to location accuracy over the required period of time, but also establishes a model that can be applied to automate similar analyses for various sensors and parameters using field failure data. Furthermore, the identification of the best-performing sensor through this analysis serves as a benchmark for future missions and designs, ensuring continuous improvement in sensor performance and reliability. Overall, this study provides a methodology to accurately determine the duration of different phases in the life data of individual sensors. It enables an assessment of the time required for stabilization and provides insights into the reliability during the operational phase and the commencement of the wear-out phase. By employing this methodology, designers can make informed decisions regarding sensor performance with regards to location accuracy, contributing to enhanced accuracy in satellite-based applications.Keywords: bathtub curve, geometric performance, Laplace test, location accuracy, reliability analysis, Weibull analysis
Procedia PDF Downloads 652279 ENDO-β-1,4-Xylanase from Thermophilic Geobacillus stearothermophilus: Immobilization Using Matrix Entrapment Technique to Increase the Stability and Recycling Efficiency
Authors: Afsheen Aman, Zainab Bibi, Shah Ali Ul Qader
Abstract:
Introduction: Xylan is a heteropolysaccharide composed of xylose monomers linked together through 1,4 linkages within a complex xylan network. Owing to wide applications of xylan hydrolytic products (xylose, xylobiose and xylooligosaccharide) the researchers are focusing towards the development of various strategies for efficient xylan degradation. One of the most important strategies focused is the use of heat tolerant biocatalysts which acts as strong and specific cleaving agents. Therefore, the exploration of microbial pool from extremely diversified ecosystem is considerably vital. Microbial populations from extreme habitats are keenly explored for the isolation of thermophilic entities. These thermozymes usually demonstrate fast hydrolytic rate, can produce high yields of product and are less prone to microbial contamination. Another possibility of degrading xylan continuously is the use of immobilization technique. The current work is an effort to merge both the positive aspects of thermozyme and immobilization technique. Methodology: Geobacillus stearothermophilus was isolated from soil sample collected near the blast furnace site. This thermophile is capable of producing thermostable endo-β-1,4-xylanase which cleaves xylan effectively. In the current study, this thermozyme was immobilized within a synthetic and a non-synthetic matrice for continuous production of metabolites using entrapment technique. The kinetic parameters of the free and immobilized enzyme were studied. For this purpose calcium alginate and polyacrylamide beads were prepared. Results: For the synthesis of immobilized beads, sodium alginate (40.0 gL-1) and calcium chloride (0.4 M) was used amalgamated. The temperature (50°C) and pH (7.0) optima of immobilized enzyme remained same for xylan hydrolysis however, the enzyme-substrate catalytic reaction time raised from 5.0 to 30.0 minutes as compared to free counterpart. Diffusion limit of high molecular weight xylan (corncob) caused a decline in Vmax of immobilized enzyme from 4773 to 203.7 U min-1 whereas, Km value increased from 0.5074 to 0.5722 mg ml-1 with reference to free enzyme. Immobilized endo-β-1,4-xylanase showed its stability at high temperatures as compared to free enzyme. It retained 18% and 9% residual activity at 70°C and 80°C, respectively whereas; free enzyme completely lost its activity at both temperatures. The Immobilized thermozyme displayed sufficient recycling efficiency and can be reused up to five reaction cycles, indicating that this enzyme can be a plausible candidate in paper processing industry. Conclusion: This thermozyme showed better immobilization yield and operational stability with the purpose of hydrolyzing the high molecular weight xylan. However, the enzyme immobilization properties can be improved further by immobilizing it on different supports for industrial purpose.Keywords: immobilization, reusability, thermozymes, xylanase
Procedia PDF Downloads 3742278 Building Bricks Made of Fly-Ash Mixed with Sand or Ceramic Dust: Synthesis and a Comparative Study
Authors: Md. R. Shattique, Md. T. Zaki, Md. G. Kibria
Abstract:
Fly-ash bricks give a comprehensive solution towards recycling of fly-ash and since there is no requirement of firing to produce them, they are also eco-friendly bricks; little or no carbon-dioxide is emitted during their entire production cycle. As bricks are the most essential and widely utilized building materials in the construction industry, the significance of developing an alternate eco-friendly brick is substantial in modern times. In this paper, manufacturing and potential utilization of Fly-ash made building bricks have been studied and was found to be a prospective substitute for fired clay bricks that contribute greatly to polluting the environment. Also, a comparison between sand made and ceramic dust made Fly-ash bricks have been carried out experimentally. The ceramic dust made bricks seem to show higher compressive strength at lower unit volume weight compared to sand made Fly-ash bricks. Moreover, the water absorption capacity of ceramic dust Fly-ash bricks was lower than sand made bricks. Then finally a statistical comparison between fired clay bricks and fly-ash bricks were carried out. All the requirements for good quality building bricks are matched by the fly-ash bricks. All the facts from this study pointed out that these bricks give a new opportunity for being an alternate building material.Keywords: coal fly-ash, ceramic dust, burnt clay bricks, sand, gypsum, absorption capacity, unit volume weight, compressive strength
Procedia PDF Downloads 4232277 CFD Simulation of the Pressure Distribution in the Upper Airway of an Obstructive Sleep Apnea Patient
Authors: Christina Hagen, Pragathi Kamale Gurmurthy, Thorsten M. Buzug
Abstract:
CFD simulations are performed in the upper airway of a patient suffering from obstructive sleep apnea (OSA) that is a sleep related breathing disorder characterized by repetitive partial or complete closures of the upper airways. The simulations are aimed at getting a better understanding of the pathophysiological flow patterns in an OSA patient. The simulation is compared to medical data of a sleep endoscopic examination under sedation. A digital model consisting of surface triangles of the upper airway is extracted from the MR images by a region growing segmentation process and is followed by a careful manual refinement. The computational domain includes the nasal cavity with the nostrils as the inlet areas and the pharyngeal volume with an outlet underneath the larynx. At the nostrils a flat inflow velocity profile is prescribed by choosing the velocity such that a volume flow rate of 150 ml/s is reached. Behind the larynx at the outlet a pressure of -10 Pa is prescribed. The stationary incompressible Navier-Stokes equations are numerically solved using finite elements. A grid convergence study has been performed. The results show an amplification of the maximal velocity of about 2.5 times the inlet velocity at a constriction of the pharyngeal volume in the area of the tongue. It is the same region that also shows the highest pressure drop from about 5 Pa. This is in agreement with the sleep endoscopic examinations of the same patient under sedation showing complete contractions in the area of the tongue. CFD simulations can become a useful tool in the diagnosis and therapy of obstructive sleep apnea by giving insight into the patient’s individual fluid dynamical situation in the upper airways giving a better understanding of the disease where experimental measurements are not feasible. Within this study, it could been shown on one hand that constriction areas within the upper airway lead to a significant pressure drop and on the other hand a good agreement of the area of pressure drop and the area of contraction could be shown.Keywords: biomedical engineering, obstructive sleep apnea, pharynx, upper airways
Procedia PDF Downloads 3062276 Uranoplasty Using Tongue Flap for Bilateral Clefts
Authors: Saidasanov Saidazal Shokhmurodovich, Topolnickiy Orest Zinovyevich, Afaunova Olga Arturovna
Abstract:
Relevance: Bilateral congenital cleft is one of the most complex forms of all clefts, which makes it difficult to choose a surgical method of treatment. During primary operations to close the hard and soft palate, there is a shortage of soft tissues and their lack during standard uranoplasty, and these factors aggravate the period of rehabilitation of patients. Materials and methods: The results of surgical treatment of children with bilateral cleft, who underwent uranoplasty using a flap from the tongue, were analyzed. The study used methods: clinical and statistical, which allowed us to solve the tasks, based on the principles of evidence-based medicine. Results and discussion: in our study, 15 patients were studied, who underwent surgical treatment in the following volume: uranoplasty using a flap from the tongue in two stages. Of these, 9 boys and 6 girls aged 2.5 to 6 years. The first stage was surgical treatment in the volume: veloplasty. The second stage was a surgical intervention in volume: uranoplasty using a flap from the tongue. In all patients, the width of the cleft ranged from 1.6-2.8 cm. All patients in this group were orthodontically prepared. Using this method, the surgeon can achieve the following results: maximum narrowing of the palatopharyngeal ring, long soft palate, complete closure of the hard palate, alveolar process, and the mucous membrane of the nasal cavity is also sutured, which creates good conditions for the next stage of osteoplastic surgery. Based on the result obtained, patients have positive results of working with a speech therapist. In all patients, the dynamics were positive without complications. Conclusions: Based on our observation, tongue flap uranoplasty is one of the effective techniques for patients with wide clefts of the hard and soft palate. The use of a flap from the tongue makes it possible to reduce the number of repeated reoperations and improve the quality of social adaptation of this group of patients, which is one of the important stages of rehabilitation. Upon completion of the stages of rehabilitation, all patients had the maximum improvement in functional, anatomical and social indicators.Keywords: congenital cleft lips and palate, bilateral cleft, child surgery, maxillofacial surgery
Procedia PDF Downloads 1212275 Nanoenergetic Materials as Effective Heat Energy Sources for Enhanced Gas Generators
Authors: Sang Beom Kim, Kyung Ju Kim, Myung Hoon Cho, Ji Hoon Kim, Soo Hyung Kim
Abstract:
In this study, we systematically investigated the effect of nanoscale energetic materials in formulations of aluminum nanoparticles (Al NPs; heat source)/copper oxide nanoparticles (CuO NPs; oxidizer) on the combustion and gas-generating properties of sodium azide microparticles (NaN3 MPs; gas-generating agent) for potential applications in gas generators. The burn rate of the NaN3 MP/CuO NP composite powder was only ~0.3 m/s. However, the addition of Al NPs to the NaN3 MP/CuO NP matrix caused the rates to reach ~5.3 m/s, respectively. In addition, the N2 gas volume flow rate generated by the ignition of the NaN3 MP/CuO NP composite powder was only ~0.6 L/s, which was significantly increased to ~3.9 L/s by adding Al NPs to the NaN3 MP/CuO NP composite powder. This suggested that the highly reactive NPs, with the assistance of CuO NPs, were effective heat-generating sources enabling the complete thermal decomposition of NaN3 MPs upon ignition. Al NPs were highly effective in the gas generators because of the increased reactivity induced by the reduced particle size. Finally, we successfully demonstrated that a homemade airbag with a specific volume of ~140 mL could be rapidly and fully inflated by the thermal activation of nanoscale energetic material-added gas-generating agents (i.e., NaN3 MP/Al NP/CuO NP composites) within the standard time of ~50 ms for airbag inflation.Keywords: nanoenergetic materials, aluminum nanoparticles, copper oxide nanoparticles, gas generators
Procedia PDF Downloads 3672274 Waste Management in a Hot Laboratory of Japan Atomic Energy Agency – 3: Volume Reduction and Stabilization of Solid Waste
Authors: Masaumi Nakahara, Sou Watanabe, Hiromichi Ogi, Atsuhiro Shibata, Kazunori Nomura
Abstract:
In the Japan Atomic Energy Agency, three types of experimental research, advanced reactor fuel reprocessing, radioactive waste disposal, and nuclear fuel cycle technology, have been carried out at the Chemical Processing Facility. The facility has generated high level radioactive liquid and solid wastes in hot cells. The high level radioactive solid waste is divided into three main categories, a flammable waste, a non-flammable waste, and a solid reagent waste. A plastic product is categorized into the flammable waste and molten with a heating mantle. The non-flammable waste is cut with a band saw machine for reducing the volume. Among the solid reagent waste, a used adsorbent after the experiments is heated, and an extractant is decomposed for its stabilization. All high level radioactive solid wastes in the hot cells are packed in a high level radioactive solid waste can. The high level radioactive solid waste can is transported to the 2nd High Active Solid Waste Storage in the Tokai Reprocessing Plant in the Japan Atomic Energy Agency.Keywords: high level radioactive solid waste, advanced reactor fuel reprocessing, radioactive waste disposal, nuclear fuel cycle technology
Procedia PDF Downloads 1592273 Development and Modelling of Cellulose Nano-Crystal from Agricultural Wastes for Adsorptive Removal of Pharmaceuticals in Wastewater
Authors: Abubakar Muhammad Hammari, Usman Dadum Hamza, Maryam Ibrahim, Kabir Garba, Idris Muhammad Misau, .
Abstract:
Pharmaceuticals are increasingly present in water systems, posing threats to ecosystems and human health. The effective treatment of pharmaceutical wastewater presents a significant challenge due to the complex and diverse organic and inorganic contaminants it contains. Conventional treatment methods often struggle to completely remove these pollutants due to their stability and water solubility, leading to environmental concerns and potential health risks. This research proposes the use of cellulose nanocrystals (CNCs) derived from agricultural waste as efficient and sustainable adsorbents for pharmaceutical wastewater treatment. CNCs offer high surface area, biodegradability, and low cost compared to existing options. This study evaluates the production, characterization, adsorption properties, and reusability of cellulose nanocrystals (CNCs) derived from waste paper (CNC-WP), rice husk (CNC-RH), and groundnut shell (CNC-GS). The percentage yield of CNCs was highest from wastepaper at 50.67%, followed by groundnut shell at 33.40% and rice husk at 26.46%. X-ray diffraction (XRD) confirmed the cellulose crystalline structure across all samples while scanning electron microscopy (SEM) revealed a needle-like morphology with size distribution variations. Energy-dispersive X-ray spectroscopy (EDX) identified carbon and oxygen as the primary elements, with minor residual inorganic materials varying by source. BET analysis indicated high surface areas for all CNCs, with CNC-RH exhibiting the highest value (464.592 m²/g), suggesting a more porous structure. The pore sizes of all samples fell within the meso-pore range (2.108 nm to 2.153 nm). Adsorption studies focused on metronidazole (MNZ) removal using CNC-WP. Isotherm models, including Langmuir and Sips, described the equilibrium between MNZ concentration and adsorption onto CNC-WP, showing the best fit with R² values exceeding 0.95. The adsorption process was favourable, with monolayer coverage and potential binding energy heterogeneity. Kinetic modelling identified the pseudo-second-order model as the best fit (R² = 1, SSE = 5.00 x 10-₇), indicating chemisorption as the predominant mechanism. Thermodynamic analysis revealed negative ΔG values at all temperatures, indicating spontaneous adsorption, with more favourable adsorption at higher temperatures. The adsorption process was exothermic, as indicated by negative ΔH values. Reusability studies demonstrated that CNC-WP retained high MNZ removal efficiency, with a modest decrease from 99.59% to 89.11% over ten regeneration cycles. This study highlights the efficiency of wastepaper as a raw material for CNC production and its potential for effective and reusable MNZ adsorption.Keywords: cellulose nanocrystals (CNCs), adsorption efficiency, metronidazole removal, reusability
Procedia PDF Downloads 52272 Devulcanization of Waste Rubber Using Thermomechanical Method Combined with Supercritical CO₂
Authors: L. Asaro, M. Gratton, S. Seghar, N. Poirot, N. Ait Hocine
Abstract:
Rubber waste disposal is an environmental problem. Particularly, many researches are centered in the management of discarded tires. In spite of all different ways of handling used tires, the most common is to deposit them in a landfill, creating a stock of tires. These stocks can cause fire danger and provide ambient for rodents, mosquitoes and other pests, causing health hazards and environmental problems. Because of the three-dimensional structure of the rubbers and their specific composition that include several additives, their recycling is a current technological challenge. The technique which can break down the crosslink bonds in the rubber is called devulcanization. Strictly, devulcanization can be defined as a process where poly-, di-, and mono-sulfidic bonds, formed during vulcanization, are totally or partially broken. In the recent years, super critical carbon dioxide (scCO₂) was proposed as a green devulcanization atmosphere. This is because it is chemically inactive, nontoxic, nonflammable and inexpensive. Its critical point can be easily reached (31.1 °C and 7.38 MPa), and residual scCO₂ in the devulcanized rubber can be easily and rapidly removed by releasing pressure. In this study thermomechanical devulcanization of ground tire rubber (GTR) was performed in a twin screw extruder under diverse operation conditions. Supercritical CO₂ was added in different quantities to promote the devulcanization. Temperature, screw speed and quantity of CO₂ were the parameters that were varied during the process. The devulcanized rubber was characterized by its devulcanization percent and crosslink density by swelling in toluene. Infrared spectroscopy (FTIR) and Gel permeation chromatography (GPC) were also done, and the results were related with the Mooney viscosity. The results showed that the crosslink density decreases as the extruder temperature and speed increases, and, as expected, the soluble fraction increase with both parameters. The Mooney viscosity of the devulcanized rubber decreases as the extruder temperature increases. The reached values were in good correlation (R= 0.96) with de the soluble fraction. In order to analyze if the devulcanization was caused by main chains or crosslink scission, the Horikx's theory was used. Results showed that all tests fall in the curve that corresponds to the sulfur bond scission, which indicates that the devulcanization has successfully happened without degradation of the rubber. In the spectra obtained by FTIR, it was observed that none of the characteristic peaks of the GTR were modified by the different devulcanization conditions. This was expected, because due to the low sulfur content (~1.4 phr) and the multiphasic composition of the GTR, it is very difficult to evaluate the devulcanization by this technique. The lowest crosslink density was reached with 1 cm³/min of CO₂, and the power consumed in that process was also near to the minimum. These results encourage us to do further analyses to better understand the effect of the different conditions on the devulcanization process. The analysis is currently extended to monophasic rubbers as ethylene propylene diene monomer rubber (EPDM) and natural rubber (NR).Keywords: devulcanization, recycling, rubber, waste
Procedia PDF Downloads 3902271 A New Intelligent, Dynamic and Real Time Management System of Sewerage
Authors: R. Tlili Yaakoubi, H.Nakouri, O. Blanpain, S. Lallahem
Abstract:
The current tools for real time management of sewer systems are based on two software tools: the software of weather forecast and the software of hydraulic simulation. The use of the first ones is an important cause of imprecision and uncertainty, the use of the second requires temporal important steps of decision because of their need in times of calculation. This way of proceeding fact that the obtained results are generally different from those waited. The major idea of this project is to change the basic paradigm by approaching the problem by the "automatic" face rather than by that "hydrology". The objective is to make possible the realization of a large number of simulations at very short times (a few seconds) allowing to take place weather forecasts by using directly the real time meditative pluviometric data. The aim is to reach a system where the decision-making is realized from reliable data and where the correction of the error is permanent. A first model of control laws was realized and tested with different return-period rainfalls. The gains obtained in rejecting volume vary from 19 to 100 %. The development of a new algorithm was then used to optimize calculation time and thus to overcome the subsequent combinatorial problem in our first approach. Finally, this new algorithm was tested with 16- year-rainfall series. The obtained gains are 40 % of total volume rejected to the natural environment and of 65 % in the number of discharges.Keywords: automation, optimization, paradigm, RTC
Procedia PDF Downloads 3012270 A Novel Computer-Generated Hologram (CGH) Achieved Scheme Generated from Point Cloud by Using a Lens Array
Authors: Wei-Na Li, Mei-Lan Piao, Nam Kim
Abstract:
We proposed a novel computer-generated hologram (CGH) achieved scheme, wherein the CGH is generated from a point cloud which is transformed by a mapping relationship of a series of elemental images captured from a real three-dimensional (3D) object by using a lens array. This scheme is composed of three procedures: mapping from elemental images to point cloud, hologram generation, and hologram display. A mapping method is figured out to achieve a virtual volume date (point cloud) from a series of elemental images. This mapping method consists of two steps. Firstly, the coordinate (x, y) pairs and its appearing number are calculated from the series of sub-images, which are generated from the elemental images. Secondly, a series of corresponding coordinates (x, y, z) are calculated from the elemental images. Then a hologram is generated from the volume data that is calculated by the previous two steps. Eventually, a spatial light modulator (SLM) and a green laser beam are utilized to display this hologram and reconstruct the original 3D object. In this paper, in order to show a more auto stereoscopic display of a real 3D object, we successfully obtained the actual depth data of every discrete point of the real 3D object, and overcame the inherent drawbacks of the depth camera by obtaining point cloud from the elemental images.Keywords: elemental image, point cloud, computer-generated hologram (CGH), autostereoscopic display
Procedia PDF Downloads 5852269 Methods Used to Achieve Airtightness of 0.07 Ach@50Pa for an Industrial Building
Authors: G. Wimmers
Abstract:
The University of Northern British Columbia needed a new laboratory building for the Master of Engineering in Integrated Wood Design Program and its new Civil Engineering Program. Since the University is committed to reducing its environmental footprint and because the Master of Engineering Program is actively involved in research of energy efficient buildings, the decision was made to request the energy efficiency of the Passive House Standard in the Request for Proposals. The building is located in Prince George in Northern British Columbia, a city located at the northern edge of climate zone 6 with an average low between -8 and -10.5 in the winter months. The footprint of the building is 30m x 30m with a height of about 10m. The building consists of a large open space for the shop and laboratory with a small portion of the floorplan being two floors, allowing for a mezzanine level with a few offices as well as mechanical and storage rooms. The total net floor area is 1042m² and the building’s gross volume 9686m³. One key requirement of the Passive House Standard is the airtight envelope with an airtightness of < 0.6 ach@50Pa. In the past, we have seen that this requirement can be challenging to reach for industrial buildings. When testing for air tightness, it is important to test in both directions, pressurization, and depressurization, since the airflow through all leakages of the building will, in reality, happen simultaneously in both directions. A specific detail or situation such as overlapping but not sealed membranes might be airtight in one direction, due to the valve effect, but are opening up when tested in the opposite direction. In this specific project, the advantage was the overall very compact envelope and the good volume to envelope area ratio. The building had to be very airtight and the details for the windows and doors installation as well as all transitions from walls to roof and floor, the connections of the prefabricated wall panels and all penetrations had to be carefully developed to allow for maximum airtightness. The biggest challenges were the specific components of this industrial building, the large bay door for semi-trucks and the dust extraction system for the wood processing machinery. The testing was carried out in accordance with EN 132829 (method A) as specified in the International Passive House Standard and the volume calculation was also following the Passive House guideline resulting in a net volume of 7383m3, excluding all walls, floors and suspended ceiling volumes. This paper will explore the details and strategies used to achieve an airtightness of 0.07 ach@50Pa, to the best of our knowledge the lowest value achieved in North America so far following the test protocol of the International Passive House Standard and discuss the crucial steps throughout the project phases and the most challenging details.Keywords: air changes, airtightness, envelope design, industrial building, passive house
Procedia PDF Downloads 1482268 A Quality Index Optimization Method for Non-Invasive Fetal ECG Extraction
Authors: Lucia Billeci, Gennaro Tartarisco, Maurizio Varanini
Abstract:
Fetal cardiac monitoring by fetal electrocardiogram (fECG) can provide significant clinical information about the healthy condition of the fetus. Despite this potentiality till now the use of fECG in clinical practice has been quite limited due to the difficulties in its measuring. The recovery of fECG from the signals acquired non-invasively by using electrodes placed on the maternal abdomen is a challenging task because abdominal signals are a mixture of several components and the fetal one is very weak. This paper presents an approach for fECG extraction from abdominal maternal recordings, which exploits the characteristics of pseudo-periodicity of fetal ECG. It consists of devising a quality index (fQI) for fECG and of finding the linear combinations of preprocessed abdominal signals, which maximize these fQI (quality index optimization - QIO). It aims at improving the performances of the most commonly adopted methods for fECG extraction, usually based on maternal ECG (mECG) estimating and canceling. The procedure for the fECG extraction and fetal QRS (fQRS) detection is completely unsupervised and based on the following steps: signal pre-processing; maternal ECG (mECG) extraction and maternal QRS detection; mECG component approximation and canceling by weighted principal component analysis; fECG extraction by fQI maximization and fetal QRS detection. The proposed method was compared with our previously developed procedure, which obtained the highest at the Physionet/Computing in Cardiology Challenge 2013. That procedure was based on removing the mECG from abdominal signals estimated by a principal component analysis (PCA) and applying the Independent component Analysis (ICA) on the residual signals. Both methods were developed and tuned using 69, 1 min long, abdominal measurements with fetal QRS annotation of the dataset A provided by PhysioNet/Computing in Cardiology Challenge 2013. The QIO-based and the ICA-based methods were compared in analyzing two databases of abdominal maternal ECG available on the Physionet site. The first is the Abdominal and Direct Fetal Electrocardiogram Database (ADdb) which contains the fetal QRS annotations thus allowing a quantitative performance comparison, the second is the Non-Invasive Fetal Electrocardiogram Database (NIdb), which does not contain the fetal QRS annotations so that the comparison between the two methods can be only qualitative. In particular, the comparison on NIdb was performed defining an index of quality for the fetal RR series. On the annotated database ADdb the QIO method, provided the performance indexes Sens=0.9988, PPA=0.9991, F1=0.9989 overcoming the ICA-based one, which provided Sens=0.9966, PPA=0.9972, F1=0.9969. The comparison on NIdb was performed defining an index of quality for the fetal RR series. The index of quality resulted higher for the QIO-based method compared to the ICA-based one in 35 records out 55 cases of the NIdb. The QIO-based method gave very high performances with both the databases. The results of this study foresees the application of the algorithm in a fully unsupervised way for the implementation in wearable devices for self-monitoring of fetal health.Keywords: fetal electrocardiography, fetal QRS detection, independent component analysis (ICA), optimization, wearable
Procedia PDF Downloads 2812267 The Admitting Hemogram as a Predictor for Severity and in-Hospital Mortality in Acute Pancreatitis
Authors: Florge Francis A. Sy
Abstract:
Acute pancreatitis (AP) is an inflammatory condition of the pancreas with local and systemic complications. Severe acute pancreatitis (SAP) has a higher mortality rate. Laboratory parameters like the neutrophil-to-lymphocyte ratio (NLR), red cell distribution width (RDW), and mean platelet volume (MPV) have been associated with SAP but with conflicting results. This study aims to determine the predictive value of these parameters on the severity and in-hospital mortality of AP. This retrospective, cross-sectional study was done in a private hospital in Cebu City, Philippines. One-hundred five patients were classified according to severity based on the modified Marshall scoring. The admitting hemogram, including the NLR, RDW, and MPV, was obtained from the complete blood count (CBC). Cut-off values for severity and in-hospital mortality were derived from the ROC. Association between NLR, RDW, and MPV with SAP and mortality were determined with a p-value of < 0.05 considered significant. The mean age for AP was 47.6 years, with 50.5% being male. Most had an unknown cause (49.5%), followed by a biliary cause (37.1%). Of the 105 patients, 23 patients had SAP, and 4 died. Older age, longer in-hospital duration, congestive heart failure, elevated creatinine, urea nitrogen, and white blood cell count were seen in SAP. The NLR was associated with in-hospital mortality using a cut-off of > 10.6 (OR 1.133, 95% CI, p-value 0.003) with 100% sensitivity, 70.3% specificity, 11.76% PPV and 100% NPV (AUC 0.855). The NLR was not associated with SAP. The RDW and MPV were not associated with SAP and mortality. The admitting NLR is, therefore, an easily accessible parameter that can predict in-hospital mortality in acute pancreatitis. Although the present study did not show an association of NLR with SAP nor RDW and MPV with both SAP and mortality, further studies are suggested to establish their clinical value.Keywords: acute pancreatitis, mean platelet volume, neutrophil-lymphocyte ratio, red cell distribution width
Procedia PDF Downloads 1252266 The Application of Cellulose-Based Halloysite-Carbon Adsorbent to Remove Chloroxylenol from Water
Authors: Laura Frydel
Abstract:
Chloroxylenol is a common ingredient in disinfectants. Due to the use of this compound in large amounts, it is more and more often detected in rivers, sewage, and also in human body fluids. In recent years, there have been concerns about the potentially harmful effects of chloroxylenol on human health and the environment. This paper presents the synthesis, a brief characterization and the use of a halloysite-carbon adsorbent for the removal of chloroxylenol from water. The template in the halloysite-carbon adsorbent was acid treated bleached halloysite, and the carbon precursor was cellulose dissolved in zinc (II) chloride, which was dissolved in 37% hydrochloric acid. The FTIR spectra before and after the adsorption process allowed to determine the presence of functional groups, bonds in the halloysite-carbon composite, and the binding mechanism of the adsorbent and adsorbate. The morphology of the bleached halloysite sample and the sample of the halloysite-carbon adsorbent were characterized by scanning electron microscopy (SEM) with surface analysis by X-ray dispersion spectrometry (EDS). The specific surface area, total pore volume and mesopore and micropore volume were determined using the ASAP 2020 volumetric adsorption analyzer. Total carbon and total organic carbon were determined for the halloysite-carbon adsorbent. The halloysite-carbon adsorbent was used to remove chloroxylenol from water. The degree of removal of chloroxylenol from water using the halloysite-carbon adsorbent was about 90%. Adsorption studies show that the halloysite-carbon composite can be used as an effective adsorbent for removing chloroxylenol from water.Keywords: adsorption, cellulose, chloroxylenol, halloysite
Procedia PDF Downloads 1912265 Accurate Cortical Reconstruction in Narrow Sulci with Zero-Non-Zero Distance (ZNZD) Vector Field
Authors: Somojit Saha, Rohit K. Chatterjee, Sarit K. Das, Avijit Kar
Abstract:
A new force field is designed for propagation of the parametric contour into deep narrow cortical fold in the application of knowledge based reconstruction of cerebral cortex from MR image of brain. Designing of this force field is highly inspired by the Generalized Gradient Vector Flow (GGVF) model and markedly differs in manipulation of image information in order to determine the direction of propagation of the contour. While GGVF uses edge map as its main driving force, the newly designed force field uses the map of distance between zero valued pixels and their nearest non-zero valued pixel as its main driving force. Hence, it is called Zero-Non-Zero Distance (ZNZD) force field. The objective of this force field is forceful propagation of the contour beyond spurious convergence due to partial volume effect (PVE) in to narrow sulcal fold. Being function of the corresponding non-zero pixel value, the force field has got an inherent property to determine spuriousness of the edge automatically. It is effectively applied along with some morphological processing in the application of cortical reconstruction to breach the hindrance of PVE in narrow sulci where conventional GGVF fails.Keywords: deformable model, external force field, partial volume effect, cortical reconstruction, MR image of brain
Procedia PDF Downloads 3982264 NENU2PHAR: PHA-Based Materials from Micro-Algae for High-Volume Consumer Products
Authors: Enrique Moliner, Alba Lafarga, Isaac Herraiz, Evelina Castellana, Mihaela Mirea
Abstract:
NENU2PHAR (GA 887474) is an EU-funded project aimed at the development of polyhydroxyalkanoates (PHAs) from micro-algae. These biobased and biodegradable polymers are being tested and validated in different high-volume market applications including food packaging, cosmetic packaging, 3D printing filaments, agro-textiles and medical devices, counting on the support of key players like Danone, BEL Group, Sofradim or IFG. At the moment the project has achieved to produce PHAs from micro-algae with a cumulated yield around 17%, i.e. 1 kg PHAs produced from 5.8 kg micro-algae biomass, which in turn capture 11 kg CO₂ for growing up. These algae-based plastics can therefore offer the same environmental benefits than current bio-based plastics (reduction of greenhouse gas emissions and fossil resource depletion), using a 3rd generation biomass feedstock that avoids the competition with food and the environmental impacts of agricultural practices. The project is also dealing with other sustainability aspects like the ecodesign and life cycle assessment of the plastic products targeted, considering not only the use of the biobased plastics but also many other ecodesign strategies. This paper will present the main progresses and results achieved to date in the project.Keywords: NENU2PHAR, Polyhydroxyalkanoates, micro-algae, biopolymer, ecodesign, life cycle assessment
Procedia PDF Downloads 912263 Investigation of Changes of Physical Properties of the Poplar Wood in Radial and Longitudinal Axis at Chaaloos Zone
Authors: Afshin Veisi
Abstract:
In this study, the physical properties of wood in poplar wood (Populous sp.) were analyzed in longitudinal and radial directions of the stem. Three Populous Alba tree were cut in chaloos zone and from each tree, 3 discs were selected at 130cm, half of tree and under of crown. The test samples from pith to bark (heartwood to sapwood) were prepared from these discs for measuring the involved properties such as, wet, dry and critical specific gravity, porosity, volume shrinkage and swelling based on the ASTM standard, and data in two radial and longitudinal directions in the trank were statistically analyzed. Such as, variations of wet, dry and critical specific gravity had in radial direction respectively: irregular increase, increase and increase, and in longitudinal direction respectively: irregular decrease, irregular increase and increase. Results of variations to moisture content and porosity show that in radial direction respectively: irregular increasing and decreasing, and in longitudinal direction from down to up respectively: irregular decreasing and stability. Volume shrinkage and swelling variations show in radial direction irregular and in longitudinal axial regular decreasing.Keywords: poplar wood, physical properties, shrinkage, swelling, critical specific gravity, wet specific gravity, dry specific gravity
Procedia PDF Downloads 2782262 Efficacy Of Tranexamic Acid On Blood Loss After Primary Total Hip Replacement : A Case-control Study In 154 Patients
Authors: Fedili Benamar, Belloulou Mohamed Lamine, Ouahes Hassane, Ghattas Samir
Abstract:
Introduction: Perioperative blood loss is a frequent cause of complications in total hip replacement (THR). The present prospective study assessed the efficacy of tranexamic acid (Exacyl(®)) in reducing blood loss in primary THR. Hypothesis: Tranexamic acid reduces blood loss in THR. Material and method: -This is a prospective randomized study on the effectiveness of Exacyl (tranexamic acid) in total hip replacement surgery performed on a standardized technique between 2019 and September 2022. -It involved 154 patients, of which 84 received a single injection of Exacyl (group 1) at a dosage of 10 mg/kg over 20 minutes during the perioperative period. -All patients received postoperative thromboprophylaxis with enoxaparin 0.4 ml subcutaneously. -All patients were admitted to the post-interventional intensive care unit for a duration of 24 hours for monitoring and pain management as per the service protocol. Results: 154 patients, of which 84 received a single injection of Exacyl (group 1) and 70 patients patients who did not receive Exacyl perioperatively : (Group 2 ) The average age is 57 +/- 15 years The distribution by gender was nearly equal with 56% male and 44% female; "The distribution according to the ASA score was as follows: 20.2% ASA1, 82.3% ASA2, and 17.5% ASA3. "There was a significant difference in the average volume of intraoperative and postoperative bleeding during the 48 hours." The average bleeding volume for group 1 (received Exacyl) was 614 ml +/- 228, while the average bleeding volume for group 2 was 729 +/- 300, with a chi-square test of 6.35 and a p-value < 0.01, which is highly significant. The ANOVA test showed an F-statistic of 7.11 and a p-value of 0.008. A Bartlett test revealed a chi-square of 6.35 and a p-value < 0.01." "In Group 1 (patients who received Exacyl), 73% had bleeding less than 750 ml (Group A), and 26% had bleeding exceeding 750 ml (Group B). In Group 2 (patients who did not receive Exacyl perioperatively), 52% had bleeding less than 750 ml (Group A), and 47% had bleeding exceeding 750 ml (Group B). "Thus, the use of Exacyl reduced perioperative bleeding and specifically decreased the risk of severe bleeding exceeding 750 ml by 43% with a relative risk (RR) of 1.37 and a p-value < 0.01. The transfusion rate was 1.19% in the population of Group 1 (Exacyl), whereas it was 10% in the population of Group 2 (no Exacyl). It can be stated that the use of Exacyl resulted in a reduction in perioperative blood transfusion with an RR of 0.1 and a p-value of 0.02. Conclusions: The use of Exacyl significantly reduced perioperative bleeding in this type of surgery.Keywords: acid tranexamic, blood loss, anesthesia, total hip replacement, surgery
Procedia PDF Downloads 772261 Conception of a Regulated, Dynamic and Intelligent Sewerage in Ostrevent
Authors: Rabaa Tlili Yaakoubi, Hind Nakouri, Olivier Blanpain
Abstract:
The current tools for real time management of sewer systems are based on two software tools: the software of weather forecast and the software of hydraulic simulation. The use of the first ones is an important cause of imprecision and uncertainty, the use of the second requires temporal important steps of decision because of their need in times of calculation. This way of proceeding fact that the obtained results are generally different from those waited. The major idea of the CARDIO project is to change the basic paradigm by approaching the problem by the "automatic" face rather than by that "hydrology". The objective is to make possible the realization of a large number of simulations at very short times (a few seconds) allowing to take place weather forecasts by using directly the real time meditative pluviometric data. The aim is to reach a system where the decision-making is realized from reliable data and where the correction of the error is permanent. A first model of control laws was realized and tested with different return-period rainfalls. The gains obtained in rejecting volume vary from 40 to 100%. The development of a new algorithm was then used to optimize calculation time and thus to overcome the subsequent combinatorial problem in our first approach. Finally, this new algorithm was tested with 16- year-rainfall series. The obtained gains are 60% of total volume rejected to the natural environment and of 80 % in the number of discharges.Keywords: RTC, paradigm, optimization, automation
Procedia PDF Downloads 2842260 Effect of Plastic Deformation on the Carbide-Free Bainite Transformation in Medium C-Si Steel
Authors: Mufath Zorgani, Carlos Garcia-Mateo, Mohammad Jahazi
Abstract:
In this study, the influence of pre-strained austenite on the extent of isothermal bainite transformation in medium-carbon, high-silicon steel was investigated. Different amounts of deformations were applied at 600°C on the austenite right before quenching to the region, where isothermal bainitic transformation is activated. Four different temperatures of 325, 350, 375, and 400°C considering similar holding time 1800s at each temperature, were selected to investigate the extent of isothermal bainitic transformation. The results showed that the deformation-free austenite transforms to the higher volume fraction of CFB bainite when the isothermal transformation temperature reduced from 400 to 325°C, the introduction of plastic deformation in austenite prior to the formation of bainite invariably involves a delay of the same or identical isothermal treatment. On the other side, when the isothermal transformation temperature and deformation increases, the volume fraction and the plate thickness of bainite decreases and the amount of retained austenite increases. The shape of retained austenite is mostly representing blocky-shape one due to the less amount of transformed bainite. Moreover, the plate-like shape bainite cannot be resolved when the deformation amount reached 30%, and the isothermal transformation temperatures are of 375 and 400°C. The amount of retained austenite and the percentage of its transformation to martensite during the final cooling stage play a significant role in the variation of hardness level for different thermomechanical regimes.Keywords: ausforming, carbide free bainite, dilatometry, microstructure
Procedia PDF Downloads 1292259 Tool for Maxillary Sinus Quantification in Computed Tomography Exams
Authors: Guilherme Giacomini, Ana Luiza Menegatti Pavan, Allan Felipe Fattori Alves, Marcela de Oliveira, Fernando Antonio Bacchim Neto, José Ricardo de Arruda Miranda, Seizo Yamashita, Diana Rodrigues de Pina
Abstract:
The maxillary sinus (MS), part of the paranasal sinus complex, is one of the most enigmatic structures in modern humans. The literature has suggested that MSs function as olfaction accessories, to heat or humidify inspired air, for thermoregulation, to impart resonance to the voice and others. Thus, the real function of the MS is still uncertain. Furthermore, the MS anatomy is complex and varies from person to person. Many diseases may affect the development process of sinuses. The incidence of rhinosinusitis and other pathoses in the MS is comparatively high, so, volume analysis has clinical value. Providing volume values for MS could be helpful in evaluating the presence of any abnormality and could be used for treatment planning and evaluation of the outcome. The computed tomography (CT) has allowed a more exact assessment of this structure, which enables a quantitative analysis. However, this is not always possible in the clinical routine, and if possible, it involves much effort and/or time. Therefore, it is necessary to have a convenient, robust, and practical tool correlated with the MS volume, allowing clinical applicability. Nowadays, the available methods for MS segmentation are manual or semi-automatic. Additionally, manual methods present inter and intraindividual variability. Thus, the aim of this study was to develop an automatic tool to quantity the MS volume in CT scans of paranasal sinuses. This study was developed with ethical approval from the authors’ institutions and national review panels. The research involved 30 retrospective exams of University Hospital, Botucatu Medical School, São Paulo State University, Brazil. The tool for automatic MS quantification, developed in Matlab®, uses a hybrid method, combining different image processing techniques. For MS detection, the algorithm uses a Support Vector Machine (SVM), by features such as pixel value, spatial distribution, shape and others. The detected pixels are used as seed point for a region growing (RG) segmentation. Then, morphological operators are applied to reduce false-positive pixels, improving the segmentation accuracy. These steps are applied in all slices of CT exam, obtaining the MS volume. To evaluate the accuracy of the developed tool, the automatic method was compared with manual segmentation realized by an experienced radiologist. For comparison, we used Bland-Altman statistics, linear regression, and Jaccard similarity coefficient. From the statistical analyses for the comparison between both methods, the linear regression showed a strong association and low dispersion between variables. The Bland–Altman analyses showed no significant differences between the analyzed methods. The Jaccard similarity coefficient was > 0.90 in all exams. In conclusion, the developed tool to quantify MS volume proved to be robust, fast, and efficient, when compared with manual segmentation. Furthermore, it avoids the intra and inter-observer variations caused by manual and semi-automatic methods. As future work, the tool will be applied in clinical practice. Thus, it may be useful in the diagnosis and treatment determination of MS diseases. Providing volume values for MS could be helpful in evaluating the presence of any abnormality and could be used for treatment planning and evaluation of the outcome. The computed tomography (CT) has allowed a more exact assessment of this structure which enables a quantitative analysis. However, this is not always possible in the clinical routine, and if possible, it involves much effort and/or time. Therefore, it is necessary to have a convenient, robust and practical tool correlated with the MS volume, allowing clinical applicability. Nowadays, the available methods for MS segmentation are manual or semi-automatic. Additionally, manual methods present inter and intraindividual variability. Thus, the aim of this study was to develop an automatic tool to quantity the MS volume in CT scans of paranasal sinuses. This study was developed with ethical approval from the authors’ institutions and national review panels. The research involved 30 retrospective exams of University Hospital, Botucatu Medical School, São Paulo State University, Brazil. The tool for automatic MS quantification, developed in Matlab®, uses a hybrid method, combining different image processing techniques. For MS detection, the algorithm uses a Support Vector Machine (SVM), by features such as pixel value, spatial distribution, shape and others. The detected pixels are used as seed point for a region growing (RG) segmentation. Then, morphological operators are applied to reduce false-positive pixels, improving the segmentation accuracy. These steps are applied in all slices of CT exam, obtaining the MS volume. To evaluate the accuracy of the developed tool, the automatic method was compared with manual segmentation realized by an experienced radiologist. For comparison, we used Bland-Altman statistics, linear regression and Jaccard similarity coefficient. From the statistical analyses for the comparison between both methods, the linear regression showed a strong association and low dispersion between variables. The Bland–Altman analyses showed no significant differences between the analyzed methods. The Jaccard similarity coefficient was > 0.90 in all exams. In conclusion, the developed tool to automatically quantify MS volume proved to be robust, fast and efficient, when compared with manual segmentation. Furthermore, it avoids the intra and inter-observer variations caused by manual and semi-automatic methods. As future work, the tool will be applied in clinical practice. Thus, it may be useful in the diagnosis and treatment determination of MS diseases.Keywords: maxillary sinus, support vector machine, region growing, volume quantification
Procedia PDF Downloads 5042258 Boussinesq Model for Dam-Break Flow Analysis
Authors: Najibullah M, Soumendra Nath Kuiry
Abstract:
Dams and reservoirs are perceived for their estimable alms to irrigation, water supply, flood control, electricity generation, etc. which civilize the prosperity and wealth of society across the world. Meantime the dam breach could cause devastating flood that can threat to the human lives and properties. Failures of large dams remain fortunately very seldom events. Nevertheless, a number of occurrences have been recorded in the world, corresponding in an average to one to two failures worldwide every year. Some of those accidents have caused catastrophic consequences. So it is decisive to predict the dam break flow for emergency planning and preparedness, as it poses high risk to life and property. To mitigate the adverse impact of dam break, modeling is necessary to gain a good understanding of the temporal and spatial evolution of the dam-break floods. This study will mainly deal with one-dimensional (1D) dam break modeling. Less commonly used in the hydraulic research community, another possible option for modeling the rapidly varied dam-break flows is the extended Boussinesq equations (BEs), which can describe the dynamics of short waves with a reasonable accuracy. Unlike the Shallow Water Equations (SWEs), the BEs taken into account the wave dispersion and non-hydrostatic pressure distribution. To capture the dam-break oscillations accurately it is very much needed of at least fourth-order accurate numerical scheme to discretize the third-order dispersion terms present in the extended BEs. The scope of this work is therefore to develop an 1D fourth-order accurate in both space and time Boussinesq model for dam-break flow analysis by using finite-volume / finite difference scheme. The spatial discretization of the flux and dispersion terms achieved through a combination of finite-volume and finite difference approximations. The flux term, was solved using a finite-volume discretization whereas the bed source and dispersion term, were discretized using centered finite-difference scheme. Time integration achieved in two stages, namely the third-order Adams Basforth predictor stage and the fourth-order Adams Moulton corrector stage. Implementation of the 1D Boussinesq model done using PYTHON 2.7.5. Evaluation of the performance of the developed model predicted as compared with the volume of fluid (VOF) based commercial model ANSYS-CFX. The developed model is used to analyze the risk of cascading dam failures similar to the Panshet dam failure in 1961 that took place in Pune, India. Nevertheless, this model can be used to predict wave overtopping accurately compared to shallow water models for designing coastal protection structures.Keywords: Boussinesq equation, Coastal protection, Dam-break flow, One-dimensional model
Procedia PDF Downloads 2322257 Backwash Optimization for Drinking Water Treatment Biological Filters
Authors: Sarra K. Ikhlef, Onita Basu
Abstract:
Natural organic matter (NOM) removal efficiency using drinking water treatment biological filters can be highly influenced by backwashing conditions. Backwashing has the ability to remove the accumulated biomass and particles in order to regenerate the biological filters' removal capacity and prevent excessive headloss buildup. A lab scale system consisting of 3 biological filters was used in this study to examine the implications of different backwash strategies on biological filtration performance. The backwash procedures were evaluated based on their impacts on dissolved organic carbon (DOC) removals, biological filters’ biomass, backwash water volume usage, and particle removal. Results showed that under nutrient limited conditions, the simultaneous use of air and water under collapse pulsing conditions lead to a DOC removal of 22% which was significantly higher (p>0.05) than the 12% removal observed under water only backwash conditions. Employing a bed expansion of 20% under nutrient supplemented conditions compared to a 30% reference bed expansion while using the same amount of water volume lead to similar DOC removals. On the other hand, utilizing a higher bed expansion (40%) lead to significantly lower DOC removals (23%). Also, a backwash strategy that reduced the backwash water volume usage by about 20% resulted in similar DOC removals observed with the reference backwash. The backwash procedures investigated in this study showed no consistent impact on biological filters' biomass concentrations as measured by the phospholipids and the adenosine tri-phosphate (ATP) methods. Moreover, none of these two analyses showed a direct correlation with DOC removal. On the other hand, dissolved oxygen (DO) uptake showed a direct correlation with DOC removals. The addition of the extended terminal subfluidization wash (ETSW) demonstrated no apparent impact on DOC removals. ETSW also successfully eliminated the filter ripening sequence (FRS). As a result, the additional water usage resulting from implementing ETSW was compensated by water savings after restart. Results from this study provide insight to researchers and water treatment utilities on how to better optimize the backwashing procedure for the goal of optimizing the overall biological filtration process.Keywords: biological filtration, backwashing, collapse pulsing, ETSW
Procedia PDF Downloads 2742256 Using Geopolymer Technology on Stabilization and Reutilization the Expansion Behavior Slag
Authors: W. H. Lee, T. W. Cheng, K. Y. Lin, S. W. Huang, Y. C. Ding
Abstract:
Basic Oxygen Furnace (BOF) Slag and electric arc furnace (EAF) slag is the by-product of iron making and steel making. Each of slag with produced over 100 million tons annually in Taiwan. The type of slag has great engineering properties, such as, high hardness and density, high compressive strength, low abrasion ratio, and can replace natural aggregate for building materials. However, no matter BOF or EAF slag, both have the expansion problem, due to it contains free lime. The purpose of this study was to stabilize the BOF and EAF slag by using geopolymer technology, hoping can prevent and solve the expansion problem. The experimental results showed that using geopolymer technology can successfully solve and prevent the expansion problem. Their main properties are analyzed with regard to their use as building materials. Autoclave is used to study the volume stability of these specimens. Finally, the compressive strength of geopolymer mortar with BOF/FAF slag can be reached over 21MPa after curing for 28 days. After autoclave testing, the volume expansion does not exceed 0.2%. Even after the autoclave test, the compressive strength can be grown to over 35MPa. In this study have success using these results on ready-mixed concrete plant, and have the same experimental results as laboratory scale. These results gave encouragement that the stabilized and reutilized BOF/EAF slag could be replaced as a feasible natural fine aggregate by using geopolymer technology.Keywords: BOF slag, EAF slag, autoclave test, geopolymer
Procedia PDF Downloads 1342255 Stability Analysis of Stagnation-Point Flow past a Shrinking Sheet in a Nanofluid
Authors: Amin Noor, Roslinda Nazar, Norihan Md. Arifin
Abstract:
In this paper, a numerical and theoretical study has been performed for the stagnation-point boundary layer flow and heat transfer towards a shrinking sheet in a nanofluid. The mathematical nanofluid model in which the effect of the nanoparticle volume fraction is taken into account is considered. The governing nonlinear partial differential equations are transformed into a system of nonlinear ordinary differential equations using a similarity transformation which is then solved numerically using the function bvp4c from Matlab. Numerical results are obtained for the skin friction coefficient, the local Nusselt number as well as the velocity and temperature profiles for some values of the governing parameters, namely the nanoparticle volume fraction Φ, the shrinking parameter λ and the Prandtl number Pr. Three different types of nanoparticles are considered, namely Cu, Al2O3 and TiO2. It is found that solutions do not exist for larger shrinking rates and dual (upper and lower branch) solutions exist when λ < -1.0. A stability analysis has been performed to show which branch solutions are stable and physically realizable. It is also found that the upper branch solutions are stable while the lower branch solutions are unstable.Keywords: heat transfer, nanofluid, shrinking sheet, stability analysis, stagnation-point flow
Procedia PDF Downloads 3822254 Ridership Study for the Proposed Installation of Automatic Guide-way Transit (AGT) System along Sapphire Street in Balanga City, Bataan
Authors: Nelson Andres, Meeko C. Masangcap, John Denver D. Catapang
Abstract:
Balanga City as, the heart of Bataan, is a growing City and is now at its fast pace of development. The growth of commerce in the city results to an increase in commuters who travel back and forth through the city, leading to congestions. Consequently, queuing of vehicles along national roads and even in the highways of the city have become a regular occurrence. This common scenario of commuters flocking the city, private and public vehicles going bumper to bumper, especially during the rush hours, greatly affect the flow of traffic vehicles and is now a burden not only to the commuters but also to the government who is trying to address this dilemma. Seeing these terrible events, the implementation of an elevated Automated Guide-way transit is seen as a possible solution to help in the decongestion of the affected parts of Balanga City.In response to the problem, the researchers identify if it is feasible to have an elevated guide-way transit in the vicinity of Sapphire Street in Balanga City, Bataan. Specifically, the study aims to determine who will be the riders based on the demographic profile, where the trip can be generated and distributed, the time when volume of people usually peaks and the estimated volume of passengers. Statistical analysis is applied to the data gathered to find out if there is an important relationship between the demographic profile of the respondents and their preference of having an elevated railway transit in the City of Balanga.Keywords: ridership, AGT, railway, elevated track
Procedia PDF Downloads 832253 Investigation of Bubble Growth During Nucleate Boiling Using CFD
Authors: K. Jagannath, Akhilesh Kotian, S. S. Sharma, Achutha Kini U., P. R. Prabhu
Abstract:
Boiling process is characterized by the rapid formation of vapour bubbles at the solid–liquid interface (nucleate boiling) with pre-existing vapour or gas pockets. Computational fluid dynamics (CFD) is an important tool to study bubble dynamics. In the present study, CFD simulation has been carried out to determine the bubble detachment diameter and its terminal velocity. Volume of fluid method is used to model the bubble and the surrounding by solving single set of momentum equations and tracking the volume fraction of each of the fluids throughout the domain. In the simulation, bubble is generated by allowing water-vapour to enter a cylinder filled with liquid water through an inlet at the bottom. After the bubble is fully formed, the bubble detaches from the surface and rises up during which the bubble accelerates due to the net balance between buoyancy force and viscous drag. Finally when these forces exactly balance each other, it attains a constant terminal velocity. The bubble detachment diameter and the terminal velocity of the bubble are captured by the monitor function provided in FLUENT. The detachment diameter and the terminal velocity obtained is compared with the established results based on the shape of the bubble. A good agreement is obtained between the results obtained from simulation and the equations in comparison with the established results.Keywords: bubble growth, computational fluid dynamics, detachment diameter, terminal velocity
Procedia PDF Downloads 3872252 Numerical Simulation of Three-Dimensional Cavitating Turbulent Flow in Francis Turbines with ANSYS
Authors: Raza Abdulla Saeed
Abstract:
In this study, the three-dimensional cavitating turbulent flow in a complete Francis turbine is simulated using mixture model for cavity/liquid two-phase flows. Numerical analysis is carried out using ANSYS CFX software release 12, and standard k-ε turbulence model is adopted for this analysis. The computational fluid domain consist of spiral casing, stay vanes, guide vanes, runner and draft tube. The computational domain is discretized with a three-dimensional mesh system of unstructured tetrahedron mesh. The finite volume method (FVM) is used to solve the governing equations of the mixture model. Results of cavitation on the runner’s blades under three different boundary conditions are presented and discussed. From the numerical results it has been found that the numerical method was successfully applied to simulate the cavitating two-phase turbulent flow through a Francis turbine, and also cavitation is clearly predicted in the form of water vapor formation inside the turbine. By comparison the numerical prediction results with a real runner; it’s shown that the region of higher volume fraction obtained by simulation is consistent with the region of runner cavitation damage.Keywords: computational fluid dynamics, hydraulic francis turbine, numerical simulation, two-phase mixture cavitation model
Procedia PDF Downloads 5622251 Finite Volume Method Simulations of GaN Growth Process in MOVPE Reactor
Authors: J. Skibinski, P. Caban, T. Wejrzanowski, K. J. Kurzydlowski
Abstract:
In the present study, numerical simulations of heat and mass transfer during gallium nitride growth process in Metal Organic Vapor Phase Epitaxy reactor AIX-200/4RF-S is addressed. Existing knowledge about phenomena occurring in the MOVPE process allows to produce high quality nitride based semiconductors. However, process parameters of MOVPE reactors can vary in certain ranges. Main goal of this study is optimization of the process and improvement of the quality of obtained crystal. In order to investigate this subject a series of computer simulations have been performed. Numerical simulations of heat and mass transfer in GaN epitaxial growth process have been performed to determine growth rate for various mass flow rates and pressures of reagents. According to the fact that it’s impossible to determine experimentally the exact distribution of heat and mass transfer inside the reactor during the process, modeling is the only solution to understand the process precisely. Main heat transfer mechanisms during MOVPE process are convection and radiation. Correlation of modeling results with the experiment allows to determine optimal process parameters for obtaining crystals of highest quality.Keywords: Finite Volume Method, semiconductors, epitaxial growth, metalorganic vapor phase epitaxy, gallium nitride
Procedia PDF Downloads 400