Search results for: erosion and wear volume
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3611

Search results for: erosion and wear volume

2231 Investigation of Changes of Physical Properties of the Poplar Wood in Radial and Longitudinal Axis at Chaaloos Zone

Authors: Afshin Veisi

Abstract:

In this study, the physical properties of wood in poplar wood (Populous sp.) were analyzed in longitudinal and radial directions of the stem. Three Populous Alba tree were cut in chaloos zone and from each tree, 3 discs were selected at 130cm, half of tree and under of crown. The test samples from pith to bark (heartwood to sapwood) were prepared from these discs for measuring the involved properties such as, wet, dry and critical specific gravity, porosity, volume shrinkage and swelling based on the ASTM standard, and data in two radial and longitudinal directions in the trank were statistically analyzed. Such as, variations of wet, dry and critical specific gravity had in radial direction respectively: irregular increase, increase and increase, and in longitudinal direction respectively: irregular decrease, irregular increase and increase. Results of variations to moisture content and porosity show that in radial direction respectively: irregular increasing and decreasing, and in longitudinal direction from down to up respectively: irregular decreasing and stability. Volume shrinkage and swelling variations show in radial direction irregular and in longitudinal axial regular decreasing.

Keywords: poplar wood, physical properties, shrinkage, swelling, critical specific gravity, wet specific gravity, dry specific gravity

Procedia PDF Downloads 277
2230 Efficacy Of Tranexamic Acid On Blood Loss After Primary Total Hip Replacement : A Case-control Study In 154 Patients

Authors: Fedili Benamar, Belloulou Mohamed Lamine, Ouahes Hassane, Ghattas Samir

Abstract:

Introduction: Perioperative blood loss is a frequent cause of complications in total hip replacement (THR). The present prospective study assessed the efficacy of tranexamic acid (Exacyl(®)) in reducing blood loss in primary THR. Hypothesis: Tranexamic acid reduces blood loss in THR. Material and method: -This is a prospective randomized study on the effectiveness of Exacyl (tranexamic acid) in total hip replacement surgery performed on a standardized technique between 2019 and September 2022. -It involved 154 patients, of which 84 received a single injection of Exacyl (group 1) at a dosage of 10 mg/kg over 20 minutes during the perioperative period. -All patients received postoperative thromboprophylaxis with enoxaparin 0.4 ml subcutaneously. -All patients were admitted to the post-interventional intensive care unit for a duration of 24 hours for monitoring and pain management as per the service protocol. Results: 154 patients, of which 84 received a single injection of Exacyl (group 1) and 70 patients patients who did not receive Exacyl perioperatively : (Group 2 ) The average age is 57 +/- 15 years The distribution by gender was nearly equal with 56% male and 44% female; "The distribution according to the ASA score was as follows: 20.2% ASA1, 82.3% ASA2, and 17.5% ASA3. "There was a significant difference in the average volume of intraoperative and postoperative bleeding during the 48 hours." The average bleeding volume for group 1 (received Exacyl) was 614 ml +/- 228, while the average bleeding volume for group 2 was 729 +/- 300, with a chi-square test of 6.35 and a p-value < 0.01, which is highly significant. The ANOVA test showed an F-statistic of 7.11 and a p-value of 0.008. A Bartlett test revealed a chi-square of 6.35 and a p-value < 0.01." "In Group 1 (patients who received Exacyl), 73% had bleeding less than 750 ml (Group A), and 26% had bleeding exceeding 750 ml (Group B). In Group 2 (patients who did not receive Exacyl perioperatively), 52% had bleeding less than 750 ml (Group A), and 47% had bleeding exceeding 750 ml (Group B). "Thus, the use of Exacyl reduced perioperative bleeding and specifically decreased the risk of severe bleeding exceeding 750 ml by 43% with a relative risk (RR) of 1.37 and a p-value < 0.01. The transfusion rate was 1.19% in the population of Group 1 (Exacyl), whereas it was 10% in the population of Group 2 (no Exacyl). It can be stated that the use of Exacyl resulted in a reduction in perioperative blood transfusion with an RR of 0.1 and a p-value of 0.02. Conclusions: The use of Exacyl significantly reduced perioperative bleeding in this type of surgery.

Keywords: acid tranexamic, blood loss, anesthesia, total hip replacement, surgery

Procedia PDF Downloads 77
2229 Conception of a Regulated, Dynamic and Intelligent Sewerage in Ostrevent

Authors: Rabaa Tlili Yaakoubi, Hind Nakouri, Olivier Blanpain

Abstract:

The current tools for real time management of sewer systems are based on two software tools: the software of weather forecast and the software of hydraulic simulation. The use of the first ones is an important cause of imprecision and uncertainty, the use of the second requires temporal important steps of decision because of their need in times of calculation. This way of proceeding fact that the obtained results are generally different from those waited. The major idea of the CARDIO project is to change the basic paradigm by approaching the problem by the "automatic" face rather than by that "hydrology". The objective is to make possible the realization of a large number of simulations at very short times (a few seconds) allowing to take place weather forecasts by using directly the real time meditative pluviometric data. The aim is to reach a system where the decision-making is realized from reliable data and where the correction of the error is permanent. A first model of control laws was realized and tested with different return-period rainfalls. The gains obtained in rejecting volume vary from 40 to 100%. The development of a new algorithm was then used to optimize calculation time and thus to overcome the subsequent combinatorial problem in our first approach. Finally, this new algorithm was tested with 16- year-rainfall series. The obtained gains are 60% of total volume rejected to the natural environment and of 80 % in the number of discharges.

Keywords: RTC, paradigm, optimization, automation

Procedia PDF Downloads 284
2228 Effect of Plastic Deformation on the Carbide-Free Bainite Transformation in Medium C-Si Steel

Authors: Mufath Zorgani, Carlos Garcia-Mateo, Mohammad Jahazi

Abstract:

In this study, the influence of pre-strained austenite on the extent of isothermal bainite transformation in medium-carbon, high-silicon steel was investigated. Different amounts of deformations were applied at 600°C on the austenite right before quenching to the region, where isothermal bainitic transformation is activated. Four different temperatures of 325, 350, 375, and 400°C considering similar holding time 1800s at each temperature, were selected to investigate the extent of isothermal bainitic transformation. The results showed that the deformation-free austenite transforms to the higher volume fraction of CFB bainite when the isothermal transformation temperature reduced from 400 to 325°C, the introduction of plastic deformation in austenite prior to the formation of bainite invariably involves a delay of the same or identical isothermal treatment. On the other side, when the isothermal transformation temperature and deformation increases, the volume fraction and the plate thickness of bainite decreases and the amount of retained austenite increases. The shape of retained austenite is mostly representing blocky-shape one due to the less amount of transformed bainite. Moreover, the plate-like shape bainite cannot be resolved when the deformation amount reached 30%, and the isothermal transformation temperatures are of 375 and 400°C. The amount of retained austenite and the percentage of its transformation to martensite during the final cooling stage play a significant role in the variation of hardness level for different thermomechanical regimes.

Keywords: ausforming, carbide free bainite, dilatometry, microstructure

Procedia PDF Downloads 128
2227 Tool for Maxillary Sinus Quantification in Computed Tomography Exams

Authors: Guilherme Giacomini, Ana Luiza Menegatti Pavan, Allan Felipe Fattori Alves, Marcela de Oliveira, Fernando Antonio Bacchim Neto, José Ricardo de Arruda Miranda, Seizo Yamashita, Diana Rodrigues de Pina

Abstract:

The maxillary sinus (MS), part of the paranasal sinus complex, is one of the most enigmatic structures in modern humans. The literature has suggested that MSs function as olfaction accessories, to heat or humidify inspired air, for thermoregulation, to impart resonance to the voice and others. Thus, the real function of the MS is still uncertain. Furthermore, the MS anatomy is complex and varies from person to person. Many diseases may affect the development process of sinuses. The incidence of rhinosinusitis and other pathoses in the MS is comparatively high, so, volume analysis has clinical value. Providing volume values for MS could be helpful in evaluating the presence of any abnormality and could be used for treatment planning and evaluation of the outcome. The computed tomography (CT) has allowed a more exact assessment of this structure, which enables a quantitative analysis. However, this is not always possible in the clinical routine, and if possible, it involves much effort and/or time. Therefore, it is necessary to have a convenient, robust, and practical tool correlated with the MS volume, allowing clinical applicability. Nowadays, the available methods for MS segmentation are manual or semi-automatic. Additionally, manual methods present inter and intraindividual variability. Thus, the aim of this study was to develop an automatic tool to quantity the MS volume in CT scans of paranasal sinuses. This study was developed with ethical approval from the authors’ institutions and national review panels. The research involved 30 retrospective exams of University Hospital, Botucatu Medical School, São Paulo State University, Brazil. The tool for automatic MS quantification, developed in Matlab®, uses a hybrid method, combining different image processing techniques. For MS detection, the algorithm uses a Support Vector Machine (SVM), by features such as pixel value, spatial distribution, shape and others. The detected pixels are used as seed point for a region growing (RG) segmentation. Then, morphological operators are applied to reduce false-positive pixels, improving the segmentation accuracy. These steps are applied in all slices of CT exam, obtaining the MS volume. To evaluate the accuracy of the developed tool, the automatic method was compared with manual segmentation realized by an experienced radiologist. For comparison, we used Bland-Altman statistics, linear regression, and Jaccard similarity coefficient. From the statistical analyses for the comparison between both methods, the linear regression showed a strong association and low dispersion between variables. The Bland–Altman analyses showed no significant differences between the analyzed methods. The Jaccard similarity coefficient was > 0.90 in all exams. In conclusion, the developed tool to quantify MS volume proved to be robust, fast, and efficient, when compared with manual segmentation. Furthermore, it avoids the intra and inter-observer variations caused by manual and semi-automatic methods. As future work, the tool will be applied in clinical practice. Thus, it may be useful in the diagnosis and treatment determination of MS diseases. Providing volume values for MS could be helpful in evaluating the presence of any abnormality and could be used for treatment planning and evaluation of the outcome. The computed tomography (CT) has allowed a more exact assessment of this structure which enables a quantitative analysis. However, this is not always possible in the clinical routine, and if possible, it involves much effort and/or time. Therefore, it is necessary to have a convenient, robust and practical tool correlated with the MS volume, allowing clinical applicability. Nowadays, the available methods for MS segmentation are manual or semi-automatic. Additionally, manual methods present inter and intraindividual variability. Thus, the aim of this study was to develop an automatic tool to quantity the MS volume in CT scans of paranasal sinuses. This study was developed with ethical approval from the authors’ institutions and national review panels. The research involved 30 retrospective exams of University Hospital, Botucatu Medical School, São Paulo State University, Brazil. The tool for automatic MS quantification, developed in Matlab®, uses a hybrid method, combining different image processing techniques. For MS detection, the algorithm uses a Support Vector Machine (SVM), by features such as pixel value, spatial distribution, shape and others. The detected pixels are used as seed point for a region growing (RG) segmentation. Then, morphological operators are applied to reduce false-positive pixels, improving the segmentation accuracy. These steps are applied in all slices of CT exam, obtaining the MS volume. To evaluate the accuracy of the developed tool, the automatic method was compared with manual segmentation realized by an experienced radiologist. For comparison, we used Bland-Altman statistics, linear regression and Jaccard similarity coefficient. From the statistical analyses for the comparison between both methods, the linear regression showed a strong association and low dispersion between variables. The Bland–Altman analyses showed no significant differences between the analyzed methods. The Jaccard similarity coefficient was > 0.90 in all exams. In conclusion, the developed tool to automatically quantify MS volume proved to be robust, fast and efficient, when compared with manual segmentation. Furthermore, it avoids the intra and inter-observer variations caused by manual and semi-automatic methods. As future work, the tool will be applied in clinical practice. Thus, it may be useful in the diagnosis and treatment determination of MS diseases.

Keywords: maxillary sinus, support vector machine, region growing, volume quantification

Procedia PDF Downloads 504
2226 Boussinesq Model for Dam-Break Flow Analysis

Authors: Najibullah M, Soumendra Nath Kuiry

Abstract:

Dams and reservoirs are perceived for their estimable alms to irrigation, water supply, flood control, electricity generation, etc. which civilize the prosperity and wealth of society across the world. Meantime the dam breach could cause devastating flood that can threat to the human lives and properties. Failures of large dams remain fortunately very seldom events. Nevertheless, a number of occurrences have been recorded in the world, corresponding in an average to one to two failures worldwide every year. Some of those accidents have caused catastrophic consequences. So it is decisive to predict the dam break flow for emergency planning and preparedness, as it poses high risk to life and property. To mitigate the adverse impact of dam break, modeling is necessary to gain a good understanding of the temporal and spatial evolution of the dam-break floods. This study will mainly deal with one-dimensional (1D) dam break modeling. Less commonly used in the hydraulic research community, another possible option for modeling the rapidly varied dam-break flows is the extended Boussinesq equations (BEs), which can describe the dynamics of short waves with a reasonable accuracy. Unlike the Shallow Water Equations (SWEs), the BEs taken into account the wave dispersion and non-hydrostatic pressure distribution. To capture the dam-break oscillations accurately it is very much needed of at least fourth-order accurate numerical scheme to discretize the third-order dispersion terms present in the extended BEs. The scope of this work is therefore to develop an 1D fourth-order accurate in both space and time Boussinesq model for dam-break flow analysis by using finite-volume / finite difference scheme. The spatial discretization of the flux and dispersion terms achieved through a combination of finite-volume and finite difference approximations. The flux term, was solved using a finite-volume discretization whereas the bed source and dispersion term, were discretized using centered finite-difference scheme. Time integration achieved in two stages, namely the third-order Adams Basforth predictor stage and the fourth-order Adams Moulton corrector stage. Implementation of the 1D Boussinesq model done using PYTHON 2.7.5. Evaluation of the performance of the developed model predicted as compared with the volume of fluid (VOF) based commercial model ANSYS-CFX. The developed model is used to analyze the risk of cascading dam failures similar to the Panshet dam failure in 1961 that took place in Pune, India. Nevertheless, this model can be used to predict wave overtopping accurately compared to shallow water models for designing coastal protection structures.

Keywords: Boussinesq equation, Coastal protection, Dam-break flow, One-dimensional model

Procedia PDF Downloads 232
2225 Water Monitoring Sentinel Cloud Platform: Water Monitoring Platform Based on Satellite Imagery and Modeling Data

Authors: Alberto Azevedo, Ricardo Martins, André B. Fortunato, Anabela Oliveira

Abstract:

Water is under severe threat today because of the rising population, increased agricultural and industrial needs, and the intensifying effects of climate change. Due to sea-level rise, erosion, and demographic pressure, the coastal regions are of significant concern to the scientific community. The Water Monitoring Sentinel Cloud platform (WORSICA) service is focused on providing new tools for monitoring water in coastal and inland areas, taking advantage of remote sensing, in situ and tidal modeling data. WORSICA is a service that can be used to determine the coastline, coastal inundation areas, and the limits of inland water bodies using remote sensing (satellite and Unmanned Aerial Vehicles - UAVs) and in situ data (from field surveys). It applies to various purposes, from determining flooded areas (from rainfall, storms, hurricanes, or tsunamis) to detecting large water leaks in major water distribution networks. This service was built on components developed in national and European projects, integrated to provide a one-stop-shop service for remote sensing information, integrating data from the Copernicus satellite and drone/unmanned aerial vehicles, validated by existing online in-situ data. Since WORSICA is operational using the European Open Science Cloud (EOSC) computational infrastructures, the service can be accessed via a web browser and is freely available to all European public research groups without additional costs. In addition, the private sector will be able to use the service, but some usage costs may be applied, depending on the type of computational resources needed by each application/user. Although the service has three main sub-services i) coastline detection; ii) inland water detection; iii) water leak detection in irrigation networks, in the present study, an application of the service to Óbidos lagoon in Portugal is shown, where the user can monitor the evolution of the lagoon inlet and estimate the topography of the intertidal areas without any additional costs. The service has several distinct methodologies implemented based on the computations of the water indexes (e.g., NDWI, MNDWI, AWEI, and AWEIsh) retrieved from the satellite image processing. In conjunction with the tidal data obtained from the FES model, the system can estimate a coastline with the corresponding level or even topography of the inter-tidal areas based on the Flood2Topo methodology. The outcomes of the WORSICA service can be helpful for several intervention areas such as i) emergency by providing fast access to inundated areas to support emergency rescue operations; ii) support of management decisions on hydraulic infrastructures operation to minimize damage downstream; iii) climate change mitigation by minimizing water losses and reduce water mains operation costs; iv) early detection of water leakages in difficult-to-access water irrigation networks, promoting their fast repair.

Keywords: remote sensing, coastline detection, water detection, satellite data, sentinel, Copernicus, EOSC

Procedia PDF Downloads 126
2224 Backwash Optimization for Drinking Water Treatment Biological Filters

Authors: Sarra K. Ikhlef, Onita Basu

Abstract:

Natural organic matter (NOM) removal efficiency using drinking water treatment biological filters can be highly influenced by backwashing conditions. Backwashing has the ability to remove the accumulated biomass and particles in order to regenerate the biological filters' removal capacity and prevent excessive headloss buildup. A lab scale system consisting of 3 biological filters was used in this study to examine the implications of different backwash strategies on biological filtration performance. The backwash procedures were evaluated based on their impacts on dissolved organic carbon (DOC) removals, biological filters’ biomass, backwash water volume usage, and particle removal. Results showed that under nutrient limited conditions, the simultaneous use of air and water under collapse pulsing conditions lead to a DOC removal of 22% which was significantly higher (p>0.05) than the 12% removal observed under water only backwash conditions. Employing a bed expansion of 20% under nutrient supplemented conditions compared to a 30% reference bed expansion while using the same amount of water volume lead to similar DOC removals. On the other hand, utilizing a higher bed expansion (40%) lead to significantly lower DOC removals (23%). Also, a backwash strategy that reduced the backwash water volume usage by about 20% resulted in similar DOC removals observed with the reference backwash. The backwash procedures investigated in this study showed no consistent impact on biological filters' biomass concentrations as measured by the phospholipids and the adenosine tri-phosphate (ATP) methods. Moreover, none of these two analyses showed a direct correlation with DOC removal. On the other hand, dissolved oxygen (DO) uptake showed a direct correlation with DOC removals. The addition of the extended terminal subfluidization wash (ETSW) demonstrated no apparent impact on DOC removals. ETSW also successfully eliminated the filter ripening sequence (FRS). As a result, the additional water usage resulting from implementing ETSW was compensated by water savings after restart. Results from this study provide insight to researchers and water treatment utilities on how to better optimize the backwashing procedure for the goal of optimizing the overall biological filtration process.

Keywords: biological filtration, backwashing, collapse pulsing, ETSW

Procedia PDF Downloads 273
2223 Using Geopolymer Technology on Stabilization and Reutilization the Expansion Behavior Slag

Authors: W. H. Lee, T. W. Cheng, K. Y. Lin, S. W. Huang, Y. C. Ding

Abstract:

Basic Oxygen Furnace (BOF) Slag and electric arc furnace (EAF) slag is the by-product of iron making and steel making. Each of slag with produced over 100 million tons annually in Taiwan. The type of slag has great engineering properties, such as, high hardness and density, high compressive strength, low abrasion ratio, and can replace natural aggregate for building materials. However, no matter BOF or EAF slag, both have the expansion problem, due to it contains free lime. The purpose of this study was to stabilize the BOF and EAF slag by using geopolymer technology, hoping can prevent and solve the expansion problem. The experimental results showed that using geopolymer technology can successfully solve and prevent the expansion problem. Their main properties are analyzed with regard to their use as building materials. Autoclave is used to study the volume stability of these specimens. Finally, the compressive strength of geopolymer mortar with BOF/FAF slag can be reached over 21MPa after curing for 28 days. After autoclave testing, the volume expansion does not exceed 0.2%. Even after the autoclave test, the compressive strength can be grown to over 35MPa. In this study have success using these results on ready-mixed concrete plant, and have the same experimental results as laboratory scale. These results gave encouragement that the stabilized and reutilized BOF/EAF slag could be replaced as a feasible natural fine aggregate by using geopolymer technology.

Keywords: BOF slag, EAF slag, autoclave test, geopolymer

Procedia PDF Downloads 133
2222 Stability Analysis of Stagnation-Point Flow past a Shrinking Sheet in a Nanofluid

Authors: Amin Noor, Roslinda Nazar, Norihan Md. Arifin

Abstract:

In this paper, a numerical and theoretical study has been performed for the stagnation-point boundary layer flow and heat transfer towards a shrinking sheet in a nanofluid. The mathematical nanofluid model in which the effect of the nanoparticle volume fraction is taken into account is considered. The governing nonlinear partial differential equations are transformed into a system of nonlinear ordinary differential equations using a similarity transformation which is then solved numerically using the function bvp4c from Matlab. Numerical results are obtained for the skin friction coefficient, the local Nusselt number as well as the velocity and temperature profiles for some values of the governing parameters, namely the nanoparticle volume fraction Φ, the shrinking parameter λ and the Prandtl number Pr. Three different types of nanoparticles are considered, namely Cu, Al2O3 and TiO2. It is found that solutions do not exist for larger shrinking rates and dual (upper and lower branch) solutions exist when λ < -1.0. A stability analysis has been performed to show which branch solutions are stable and physically realizable. It is also found that the upper branch solutions are stable while the lower branch solutions are unstable.

Keywords: heat transfer, nanofluid, shrinking sheet, stability analysis, stagnation-point flow

Procedia PDF Downloads 382
2221 Ridership Study for the Proposed Installation of Automatic Guide-way Transit (AGT) System along Sapphire Street in Balanga City, Bataan

Authors: Nelson Andres, Meeko C. Masangcap, John Denver D. Catapang

Abstract:

Balanga City as, the heart of Bataan, is a growing City and is now at its fast pace of development. The growth of commerce in the city results to an increase in commuters who travel back and forth through the city, leading to congestions. Consequently, queuing of vehicles along national roads and even in the highways of the city have become a regular occurrence. This common scenario of commuters flocking the city, private and public vehicles going bumper to bumper, especially during the rush hours, greatly affect the flow of traffic vehicles and is now a burden not only to the commuters but also to the government who is trying to address this dilemma. Seeing these terrible events, the implementation of an elevated Automated Guide-way transit is seen as a possible solution to help in the decongestion of the affected parts of Balanga City.In response to the problem, the researchers identify if it is feasible to have an elevated guide-way transit in the vicinity of Sapphire Street in Balanga City, Bataan. Specifically, the study aims to determine who will be the riders based on the demographic profile, where the trip can be generated and distributed, the time when volume of people usually peaks and the estimated volume of passengers. Statistical analysis is applied to the data gathered to find out if there is an important relationship between the demographic profile of the respondents and their preference of having an elevated railway transit in the City of Balanga.

Keywords: ridership, AGT, railway, elevated track

Procedia PDF Downloads 81
2220 Investigation of Bubble Growth During Nucleate Boiling Using CFD

Authors: K. Jagannath, Akhilesh Kotian, S. S. Sharma, Achutha Kini U., P. R. Prabhu

Abstract:

Boiling process is characterized by the rapid formation of vapour bubbles at the solid–liquid interface (nucleate boiling) with pre-existing vapour or gas pockets. Computational fluid dynamics (CFD) is an important tool to study bubble dynamics. In the present study, CFD simulation has been carried out to determine the bubble detachment diameter and its terminal velocity. Volume of fluid method is used to model the bubble and the surrounding by solving single set of momentum equations and tracking the volume fraction of each of the fluids throughout the domain. In the simulation, bubble is generated by allowing water-vapour to enter a cylinder filled with liquid water through an inlet at the bottom. After the bubble is fully formed, the bubble detaches from the surface and rises up during which the bubble accelerates due to the net balance between buoyancy force and viscous drag. Finally when these forces exactly balance each other, it attains a constant terminal velocity. The bubble detachment diameter and the terminal velocity of the bubble are captured by the monitor function provided in FLUENT. The detachment diameter and the terminal velocity obtained is compared with the established results based on the shape of the bubble. A good agreement is obtained between the results obtained from simulation and the equations in comparison with the established results.

Keywords: bubble growth, computational fluid dynamics, detachment diameter, terminal velocity

Procedia PDF Downloads 385
2219 Numerical Simulation of Three-Dimensional Cavitating Turbulent Flow in Francis Turbines with ANSYS

Authors: Raza Abdulla Saeed

Abstract:

In this study, the three-dimensional cavitating turbulent flow in a complete Francis turbine is simulated using mixture model for cavity/liquid two-phase flows. Numerical analysis is carried out using ANSYS CFX software release 12, and standard k-ε turbulence model is adopted for this analysis. The computational fluid domain consist of spiral casing, stay vanes, guide vanes, runner and draft tube. The computational domain is discretized with a three-dimensional mesh system of unstructured tetrahedron mesh. The finite volume method (FVM) is used to solve the governing equations of the mixture model. Results of cavitation on the runner’s blades under three different boundary conditions are presented and discussed. From the numerical results it has been found that the numerical method was successfully applied to simulate the cavitating two-phase turbulent flow through a Francis turbine, and also cavitation is clearly predicted in the form of water vapor formation inside the turbine. By comparison the numerical prediction results with a real runner; it’s shown that the region of higher volume fraction obtained by simulation is consistent with the region of runner cavitation damage.

Keywords: computational fluid dynamics, hydraulic francis turbine, numerical simulation, two-phase mixture cavitation model

Procedia PDF Downloads 560
2218 The Interactive Wearable Toy "+Me", for the Therapy of Children with Autism Spectrum Disorders: Preliminary Results

Authors: Beste Ozcan, Valerio Sperati, Laura Romano, Tania Moretta, Simone Scaffaro, Noemi Faedda, Federica Giovannone, Carla Sogos, Vincenzo Guidetti, Gianluca Baldassarre

Abstract:

+me is an experimental interactive toy with the appearance of a soft, pillow-like, panda. Shape and consistency are designed to arise emotional attachment in young children: a child can wear it around his/her neck and treat it as a companion (i.e. a transitional object). When caressed on paws or head, the panda emits appealing, interesting outputs like colored lights or amusing sounds, thanks to embedded electronics. Such sensory patterns can be modified through a wirelessly connected tablet: by this, an adult caregiver can adapt +me responses to a child's reactions or requests, for example, changing the light hue or the type of sound. The toy control is therefore shared, as it depends on both the child (who handles the panda) and the adult (who manages the tablet and mediates the sensory input-output contingencies). These features make +me a potential tool for therapy with children with Neurodevelopmental Disorders (ND), characterized by impairments in the social area, like Autism Spectrum Disorders (ASD) and Language Disorders (LD): as a proposal, the toy could be used together with a therapist, in rehabilitative play activities aimed at encouraging simple social interactions and reinforcing basic relational and communication skills. +me was tested in two pilot experiments, the first one involving 15 Typically Developed (TD) children aged in 8-34 months, the second one involving 7 children with ASD, and 7 with LD, aged in 30-48 months. In both studies a researcher/caregiver, during a one-to-one, ten-minute activity plays with the panda and encourages the child to do the same. The purpose of both studies was to ascertain the general acceptability of the device as an interesting toy that is an object able to capture the child's attention and to maintain a high motivation to interact with it and with the adult. Behavioral indexes for estimating the interplay between the child, +me and caregiver were rated from the video recording of the experimental sessions. Preliminary results show how -on average- participants from 3 groups exhibit a good engagement: they touch, caress, explore the panda and show enjoyment when they manage to trigger luminous and sound responses. During the experiments, children tend to imitate the caregiver's actions on +me, often looking (and smiling) at him/her. Interesting behavioral differences between TD, ASD, and LD groups are scored: for example, ASD participants produce a fewer number of smiles both to panda and to a caregiver with respect to TD group, while LD scores stand between ASD and TD subjects. These preliminary observations suggest that the interactive toy +me is able to raise and maintain the interest of toddlers and therefore it can be reasonably used as a supporting tool during therapy, to stimulate pivotal social skills as imitation, turn-taking, eye contact, and social smiles. Interestingly, the young age of participants, along with the behavioral differences between groups, seem to suggest a further potential use of the device: a tool for early differential diagnosis (the average age of a child

Keywords: autism spectrum disorders, interactive toy, social interaction, therapy, transitional wearable companion

Procedia PDF Downloads 123
2217 Finite Volume Method Simulations of GaN Growth Process in MOVPE Reactor

Authors: J. Skibinski, P. Caban, T. Wejrzanowski, K. J. Kurzydlowski

Abstract:

In the present study, numerical simulations of heat and mass transfer during gallium nitride growth process in Metal Organic Vapor Phase Epitaxy reactor AIX-200/4RF-S is addressed. Existing knowledge about phenomena occurring in the MOVPE process allows to produce high quality nitride based semiconductors. However, process parameters of MOVPE reactors can vary in certain ranges. Main goal of this study is optimization of the process and improvement of the quality of obtained crystal. In order to investigate this subject a series of computer simulations have been performed. Numerical simulations of heat and mass transfer in GaN epitaxial growth process have been performed to determine growth rate for various mass flow rates and pressures of reagents. According to the fact that it’s impossible to determine experimentally the exact distribution of heat and mass transfer inside the reactor during the process, modeling is the only solution to understand the process precisely. Main heat transfer mechanisms during MOVPE process are convection and radiation. Correlation of modeling results with the experiment allows to determine optimal process parameters for obtaining crystals of highest quality.

Keywords: Finite Volume Method, semiconductors, epitaxial growth, metalorganic vapor phase epitaxy, gallium nitride

Procedia PDF Downloads 398
2216 Formal History Teaching and Lifeworld Literacies: Developing Transversal Skills as an Embodied Learning Outcomes in Historical Research Projects

Authors: Paul Flynn, Luke O’Donnell

Abstract:

There is a pressing societal need for educators in formal and non-formal settings to develop pedagogical frameworks, programmes, and interventions that support the development of transversal skills for life beyond the classroom. These skills include communication, collaboration, interpersonal relationship building, problem-solving, and planning, and organizational skills; or lifeworld literacies encountered first hand. This is particularly true for young people aged between 15-18. This demographic represents both the future of society and those best positioned to take advantage of well-designed, structured educational supports within and across formal and non-formal settings. Secondary school history has been identified as an appropriate area of study which deftly develops many of those transversal skills so crucial to positive societal engagement. However, in the formal context, students often challenge history’s relevance to their own lived experience and dismiss it as a study option. In response to such challenges, teachers will often design stimulating lessons which are often well-received. That said, some students continue to question modern-day connections, presenting a persistent and pervasive classroom distraction. The continuing decline in numbers opting to study second-level history indicates an erosion of what should be a critical opportunity to develop all-important lifeworld literacies within formal education. In contrast, students readily acknowledge relevance in non-formal settings where many participants meaningfully engage with history by way of student-focused activities. Furthermore, many do so without predesigned pedagogical aids which support transversal skills development as embodied learning outcomes. As this paper will present, there is a dearth of work pertaining to the circular subject of history and its embodied learning outcomes, including lifeworld literacies, in formal and non-formal settings. While frequently challenging to reconcile formal (often defined by strict curricula and examination processes), and non-formal engagement with history, opportunities do exist. In the Irish context, this is exemplified by a popular university outreach programme: breaking the SEAL. This programme supports second-level history students as they fulfill curriculum requirements in completing a research study report. This report is a student-led research project pulling on communication skills, collaboration with peers and teachers, interpersonal relationships, problem-solving, and planning and organizational skills. Completion of this process has been widely recognized as excellent preparation not only for higher education (third level) but work-life demands as well. Within a formal education setting, the RSR harnesses non-formal learning virtues and exposes students to limited aspects of independent learning that relate to a professional work setting –a lifeworld literacy. Breaking the SEAL provides opportunities for students to enhance their lifeworld literacy by engaging in an independent research and learning process within the protective security of the classroom and its teacher. This paper will highlight the critical role this programme plays in preparing participating students (n=315) for life after compulsory education and presents examples of how lifeworld literacies may be developed through a scaffolded process of historical research and reporting anchored in non-formal contexts.

Keywords: history, education, literacy, transversal skills

Procedia PDF Downloads 168
2215 Titanium Alloys for Cryogenic Gas Bottle Applications: A Comparative Study

Authors: Bhanu Pant, Sanjay H. Upadhyay

Abstract:

Titanium alloys, owing to their high specific strength coupled with excellent resistance to corrosion in many severe environments, find extensive usage in the aerospace sector. Alpha and beta lean Titanium alloys have an additional characteristic of exhibiting high toughness with an NTS/ UTS ratio greater than one down to liquid oxygen and liquid helium temperatures. The cryogenic stage of high-performance rockets utilizes cryo-fluid submerged pressurizing tanks to improve volume to mass performance factor. A superior volume-to-mass ratio is achieved for LH2-submerged pressurizing tanks as compared to those submerged in LOX. Such high-efficiency tanks for LH2 submerged application necessitate the use of difficult to process alpha type Ti5Al2.5Sn-ELI alloy, which requires close control of process parameters to develop the tanks. In the present paper, a comparison of this alpha-type cryogenic Titanium alloy has been brought out with conventional alpha-beta Ti6Al4V-ELI alloy, which is usable up to LOX temperatures. Specific challenges faced during the development of these cryogenic pressurizing tanks for a launch vehicle based on the author's experience are included in the paper on the comparatively lesser-studied alpha Ti5Al2.5Sn-ELI alloy.

Keywords: cryogenic tanks, titanium Alloys, NTS/UTS ratio, alpha and alpha-beta ELI alloys

Procedia PDF Downloads 62
2214 Sentinel-2 Based Burn Area Severity Assessment Tool in Google Earth Engine

Authors: D. Madhushanka, Y. Liu, H. C. Fernando

Abstract:

Fires are one of the foremost factors of land surface disturbance in diverse ecosystems, causing soil erosion and land-cover changes and atmospheric effects affecting people's lives and properties. Generally, the severity of the fire is calculated as the Normalized Burn Ratio (NBR) index. This is performed manually by comparing two images obtained afterward. Then by using the bitemporal difference of the preprocessed satellite images, the dNBR is calculated. The burnt area is then classified as either unburnt (dNBR<0.1) or burnt (dNBR>= 0.1). Furthermore, Wildfire Severity Assessment (WSA) classifies burnt areas and unburnt areas using classification levels proposed by USGS and comprises seven classes. This procedure generates a burn severity report for the area chosen by the user manually. This study is carried out with the objective of producing an automated tool for the above-mentioned process, namely the World Wildfire Severity Assessment Tool (WWSAT). It is implemented in Google Earth Engine (GEE), which is a free cloud-computing platform for satellite data processing, with several data catalogs at different resolutions (notably Landsat, Sentinel-2, and MODIS) and planetary-scale analysis capabilities. Sentinel-2 MSI is chosen to obtain regular processes related to burnt area severity mapping using a medium spatial resolution sensor (15m). This tool uses machine learning classification techniques to identify burnt areas using NBR and to classify their severity over the user-selected extent and period automatically. Cloud coverage is one of the biggest concerns when fire severity mapping is performed. In WWSAT based on GEE, we present a fully automatic workflow to aggregate cloud-free Sentinel-2 images for both pre-fire and post-fire image compositing. The parallel processing capabilities and preloaded geospatial datasets of GEE facilitated the production of this tool. This tool consists of a Graphical User Interface (GUI) to make it user-friendly. The advantage of this tool is the ability to obtain burn area severity over a large extent and more extended temporal periods. Two case studies were carried out to demonstrate the performance of this tool. The Blue Mountain national park forest affected by the Australian fire season between 2019 and 2020 is used to describe the workflow of the WWSAT. This site detected more than 7809 km2, using Sentinel-2 data, giving an error below 6.5% when compared with the area detected on the field. Furthermore, 86.77% of the detected area was recognized as fully burnt out, of which high severity (17.29%), moderate-high severity (19.63%), moderate-low severity (22.35%), and low severity (27.51%). The Arapaho and Roosevelt National Forest Park, California, the USA, which is affected by the Cameron peak fire in 2020, is chosen for the second case study. It was found that around 983 km2 had burned out, of which high severity (2.73%), moderate-high severity (1.57%), moderate-low severity (1.18%), and low severity (5.45%). These spots also can be detected through the visual inspection made possible by cloud-free images generated by WWSAT. This tool is cost-effective in calculating the burnt area since satellite images are free and the cost of field surveys is avoided.

Keywords: burnt area, burnt severity, fires, google earth engine (GEE), sentinel-2

Procedia PDF Downloads 235
2213 Effect of Surfactant on Thermal Conductivity of Ethylene Glycol/Silver Nanofluid

Authors: E. C. Muhammed Irshad

Abstract:

Nanofluids are a new class of solid-liquid colloidal mixture consisting of nanometer sized (< 100nm) solid particles suspended in heat transfer fluids such as water, ethylene/propylene glycol etc. Nanofluids offer excellent scope of enhancing thermal conductivity of common heat transfer fluids and it leads to enhancement of the heat transfer coefficient. In the present study, silver nanoparticles are dispersed in ethylene glycol water mixture. Low volume concentrations (0.05%, 0.1% and 0.15%) of silver nanofluids were synthesized. The thermal conductivity of these nanofluids was determined with thermal property analyzer (KD2 pro apparatus) and heat transfer coefficient was found experimentally. Initially, the thermal conductivity and viscosity of nanofluids were calculated with various correlations at different concentrations and were compared. Thermal conductivity of silver nanofluid at 0.02% and 0.1% concentration of silver nanoparticle increased to 23.3% and 27.7% for Sodium Dodecyl Sulfate (SDS) and to 33.6% and 36.7% for Poly Vinyl Pyrrolidone (PVP), respectively. The nanofluid maintains the stability for two days and it starts to settle down due to high density of silver. But it shows good improvement in the thermal conductivity for low volume concentration and it also shows better improvement with Poly Vinyl Pyrrolidone (PVP) surfactant than Sodium Dodecyl Sulfate (SDS).

Keywords: k-thermal conductivity, sodium dodecyl sulfate, vinyl pyrrolidone, mechatronics engineering

Procedia PDF Downloads 313
2212 Discontinuous Spacetime with Vacuum Holes as Explanation for Gravitation, Quantum Mechanics and Teleportation

Authors: Constantin Z. Leshan

Abstract:

Hole Vacuum theory is based on discontinuous spacetime that contains vacuum holes. Vacuum holes can explain gravitation, some laws of quantum mechanics and allow teleportation of matter. All massive bodies emit a flux of holes which curve the spacetime; if we increase the concentration of holes, it leads to length contraction and time dilation because the holes do not have the properties of extension and duration. In the limited case when space consists of holes only, the distance between every two points is equal to zero and time stops - outside of the Universe, the extension and duration properties do not exist. For this reason, the vacuum hole is the only particle in physics capable of describing gravitation using its own properties only. All microscopic particles must 'jump' continually and 'vibrate' due to the appearance of holes (impassable microscopic 'walls' in space), and it is the cause of the quantum behavior. Vacuum holes can explain the entanglement, non-locality, wave properties of matter, tunneling, uncertainty principle and so on. Particles do not have trajectories because spacetime is discontinuous and has impassable microscopic 'walls' due to the simple mechanical motion is impossible at small scale distances; it is impossible to 'trace' a straight line in the discontinuous spacetime because it contains the impassable holes. Spacetime 'boils' continually due to the appearance of the vacuum holes. For teleportation to be possible, we must send a body outside of the Universe by enveloping it with a closed surface consisting of vacuum holes. Since a material body cannot exist outside of the Universe, it reappears instantaneously in a random point of the Universe. Since a body disappears in one volume and reappears in another random volume without traversing the physical space between them, such a transportation method can be called teleportation (or Hole Teleportation). It is shown that Hole Teleportation does not violate causality and special relativity due to its random nature and other properties. Although Hole Teleportation has a random nature, it can be used for colonization of extrasolar planets by the help of the method called 'random jumps': after a large number of random teleportation jumps, there is a probability that the spaceship may appear near a habitable planet. We can create vacuum holes experimentally using the method proposed by Descartes: we must remove a body from the vessel without permitting another body to occupy this volume.

Keywords: border of the Universe, causality violation, perfect isolation, quantum jumps

Procedia PDF Downloads 425
2211 Calibration of Syringe Pumps Using Interferometry and Optical Methods

Authors: E. Batista, R. Mendes, A. Furtado, M. C. Ferreira, I. Godinho, J. A. Sousa, M. Alvares, R. Martins

Abstract:

Syringe pumps are commonly used for drug delivery in hospitals and clinical environments. These instruments are critical in neonatology and oncology, where any variation in the flow rate and drug dosing quantity can lead to severe incidents and even death of the patient. Therefore it is very important to determine the accuracy and precision of these devices using the suitable calibration methods. The Volume Laboratory of the Portuguese Institute for Quality (LVC/IPQ) uses two different methods to calibrate syringe pumps from 16 nL/min up to 20 mL/min. The Interferometric method uses an interferometer to monitor the distance travelled by a pusher block of the syringe pump in order to determine the flow rate. Therefore, knowing the internal diameter of the syringe with very high precision, the travelled distance, and the time needed for that travelled distance, it was possible to calculate the flow rate of the fluid inside the syringe and its uncertainty. As an alternative to the gravimetric and the interferometric method, a methodology based on the application of optical technology was also developed to measure flow rates. Mainly this method relies on measuring the increase of volume of a drop over time. The objective of this work is to compare the results of the calibration of two syringe pumps using the different methodologies described above. The obtained results were consistent for the three methods used. The uncertainties values were very similar for all the three methods, being higher for the optical drop method due to setup limitations.

Keywords: calibration, flow, interferometry, syringe pump, uncertainty

Procedia PDF Downloads 109
2210 Materials for Electrically Driven Aircrafts: Highly Conductive Carbon-Fiber Reinforced Epoxy Composites

Authors: Simon Bard, Martin Demleitner, Florian Schonl, Volker Altstadt

Abstract:

For an electrically driven aircraft, whose engine is based on semiconductors, alternative materials are needed. The avoid hotspots in the materials thermally conductive polymers are necessary. Nevertheless, the mechanical properties of these materials should remain. Herein, the work of three years in a project with airbus and Siemens is presented. Different strategies have been pursued to achieve conductive fiber-reinforced composites: Metal-coated carbon fibers, pitch-based fibers and particle-loaded matrices have been investigated. In addition, a combination of copper-coated fibers and a conductive matrix has been successfully tested for its conductivity and mechanical properties. First, prepregs have been produced with a laboratory scale prepreg line, which can handle materials with maximum width of 300 mm. These materials have then been processed to fiber-reinforced laminates. For the PAN-fiber reinforced laminates, it could be shown that there is a strong dependency between fiber volume content and thermal conductivity. Laminates with 50 vol% of carbon fiber offer a conductivity of 0.6 W/mK, those with 66 vol% of fiber a thermal conductivity of 1 W/mK. With pitch-based fiber, the conductivity enhances to 1.5 W/mK for 61 vol% of fiber, compared to 0.81 W/mK with the same amount of fibers produced from PAN (+83% in conducitivity). The thermal conductivity of PAN-based composites with 50 vol% of fiber is at 0.6 W/mK, their nickel-coated counterparts with the same fiber volume content offer a conductivity of 1 W/mK, an increase of 66%.

Keywords: carbon, electric aircraft, polymer, thermal conductivity

Procedia PDF Downloads 163
2209 Landslide Study Using Unmanned Aerial Vehicle and Resistivity Survey at Bkt Kukus, Penang Island, Malaysia

Authors: Kamal Bahrin Jaafar

Abstract:

The study area is located at Bukit Kukus, Penang where the construction of twin road project in ongoing. A landslide event has occurred on 19th October 2018, which causes fatal deaths. The purpose of this study is to figure out the causes of failure, the estimated volume of failure, and its balance. The study comprises of unmanned aerial vehicle (UAV) sensing and resistivity survey. The resistivity method includes spreading three lines of 200m length resistivity survey with the depth of penetration in the subsurface not exceeding 35m. The result of UAV shows the current view of the site condition. Based on resistivity result, the dominant layer in the study area consists of residual soil/filling material with a thickness of more than 35m. Three selected cross sections from construction drawing are overlain with the current cross sections to understand more on the condition of the subsurface profile. By comparison, there is a difference between past and present topography. The combination of result from the previous data and current condition shows the calculated volume of failure is 85,000 m³, and its balance is 50,000 m³. In conclusion, the failure occurs since the contractor has conducted the construction works without following the construction drawing supplied by the consultant. Besides, the cause of failure is triggered by the geology condition, such as a fault that should be considered prior to the commencement of work.

Keywords: UAV, landslide, resistivity survey, cause of failure

Procedia PDF Downloads 114
2208 Study on Compressive Strength and Setting Time of Fly Ash Concrete after Slump Recovery Using Superplasticizer

Authors: Chaiyakrit Raoupatham, Ram Hari Dhakal, Chalermchai Wanichlamlert

Abstract:

Fresh concrete that is on bound to be rejected due to belated use either from delay construction process or unflavored traffic cause delay on concrete delivering can recover the slump and use once again by introduce second dose of superplasticizer(naphthalene based type F) into system. By adding superplasticizer as solution for recover unusable slump loss concrete may affects other concrete properties. Therefore, this paper was observed setting time and compressive strength of concrete after being re-dose with chemical admixture type F (superplasticizer, naphthalene based) for slump recovery. The concrete used in this study was fly ash concrete with fly ash replacement of 0%, 30% and 50% respectively. Concrete mix designed for test specimen was prepared with paste content (ratio of volume of cement to volume of void in the aggregate) of 1.2 and 1.3, water-to-binder ratio (w/b) range of 0.3 to 0.58, initial dose of superplasticizer (SP) range from 0.5 to 1.6%. The setting time of concrete were tested both before and after re-dosed with different amount of second dose and time of dosing. The research was concluded that addition of second dose of superplasticizer would increase both initial and final setting times accordingly to dosage of addition. As for fly ash concrete, the prolongation effect was higher as the replacement of fly ash is increase. The prolongation effect can reach up to maximum about 4 hours. In case of compressive strength, the re-dosed concrete has strength fluctuation within acceptable range of ±10%.

Keywords: compressive strength, fly ash concrete, second dose of superplasticizer, setting times

Procedia PDF Downloads 281
2207 Effect of Different Media and Planting Time on the Cuttings of Cherry (Prunus Avium L.) Rootstock Colt Under the Agro Climatic Conditions of Temprate Region

Authors: Sajjad Ali Khan Sajjad Ali Khan, Gohar Ayub, Khalil Ur Rahman, Muhammad Sajid, Mumtaz Farooq, Mohammad Irshad, Haider Ali

Abstract:

A trail was carried out to know the effect of different soil media and planting time on the cuttings of cherry (Prunus avium L.) rootstock Colt at Agriculture Research Institute (ARI) Mingora swat, during winter 2011. The experiment was laid out in Randomized Complete Block Design (RCBD) with split plot arrangement and was replicated three times. Soil media (Silt, Garden soil and Silt+Garden soil+FYM) were assigned to main plots whereas, planting Dates (1st Jan, 11th Jan, 21st Jan, 1st Feb, 11th Feb, 21st Feb and 2nd March) subjected to sub plots. The data recorded on sprouting percentage, shoot diameter cutting-1, number of leaves cutting-1, rootstock height (cm), survival percentage, number of roots, root length (cm), root volume (cm3) and root weight (gm) were significantly affected by different soil media. Maximum sprouting percentage (100%), shoot diameter (1.72 mm), number of leaves cutting-1 (76.74), rootstock height (104.36 cm), survival percentage (41.67%), number of roots (76.35), root length (11.28 cm), root volume (4.43 cm3) and root weight (4.64 gm) were recorded in media M3 (Garden soil+silt+FYM). A significant response to various planting dates were observed for most of vegetative and rooting attributes of cherry rootstock Colt. 1st January plantation showed maximum sprouting percentage (100%), shoot diameter (1.99 mm), number of leaves (81.46), rootstock height (126.24 cm), survival percentage (58.12%), whereas 11th January plantation showed more number of roots (94.43), root length (10.60 cm), root volume (3.68 cm3) and root weight (3.71 gm). Based on the results from the experimental work, it is recommended that cherry cuttings should be planted in early January in soil media (Silt+Garden soil+ FYM) for better growth and development under the agro climatic conditions of temperate region.

Keywords: soil media, cherry rootstock, planting dates, growth parameters

Procedia PDF Downloads 98
2206 Geometric Intuition and Formalism in Passing from Indivisibles to Infinitesimals: Pascal and Leibniz

Authors: Remus Titiriga

Abstract:

The paper focuses on Pascal's indivisibles evolving to Leibniz's infinitesimals. It starts with parallel developments by the two savants in Combinatorics (triangular numbers for Pascal and harmonic triangles for Leibniz) and their implication in determining the sum of mathematical series. It follows with a focus on the geometrical contributions of Pascal. He considered the cycloid and other mechanical curves the epitome of geometric comprehensibility in a series of challenging problems he posed to the mathematical world. Pascal provided the solutions in 1658, in a volume published under the pseudonym of Dettonville, using indivisibles and ratios between curved and straight lines. In the third part, the research follows the impact of this volume on Leibniz as the initial impetus for the elaboration of modern calculus as an algorithmic method disjoint of geometrical intuition. Then paper analyses the further steps and proves that Leibniz's developments relate to his philosophical frame (the search for a characteristic Universalis, the consideration of principle of continuity or the rule of sufficient reason) different from Pascal's and impacting mathematical problems and their solutions. At this stage in Leibniz's evolution, the infinitesimals replaced the indivisibles proper. The last part of the paper starts with speculation around "What if?". Could Pascal, if he lived more, accomplish the same feat? The document uses Pascal's reconstructed philosophical frame to formulate a positive answer. It also proposes to teach calculus with indivisibles and infinitesimals mimicking Pascal and Leibniz's achievements.

Keywords: indivisibles, infinitesimals, characteristic triangle, the principle of continuity

Procedia PDF Downloads 129
2205 A Geometric Interpolation Scheme in Overset Meshes for the Piecewise Linear Interface Calculation Volume of Fluid Method in Multiphase Flows

Authors: Yanni Chang, Dezhi Dai, Albert Y. Tong

Abstract:

Piecewise linear interface calculation (PLIC) schemes are widely used in the volume-of-fluid (VOF) method to capture interfaces in numerical simulations of multiphase flows. Dynamic overset meshes can be especially useful in applications involving component motions and complex geometric shapes. In the present study, the VOF value of an acceptor cell is evaluated in a geometric way that transfers the fraction field between the meshes precisely with reconstructed interfaces from the corresponding donor elements. The acceptor cell value is evaluated by using a weighted average of its donors for most of the overset interpolation schemes for continuous flow variables. The weighting factors are obtained by different algebraic methods. Unlike the continuous flow variables, the VOF equation is a step function near the interfaces, which ranges from zero to unity rapidly. A geometric interpolation scheme of the VOF field in overset meshes for the PLIC-VOF method has been proposed in the paper. It has been tested successfully in quadrilateral/hexahedral overset meshes by employing several VOF advection tests with imposed solenoidal velocity fields. The proposed algorithm has been shown to yield higher accuracy in mass conservation and interface reconstruction compared with three other algebraic ones.

Keywords: interpolation scheme, multiphase flows, overset meshes, PLIC-VOF method

Procedia PDF Downloads 176
2204 3D Seismic Acquisition Challenges in the NW Ghadames Basin Libya, an Integrated Geophysical Sedimentological and Subsurface Studies Approach as a Solution

Authors: S. Sharma, Gaballa Aqeelah, Tawfig Alghbaili, Ali Elmessmari

Abstract:

There were abrupt discontinuities in the Brute Stack in the northernmost locations during the acquisition of 2D (2007) and 3D (2021) seismic data in the northwest region of the Ghadames Basin, Libya. In both campaigns, complete fluid circulation loss was seen in these regions during up-hole drilling. Geophysics, sedimentology and shallow subsurface geology were all integrated to look into what was causing the seismic signal to disappear at shallow depths. The Upper Cretaceous Nalut Formation is the near-surface or surface formation in the studied area. It is distinguished by abnormally high resistivity in all the neighboring wells. The Nalut Formation in all the nearby wells from the present study and previous outcrop study suggests lithology of dolomite and chert/flint in nodular or layered forms. There are also reports of karstic caverns, vugs, and thick cracks, which all work together to produce the high resistivity. Four up-hole samples that were analyzed for microfacies revealed a near-coastal to tidal environment. Algal (Chara) infested deposits up to 30 feet thick and monotonous, very porous, are seen in two up-hole sediments; these deposits are interpreted to be scattered, continental algal travertine mounds. Chert/flint, dolomite, and calcite in varying amounts are confirmed by XRD analysis. Regional tracking of the high resistivity of the Nalut Formation, which is thought to be connected to the sea level drop that created the paleokarst layer, is possible. It is abruptly overlain by a blanket marine transgressive deposit caused by rapid sea level rise, which is a regional, relatively high radioactive layer of argillaceous limestone. The examined area's close proximity to the mountainous, E-W trending ridges of northern Libya made it easier for recent freshwater circulation, which later enhanced cavern development and mineralization in the paleokarst layer. Seismic signal loss at shallow depth is caused by extremely heterogeneous mineralogy of pore- filling or lack thereof. Scattering effect of shallow karstic layer on seismic signal has been well documented. Higher velocity inflection points at shallower depths in the northern part and deeper intervals in the southern part, in both cases at Nalut level, demonstrate the layer's influence on the seismic signal. During the Permian-Carboniferous, the Ghadames Basin underwent uplift and extensive erosion, which resulted in this karstic layer of the Nalut Formation uplifted to a shallow depth in the northern part of the studied area weakening the acoustic signal, whereas in the southern part of the 3D acquisition area the Nalut Formation remained at the deeper interval without affecting the seismic signal. Results from actions taken during seismic processing to deal with this signal loss are visible and have improved. This study recommends using denser spacing or dynamite to circumvent the karst layer in a comparable geographic area in order to prevent signal loss at lesser depths.

Keywords: well logging, seismic data acquisition, sesimic data processing, up-holes

Procedia PDF Downloads 86
2203 Optimization of Two Quality Characteristics in Injection Molding Processes via Taguchi Methodology

Authors: Joseph C. Chen, Venkata Karthik Jakka

Abstract:

The main objective of this research is to optimize tensile strength and dimensional accuracy in injection molding processes using Taguchi Parameter Design. An L16 orthogonal array (OA) is used in Taguchi experimental design with five control factors at four levels each and with non-controllable factor vibration. A total of 32 experiments were designed to obtain the optimal parameter setting for the process. The optimal parameters identified for the shrinkage are shot volume, 1.7 cubic inch (A4); mold term temperature, 130 ºF (B1); hold pressure, 3200 Psi (C4); injection speed, 0.61 inch3/sec (D2); and hold time of 14 seconds (E2). The optimal parameters identified for the tensile strength are shot volume, 1.7 cubic inch (A4); mold temperature, 160 ºF (B4); hold pressure, 3100 Psi (C3); injection speed, 0.69 inch3/sec (D4); and hold time of 14 seconds (E2). The Taguchi-based optimization framework was systematically and successfully implemented to obtain an adjusted optimal setting in this research. The mean shrinkage of the confirmation runs is 0.0031%, and the tensile strength value was found to be 3148.1 psi. Both outcomes are far better results from the baseline, and defects have been further reduced in injection molding processes.

Keywords: injection molding processes, taguchi parameter design, tensile strength, high-density polyethylene(HDPE)

Procedia PDF Downloads 196
2202 Total Parenteral Nutrition Wastage: A Retrospective Cohort Study in a Small District General Hospital

Authors: Muhammad Faizan Butt, Maria Ambreen Tahir, Joshua James Pilkington, A. A. Warsi

Abstract:

Background: Total parenteral nutrition (TPN) use within the NHS is crucial in the prevention of malnourishment. TPN prescriptions are tailored to an individual patient’s needs. TPN bags come in fixed sizes, and minimizing wastage has financial and sustainability implications for the health service. The aim of the study is to assess current prescribing practices, look at the volume of TPN wastage and identify reasons for it. Methodology: A retrospective cohort study on TPN prescriptions over a period of 1 year (Jan-Dec 2022) was performed. All patients prescribed TPN that had been admitted under a general surgery consultant in a small district hospital were included. Data were extracted from hospital electronic records and dietician charts. Data were described, and reasons for TPN wastage were explored. Results: 49 patients were identified. The average length of TPN prescription was 8 days (median). This totaled 608 prescriptions. Of the bags prescribed, 258, 169, and 181 were 10g (2500ml), 14g (2000ml), and 18g (2000ml), respectively. The mean volume wasted from each type of bag was 634ml, 634ml, and 648ml, respectively. Reasons for TPN wastage identified were: no loss (25%), smaller bags not available (53.6%), step-down regime (8.1%), and other (12.2%). Conclusion: This study has identified that the current stocking and prescribing of TPN within a district general hospital leads to a significant wastage of 638.2ml (average). The commonest reason for wastage is the non-availability of a more appropriate sized bag.

Keywords: general surgery, TPN, sustainability, wastage

Procedia PDF Downloads 75