Search results for: mechanical properties
1741 Benchmarking Machine Learning Approaches for Forecasting Hotel Revenue
Authors: Rachel Y. Zhang, Christopher K. Anderson
Abstract:
A critical aspect of revenue management is a firm’s ability to predict demand as a function of price. Historically hotels have used simple time series models (regression and/or pick-up based models) owing to the complexities of trying to build casual models of demands. Machine learning approaches are slowly attracting attention owing to their flexibility in modeling relationships. This study provides an overview of approaches to forecasting hospitality demand – focusing on the opportunities created by machine learning approaches, including K-Nearest-Neighbors, Support vector machine, Regression Tree, and Artificial Neural Network algorithms. The out-of-sample performances of above approaches to forecasting hotel demand are illustrated by using a proprietary sample of the market level (24 properties) transactional data for Las Vegas NV. Causal predictive models can be built and evaluated owing to the availability of market level (versus firm level) data. This research also compares and contrast model accuracy of firm-level models (i.e. predictive models for hotel A only using hotel A’s data) to models using market level data (prices, review scores, location, chain scale, etc… for all hotels within the market). The prospected models will be valuable for hotel revenue prediction given the basic characters of a hotel property or can be applied in performance evaluation for an existed hotel. The findings will unveil the features that play key roles in a hotel’s revenue performance, which would have considerable potential usefulness in both revenue prediction and evaluation.Keywords: hotel revenue, k-nearest-neighbors, machine learning, neural network, prediction model, regression tree, support vector machine
Procedia PDF Downloads 1331740 Non-Destructive Testing of Carbon Fiber Reinforced Plastic by Infrared Thermography Methods
Authors: W. Swiderski
Abstract:
Composite materials are one answer to the growing demand for materials with better parameters of construction and exploitation. Composite materials also permit conscious shaping of desirable properties to increase the extent of reach in the case of metals, ceramics or polymers. In recent years, composite materials have been used widely in aerospace, energy, transportation, medicine, etc. Fiber-reinforced composites including carbon fiber, glass fiber and aramid fiber have become a major structural material. The typical defect during manufacture and operation is delamination damage of layered composites. When delamination damage of the composites spreads, it may lead to a composite fracture. One of the many methods used in non-destructive testing of composites is active infrared thermography. In active thermography, it is necessary to deliver energy to the examined sample in order to obtain significant temperature differences indicating the presence of subsurface anomalies. To detect possible defects in composite materials, different methods of thermal stimulation can be applied to the tested material, these include heating lamps, lasers, eddy currents, microwaves or ultrasounds. The use of a suitable source of thermal stimulation on the test material can have a decisive influence on the detection or failure to detect defects. Samples of multilayer structure carbon composites were prepared with deliberately introduced defects for comparative purposes. Very thin defects of different sizes and shapes made of Teflon or copper having a thickness of 0.1 mm were screened. Non-destructive testing was carried out using the following sources of thermal stimulation, heating lamp, flash lamp, ultrasound and eddy currents. The results are reported in the paper.Keywords: Non-destructive testing, IR thermography, composite material, thermal stimulation
Procedia PDF Downloads 2591739 On the Solution of Boundary Value Problems Blended with Hybrid Block Methods
Authors: Kizito Ugochukwu Nwajeri
Abstract:
This paper explores the application of hybrid block methods for solving boundary value problems (BVPs), which are prevalent in various fields such as science, engineering, and applied mathematics. Traditionally, numerical approaches such as finite difference and shooting methods, often encounter challenges related to stability and convergence, particularly in the context of complex and nonlinear BVPs. To address these challenges, we propose a hybrid block method that integrates features from both single-step and multi-step techniques. This method allows for the simultaneous computation of multiple solution points while maintaining high accuracy. Specifically, we employ a combination of polynomial interpolation and collocation strategies to derive a system of equations that captures the behavior of the solution across the entire domain. By directly incorporating boundary conditions into the formulation, we enhance the stability and convergence properties of the numerical solution. Furthermore, we introduce an adaptive step-size mechanism to optimize performance based on the local behavior of the solution. This adjustment allows the method to respond effectively to variations in solution behavior, improving both accuracy and computational efficiency. Numerical tests on a variety of boundary value problems demonstrate the effectiveness of the hybrid block methods. These tests showcase significant improvements in accuracy and computational efficiency compared to conventional methods, indicating that our approach is robust and versatile. The results suggest that this hybrid block method is suitable for a wide range of applications in real-world problems, offering a promising alternative to existing numerical techniques.Keywords: hybrid block methods, boundary value problem, polynomial interpolation, adaptive step-size control, collocation methods
Procedia PDF Downloads 311738 Ultrasound-Assisted Sol – Gel Synthesis of Nano-Boehmite for Biomedical Purposes
Authors: Olga Shapovalova, Vladimir Vinogradov
Abstract:
Among many different sol – gel matrices only alumina can be successfully parenteral injected in the human body. And this is not surprising, because boehmite (aluminium oxyhydroxide) is the metal oxide approved by FDA and EMA for intravenous and intramuscular administrations, and also has been using for a longtime as adjuvant for producing of many modern vaccines. In our earlier study, it has been shown, that denaturation temperature of enzymes entrapped in sol-gel boehmite matrix increases for 30 – 60 °С with preserving of initial activity. It makes such matrices more attractive for long-term storage of non-stable drugs. In current work we present ultrasound-assisted sol-gel synthesis of nano-boehmite. This method provides bio-friendly, very stable, highly homogeneous alumina sol with using only water and aluminium isopropoxide as a precursor. Many parameters of the synthesis were studied in details: time of ultrasound treatment, US frequency, surface area, pore and nanoparticle size, zeta potential and others. Here we investigated the dependence of stability of colloidal sols and textural properties of the final composites as a function of the time of ultrasonic treatment. Chosen ultrasonic treatment time was between 30 and 180 minutes. Surface area, average pore diameter and total pore volume of the final composites were measured by surface and pore size analyzer Nova 1200 Quntachrome. It was shown that the matrices with ultrasonic treatment time equal to 90 minutes have the biggest surface area 431 ± 24 m2/g. On the other had such matrices have a smaller stability in comparison with the samples with ultrasonic treatment time equal to 120 minutes that have the surface area 390 ± 21 m2/g. It was shown that the stable sols could be formed only after 120 minutes of ultrasonic treatment, otherwise the white precipitate of boehmite is formed. We conclude that the optimal ultrasonic treatment time is 120 minutes.Keywords: boehmite matrix, stabilisation, ultrasound-assisted sol-gel synthesis
Procedia PDF Downloads 2671737 Photocatalytic Degradation of Organic Polluant Reacting with Tungstates: Role of Microstructure and Size Effect on Oxidation Kinetics
Authors: A. Taoufyq, B. Bakiz, A. Benlhachemi, L. Patout, D. V. Chokouadeua, F. Guinneton, G. Nolibe, A. Lyoussi, J-R. Gavarri
Abstract:
Currently, the photo catalytic reactions occurring under solar illumination have attracted worldwide attentions due to a tremendous set of environmental problems. Taking the sunlight into account, it is indispensable to develop highly effective visible-light-driver photo catalysts. Nano structured materials such as MxM’1-xWO6 system are widely studied due to its interesting piezoelectric, dielectric and catalytic properties. These materials can be used in photo catalysis technique for environmental applications, such as waste water treatments. The aim of this study was to investigate the photo catalytic activity of polycrystalline phases of bismuth tungstate of formula Bi2WO6. Polycrystalline samples were elaborated using a coprecipitation technique followed by a calcination process at different temperatures (300, 400, 600 and 900°C). The obtained polycrystalline phases have been characterized by X-ray diffraction (XRD), scanning electron microscopy (SEM), and transmission electron microscopy (TEM). Crystal cell parameters and cell volume depend on elaboration temperature. High-resolution electron microscopy images and image simulations, associated with X-ray diffraction data, allowed confirming the lattices and space groups Pca21. The photo catalytic activity of the as-prepared samples was studied by irradiating aqueous solutions of Rhodamine B, associated with Bi2WO6 additives having variable crystallite sizes. The photo catalytic activity of such bismuth tungstates increased as the crystallite sizes decreased. The high specific area of the photo catalytic particles obtained at 300°C seems to condition the degradation kinetics of RhB.Keywords: Bismuth tungstate, crystallite sizes, electron microscopy, photocatalytic activity, X-ray diffraction.
Procedia PDF Downloads 4491736 A Framework for Auditing Multilevel Models Using Explainability Methods
Authors: Debarati Bhaumik, Diptish Dey
Abstract:
Multilevel models, increasingly deployed in industries such as insurance, food production, and entertainment within functions such as marketing and supply chain management, need to be transparent and ethical. Applications usually result in binary classification within groups or hierarchies based on a set of input features. Using open-source datasets, we demonstrate that popular explainability methods, such as SHAP and LIME, consistently underperform inaccuracy when interpreting these models. They fail to predict the order of feature importance, the magnitudes, and occasionally even the nature of the feature contribution (negative versus positive contribution to the outcome). Besides accuracy, the computational intractability of SHAP for binomial classification is a cause of concern. For transparent and ethical applications of these hierarchical statistical models, sound audit frameworks need to be developed. In this paper, we propose an audit framework for technical assessment of multilevel regression models focusing on three aspects: (i) model assumptions & statistical properties, (ii) model transparency using different explainability methods, and (iii) discrimination assessment. To this end, we undertake a quantitative approach and compare intrinsic model methods with SHAP and LIME. The framework comprises a shortlist of KPIs, such as PoCE (Percentage of Correct Explanations) and MDG (Mean Discriminatory Gap) per feature, for each of these three aspects. A traffic light risk assessment method is furthermore coupled to these KPIs. The audit framework will assist regulatory bodies in performing conformity assessments of AI systems using multilevel binomial classification models at businesses. It will also benefit businesses deploying multilevel models to be future-proof and aligned with the European Commission’s proposed Regulation on Artificial Intelligence.Keywords: audit, multilevel model, model transparency, model explainability, discrimination, ethics
Procedia PDF Downloads 941735 Multivariate Data Analysis for Automatic Atrial Fibrillation Detection
Authors: Zouhair Haddi, Stephane Delliaux, Jean-Francois Pons, Ismail Kechaf, Jean-Claude De Haro, Mustapha Ouladsine
Abstract:
Atrial fibrillation (AF) has been considered as the most common cardiac arrhythmia, and a major public health burden associated with significant morbidity and mortality. Nowadays, telemedical approaches targeting cardiac outpatients situate AF among the most challenged medical issues. The automatic, early, and fast AF detection is still a major concern for the healthcare professional. Several algorithms based on univariate analysis have been developed to detect atrial fibrillation. However, the published results do not show satisfactory classification accuracy. This work was aimed at resolving this shortcoming by proposing multivariate data analysis methods for automatic AF detection. Four publicly-accessible sets of clinical data (AF Termination Challenge Database, MIT-BIH AF, Normal Sinus Rhythm RR Interval Database, and MIT-BIH Normal Sinus Rhythm Databases) were used for assessment. All time series were segmented in 1 min RR intervals window and then four specific features were calculated. Two pattern recognition methods, i.e., Principal Component Analysis (PCA) and Learning Vector Quantization (LVQ) neural network were used to develop classification models. PCA, as a feature reduction method, was employed to find important features to discriminate between AF and Normal Sinus Rhythm. Despite its very simple structure, the results show that the LVQ model performs better on the analyzed databases than do existing algorithms, with high sensitivity and specificity (99.19% and 99.39%, respectively). The proposed AF detection holds several interesting properties, and can be implemented with just a few arithmetical operations which make it a suitable choice for telecare applications.Keywords: atrial fibrillation, multivariate data analysis, automatic detection, telemedicine
Procedia PDF Downloads 2671734 Environmentally Friendly KOH and NH4OH-KOH Pulping of Rice Straw
Authors: Omid Ghaffarzadeh Mollabashi, Sara Khorshidi, Hossein Kermanian Seyed, Majid Zabihzadeh
Abstract:
The main problem that hinders the intensive use of non-wood raw materials in papermaking industry is the environmental pollution caused by black liquor. As a matter of fact, black liquor of nonwood pulping is discharged to the environment due to the lack of recovery. Traditionally, NaOH pulping produces Na-based black liquor that may increase soil erosion and reduce soil permeability. With substitution of KOH/NH4OH with NaOH as the cooking liquor, K and N can act as a soil fertilizer while offering an environmentally acceptable disposal alternative. For this purpose, rice straw samples were pulped under the following conditions; Constant factors were: straw weight: 100 gram (based on oven dry), liquor to straw ratio 7:1 and maximum temperature, 170 and 180 ºC. Variable factors for KOH cooks were: KOH dosage of 14, 17 and %20 on oven dry of straw and times at maximum temperature of 60 and 90 minutes. For KOH-NH4OH cooks, KOH dosage of 5 and %10 and NH4OH dosage of 25 and %35, both based as oven dry of straw were applied. Besides, time at maximum temperature was 90 minutes. Yield ranges of KOH and KOH-NH4OH pulp samples were obtained from 37.28 to 48.62 and 45.63 to 48.08 percent, respectively. In addition, Kappa number ranged from 21.91 to 29.85 and 55.15 to 56.25, respectively. In comparison with soda, soda-AQ, cold soda, kraft, EDA (dissolving), De-Ethylene Glycol (dissolving), burst and tensile index for KOH pulp was more in similar cooking condition. With an exception of soda pulps, tear index of the mentioned pulp is more than all compared treatments. Therefore, it can be resulted that KOH pulping method is an appropriate choice for making paper of the rice straw. Also, compared to KOH-NH4OH, KOH pulping method is more appropriate choice because of better pulping results.Keywords: environmentally friendly process, rice straw, NH4OH-KOH pulping, pulp properties
Procedia PDF Downloads 2701733 Improving the Exploitation of Fluid in Elastomeric Polymeric Isolator
Authors: Haithem Elderrat, Huw Davies, Emmanuel Brousseau
Abstract:
Elastomeric polymer foam has been used widely in the automotive industry, especially for isolating unwanted vibrations. Such material is able to absorb unwanted vibration due to its combination of elastic and viscous properties. However, the ‘creep effect’, poor stress distribution and susceptibility to high temperatures are the main disadvantages of such a system. In this study, improvements in the performance of elastomeric foam as a vibration isolator were investigated using the concept of Foam Filled Fluid (FFFluid). In FFFluid devices, the foam takes the form of capsule shapes, and is mixed with viscous fluid, while the mixture is contained in a closed vessel. When the FFFluid isolator is affected by vibrations, energy is absorbed, due to the elastic strain of the foam. As the foam is compressed, there is also movement of the fluid, which contributes to further energy absorption as the fluid shears. Also, and dependent on the design adopted, the packaging could also attenuate vibration through energy absorption via friction and/or elastic strain. The present study focuses on the advantages of the FFFluid concept over the dry polymeric foam in the role of vibration isolation. This comparative study between the performance of dry foam and the FFFluid was made according to experimental procedures. The paper concludes by evaluating the performance of the FFFluid isolator in the suspension system of a light vehicle. One outcome of this research is that the FFFluid may preferable over elastomer isolators in certain applications, as it enables a reduction in the effects of high temperatures and of ‘creep effects’, thereby increasing the reliability and load distribution. The stiffness coefficient of the system has increased about 60% by using an FFFluid sample. The technology represented by the FFFluid is therefore considered by this research suitable for application in the suspension system of a light vehicle.Keywords: FFFluid, dry foam, anti-vibration devices, elastomeric polymer foam
Procedia PDF Downloads 3391732 Characterization and Pcr Detection of Selected Strains of Psychrotrophic Bacteria Isolated From Raw Milk
Authors: Kidane workelul, Li xu, Xiaoyang Pang, Jiaping Lv
Abstract:
Dairy products are exceptionally ideal media for the growth of microorganisms because of their high nutritional content. There are several ways that milk might get contaminated throughout the milking process, including how the raw milk is transported and stored, as well as how long it is kept before being processed. Psychrotrophic bacteria are among the one which can deteriorate the quality of milk mainly their heat resistance proteas and lipase enzyme. For this research purpose 8 selected strains of Psychrotrophic bacteria (Entrococcus hirae, Pseudomonas fluorescens, Pseudomonas azotoformans, Pseudomonas putida, Exiguobacterium indicum, Pseudomonas paralactice, Acinetobacter indicum, Serratia liquefacients)are chosen and try to determine their characteristics based on the research methodology protocol. Thus, the 8 selected strains are cultured, plated incubate, extracted their genomic DNA and genome DNA was amplified, the purpose of the study was to identify their Psychrotrophic properties, lipase hydrolysis positive test, their optimal incubation temperature, designed primer using the noble strain P,flourescens conserved region area in target with lipA gene, optimized primer specificity as well as sensitivity and PCR detection for lipase positive strains using the design primers. Based on the findings both the selected 8 strains isolated from stored raw milk are Psychrotrophic bacteria, 6 of the selected strains except the 2 strains are positive for lipase hydrolysis, their optimal temperature is 20 to 30 OC, the designed primer specificity is very accurate and amplifies for those strains only with lipase positive but could not amplify for the others. Thus, the result is promising and could help in detecting the Psychrotrophic bacteria producing heat resistance enzymes (lipase) at early stage before the milk is processed and this will safe production loss for the dairy industry.Keywords: dairy industry, heat-resistant, lipA, milk, primer and psychrotrophic
Procedia PDF Downloads 641731 Improvement of the Q-System Using the Rock Engineering System: A Case Study of Water Conveyor Tunnel of Azad Dam
Authors: Sahand Golmohammadi, Sana Hosseini Shirazi
Abstract:
Because the status and mechanical parameters of discontinuities in the rock mass are included in the calculations, various methods of rock engineering classification are often used as a starting point for the design of different types of structures. The Q-system is one of the most frequently used methods for stability analysis and determination of support systems of underground structures in rock, including tunnel. In this method, six main parameters of the rock mass, namely, the rock quality designation (RQD), joint set number (Jn), joint roughness number (Jr), joint alteration number (Ja), joint water parameter (Jw) and stress reduction factor (SRF) are required. In this regard, in order to achieve a reasonable and optimal design, identifying the effective parameters for the stability of the mentioned structures is one of the most important goals and the most necessary actions in rock engineering. Therefore, it is necessary to study the relationships between the parameters of a system and how they interact with each other and, ultimately, the whole system. In this research, it has attempted to determine the most effective parameters (key parameters) from the six parameters of rock mass in the Q-system using the rock engineering system (RES) method to improve the relationships between the parameters in the calculation of the Q value. The RES system is, in fact, a method by which one can determine the degree of cause and effect of a system's parameters by making an interaction matrix. In this research, the geomechanical data collected from the water conveyor tunnel of Azad Dam were used to make the interaction matrix of the Q-system. For this purpose, instead of using the conventional methods that are always accompanied by defects such as uncertainty, the Q-system interaction matrix is coded using a technique that is actually a statistical analysis of the data and determining the correlation coefficient between them. So, the effect of each parameter on the system is evaluated with greater certainty. The results of this study show that the formed interaction matrix provides a reasonable estimate of the effective parameters in the Q-system. Among the six parameters of the Q-system, the SRF and Jr parameters have the maximum and minimum impact on the system, respectively, and also the RQD and Jw parameters have the maximum and minimum impact on the system, respectively. Therefore, by developing this method, we can obtain a more accurate relation to the rock mass classification by weighting the required parameters in the Q-system.Keywords: Q-system, rock engineering system, statistical analysis, rock mass, tunnel
Procedia PDF Downloads 731730 Sulfonic Acid Functionalized Ionic Liquid in Combinatorial Approach: A Recyclable and Water Tolerant-Acidic Catalyst for Friedlander Quinoline Synthesis
Authors: Jafar Akbari
Abstract:
Quinolines are very important compounds partially because of their pharmacological properties which include wide applications in medicinal chemistry. notable among them are antimalarial drugs, anti-inflammatory agents, antiasthamatic, antibacterial, antihypertensive, and tyrosine kinase inhibiting agents. Despite quinoline usage in pharmaceutical and other industries, comparatively few methods for their preparation have been reported.The Friedlander annulation is one of the simplest and most straightforward methods for the synthesis of poly substituted quinolines. Although, modified methods employing lewis or br¢nsted acids have been reported for the synthesis of quinolines, the development of water stable acidic catalyst for quinoline synthesis is quite desirable. One of the most remarkable features of ionic liquids is that the yields can be optimized by changing the anions or the cations. Recently, sulfonic acid functionalized ionic liquids were used as solvent-catalyst for several organic reactions. We herein report the one pot domino approach for the synthesis of quinoline derivatives in Friedlander manner using TSIL as a catalyst. These ILs are miscible in water, and their homogeneous system is readily separated from the reaction product, combining advantages of both homogeneous and heterogeneous catalysis. In this reaction, the catalyst plays a dual role; it ensures an effective condensation and cyclization of 2-aminoaryl ketone with second carbonyl group and it also promotes the aromatization to the final product. Various types of quinolines from 2-aminoaryl ketones and β-ketoesters/ketones were prepared in 85-98% yields using the catalytic system of SO3-H functionalized ionic liquid/H2O. More importantly, the catalyst could be easily recycled for five times without loss of much activity.Keywords: antimalarial drugs, green chemistry, ionic liquid, quinolines
Procedia PDF Downloads 2101729 The Effect of Ultrasound as Pre-Treatment for Drying of Red Delicious and Golden Delicious Apples
Authors: Gulcin Yildiz
Abstract:
Drying (dehydration) is the process of removing water from food in order to preserve the food and an alternative to reduce post-harvest loss of fruits. Different pre-treatment methods have been developed for fruit drying, such as ultrasound. If no pre-treatment is done, the fruits will continue to darken after they are dried. However, the effects of ultrasound as pre-treatment on drying of apples has not been well documented. This study was undertaken to investigate the effect of ultrasound as pre-treatment before oven drying of red delicious and golden delicious apples. Red delicious and golden delicious apples were dried in different temperatures. Before performing drying experiments in an oven at 50, 75 and 100 °C, ultrasound as pretreatment was applied in 5, 10, and 15 minutes. Colors of the dried apples were measured with a Minolta Chroma Meter CR-300 (Minolta Camera Co. Ltd., Osaka, Japan) by directly holding the device vertically to the surface of the samples. Content of total phenols was determined spectrophotometrically with the FolinCiocalteau assay, and the antioxidant capacity was evaluated by using 1,1-diphenyl-2-picrylhydrazyl (DPPH) assay. The samples (both red delicious and golden delicious apples) with longer ultrasound treatment produced higher weight loss due to the changes in tissue structure. However less phenolic content and antioxidant capacity were observed for the samples with longer ultrasound pre-treatment. The highest total phenolic content (TPC) was determined in dried apples at 75 °C with 5 minutes pre-treatment ultrasound and the lowest TPC was determined in dried apples at 50 °C with 15 minutes pre-treatment ultrasound which was subjected to the longest ultrasound pre-treatment and drying. The combination of 5 min of ultrasound pre-treatment and 75 °C of oven-drying showed to be the best combination for an energy efficient process. This combination exhibited good antioxidant properties as well. The present study clearly demonstrated that applying ultrasound as pre-treatment for drying of apples is an effective process in terms of quality of dried products, time, and energy.Keywords: golden delicious apples, red delicious apples, total phenolic content, Ultrasound
Procedia PDF Downloads 2961728 Probabilistic Analysis of Bearing Capacity of Isolated Footing using Monte Carlo Simulation
Authors: Sameer Jung Karki, Gokhan Saygili
Abstract:
The allowable bearing capacity of foundation systems is determined by applying a factor of safety to the ultimate bearing capacity. Conventional ultimate bearing capacity calculations routines are based on deterministic input parameters where the nonuniformity and inhomogeneity of soil and site properties are not accounted for. Hence, the laws of mathematics like probability calculus and statistical analysis cannot be directly applied to foundation engineering. It’s assumed that the Factor of Safety, typically as high as 3.0, incorporates the uncertainty of the input parameters. This factor of safety is estimated based on subjective judgement rather than objective facts. It is an ambiguous term. Hence, a probabilistic analysis of the bearing capacity of an isolated footing on a clayey soil is carried out by using the Monte Carlo Simulation method. This simulated model was compared with the traditional discrete model. It was found out that the bearing capacity of soil was found higher for the simulated model compared with the discrete model. This was verified by doing the sensitivity analysis. As the number of simulations was increased, there was a significant % increase of the bearing capacity compared with discrete bearing capacity. The bearing capacity values obtained by simulation was found to follow a normal distribution. While using the traditional value of Factor of safety 3, the allowable bearing capacity had lower probability (0.03717) of occurring in the field compared to a higher probability (0.15866), while using the simulation derived factor of safety of 1.5. This means the traditional factor of safety is giving us bearing capacity that is less likely occurring/available in the field. This shows the subjective nature of factor of safety, and hence probability method is suggested to address the variability of the input parameters in bearing capacity equations.Keywords: bearing capacity, factor of safety, isolated footing, montecarlo simulation
Procedia PDF Downloads 1871727 Laser Writing on Vitroceramic Disks for Petabyte Data Storage
Authors: C. Busuioc, S. I. Jinga, E. Pavel
Abstract:
The continuous need of more non-volatile memories with a higher storage capacity, smaller dimensions and weight, as well as lower costs, has led to the exploration of optical lithography on active media, as well as patterned magnetic composites. In this context, optical lithography is a technique that can provide a significant decrease of the information bit size to the nanometric scale. However, there are some restrictions that arise from the need of breaking the optical diffraction limit. Major achievements have been obtained by employing a vitoceramic material as active medium and a laser beam operated at low power for the direct writing procedure. Thus, optical discs with ultra-high density were fabricated by a conventional melt-quenching method starting from analytical purity reagents. They were subsequently used for 3D recording based on their photosensitive features. Naturally, the next step consists in the elucidation of the composition and structure of the active centers, in correlation with the use of silver and rare-earth compounds for the synthesis of the optical supports. This has been accomplished by modern characterization methods, namely transmission electron microscopy coupled with selected area electron diffraction, scanning transmission electron microscopy and electron energy loss spectroscopy. The influence of laser diode parameters, silver concentration and fluorescent compounds formation on the writing process and final material properties was investigated. The results indicate performances in terms of capacity with two order of magnitude higher than other reported information storage systems. Moreover, the fluorescent photosensitive vitroceramics may be integrated in other applications which appeal to nanofabrication as the driving force in electronics and photonics fields.Keywords: data storage, fluorescent compounds, laser writing, vitroceramics
Procedia PDF Downloads 2251726 Personalizing Human Physical Life Routines Recognition over Cloud-based Sensor Data via AI and Machine Learning
Authors: Kaushik Sathupadi, Sandesh Achar
Abstract:
Pervasive computing is a growing research field that aims to acknowledge human physical life routines (HPLR) based on body-worn sensors such as MEMS sensors-based technologies. The use of these technologies for human activity recognition is progressively increasing. On the other hand, personalizing human life routines using numerous machine-learning techniques has always been an intriguing topic. In contrast, various methods have demonstrated the ability to recognize basic movement patterns. However, it still needs to be improved to anticipate the dynamics of human living patterns. This study introduces state-of-the-art techniques for recognizing static and dy-namic patterns and forecasting those challenging activities from multi-fused sensors. Further-more, numerous MEMS signals are extracted from one self-annotated IM-WSHA dataset and two benchmarked datasets. First, we acquired raw data is filtered with z-normalization and denoiser methods. Then, we adopted statistical, local binary pattern, auto-regressive model, and intrinsic time scale decomposition major features for feature extraction from different domains. Next, the acquired features are optimized using maximum relevance and minimum redundancy (mRMR). Finally, the artificial neural network is applied to analyze the whole system's performance. As a result, we attained a 90.27% recognition rate for the self-annotated dataset, while the HARTH and KU-HAR achieved 83% on nine living activities and 90.94% on 18 static and dynamic routines. Thus, the proposed HPLR system outperformed other state-of-the-art systems when evaluated with other methods in the literature.Keywords: artificial intelligence, machine learning, gait analysis, local binary pattern (LBP), statistical features, micro-electro-mechanical systems (MEMS), maximum relevance and minimum re-dundancy (MRMR)
Procedia PDF Downloads 201725 Characterization of Candlenut Shells and Its Application to Remove Oil and Fine Solids of Produced Water in Nutshell Filters of Water Cleaning Plant
Authors: Annur Suhadi, Haris B. Harahap, Zaim Arrosyidi, Epan, Darmapala
Abstract:
Oilfields under waterflood often face the problem of plugging injectors either by internal filtration or external filter cake built up inside pore throats. The content of suspended solids shall be reduced to required level of filtration since corrective action of plugging is costly expensive. The performance of nutshell filters, where filtration takes place, is good using pecan and walnut shells. Candlenut shells were used instead of pecan and walnut shells since they were abundant in Indonesia, Malaysia, and East Africa. Physical and chemical properties of walnut, pecan, and candlenut shells were tested and the results were compared. Testing, using full-scale nutshell filters, was conducted to determine the oil content, turbidity, and suspended solid removal, which was based on designed flux rate. The performance of candlenut shells, which were deeply bedded in nutshell filters for filtration process, was monitored. Cleaned water outgoing nutshell filters had total suspended solids of 17 ppm, while oil content could be reduced to 15.1 ppm. Turbidity, using candlenut shells, was below the specification for injection water, which was less than 10 Nephelometric Turbidity Unit (NTU). Turbidity of water, outgoing nutshell filter, was ranged from 1.7-5.0 NTU at various dates of operation. Walnut, pecan, and candlenut shells had moisture content of 8.98 wt%, 10.95 wt%, and 9.95 wt%, respectively. The porosity of walnut, pecan, and candlenut shells was significantly affected by moisture content. Candlenut shells had property of toluene solubility of 7.68 wt%, which was much higher than walnut shells, reflecting more crude oil adsorption. The hardness of candlenut shells was 2.5-3 Mohs, which was close to walnut shells’ hardness. It was advantage to guarantee the cleaning filter cake by fluidization process during backwashing.Keywords: candlenut shells, filtration, nutshell filter, pecan shells, walnut shells
Procedia PDF Downloads 1111724 Effect of Fire Retardant Painting Product on Smoke Optical Density of Burning Natural Wood Samples
Authors: Abdullah N. Olimat, Ahmad S. Awad, Faisal M. AL-Ghathian
Abstract:
Natural wood is used in many applications in Jordan such as furniture, partitions constructions, and cupboards. Experimental work for smoke produced by the combustion of certain wood samples was studied. Smoke generated from burning of natural wood, is considered as a major cause of death in furniture fires. The critical parameter for life safety in fires is the available time for escape, so the visual obscuration due to smoke release during fire is taken into consideration. The effect of smoke, produced by burning of wood, depends on the amount of smoke released in case of fire. The amount of smoke production, apparently, affects the time available for the occupants to escape. To achieve the protection of life of building occupants during fire growth, fire retardant painting products are tested. The tested samples of natural wood include Beech, Ash, Beech Pine, and white Beech Pine. A smoke density chamber manufactured by fire testing technology has been used to perform measurement of smoke properties. The procedure of test was carried out according to the ISO-5659. A nonflammable vertical radiant heat flux of 25 kW/m2 is exposed to the wood samples in a horizontal orientation. The main objective of the current study is to carry out the experimental tests for samples of natural woods to evaluate the capability to escape in case of fire and the fire safety requirements. Specific optical density, transmittance, thermal conductivity, and mass loss are main measured parameters. Also, comparisons between samples with paint and with no paint are carried out between the selected samples of woods.Keywords: extinction coefficient, optical density, transmittance, visibility
Procedia PDF Downloads 2371723 Microstructure Analysis of TI-6AL-4V Friction Stir Welded Joints
Authors: P. Leo, E. Cerri, L. Fratini, G. Buffa
Abstract:
The Friction Stir Welding process uses an inert rotating mandrel and a force on the mandrel normal to the plane of the sheets to generate the frictional heat. The heat and the stirring action of the mandrel create a bond between the two sheets without melting the base metal. As matter of fact, the use of a solid state welding process limits the insurgence of defects, due to the presence of gas in melting bath, and avoids the negative effects of materials metallurgical transformation strictly connected with the change of phase. The industrial importance of Ti-6Al-4V alloy is well known. It provides an exceptional good balance of strength, ductility, fatigue and fracture properties together with good corrosion resistance and good metallurgical stability. In this paper, the authors analyze the microstructure of friction stir welded joints of Ti-6Al-4V processed at the same travel speed (35 mm/min) but at different rotation speeds (300-500 rpm). The microstructure of base material (BM), as result from both optical microscope and scanning electron microscope analysis is not homogenous. It is characterized by distorted α/β lamellar microstructure together with smashed zone of fragmented β layer and β retained grain boundary phase. The BM has been welded in the-as received state, without any previous heat treatment. Even the microstructure of the transverse and longitudinal sections of joints is not homogeneous. Close to the top of weld cross sections a much finer microstructure than the initial condition has been observed, while in the center of the joints the microstructure is less refined. Along longitudinal sections, the microstructure is characterized by equiaxed grains and lamellae. Both the length and area fraction of lamellas increases with distance from longitudinal axis. The hardness of joints is higher than that of BM. As the process temperature increases the average microhardness slightly decreases.Keywords: friction stir welding, microhardness, microstructure, Ti-6Al-4V
Procedia PDF Downloads 3811722 Nanopharmaceutical: A Comprehensive Appearance of Drug Delivery System
Authors: Mahsa Fathollahzadeh
Abstract:
The various nanoparticles employed in drug delivery applications include micelles, liposomes, solid lipid nanoparticles, polymeric nanoparticles, functionalized nanoparticles, nanocrystals, cyclodextrins, dendrimers, and nanotubes. Micelles, composed of amphiphilic block copolymers, can encapsulate hydrophobic molecules, allowing for targeted delivery. Liposomes, vesicular structures made up of phospholipids, can encapsulate both hydrophobic and hydrophilic molecules, providing a flexible platform for delivering therapeutic agents. Solid lipid nanoparticles (SLNs) and nanostructured lipid carriers (NLCs) are designed to improve the stability and bioavailability of lipophilic drugs. Polymeric nanoparticles, such as poly(lactic-co-glycolic acid) (PLGA), are biodegradable and can be engineered to release drugs in a controlled manner. Functionalized nanoparticles, coated with targeting ligands or antibodies, can specifically target diseased cells or tissues. Nanocrystals, engineered to have specific surface properties, can enhance the solubility and bioavailability of poorly soluble drugs. Cyclodextrins, doughnut-shaped molecules with hydrophobic cavities, can be complex with hydrophobic molecules, allowing for improved solubility and bioavailability. Dendrimers, branched polymers with a central core, can be designed to deliver multiple therapeutic agents simultaneously. Nanotubes and metallic nanoparticles, such as gold nanoparticles, offer real-time tracking capabilities and can be used to detect biomolecular interactions. The use of these nanoparticles has revolutionized the field of drug delivery, enabling targeted and controlled release of therapeutic agents, reduced toxicity, and improved patient outcomes.Keywords: nanotechnology, nanopharmaceuticals, drug-delivery, proteins, ligands, nanoparticles, chemistry
Procedia PDF Downloads 511721 Characterization of the Microorganisms Associated with Pleurotus ostractus and Pleurotus tuber-Regium Spent Mushroom Substrate
Authors: Samuel E. Okere, Anthony E. Ataga
Abstract:
Introduction: The microbial ecology of Pleurotus osteratus and Pleurotus tuber–regium spent mushroom substrate (SMS) were characterized to determine other ways of its utilization. Materials and Methods: The microbiological properties of the spent mushroom substrate were determined using standard methods. This study was carried out at the Microbiology Laboratory University of Port Harcourt, Rivers State, Nigeria. Results: Quantitative microbiological analysis revealed that Pleurotus osteratus spent mushroom substrate (POSMS) contained 7.9x10⁵ and 1.2 x10³ cfu/g of total heterotrophic bacteria and total fungi count respectively while Pleurotus tuber-regium spent mushroom substrate (PTSMS) contained 1.38x10⁶ and 9.0 x10² cfu/g of total heterotrophic bacteria count and total fungi count respectively. The fungi species encountered from Pleurotus tuber-regium spent mushroom substrate (PTSMS) include Aspergillus and Cladosporum species, while Aspergillus and Penicillium species were encountered from Pleurotus osteratus spent mushroom substrate (POSMS). However, the bacteria species encountered from Pleurotus tuber-regium spent mushroom substrate include Bacillus, Acinetobacter, Alcaligenes, Actinobacter, and Pseudomonas species while Bacillus, Actinobacteria, Aeromonas, Lactobacillus and Aerococcus species were encountered from Pleurotus osteratus spent mushroom substrate (POSMS). Conclusion: Therefore based on the findings from this study, it can be concluded that spent mushroom substrate contain microorganisms that can be utilized both in bioremediation of oil-polluted soils as they contain important hydrocarbon utilizing microorganisms such as Penicillium, Aspergillus and Bacillus species and also as sources of plant growth-promoting rhizobacteria (PGPR) such as Pseudomonas and Bacillus species which can induce resistance on plants. However, further studies are recommended, especially to molecularly characterize these microorganisms.Keywords: characterization, microorganisms, mushroom, spent substrate
Procedia PDF Downloads 1611720 The Roles of Muslims Scholars in Minifying Religious Extremism for Religious Tolerance and Peace Building in Nigeria
Authors: Mukhtar Sarkin-Kebbi
Abstract:
Insurgency, religious extremism and other related religious crises become hydra-headed in Nigeria, which caused destruction of human lives and properties worth of billions naira. As result, millions people were displaced and million children were out of school most of whom from Muslims community. The wrong teaching and misinterpretation of Islam by some Muslim community fuel the spread of extremist ideology hatred among Muslim sects, non-Muslims and emergency of extremist groups, like Boko Haram. A multi-religious country like Nigeria to realise its development in all human aspects, there must be unity and religious tolerance. Many agreed that changing the ideologies of insurgents and religious extremism will require intellectual role with vigorous campaign. Muslim scholars can play a vital role in promoting social reform and peaceful coexistence. This paper discusses the importance of unity among Muslim community and religious tolerance in light of the Qur’an and the Hadith. The paper also reviews the relationship between Muslims and non Muslims during the life time the Prophet (S.A.W.) in order to serve as exemplary model. Contemporary issues such as religious extremism, sectarians, intolerance and their consequences were examined. To minify religious intolerance and extremism,the paper identifies the roles to be played by Muslim scholars with references from Qur’an and Sunnah. The paper concludes that to realise overall human development and eternal salvation, Muslim should shun away from any religious crises and embrace unity and religious tolerance. Finally the paper recommends among others that only pious and learned scholars should be allowed to preach in any religious gathering, Muslim should exercise patience, tolerance in dealing with Muslims and non Muslims. Muslims should leave by example from the teaching of Qur’an and Sunnah of the Prophet (S.A.W.).Keywords: Muslim scholars, peace building, religious extremism, religious tolerance
Procedia PDF Downloads 2131719 Effect of Good Agriculture Management Practices and Constraints on Grape Farming: A Case Study in Mirbachakot, Kalakan and Shakardara Districts Kabul, Afghanistan
Authors: Mohammad Mirwais Yusufi
Abstract:
Skillful management is one of the most important success factors for today’s farms. When a farm is well managed, it can generate funds for its sustainability. Grape is one of the most diffused fruits in the world and one of the most important cash crops with high potential of production in Afghanistan as well. While there are several organizations intervening for improvement of this cash crop, the quality and quantity are still not satisfactory for producers and external markets. The situation has not changed over the years. Therefore, a survey was conducted in 2017 with 60 grape growers, supported by questionnaires in Mirbachakot, Kalakan and Shakardara districts of Kabul province. The purpose was to get an understanding of the current socio-demographic characteristics of farmers, management methods, constraints, farm size, yield and contribution of grape farming to household income. Findings indicate that grape farming was predominant 83.3% male, 16.6% female and small-scale farmers were the main grape producers, 60% < 1 ha of land under grape production. Likewise, 50% had more than > 10 years and 33.3% between 1-5 years’ experience in grape farming. The high level of illiteracy and diseases had significant digit effect on growth, yield and quality of grapes. The results showed that vineyard management operations to protect grapes from mechanical damage are very poor or completely absent. Comparing developed countries, table grape is one of the fruits with the highest input of technology, while in developing countries the cost of labor is low but the purchase of the equipment is very high due to financial situation. Hence the low quality and quantity of grape are influenced by poor management methods, such as non-availability of experts and lack of technical guidance in the study site. Thereby, the study suggested that improved agricultural extension services and managerial skills could contribute to addressing the problems.Keywords: constraints, effect, management, Kabul
Procedia PDF Downloads 1121718 Low Voltage and High Field-Effect Mobility Thin Film Transistor Using Crystalline Polymer Nanocomposite as Gate Dielectric
Authors: Debabrata Bhadra, B. K. Chaudhuri
Abstract:
The operation of organic thin film transistors (OFETs) with low voltage is currently a prevailing issue. We have fabricated anthracene thin-film transistor (TFT) with an ultrathin layer (~450nm) of Poly-vinylidene fluoride (PVDF)/CuO nanocomposites as a gate insulator. We obtained a device with excellent electrical characteristics at low operating voltages (<1V). Different layers of the film were also prepared to achieve the best optimization of ideal gate insulator with various static dielectric constant (εr ). Capacitance density, leakage current at 1V gate voltage and electrical characteristics of OFETs with a single and multi layer films were investigated. This device was found to have highest field effect mobility of 2.27 cm2/Vs, a threshold voltage of 0.34V, an exceptionally low sub threshold slope of 380 mV/decade and an on/off ratio of 106. Such favorable combination of properties means that these OFETs can be utilized successfully as voltages below 1V. A very simple fabrication process has been used along with step wise poling process for enhancing the pyroelectric effects on the device performance. The output characteristic of OFET after poling were changed and exhibited linear current-voltage relationship showing the evidence of large polarization. The temperature dependent response of the device was also investigated. The stable performance of the OFET after poling operation makes it reliable in temperature sensor applications. Such High-ε CuO/PVDF gate dielectric appears to be highly promising candidates for organic non-volatile memory and sensor field-effect transistors (FETs).Keywords: organic field effect transistors, thin film transistor, gate dielectric, organic semiconductor
Procedia PDF Downloads 2441717 Laboratory Evaluation of Asphalt Concrete Prepared with Over Burnt Brick Aggregate Treated by Zycosoil
Authors: D. Sarkar, M. Pal, A. K. Sarkar
Abstract:
Asphaltic concrete for pavement construction in India are produced by using crushed stone, gravels etc. as aggregate. In north-Eastern region of India, there is a scarcity o f stone aggregate. Therefore the road engineers are always in search of an optional material as aggregate which can replace the regularly used material. The purpose of this work was to evaluate the utilization of substandard or marginal aggregates in flexible pavement construction. The investigation was undertaken to evaluate the effects of using lower quality aggregates such as over burnt brick aggregate on the preparation of asphalt concrete for flexible pavements. The scope of this work included a review of available literature and existing data, a laboratory evaluation organized to determine the effects of marginal aggregates and potential techniques to upgrade these substandard materials, and a laboratory evaluation of these upgraded marginal aggregate asphalt mixtures. Over burnt brick aggregates are water susceptible and can leads to moisture damage. Moisture damage is the progressive loss of functionality of the material owing to loss of the adhesion bond between the asphalt binder and the aggregate surface. Hence, zycosoil as an anti striping additive were evaluated in this study. This study summarizes the results of the laboratory evaluation carried out to investigate the properties of asphalt concrete prepared with zycosoil modified over burnt brick aggregate. Marshall specimen were prepared with stone aggregate, zycosoil modified stone aggregate, over burnt brick aggregate and zycosoil modified over burnt brick aggregate. Results show that addition of zycosoil with stone aggregate increased stability by 6% and addition of zycosoil with over burnt brick aggregate increased stability by 30%.Keywords: asphalt concrete, over burnt brick aggregate, marshall stability, zycosoil
Procedia PDF Downloads 3571716 Optimization of Enzymatic Hydrolysis of Cooked Porcine Blood to Obtain Hydrolysates with Potential Biological Activities
Authors: Miguel Pereira, Lígia Pimentel, Manuela Pintado
Abstract:
Animal blood is a major by-product of slaughterhouses and still represents a cost and environmental problem in some countries. To be eliminated, blood should be stabilised by cooking and afterwards the slaughterhouses must have to pay for its incineration. In order to reduce the elimination costs and valorise the high protein content the aim of this study was the optimization of hydrolysis conditions, in terms of enzyme ratio and time, in order to obtain hydrolysates with biological activity. Two enzymes were tested in this assay: pepsin and proteases from Cynara cardunculus (cardosins). The latter has the advantage to be largely used in the Portuguese Dairy Industry and has a low price. The screening assays were carried out in a range of time between 0 and 10 h and using a ratio of enzyme/reaction volume between 0 and 5%. The assays were performed at the optimal conditions of pH and temperature for each enzyme: 55 °C at pH 5.2 for cardosins and 37 °C at pH 2.0 for pepsin. After reaction, the hydrolysates were evaluated by FPLC (Fast Protein Liquid Chromatography) and tested for their antioxidant activity by ABTS method. FPLC chromatograms showed different profiles when comparing the enzymatic reactions with the control (no enzyme added). The chromatogram exhibited new peaks with lower MW that were not present in control samples, demonstrating the hydrolysis by both enzymes. Regarding to the antioxidant activity, the best results for both enzymes were obtained using a ratio enzyme/reactional volume of 5% during 5 h of hydrolysis. However, the extension of reaction did not affect significantly the antioxidant activity. This has an industrial relevant aspect in what concerns to the process cost. In conclusion, the enzymatic blood hydrolysis can be a better alternative to the current elimination process allowing to the industry the reuse of an ingredient with biological properties and economic value.Keywords: antioxidant activity, blood, by-products, enzymatic hydrolysis
Procedia PDF Downloads 5091715 A Simulation-Based Method for Evaluation of Energy System Cooperation between Pulp and Paper Mills and a District Heating System: A Case Study
Authors: Alexander Hedlund, Anna-Karin Stengard, Olof Björkqvist
Abstract:
A step towards reducing greenhouse gases and energy consumption is to collaborate with the energy system between several industries. This work is based on a case study on integration of pulp and paper mills with a district heating system in Sundsvall, Sweden. Present research shows that it is possible to make a significant reduction in the electricity demand in the mechanical pulping process. However, the profitability of the efficiency measures could be an issue, as the excess steam recovered from the refiners decreases with the electricity consumption. A consequence will be that the fuel demand for steam production will increase. If the fuel price is similar to the electricity price it would reduce the profit of such a project. If the paper mill can be integrated with a district heating system, it is possible to upgrade excess heat from a nearby kraft pulp mill to process steam via the district heating system in order to avoid the additional fuel need. The concept is investigated by using a simulation model describing both the mass and energy balance as well as the operating margin. Three scenarios were analyzed: reference, electricity reduction and energy substitution. The simulation show that the total input to the system is lowest in the Energy substitution scenario. Additionally, in the Energy substitution scenario the steam from the incineration boiler covers not only the steam shortage but also a part of the steam produced using the biofuel boiler, the cooling tower connected to the incineration boiler is no longer needed and the excess heat can cover the whole district heating load during the whole year. The study shows a substantial economic advantage if all stakeholders act together as one system. However, costs and benefits are unequally shared between the actors. This means that there is a need for new business models in order to share the system costs and benefits.Keywords: energy system, cooperation, simulation method, excess heat, district heating
Procedia PDF Downloads 2261714 A Geometrical Multiscale Approach to Blood Flow Simulation: Coupling 2-D Navier-Stokes and 0-D Lumped Parameter Models
Authors: Azadeh Jafari, Robert G. Owens
Abstract:
In this study, a geometrical multiscale approach which means coupling together the 2-D Navier-Stokes equations, constitutive equations and 0-D lumped parameter models is investigated. A multiscale approach, suggest a natural way of coupling detailed local models (in the flow domain) with coarser models able to describe the dynamics over a large part or even the whole cardiovascular system at acceptable computational cost. In this study we introduce a new velocity correction scheme to decouple the velocity computation from the pressure one. To evaluate the capability of our new scheme, a comparison between the results obtained with Neumann outflow boundary conditions on the velocity and Dirichlet outflow boundary conditions on the pressure and those obtained using coupling with the lumped parameter model has been performed. Comprehensive studies have been done based on the sensitivity of numerical scheme to the initial conditions, elasticity and number of spectral modes. Improvement of the computational algorithm with stable convergence has been demonstrated for at least moderate Weissenberg number. We comment on mathematical properties of the reduced model, its limitations in yielding realistic and accurate numerical simulations, and its contribution to a better understanding of microvascular blood flow. We discuss the sophistication and reliability of multiscale models for computing correct boundary conditions at the outflow boundaries of a section of the cardiovascular system of interest. In this respect the geometrical multiscale approach can be regarded as a new method for solving a class of biofluids problems, whose application goes significantly beyond the one addressed in this work.Keywords: geometrical multiscale models, haemorheology model, coupled 2-D navier-stokes 0-D lumped parameter modeling, computational fluid dynamics
Procedia PDF Downloads 3611713 Assessment of Land Use Land Cover Change-Induced Climatic Effects
Authors: Mahesh K. Jat, Ankan Jana, Mahender Choudhary
Abstract:
Rapid population and economic growth resulted in changes in large-scale land use land cover (LULC) changes. Changes in the biophysical properties of the Earth's surface and its impact on climate are of primary concern nowadays. Different approaches, ranging from location-based relationships or modelling earth surface - atmospheric interaction through modelling techniques like surface energy balance (SEB) are used in the recent past to examine the relationship between changes in Earth surface land cover and climatic characteristics like temperature and precipitation. A remote sensing-based model i.e., Surface Energy Balance Algorithm for Land (SEBAL), has been used to estimate the surface heat fluxes over Mahi Bajaj Sagar catchment (India) from 2001 to 2020. Landsat ETM and OLI satellite data are used to model the SEB of the area. Changes in observed precipitation and temperature, obtained from India Meteorological Department (IMD) have been correlated with changes in surface heat fluxes to understand the relative contributions of LULC change in changing these climatic variables. Results indicate a noticeable impact of LULC changes on climatic variables, which are aligned with respective changes in SEB components. Results suggest that precipitation increases at a rate of 20 mm/year. The maximum and minimum temperature decreases and increases at 0.007 ℃ /year and 0.02 ℃ /year, respectively. The average temperature increases at 0.009 ℃ /year. Changes in latent heat flux and sensible heat flux positively correlate with precipitation and temperature, respectively. Variation in surface heat fluxes influences the climate parameters and is an adequate reason for climate change. So, SEB modelling is helpful to understand the LULC change and its impact on climate.Keywords: LULC, sensible heat flux, latent heat flux, SEBAL, landsat, precipitation, temperature
Procedia PDF Downloads 1161712 Numerical Methodology to Support the Development of a Double Chamber Syringe
Authors: Lourenço Bastos, Filipa Carneiro, Bruno Vale, Rita Marques Joana Silva, Ricardo Freitas, Ângelo Marques, Sara Cortez, Alberta Coelho, Pedro Parreira, Liliana Sousa, Anabela Salgueiro, Bruno Silva
Abstract:
The process of flushing is considered to be an adequate technique to reduce the risk of infection during the clinical practice of venous catheterization. Nonetheless, there is still a lack of adhesion to this method, in part due to the complexity of this procedure. The project SeringaDuo aimed to develop an innovative double-chamber syringe for intravenous sequential administration of drugs and serums. This device served the purpose of improving the adherence to the practice, through the reduction of manipulations needed, which also improves patient safety, and though the promotion of flushing practice by health professionals, by simplifying this task. To assist on the development of this innovative syringe, a numerical methodology was developed and validated in order to predict the syringe’s mechanical and flow behavior during the fluids’ loading and administration phases, as well as to allow the material behavior evaluation during its production. For this, three commercial numerical simulation software was used, namely ABAQUS, ANSYS/FLUENT, and MOLDFLOW. This methodology aimed to evaluate the concepts feasibility and to optimize the geometries of the syringe’s components, creating this way an iterative process for product development based on numerical simulations, validated by the production of prototypes. Through this methodology, it was possible to achieve a final design that fulfils all the characteristics and specifications defined. This iterative process based on numerical simulations is a powerful tool for product development that allows obtaining fast and accurate results without the strict need for prototypes. An iterative process can be implemented, consisting of consecutive constructions and evaluations of new concepts, to obtain an optimized solution, which fulfils all the predefined specifications and requirements.Keywords: Venous catheterization, flushing, syringe, numerical simulation
Procedia PDF Downloads 167