Search results for: Butterworth low pass filter
306 Comparison of Different Reanalysis Products for Predicting Extreme Precipitation in the Southern Coast of the Caspian Sea
Authors: Parvin Ghafarian, Mohammadreza Mohammadpur Panchah, Mehri Fallahi
Abstract:
Synoptic patterns from surface up to tropopause are very important for forecasting the weather and atmospheric conditions. There are many tools to prepare and analyze these maps. Reanalysis data and the outputs of numerical weather prediction models, satellite images, meteorological radar, and weather station data are used in world forecasting centers to predict the weather. The forecasting extreme precipitating on the southern coast of the Caspian Sea (CS) is the main issue due to complex topography. Also, there are different types of climate in these areas. In this research, we used two reanalysis data such as ECMWF Reanalysis 5th Generation Description (ERA5) and National Centers for Environmental Prediction /National Center for Atmospheric Research (NCEP/NCAR) for verification of the numerical model. ERA5 is the latest version of ECMWF. The temporal resolution of ERA5 is hourly, and the NCEP/NCAR is every six hours. Some atmospheric parameters such as mean sea level pressure, geopotential height, relative humidity, wind speed and direction, sea surface temperature, etc. were selected and analyzed. Some different type of precipitation (rain and snow) was selected. The results showed that the NCEP/NCAR has more ability to demonstrate the intensity of the atmospheric system. The ERA5 is suitable for extract the value of parameters for specific point. Also, ERA5 is appropriate to analyze the snowfall events over CS (snow cover and snow depth). Sea surface temperature has the main role to generate instability over CS, especially when the cold air pass from the CS. Sea surface temperature of NCEP/NCAR product has low resolution near coast. However, both data were able to detect meteorological synoptic patterns that led to heavy rainfall over CS. However, due to the time lag, they are not suitable for forecast centers. The application of these two data is for research and verification of meteorological models. Finally, ERA5 has a better resolution, respect to NCEP/NCAR reanalysis data, but NCEP/NCAR data is available from 1948 and appropriate for long term research.Keywords: synoptic patterns, heavy precipitation, reanalysis data, snow
Procedia PDF Downloads 123305 Mitigating Biofouling on Reverse Osmosis Membranes: Applying Greener Preservatives to Biofilm Treatment
Authors: Anna Curtin, Matthew Thibodeau, Heather Buckley
Abstract:
Water scarcity is characterized by a lack of access to clean and affordable drinking water, as well as water for hygienic and economic needs. The amount of people effected by water scarcity is expected to increase in the coming years due to climate change, population growth, and pollution, amongst other things. In response, scientists are pursuing cost effective drinking water treatment methods, often with a focus on alternative water sources. Desalination of seawater via reverse osmosis is one promising alternative method. Desalination of seawater via reverse osmosis, however, is limited significantly by biofouling of the filtration membrane. Biofouling is the buildup of microorganisms in a biofilm at the water-membrane interface. It clogs the membrane, decreasing the efficiency of filtration, consequently increasing operational and maintenance costs. Although effective, existing chemical treatment methods can damage the membrane, decreasing the lifespan of the membrane; create antibiotic resistance; and cause harm to humans and the environment if they pass through the membrane into the permeate. The current project focuses on applying safer preservatives used in home and personal care products to RO membranes to investigate the biofouling treatment efficacy. Currently, many of these safer preservatives have only been tested on cells in planktonic phase in suspension cultures, not on cells in biofilms. The results of suspension culture tests are not applicable to biofouling scenarios because organisms in planktonic phase in suspension cultures exhibit different morphological, chemical, and metabolic characteristics than those in a biofilm. Testing antifoulant efficacy of safer preservatives on biofilms will provide more applicable results to biofouling on RO membranes. To do this, biofilms will be grown on 96-well-plates and minimum inhibitory concentrations (MIC90) and log-reductions will be calculated for various safer preservatives. Results from these tests will be used to guide doses for tests of safer preservatives in a bench-scale RO system.Keywords: reverse osmosis, biofouling, preservatives, antimicrobial, safer alternative, green chemistry
Procedia PDF Downloads 144304 Extracting Terrain Points from Airborne Laser Scanning Data in Densely Forested Areas
Authors: Ziad Abdeldayem, Jakub Markiewicz, Kunal Kansara, Laura Edwards
Abstract:
Airborne Laser Scanning (ALS) is one of the main technologies for generating high-resolution digital terrain models (DTMs). DTMs are crucial to several applications, such as topographic mapping, flood zone delineation, geographic information systems (GIS), hydrological modelling, spatial analysis, etc. Laser scanning system generates irregularly spaced three-dimensional cloud of points. Raw ALS data are mainly ground points (that represent the bare earth) and non-ground points (that represent buildings, trees, cars, etc.). Removing all the non-ground points from the raw data is referred to as filtering. Filtering heavily forested areas is considered a difficult and challenging task as the canopy stops laser pulses from reaching the terrain surface. This research presents an approach for removing non-ground points from raw ALS data in densely forested areas. Smoothing splines are exploited to interpolate and fit the noisy ALS data. The presented filter utilizes a weight function to allocate weights for each point of the data. Furthermore, unlike most of the methods, the presented filtering algorithm is designed to be automatic. Three different forested areas in the United Kingdom are used to assess the performance of the algorithm. The results show that the generated DTMs from the filtered data are accurate (when compared against reference terrain data) and the performance of the method is stable for all the heavily forested data samples. The average root mean square error (RMSE) value is 0.35 m.Keywords: airborne laser scanning, digital terrain models, filtering, forested areas
Procedia PDF Downloads 139303 Problem Solving: Process or Product? A Mathematics Approach to Problem Solving in Knowledge Management
Authors: A. Giannakopoulos, S. B. Buckley
Abstract:
Problem solving in any field is recognised as a prerequisite for any advancement in knowledge. For example in South Africa it is one of the seven critical outcomes of education together with critical thinking. As a systematic way to problem solving was initiated in mathematics by the great mathematician George Polya (the father of problem solving), more detailed and comprehensive ways in problem solving have been developed. This paper is based on the findings by the author and subsequent recommendations for further research in problem solving and critical thinking. Although the study was done in mathematics, there is no doubt by now in almost anyone’s mind that mathematics is involved to a greater or a lesser extent in all fields, from symbols, to variables, to equations, to logic, to critical thinking. Therefore it stands to reason that mathematical principles and learning cannot be divorced from any field. In management of knowledge situations, the types of problems are similar to mathematics problems varying from simple to analogical to complex; from well-structured to ill-structured problems. While simple problems could be solved by employees by adhering to prescribed sequential steps (the process), analogical and complex problems cannot be proceduralised and that diminishes the capacity of the organisation of knowledge creation and innovation. The low efficiency in some organisations and the low pass rates in mathematics prompted the author to view problem solving as a product. The authors argue that using mathematical approaches to knowledge management problem solving and treating problem solving as a product will empower the employee through further training to tackle analogical and complex problems. The question the authors asked was: If it is true that problem solving and critical thinking are indeed basic skills necessary for advancement of knowledge why is there so little literature of knowledge management (KM) about them and how they are connected and advance KM?This paper concludes with a conceptual model which is based on general accepted principles of knowledge acquisition (developing a learning organisation), knowledge creation, sharing, disseminating and storing thereof, the five pillars of knowledge management (KM). This model, also expands on Gray’s framework on KM practices and problem solving and opens the doors to a new approach to training employees in general and domain specific areas problems which can be adapted in any type of organisation.Keywords: critical thinking, knowledge management, mathematics, problem solving
Procedia PDF Downloads 596302 A Meta-Analysis of the Academic Achievement of Students With Emotional/Behavioral Disorders in Traditional Public Schools in the United States
Authors: Dana Page, Erica McClure, Kate Snider, Jenni Pollard, Tim Landrum, Jeff Valentine
Abstract:
Extensive research has been conducted on students with emotional and behavioral disorders (EBD) and their rates of challenging behavior. In the past, however, less attention has been given to their academic achievement and outcomes. Recent research examining outcomes for students with EBD has indicated that these students receive lower grades, are less likely to pass classes, and experience higher rates of school dropout than students without disabilities and students with other high incidence disabilities. Given that between 2% and 20% of the school-age population is likely to have EBD (though many may not be identified as such), this is no small problem. Despite the need for increased examination of this population’s academic achievement, research on the actual performance of students with EBD has been minimal. This study reports the results of a meta-analysis of the limited research examining academic achievement of students with EBD, including effect sizes of assessment scores and discussion of moderators potentially impacting academic outcomes. Researchers conducted a thorough literature search to identify potentially relevant documents before screening studies for inclusion in the systematic review. Screening identified 35 studies that reported results of academic assessment scores for students with EBD. These studies were then coded to extract descriptive data across multiple domains, including placement of students, participant demographics, and academic assessment scores. Results indicated possible collinearity between EBD disability status and lower academic assessment scores, despite a lack of association between EBD eligibility and lower cognitive ability. Quantitative analysis of assessment results yielded effect sizes for academic achievement of student participants, indicating lower performance levels and potential moderators (e.g., race, socioeconomic status, and gender) impacting student academic performance. In addition to discussing results of the meta-analysis, implications and areas for future research, policy, and practice are discussed.Keywords: students with emotional behavioral disorders, academic achievement, systematic review, meta-analysis
Procedia PDF Downloads 69301 MIMO Radar-Based System for Structural Health Monitoring and Geophysical Applications
Authors: Davide D’Aria, Paolo Falcone, Luigi Maggi, Aldo Cero, Giovanni Amoroso
Abstract:
The paper presents a methodology for real-time structural health monitoring and geophysical applications. The key elements of the system are a high performance MIMO RADAR sensor, an optical camera and a dedicated set of software algorithms encompassing interferometry, tomography and photogrammetry. The MIMO Radar sensor proposed in this work, provides an extremely high sensitivity to displacements making the system able to react to tiny deformations (up to tens of microns) with a time scale which spans from milliseconds to hours. The MIMO feature of the system makes the system capable of providing a set of two-dimensional images of the observed scene, each mapped on the azimuth-range directions with noticeably resolution in both the dimensions and with an outstanding repetition rate. The back-scattered energy, which is distributed in the 3D space, is projected on a 2D plane, where each pixel has as coordinates the Line-Of-Sight distance and the cross-range azimuthal angle. At the same time, the high performing processing unit allows to sense the observed scene with remarkable refresh periods (up to milliseconds), thus opening the way for combined static and dynamic structural health monitoring. Thanks to the smart TX/RX antenna array layout, the MIMO data can be processed through a tomographic approach to reconstruct the three-dimensional map of the observed scene. This 3D point cloud is then accurately mapped on a 2D digital optical image through photogrammetric techniques, allowing for easy and straightforward interpretations of the measurements. Once the three-dimensional image is reconstructed, a 'repeat-pass' interferometric approach is exploited to provide the user of the system with high frequency three-dimensional motion/vibration estimation of each point of the reconstructed image. At this stage, the methodology leverages consolidated atmospheric correction algorithms to provide reliable displacement and vibration measurements.Keywords: interferometry, MIMO RADAR, SAR, tomography
Procedia PDF Downloads 195300 The BNCT Project Using the Cf-252 Source: Monte Carlo Simulations
Authors: Marta Błażkiewicz-Mazurek, Adam Konefał
Abstract:
The project can be divided into three main parts: i. modeling the Cf-252 neutron source and conducting an experiment to verify the correctness of the obtained results, ii. design of the BNCT system infrastructure, iii. analysis of the results from the logical detector. Modeling of the Cf-252 source included designing the shape and size of the source as well as the energy and spatial distribution of emitted neutrons. Two options were considered: a point source and a cylindrical spatial source. The energy distribution corresponded to various spectra taken from specialized literature. Directionally isotropic neutron emission was simulated. The simulation results were compared with experimental values determined using the activation detector method using indium foils and cadmium shields. The relative fluence rate of thermal and resonance neutrons was compared in the chosen places in the vicinity of the source. The second part of the project related to the modeling of the BNCT infrastructure consisted of developing a simulation program taking into account all the essential components of this system. Materials with moderating, absorbing, and backscattering properties of neutrons were adopted into the project. Additionally, a gamma radiation filter was introduced into the beam output system. The analysis of the simulation results obtained using a logical detector located at the beam exit from the BNCT infrastructure included neutron energy and their spatial distribution. Optimization of the system involved changing the size and materials of the system to obtain a suitable collimated beam of thermal neutrons.Keywords: BNCT, Monte Carlo, neutrons, simulation, modeling
Procedia PDF Downloads 29299 Establishment of a Nomogram Prediction Model for Postpartum Hemorrhage during Vaginal Delivery
Authors: Yinglisong, Jingge Chen, Jingxuan Chen, Yan Wang, Hui Huang, Jing Zhnag, Qianqian Zhang, Zhenzhen Zhang, Ji Zhang
Abstract:
Purpose: The study aims to establish a nomogram prediction model for postpartum hemorrhage (PPH) in vaginal delivery. Patients and Methods: Clinical data were retrospectively collected from vaginal delivery patients admitted to a hospital in Zhengzhou, China, from June 1, 2022 - October 31, 2022. Univariate and multivariate logistic regression were used to filter out independent risk factors. A nomogram model was established for PPH in vaginal delivery based on the risk factors coefficient. Bootstrapping was used for internal validation. To assess discrimination and calibration, receiver operator characteristics (ROC) and calibration curves were generated in the derivation and validation groups. Results: A total of 1340 cases of vaginal delivery were enrolled, with 81 (6.04%) having PPH. Logistic regression indicated that history of uterine surgery, induction of labor, duration of first labor, neonatal weight, WBC value (during the first stage of labor), and cervical lacerations were all independent risk factors of hemorrhage (P <0.05). The area-under-curve (AUC) of ROC curves of the derivation group and the validation group were 0.817 and 0.821, respectively, indicating good discrimination. Two calibration curves showed that nomogram prediction and practical results were highly consistent (P = 0.105, P = 0.113). Conclusion: The developed individualized risk prediction nomogram model can assist midwives in recognizing and diagnosing high-risk groups of PPH and initiating early warning to reduce PPH incidence.Keywords: vaginal delivery, postpartum hemorrhage, risk factor, nomogram
Procedia PDF Downloads 77298 Ergonomic Adaptations in Visually Impaired Workers - A Literature Review
Authors: Kamila Troper, Pedro Mestre, Maria Lurdes Menano, Joana Mendonça, Maria João Costa, Sandra Demel
Abstract:
Introduction: Visual impairment is a problem that has an influence on hundreds of thousands of people all over the world. Although it is possible for a Visually Impaired person to do most jobs, the right training, technological assistance, and emotional support are essential. Ergonomics be able to solve many of the problems/issues with the relative ease of positioning, lighting and design of the workplace. A little forethought can make a tremendous difference to the ease with which a person with an impairment function. Objectives: Review the main ergonomic adaptation measures reported in the literature in order to promote better working conditions and safety measures for the visually impaired. Methodology: This was an exploratory-descriptive, qualitative literature systematic review study. The main databases used were: PubMed, BIREME, LILACS, with articles and studies published between 2000 and 2021. Results: Based on the principles of the theoretical references of ergonomic analysis of work, the main restructuring of the physical space of the workstations were: Accessibility facilities and assistive technologies; A screen reader that captures information from a computer and sends it in real-time to a speech synthesizer or Braille terminal; Installations of software with voice recognition, Monitors with enlarged screens; Magnification software; Adequate lighting, magnifying lenses in addition to recommendations regarding signage and clearance of the places where the visually impaired pass through. Conclusions: Employability rates for people with visual impairments(both those who are blind and those who have low vision)are low and continue to be a concern to the world and for researchers as a topic of international interest. Although numerous authors have identified barriers to employment and proposed strategies to remediate or circumvent those barriers, people with visual impairments continue to experience high rates of unemployment.Keywords: ergonomic adaptations, visual impairments, ergonomic analysis of work, systematic review
Procedia PDF Downloads 182297 Towards Developing A Rural South African Child Into An Engineering Graduates With Conceptual And Critical Thinking Skills
Authors: Betty Kibirige
Abstract:
Students entering the University of Zululand (UNIZULU) Science Faculty mostly come with skills that allow them to prepare for exams and pass them in order to satisfy requirements for entry into a tertiary Institution. Some students hail from deep rural schools with limited facilities, while others come from well-resourced schools. Personal experience has shown that it may take a student the whole time at a tertiary institution following the same skills as those acquired in high school as a sure means of entering the next level in their development, namely a postgraduate program. While it is apparent that at this point in human history, it is totally impossible to teach all the possible content in any one subject, many academics approach teaching and learning from the traditional point of view. It therefore became apparent to explore ways of developing a graduate that will be able to approach life with skills that allows them to navigate knowledge by applying conceptual and critical thinking skills. Recently, the Science Faculty at the University of Zululand introduced two Engineering programs. In an endeavour to approach the development of the Engineering graduate in this institution to be able to tackle problem-solving in the present-day excessive information availability, it became necessary to study and review approaches used by various academics in order to settle for a possible best approach to the challenge at hand. This paper focuses on the development of a deep rural child in a graduate with conceptual and critical thinking skills as major attributes possessed upon graduation. For this purpose, various approaches were studied. A combination of these approaches was repackaged to form an approach that may appear novel to UNIZULU and the rural child, especially for the Engineering discipline. The approach was checked by offering quiz questions to students participating in an engineering module, observing test scores in the targeted module and make comparative studies. Test results are discussed in the article. It was concluded that students’ graduate attributes could be tailored subconsciously to indeed include conceptual and critical thinking skills, but through more than one approach depending mainly on the student's high school background.Keywords: graduate attributes, conceptual skills, critical thinking skills, traditional approach
Procedia PDF Downloads 242296 Automatic Staging and Subtype Determination for Non-Small Cell Lung Carcinoma Using PET Image Texture Analysis
Authors: Seyhan Karaçavuş, Bülent Yılmaz, Ömer Kayaaltı, Semra İçer, Arzu Taşdemir, Oğuzhan Ayyıldız, Kübra Eset, Eser Kaya
Abstract:
In this study, our goal was to perform tumor staging and subtype determination automatically using different texture analysis approaches for a very common cancer type, i.e., non-small cell lung carcinoma (NSCLC). Especially, we introduced a texture analysis approach, called Law’s texture filter, to be used in this context for the first time. The 18F-FDG PET images of 42 patients with NSCLC were evaluated. The number of patients for each tumor stage, i.e., I-II, III or IV, was 14. The patients had ~45% adenocarcinoma (ADC) and ~55% squamous cell carcinoma (SqCCs). MATLAB technical computing language was employed in the extraction of 51 features by using first order statistics (FOS), gray-level co-occurrence matrix (GLCM), gray-level run-length matrix (GLRLM), and Laws’ texture filters. The feature selection method employed was the sequential forward selection (SFS). Selected textural features were used in the automatic classification by k-nearest neighbors (k-NN) and support vector machines (SVM). In the automatic classification of tumor stage, the accuracy was approximately 59.5% with k-NN classifier (k=3) and 69% with SVM (with one versus one paradigm), using 5 features. In the automatic classification of tumor subtype, the accuracy was around 92.7% with SVM one vs. one. Texture analysis of FDG-PET images might be used, in addition to metabolic parameters as an objective tool to assess tumor histopathological characteristics and in automatic classification of tumor stage and subtype.Keywords: cancer stage, cancer cell type, non-small cell lung carcinoma, PET, texture analysis
Procedia PDF Downloads 326295 Risk-Sharing Financing of Islamic Banks: Better Shielded against Interest Rate Risk
Authors: Mirzet SeHo, Alaa Alaabed, Mansur Masih
Abstract:
In theory, risk-sharing-based financing (RSF) is considered a corner stone of Islamic finance. It is argued to render Islamic banks more resilient to shocks. In practice, however, this feature of Islamic financial products is almost negligible. Instead, debt-based instruments, with conventional like features, have overwhelmed the nascent industry. In addition, the framework of present-day economic, regulatory and financial reality inevitably exposes Islamic banks in dual banking systems to problems of conventional banks. This includes, but is not limited to, interest rate risk. Empirical evidence has, thus far, confirmed such exposures, despite Islamic banks’ interest-free operations. This study applies system GMM in modeling the determinants of RSF, and finds that RSF is insensitive to changes in interest rates. Hence, our results provide support to the “stability” view of risk-sharing-based financing. This suggests RSF as the way forward for risk management at Islamic banks, in the absence of widely acceptable Shariah compliant hedging instruments. Further support to the stability view is given by evidence of counter-cyclicality. Unlike debt-based lending that inflates artificial asset bubbles through credit expansion during the upswing of business cycles, RSF is negatively related to GDP growth. Our results also imply a significantly strong relationship between risk-sharing deposits and RSF. However, the pass-through of these deposits to RSF is economically low. Only about 40% of risk-sharing deposits are channeled to risk-sharing financing. This raises questions on the validity of the industry’s claim that depositors accustomed to conventional banking shun away from risk sharing and signals potential for better balance sheet management at Islamic banks. Overall, our findings suggest that, on the one hand, Islamic banks can gain ‘independence’ from conventional banks and interest rates through risk-sharing products, the potential for which is enormous. On the other hand, RSF could enable policy makers to improve systemic stability and restrain excessive credit expansion through its countercyclical features.Keywords: Islamic banks, risk-sharing, financing, interest rate, dynamic system GMM
Procedia PDF Downloads 316294 Meditation and Insight Interpretation Using Quantum Circle Based-on Experiment and Quantum Relativity Formalism
Authors: Somnath Bhattachryya, Montree Bunruangses, Somchat Sonasang, Preecha Yupapin
Abstract:
In this study and research on meditation and insight, the design and experiment with electronic circuits to manipulate the meditators' mental circles that call the chakras to have the same size is proposed. The shape of the circuit is 4-ports, called an add-drop multiplexer, that studies the meditation structure called the four-mindfulness foundation, then uses an AC power signal as an input instead of the meditation time function, where various behaviors with the method of re-filtering the signal (successive filtering), like eight noble paths. Start by inputting a signal at a frequency that causes the velocity of the wave on the perimeter of the circuit to cause particles to have the speed of light in a vacuum. The signal changes from electromagnetic waves and matter waves according to the velocity (frequency) until it reaches the point of the relativistic limit. The electromagnetic waves are transformed into photons with properties of wave-particle overcoming the limits of the speed of light. As for the matter wave, it will travel to the other side and cannot pass through the relativistic limit, called a shadow signal (echo) that can have power from increasing speed but cannot create speed faster than light or insight. In the experiment, the only the side where the velocity is positive, only where the speed above light or the corresponding frequency indicates intelligence. Other side(echo) can be done by changing the input signal to the other side of the circuit to get the same result. But there is no intelligence or speed beyond light. It is also used to study the stretching, contraction of time and wormholes that can be applied for teleporting, Bose-Einstein condensate and teleprinting, quantum telephone. The teleporting can happen throughout the system with wave-particle and echo, which is when the speed of the particle is faster than the stretching or contraction of time, the particle will submerge in the wormhole, when the destination and time are determined, will travel through the wormhole. In a wormhole, time can determine in the future and the past. The experimental results using the microstrip circuit have been found to be by the principle of quantum relativity, which can be further developed for both tools and meditation practitioners for quantum technology.Keywords: quantu meditation, insight picture, quantum circuit, absolute time, teleportation
Procedia PDF Downloads 64293 Quantification of Hydrogen Sulfide and Methyl Mercaptan in Air Samples from a Waste Management Facilities
Authors: R. F. Vieira, S. A. Figueiredo, O. M. Freitas, V. F. Domingues, C. Delerue-Matos
Abstract:
The presence of sulphur compounds like hydrogen sulphide and mercaptans is one of the reasons for waste-water treatment and waste management being associated with odour emissions. In this context having a quantifying method for these compounds helps in the optimization of treatment with the goal of their elimination, namely biofiltration processes. The aim of this study was the development of a method for quantification of odorous gases in waste treatment plants air samples. A method based on head space solid phase microextraction (HS-SPME) coupled with gas chromatography - flame photometric detector (GC-FPD) was used to analyse H2S and Metil Mercaptan (MM). The extraction was carried out with a 75-μm Carboxen-polydimethylsiloxane fiber coating at 22 ºC for 20 min, and analysed by a GC 2010 Plus A from Shimadzu with a sulphur filter detector: splitless mode (0.3 min), the column temperature program was from 60 ºC, increased by 15 ºC/min to 100 ºC (2 min). The injector temperature was held at 250 ºC, and the detector at 260 ºC. For calibration curve a gas diluter equipment (digital Hovagas G2 - Multi Component Gas Mixer) was used to do the standards. This unit had two input connections, one for a stream of the dilute gas and another for a stream of nitrogen and an output connected to a glass bulb. A 40 ppm H2S and a 50 ppm MM cylinders were used. The equipment was programmed to the selected concentration, and it automatically carried out the dilution to the glass bulb. The mixture was left flowing through the glass bulb for 5 min and then the extremities were closed. This method allowed the calibration between 1-20 ppm for H2S and 0.02-0.1 ppm and 1-3.5 ppm for MM. Several quantifications of air samples from inlet and outlet of a biofilter operating in a waste management facility in the north of Portugal allowed the evaluation the biofilters performance.Keywords: biofiltration, hydrogen sulphide, mercaptans, quantification
Procedia PDF Downloads 476292 Model-Based Fault Diagnosis in Carbon Fiber Reinforced Composites Using Particle Filtering
Abstract:
Carbon fiber reinforced composites (CFRP) used as aircraft structure are subject to lightning strike, putting structural integrity under risk. Indirect damage may occur after a lightning strike where the internal structure can be damaged due to excessive heat induced by lightning current, while the surface of the structures remains intact. Three damage modes may be observed after a lightning strike: fiber breakage, inter-ply delamination and intra-ply cracks. The assessment of internal damage states in composite is challenging due to complicated microstructure, inherent uncertainties, and existence of multiple damage modes. In this work, a model based approach is adopted to diagnose faults in carbon composites after lighting strikes. A resistor network model is implemented to relate the overall electrical and thermal conduction behavior under simulated lightning current waveform to the intrinsic temperature dependent material properties, microstructure and degradation of materials. A fault detection and identification (FDI) module utilizes the physics based model and a particle filtering algorithm to identify damage mode as well as calculate the probability of structural failure. Extensive simulation results are provided to substantiate the proposed fault diagnosis methodology with both single fault and multiple faults cases. The approach is also demonstrated on transient resistance data collected from a IM7/Epoxy laminate under simulated lightning strike.Keywords: carbon composite, fault detection, fault identification, particle filter
Procedia PDF Downloads 195291 Language Choice and Language Maintenance of Northeastern Thai Staff in Suan Sunandha Rajabhat University
Authors: Napasri Suwanajote
Abstract:
The purposes of this research were to analyze and evaluate successful factors in OTOP production process for the developing of learning center on OTOP production process based on Sufficiency Economic Philosophy for sustainable life quality. The research has been designed as a qualitative study to gather information from 30 OTOP producers in Bangkontee District, Samudsongkram Province. They were all interviewed on 3 main parts. Part 1 was about the production process including 1) production, 2) product development, 3) the community strength, 4) marketing possibility, and 5) product quality. Part 2 evaluated appropriate successful factors including 1) the analysis of the successful factors, 2) evaluate the strategy based on Sufficiency Economic Philosophy, and 3) the model of learning center on OTOP production process based on Sufficiency Economic Philosophy for sustainable life quality. The results showed that the production did not affect the environment with potential in continuing standard quality production. They used the raw materials in the country. On the aspect of product and community strength in the past 1 year, it was found that there was no appropriate packaging showing product identity according to global market standard. They needed the training on packaging especially for food and drink products. On the aspect of product quality and product specification, it was found that the products were certified by the local OTOP standard. There should be a responsible organization to help the uncertified producers pass the standard. However, there was a problem on food contamination which was hazardous to the consumers. The producers should cooperate with the government sector or educational institutes involving with food processing to reach FDA standard. The results from small group discussion showed that the community expected high education and better standard living. Some problems reported by the community included informal debt and drugs in the community. There were 8 steps in developing the model of learning center on OTOP production process based on Sufficiency Economic Philosophy for sustainable life quality.Keywords: production process, OTOP, sufficiency economic philosophy, language choice
Procedia PDF Downloads 237290 Quantum Statistical Machine Learning and Quantum Time Series
Authors: Omar Alzeley, Sergey Utev
Abstract:
Minimizing a constrained multivariate function is the fundamental of Machine learning, and these algorithms are at the core of data mining and data visualization techniques. The decision function that maps input points to output points is based on the result of optimization. This optimization is the central of learning theory. One approach to complex systems where the dynamics of the system is inferred by a statistical analysis of the fluctuations in time of some associated observable is time series analysis. The purpose of this paper is a mathematical transition from the autoregressive model of classical time series to the matrix formalization of quantum theory. Firstly, we have proposed a quantum time series model (QTS). Although Hamiltonian technique becomes an established tool to detect a deterministic chaos, other approaches emerge. The quantum probabilistic technique is used to motivate the construction of our QTS model. The QTS model resembles the quantum dynamic model which was applied to financial data. Secondly, various statistical methods, including machine learning algorithms such as the Kalman filter algorithm, are applied to estimate and analyses the unknown parameters of the model. Finally, simulation techniques such as Markov chain Monte Carlo have been used to support our investigations. The proposed model has been examined by using real and simulated data. We establish the relation between quantum statistical machine and quantum time series via random matrix theory. It is interesting to note that the primary focus of the application of QTS in the field of quantum chaos was to find a model that explain chaotic behaviour. Maybe this model will reveal another insight into quantum chaos.Keywords: machine learning, simulation techniques, quantum probability, tensor product, time series
Procedia PDF Downloads 469289 Analysis of Waiting Time and Drivers Fatigue at Manual Toll Plaza and Suggestion of an Automated Toll Tax Collection System
Authors: Muhammad Dawood Idrees, Maria Hafeez, Arsalan Ansari
Abstract:
Toll tax collection is the earliest method of tax collection and revenue generation. This revenue is utilized for the development of roads networks, maintenance, and connecting to roads and highways across the country. Pakistan is one of the biggest countries, covers a wide area of land, roads networks, and motorways are important source of connecting cities. Every day millions of people use motorways, and they have to stop at toll plazas to pay toll tax as majority of toll plazas are manually collecting toll tax. The purpose of this study is to calculate the waiting time of vehicles at Karachi Hyderabad (M-9) motorway. As Karachi is the biggest city of Pakistan and hundreds of thousands of people use this route to approach other cities. Currently, toll tax collection is manual system which is a major cause for long time waiting at toll plaza. This study calculates the waiting time of vehicles, fuel consumed in waiting time, manpower employed at toll plaza as all process is manual, and it also leads to mental and physical fatigue of driver. All wastages of sources are also calculated, and a most feasible automatic toll tax collection system is proposed which is not only beneficial to reduce waiting time but also beneficial in reduction of fuel, reduction of manpower employed, and reduction in physical and mental fatigue. A cost comparison in terms of wastages is also shown between manual and automatic toll tax collection system (E-Z Pass). Results of this study reveal that, if automatic tool collection system is implemented at Karachi to Hyderabad motorway (M-9), there will be a significance reduction in waiting time of vehicles, which leads to reduction of fuel consumption, environmental pollution, mental and physical fatigue of driver. All these reductions are also calculated in terms of money (Pakistani rupees) and it is obtained that millions of rupees can be saved by using automatic tool collection system which will lead to improve the economy of country.Keywords: toll tax collection, waiting time, wastages, driver fatigue
Procedia PDF Downloads 150288 Ecological-Economics Evaluation of Water Treatment Systems
Authors: Hwasuk Jung, Seoi Lee, Dongchoon Ryou, Pyungjong Yoo, Seokmo Lee
Abstract:
The Nakdong River being used as drinking water sources for Pusan metropolitan city has the vulnerability of water management due to the fact that industrial areas are located in the upper Nakdong River. Most citizens of Busan think that the water quality of Nakdong River is not good, so they boil or use home filter to drink tap water, which causes unnecessary individual costs to Busan citizens. We need to diversify water intake to reduce the cost and to change the weak water source. Under this background, this study was carried out for the environmental accounting of Namgang dam water treatment system compared to Nakdong River water treatment system by using emergy analysis method to help making reasonable decision. Emergy analysis method evaluates quantitatively both natural environment and human economic activities as an equal unit of measure. The emergy transformity of Namgang dam’s water was 1.16 times larger than that of Nakdong River’s water. Namgang Dam’s water shows larger emergy transformity than that of Nakdong River’s water due to its good water quality. The emergy used in making 1 m3 tap water from Namgang dam water treatment system was 1.26 times larger than that of Nakdong River water treatment system. Namgang dam water treatment system shows larger emergy input than that of Nakdong river water treatment system due to its construction cost of new pipeline for intaking Namgang daw water. If the Won used in making 1 m3 tap water from Nakdong river water treatment system is 1, Namgang dam water treatment system used 1.66. If the Em-won used in making 1 m3 tap water from Nakdong river water treatment system is 1, Namgang dam water treatment system used 1.26. The cost-benefit ratio of Em-won was smaller than that of Won. When we use emergy analysis, which considers the benefit of a natural environment such as good water quality of Namgang dam, Namgang dam water treatment system could be a good alternative for diversifying intake source.Keywords: emergy, emergy transformity, Em-won, water treatment system
Procedia PDF Downloads 305287 Three Issues for Integrating Artificial Intelligence into Legal Reasoning
Authors: Fausto Morais
Abstract:
Artificial intelligence has been widely used in law. Programs are able to classify suits, to identify decision-making patterns, to predict outcomes, and to formalize legal arguments as well. In Brazil, the artificial intelligence victor has been classifying cases to supreme court’s standards. When those programs act doing those tasks, they simulate some kind of legal decision and legal arguments, raising doubts about how artificial intelligence can be integrated into legal reasoning. Taking this into account, the following three issues are identified; the problem of hypernormatization, the argument of legal anthropocentrism, and the artificial legal principles. Hypernormatization can be seen in the Brazilian legal context in the Supreme Court’s usage of the Victor program. This program generated efficiency and consistency. On the other hand, there is a feasible risk of over standardizing factual and normative legal features. Then legal clerks and programmers should work together to develop an adequate way to model legal language into computational code. If this is possible, intelligent programs may enact legal decisions in easy cases automatically cases, and, in this picture, the legal anthropocentrism argument takes place. Such an argument argues that just humans beings should enact legal decisions. This is so because human beings have a conscience, free will, and self unity. In spite of that, it is possible to argue against the anthropocentrism argument and to show how intelligent programs may work overcoming human beings' problems like misleading cognition, emotions, and lack of memory. In this way, intelligent machines could be able to pass legal decisions automatically by classification, as Victor in Brazil does, because they are binding by legal patterns and should not deviate from them. Notwithstanding, artificial intelligent programs can be helpful beyond easy cases. In hard cases, they are able to identify legal standards and legal arguments by using machine learning. For that, a dataset of legal decisions regarding a particular matter must be available, which is a reality in Brazilian Judiciary. Doing such procedure, artificial intelligent programs can support a human decision in hard cases, providing legal standards and arguments based on empirical evidence. Those legal features claim an argumentative weight in legal reasoning and should serve as references for judges when they must decide to maintain or overcome a legal standard.Keywords: artificial intelligence, artificial legal principles, hypernormatization, legal anthropocentrism argument, legal reasoning
Procedia PDF Downloads 145286 A Novel Hybrid Deep Learning Architecture for Predicting Acute Kidney Injury Using Patient Record Data and Ultrasound Kidney Images
Authors: Sophia Shi
Abstract:
Acute kidney injury (AKI) is the sudden onset of kidney damage in which the kidneys cannot filter waste from the blood, requiring emergency hospitalization. AKI patient mortality rate is high in the ICU and is virtually impossible for doctors to predict because it is so unexpected. Currently, there is no hybrid model predicting AKI that takes advantage of two types of data. De-identified patient data from the MIMIC-III database and de-identified kidney images and corresponding patient records from the Beijing Hospital of the Ministry of Health were collected. Using data features including serum creatinine among others, two numeric models using MIMIC and Beijing Hospital data were built, and with the hospital ultrasounds, an image-only model was built. Convolutional neural networks (CNN) were used, VGG and Resnet for numeric data and Resnet for image data, and they were combined into a hybrid model by concatenating feature maps of both types of models to create a new input. This input enters another CNN block and then two fully connected layers, ending in a binary output after running through Softmax and additional code. The hybrid model successfully predicted AKI and the highest AUROC of the model was 0.953, achieving an accuracy of 90% and F1-score of 0.91. This model can be implemented into urgent clinical settings such as the ICU and aid doctors by assessing the risk of AKI shortly after the patient’s admission to the ICU, so that doctors can take preventative measures and diminish mortality risks and severe kidney damage.Keywords: Acute kidney injury, Convolutional neural network, Hybrid deep learning, Patient record data, ResNet, Ultrasound kidney images, VGG
Procedia PDF Downloads 131285 Effects of Essential Oils on the Intestinal Microflora of Termite (Heterotermes indicola)
Authors: Ayesha Aihetasham, Najma Arshad, Sobia Khan
Abstract:
Damage causes by subterranean termites are of major concern today. Termites majorly treated with pesticides resulted in several problems related to health and environment. For this reason, plant-derived natural products specifically essential oils have been evaluated in order to control termites. The aim of the present study was to investigate the antitermitic potential of six essential oils on Heterotermes indicola subterranean termite. No-choice bioassay was used to assess the termiticidal action of essential oils. Further, gut from each set of treated termite group was extracted and analyzed for reduction in number of protozoa and bacteria by protozoal count method using haemocytometer and viable bacterial plate count (dilution method) respectively. In no-choice bioassay it was found that Foeniculum vulgare oil causes high degree of mortality 90 % average mortality at 10 mg oil concentration (10mg/0.42g weight of filter paper). Least mortality appeared to be due to Citrus sinensis oil (43.33 % average mortality at 10 mg/0.42g). The highest activity verified to be of Foeniculum vulgare followed by Eruca sativa, Trigonella foenum-graecum, Peganum harmala, Syzygium cumini and Citrus sinensis. The essential oil which caused maximum reduction in number of protozoa was P. harmala followed by T. foenum-graecum and E. sativa. In case of bacterial count E. sativa oil indicated maximum decrease in bacterial number (6.4×10⁹ CFU/ml). It is concluded that F. vulgare, E. sativa and P. harmala essential oils are highly effective against H. indicola termite and its gut microflora.Keywords: bacterial count, essential oils, Heterotermes indicola, protozoal count
Procedia PDF Downloads 246284 A West Coast Estuarine Case Study: A Predictive Approach to Monitor Estuarine Eutrophication
Authors: Vedant Janapaty
Abstract:
Estuaries are wetlands where fresh water from streams mixes with salt water from the sea. Also known as “kidneys of our planet”- they are extremely productive environments that filter pollutants, absorb floods from sea level rise, and shelter a unique ecosystem. However, eutrophication and loss of native species are ailing our wetlands. There is a lack of uniform data collection and sparse research on correlations between satellite data and in situ measurements. Remote sensing (RS) has shown great promise in environmental monitoring. This project attempts to use satellite data and correlate metrics with in situ observations collected at five estuaries. Images for satellite data were processed to calculate 7 bands (SIs) using Python. Average SI values were calculated per month for 23 years. Publicly available data from 6 sites at ELK was used to obtain 10 parameters (OPs). Average OP values were calculated per month for 23 years. Linear correlations between the 7 SIs and 10 OPs were made and found to be inadequate (correlation = 1 to 64%). Fourier transform analysis on 7 SIs was performed. Dominant frequencies and amplitudes were extracted for 7 SIs, and a machine learning(ML) model was trained, validated, and tested for 10 OPs. Better correlations were observed between SIs and OPs, with certain time delays (0, 3, 4, 6 month delay), and ML was again performed. The OPs saw improved R² values in the range of 0.2 to 0.93. This approach can be used to get periodic analyses of overall wetland health with satellite indices. It proves that remote sensing can be used to develop correlations with critical parameters that measure eutrophication in situ data and can be used by practitioners to easily monitor wetland health.Keywords: estuary, remote sensing, machine learning, Fourier transform
Procedia PDF Downloads 104283 Empowering Transformers for Evidence-Based Medicine
Authors: Jinan Fiaidhi, Hashmath Shaik
Abstract:
Breaking the barrier for practicing evidence-based medicine relies on effective methods for rapidly identifying relevant evidence from the body of biomedical literature. An important challenge confronted by medical practitioners is the long time needed to browse, filter, summarize and compile information from different medical resources. Deep learning can help in solving this based on automatic question answering (Q&A) and transformers. However, Q&A and transformer technologies are not trained to answer clinical queries that can be used for evidence-based practice, nor can they respond to structured clinical questioning protocols like PICO (Patient/Problem, Intervention, Comparison and Outcome). This article describes the use of deep learning techniques for Q&A that are based on transformer models like BERT and GPT to answer PICO clinical questions that can be used for evidence-based practice extracted from sound medical research resources like PubMed. We are reporting acceptable clinical answers that are supported by findings from PubMed. Our transformer methods are reaching an acceptable state-of-the-art performance based on two staged bootstrapping processes involving filtering relevant articles followed by identifying articles that support the requested outcome expressed by the PICO question. Moreover, we are also reporting experimentations to empower our bootstrapping techniques with patch attention to the most important keywords in the clinical case and the PICO questions. Our bootstrapped patched with attention is showing relevancy of the evidence collected based on entropy metrics.Keywords: automatic question answering, PICO questions, evidence-based medicine, generative models, LLM transformers
Procedia PDF Downloads 43282 Quantitative Proteome Analysis and Bioactivity Testing of New Zealand Honeybee Venom
Authors: Maryam Ghamsari, Mitchell Nye-Wood, Kelvin Wang, Angela Juhasz, Michelle Colgrave, Don Otter, Jun Lu, Nazimah Hamid, Thao T. Le
Abstract:
Bee venom, a complex mixture of peptides, proteins, enzymes, and other bioactive compounds, has been widely studied for its therapeutic application. This study investigated the proteins present in New Zealand (NZ) honeybee venom (BV) using bottom-up proteomics. Two sample digestion techniques, in-solution digestion and filter-aided sample preparation (FASP), were employed to obtain the optimal method for protein digestion. Sequential Window Acquisition of All Theoretical Mass Spectra (SWATH–MS) analysis was conducted to quantify the protein compositions of NZ BV and investigate variations in collection years. Our results revealed high protein content (158.12 µg/mL), with the FASP method yielding a larger number of identified proteins (125) than in-solution digestion (95). SWATH–MS indicated melittin and phospholipase A2 as the most abundant proteins. Significant variations in protein compositions across samples from different years (2018, 2019, 2021) were observed, with implications for venom's bioactivity. In vitro testing demonstrated immunomodulatory and antioxidant activities, with a viable range for cell growth established at 1.5-5 µg/mL. The study underscores the value of proteomic tools in characterizing bioactive compounds in bee venom, paving the way for deeper exploration into their therapeutic potentials. Further research is needed to fractionate the venom and elucidate the mechanisms of action for the identified bioactive components.Keywords: honeybee venom, proteomics, bioactivity, fractionation, swath-ms, melittin, phospholipase a2, new zealand, immunomodulatory, antioxidant
Procedia PDF Downloads 39281 A Sub-Conjunctiva Injection of Rosiglitazone for Anti-Fibrosis Treatment after Glaucoma Filtration Surgery
Authors: Yang Zhao, Feng Zhang, Xuanchu Duan
Abstract:
Trans-differentiation of human Tenon fibroblasts (HTFs) to myo-fibroblasts and fibrosis of episcleral tissue are the most common reasons for the failure of glaucoma filtration surgery, with limited treatment options like antimetabolites which always have side-effects such as leakage of filter bulb, infection, hypotony, and endophthalmitis. Rosiglitazone, a specific thiazolidinedione is a synthetic high-affinity ligand for PPAR-r, which has been used in the treatment of type2 diabetes, and found to have pleiotropic functions against inflammatory response, cell proliferation and tissue fibrosis and to benefit to a variety of diseases in animal myocardium models, steatohepatitis models, etc. Here, in vitro we cultured primary HTFs and stimulated with TGF- β to induced myofibrogenic, then treated cells with Rosiglitazone to assess for fibrogenic response. In vivo, we used rabbit glaucoma model to establish the formation of post- trabeculectomy scarring. Then we administered subconjunctival injection with Rosiglitazone beside the filtering bleb, later protein, mRNA and immunofluorescence of fibrogenic markers are checked, and filtering bleb condition was measured. In vitro, we found Rosiglitazone could suppressed proliferation and migration of fibroblasts through macroautophagy via TGF- β /Smad signaling pathway. In vivo, on postoperative day 28, the mean number of fibroblasts in Rosiglitazone injection group was significantly the lowest and had the least collagen content and connective tissue growth factor. Rosiglitazone effectively controlled human and rabbit fibroblasts in vivo and in vitro. Its subconjunctiiva application may represent an effective, new avenue for the prevention of scarring after glaucoma surgery.Keywords: fibrosis, glaucoma, macroautophagy, rosiglitazone
Procedia PDF Downloads 274280 Partnerships for Environmental Sustainability: An Effective Multistakeholder Governance Regime for Oil and Gas Producing Areas
Authors: Joy Debski
Abstract:
Due to the varying degrees of the problem posed by global warming, environmental sustainability dominates international discourse. International initiatives' aims and expectations have proven particularly challenging to put into practice in developing nations. To reduce human exploitation of the environment, stricter measures are urgently needed. However, putting them into practice has proven more difficult. Relatively recent information from the Climate Accountability Institute and academic researchers shows that fossil fuel companies are major contributors to the climate crisis. Host communities in oil and gas-producing areas, particularly in developing nations, have grown hostile toward both oil and gas companies and government policies. It is now essential that the three main stakeholders—government, the oil and gas sector, and host communities—cooperate to achieve the shared objective of environmental sustainability. This research, therefore, advocates a governance system for Nigeria that facilitates the achieving the goal of environmental sustainability. This objective is achieved by the research's examination of the main institutional framework for environmental sustainability, evaluation of the strategies used by major oil companies to increase stakeholder engagement in environmental sustainability, and examination of the involvement of host communities in environmental sustainability. The study reveals that while environmental sustainability is important to the identified stakeholders, it's challenging to accomplish without an informed synergy. Hence the research advocates the centralisation of CSR through a CSR commission for environmental sustainability. The commission’s mandate is to facilitate, partner with, and endorse companies. The commission is strongly advised to incorporate host community liaison offices into the process of negotiating contracts with oil and gas firms, as well as to play a facilitative role in helping firms adhere to both domestic and international regulations. The recommendations can benefit Nigerian policymakers in enhancing their unsuccessful efforts to pass CSR legislation. Through the research-proposed CSR department, which has competent training and stakeholder engagement strategies, oil and gas companies can enhance and centralise their goals for environmental sustainability. Finally, the CSR Commission's expertise would give host communities more leverage when negotiating their memorandum of understanding with oil and gas companies.Keywords: environmental sustainability, corporate social responsibility, CSR, oil and gas, nigeria
Procedia PDF Downloads 82279 Formulation of Suppositories Using Allanblackia Floribunda Butter as a Base
Authors: Mary Konadu
Abstract:
The rectal route for drug administration is becoming attractive to drug formulators because it can avoid hepatic first-pass effects, decrease gastrointestinal side effects and avoid undesirable effects of meals on drug absorption. Suppositories have been recognized as an alternative to the oral route in situations such as when the patient is comatose, unable to swallow, or when the drug produces nausea or vomiting. Effective drug delivery with appropriate pharmaceutical excipient is key in the production of clinically useful preparations. The high cost of available excipients coupled with other disadvantages have led to the exploration of potential excipients from natural sources. Allanblackia floribunda butter, a naturally occurring lipid, is used for medicinal, culinary, and cosmetic purposes. Different extraction methods (solvent (hexane) extraction, traditional/hot water extraction, and cold/screw press extraction) were employed to extract the oil. The different extracts of A. floribunda oil were analyzed for their physicochemical properties and mineral content. The oil was used as a base to formulate Paracetamol and Diclofenac suppositories. Quality control test were carried out on the formulated suppositories. The %age oil yield for hexane extract, hot water extract, and cold press extract were 50.40 ±0.00, 37.36±0.00, and 20.48±0.00, respectively. The acid value, saponification value, iodine value and free fatty acid were 1.159 ± 0.065, 208.51 ± 8.450, 49.877 ± 0.690 and 0.583 ± 0.032 respectively for hexane extract; 3.480 ± 0.055, 204.672±2.863, 49.04 ± 0.76 and 1.747 ± 0.028 respectively for hot water/traditional extract; 4.43 ± 0.055, 192.05±1.56, 49.96 ± 0.29 and 2.23 ± 0.03 respectively for cold press extract. Calcium, sodium, magnesium, potassium, and iron were minerals found to be present in the A. floribunda butter extracts. The uniformity of weight, hardness, disintegration time, and uniformity of content were found to be within the acceptable range. The melting point ranges for all the suppositories were found to be satisfactory. The cumulative drug release (%) of the suppositories at 45 minutes was 90.19±0.00 (Hot water extract), 93.75±0.00 (Cold Pres Extract), and 98.16±0.00 (Hexane Extract) for Paracetamol suppositories. Diclofenac sodium suppositories had a cumulative %age release of 81.60±0.00 (Hot water Extract), 95.33±0.00 (Cold Press Extract), and 99.20±0.00 (Hexane Extract). The physicochemical parameters obtained from this study shows that Allanblackia floribunda seed oil is edible and can be used as a suppository base. The suppository formulation was successful, and the quality control tests conformed to Pharmacopoeia standard.Keywords: allanblackia foribunda, paracetamol, diclofenac, suppositories
Procedia PDF Downloads 122278 Effect of Varying Zener-Hollomon Parameter (Temperature and Flow Stress) and Stress Relaxation on Creep Response of Hot Deformed AA3104 Can Body Stock
Authors: Oyindamola Kayode, Sarah George, Roberto Borrageiro, Mike Shirran
Abstract:
A phenomenon identified by our industrial partner has experienced sag on AA3104 can body stock (CBS) transfer bar during transportation of the slab from the breakdown mill to the finishing mill. Excessive sag results in bottom scuffing of the slab onto the roller table, resulting in surface defects on the final product. It has been found that increasing the strain rate on the breakdown mill final pass results in a slab resistant to sag. The creep response for materials hot deformed at different Zener–Holloman parameter values needs to be evaluated experimentally to gain better understanding of the operating mechanism. This study investigates this identified phenomenon through laboratory simulation of the breakdown mill conditions for various strain rates by utilizing the Gleeble at UCT Centre for Materials Engineering. The experiment will determine the creep response for a range of conditions as well as quantifying the associated material microstructure (sub-grain size, grain structure etc). The experimental matrices were determined based on experimental conditions approximate to industrial hot breakdown rolling and carried out on the Gleeble 3800 at the Centre for Materials Engineering, University of Cape Town. Plane strain compression samples were used for this series of tests at an applied load that allow for better contact and exaggerated creep displacement. A tantalum barrier layer was used for increased conductivity and decreased risk of anvil welding. One set of tests with no in-situ hold time was performed, where the samples were quenched after deformation. The samples were retained for microstructure analysis of the micrographs from the light microscopy (LM), quantitative data and images from scanning electron microscopy (SEM) and energy dispersive X-ray (EDX), sub-grain size and grain structure from electron back scattered diffraction (EBSD).Keywords: aluminium alloy, can-body stock, hot rolling, creep response, Zener-Hollomon parameter
Procedia PDF Downloads 86277 Tea and Its Working Methodology in the Biomass Estimation of Poplar Species
Authors: Pratima Poudel, Austin Himes, Heidi Renninger, Eric McConnel
Abstract:
Populus spp. (poplar) are the fastest-growing trees in North America, making them ideal for a range of applications as they can achieve high yields on short rotations and regenerate by coppice. Furthermore, poplar undergoes biochemical conversion to fuels without complexity, making it one of the most promising, purpose-grown, woody perennial energy sources. Employing wood-based biomass for bioenergy offers numerous benefits, including reducing greenhouse gas (GHG) emissions compared to non-renewable traditional fuels, the preservation of robust forest ecosystems, and creating economic prospects for rural communities.In order to gain a better understanding of the potential use of poplar as a biomass feedstock for biofuel in the southeastern US, the conducted a techno-economic assessment (TEA). This assessment is an analytical approach that integrates technical and economic factors of a production system to evaluate its economic viability. the TEA specifically focused on a short rotation coppice system employing a single-pass cut-and-chip harvesting method for poplar. It encompassed all the costs associated with establishing dedicated poplar plantations, including land rent, site preparation, planting, fertilizers, and herbicides. Additionally, we performed a sensitivity analysis to evaluate how different costs can affect the economic performance of the poplar cropping system. This analysis aimed to determine the minimum average delivered selling price for one metric ton of biomass necessary to achieve a desired rate of return over the cropping period. To inform the TEA, data on the establishment, crop care activities, and crop yields were derived from a field study conducted at the Mississippi Agricultural and Forestry Experiment Station's Bearden Dairy Research Center in Oktibbeha County and Pontotoc Ridge-Flatwood Branch Experiment Station in Pontotoc County.Keywords: biomass, populus species, sensitivity analysis, technoeconomic analysis
Procedia PDF Downloads 83