Search results for: cloud remove
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1350

Search results for: cloud remove

300 Molecular Dynamic Simulation of CO2 Absorption into Mixed Aqueous Solutions MDEA/PZ

Authors: N. Harun, E. E. Masiren, W. H. W. Ibrahim, F. Adam

Abstract:

Amine absorption process is an approach for mitigation of CO2 from flue gas that produces from power plant. This process is the most common system used in chemical and oil industries for gas purification to remove acid gases. On the challenges of this process is high energy requirement for solvent regeneration to release CO2. In the past few years, mixed alkanolamines have received increasing attention. In most cases, the mixtures contain N-methyldiethanolamine (MDEA) as the base amine with the addition of one or two more reactive amines such as PZ. The reason for the application of such blend amine is to take advantage of high reaction rate of CO2 with the activator combined with the advantages of the low heat of regeneration of MDEA. Several experimental and simulation studies have been undertaken to understand this process using blend MDEA/PZ solvent. Despite those studies, the mechanism of CO2 absorption into the aqueous MDEA is not well understood and available knowledge within the open literature is limited. The aim of this study is to investigate the intermolecular interaction of the blend MDEA/PZ using Molecular Dynamics (MD) simulation. MD simulation was run under condition 313K and 1 atm using NVE ensemble at 200ps and NVT ensemble at 1ns. The results were interpreted in term of Radial Distribution Function (RDF) analysis through two system of interest i.e binary and tertiary. The binary system will explain the interaction between amine and water molecule while tertiary system used to determine the interaction between the amine and CO2 molecule. For the binary system, it was observed that the –OH group of MDEA is more attracted to water molecule compared to –NH group of MDEA. The –OH group of MDEA can form the hydrogen bond with water that will assist the solubility of MDEA in water. The intermolecular interaction probability of –OH and –NH group of MDEA with CO2 in blended MDEA/PZ is higher than using single MDEA. This findings show that PZ molecule act as an activator to promote the intermolecular interaction between MDEA and CO2.Thus, blend of MDEA with PZ is expecting to increase the absorption rate of CO2 and reduce the heat regeneration requirement.

Keywords: amine absorption process, blend MDEA/PZ, CO2 capture, molecular dynamic simulation, radial distribution function

Procedia PDF Downloads 290
299 Phytotechnologies for Use and Reconstitution of Contaminated Sites

Authors: Olga Shuvaeva, Tamara Romanova, Sergey Volynkin, Valentina Podolinnaya

Abstract:

Green chemistry concept is focused on the prevention of environmental pollution caused by human activity. However, there are a lot of contaminated areas in the world which pose a serious threat to ecosystems in terms of their conservation. Therefore in accordance with the principles of green chemistry, it should not be forgotten about the need to clean these areas. Furthermore, the waste material often contains the valuable components, the extraction of which by traditional wet chemical technologies is inefficient both from the economic and environmental protection standpoint. Wherein, the plants may be successfully used to ‘scavenge’ a range of metals from polluted land sites in an approach allowing to carry out both of these processes – phytoremediation and phytomining in conjunction. The goal of the present work was to study bioaccumulation ability of floating macrophytes such as water hyacinth and pondweed toward Hg, Ba, Cd, Mo and Pb as pollutants in aquatic medium and terrestrial plants (birch, reed, and cane) towards gold and silver as valuable components. The peculiarity of ongoing research was that the plants grew under extreme conditions (pH of drainage and pore waters was about 2.5). The study was conducted at the territory of Ursk tailings (Southwestern Siberia, Russia) formed as a result of primary polymetallic ores cyanidation. The waste material is mainly presented (~80%) by pyrite (FeS₂) and barite (BaSO₄), the raw minerals included FeAsS, HgS, PbS, Ag₂S as minor ones. It has been shown that water hyacinth demonstrates high ability to accumulate different metals, and what is especially important – to remove mercury from polluted waters with BCF value more than 1000. As for the gold, its concentrations in reed and cane growing near the waste material were estimated as 500 and 900 μg∙kg⁻¹ respectively. It was also found that the plants can survive under extreme conditions of acidic environment and hence we can assume that there is a principal opportunity to use them for the valuable substances extraction from an area of the mining waste dumps burial.

Keywords: bioaccumulation, gold, heavy metals, mine tailing

Procedia PDF Downloads 167
298 Saving Energy through Scalable Architecture

Authors: John Lamb, Robert Epstein, Vasundhara L. Bhupathi, Sanjeev Kumar Marimekala

Abstract:

In this paper, we focus on the importance of scalable architecture for data centers and buildings in general to help an enterprise achieve environmental sustainability. The scalable architecture helps in many ways, such as adaptability to the business and user requirements, promotes high availability and disaster recovery solutions that are cost effective and low maintenance. The scalable architecture also plays a vital role in three core areas of sustainability: economy, environment, and social, which are also known as the 3 pillars of a sustainability model. If the architecture is scalable, it has many advantages. A few examples are that scalable architecture helps businesses and industries to adapt to changing technology, drive innovation, promote platform independence, and build resilience against natural disasters. Most importantly, having a scalable architecture helps industries bring in cost-effective measures for energy consumption, reduce wastage, increase productivity, and enable a robust environment. It also helps in the reduction of carbon emissions with advanced monitoring and metering capabilities. Scalable architectures help in reducing waste by optimizing the designs to utilize materials efficiently, minimize resources, decrease carbon footprints by using low-impact materials that are environmentally friendly. In this paper we also emphasize the importance of cultural shift towards the reuse and recycling of natural resources for a balanced ecosystem and maintain a circular economy. Also, since all of us are involved in the use of computers, much of the scalable architecture we have studied is related to data centers.

Keywords: scalable architectures, sustainability, application design, disruptive technology, machine learning and natural language processing, AI, social media platform, cloud computing, advanced networking and storage devices, advanced monitoring and metering infrastructure, climate change

Procedia PDF Downloads 97
297 Load Forecasting in Microgrid Systems with R and Cortana Intelligence Suite

Authors: F. Lazzeri, I. Reiter

Abstract:

Energy production optimization has been traditionally very important for utilities in order to improve resource consumption. However, load forecasting is a challenging task, as there are a large number of relevant variables that must be considered, and several strategies have been used to deal with this complex problem. This is especially true also in microgrids where many elements have to adjust their performance depending on the future generation and consumption conditions. The goal of this paper is to present a solution for short-term load forecasting in microgrids, based on three machine learning experiments developed in R and web services built and deployed with different components of Cortana Intelligence Suite: Azure Machine Learning, a fully managed cloud service that enables to easily build, deploy, and share predictive analytics solutions; SQL database, a Microsoft database service for app developers; and PowerBI, a suite of business analytics tools to analyze data and share insights. Our results show that Boosted Decision Tree and Fast Forest Quantile regression methods can be very useful to predict hourly short-term consumption in microgrids; moreover, we found that for these types of forecasting models, weather data (temperature, wind, humidity and dew point) can play a crucial role in improving the accuracy of the forecasting solution. Data cleaning and feature engineering methods performed in R and different types of machine learning algorithms (Boosted Decision Tree, Fast Forest Quantile and ARIMA) will be presented, and results and performance metrics discussed.

Keywords: time-series, features engineering methods for forecasting, energy demand forecasting, Azure Machine Learning

Procedia PDF Downloads 290
296 Evaluation of Surface Roughness Condition Using App Roadroid

Authors: Diego de Almeida Pereira

Abstract:

The roughness index of a road is considered the most important parameter about the quality of the pavement, as it has a close relation with the comfort and safety of the road users. Such condition can be established by means of functional evaluation of pavement surface deviations, measured by the International Roughness Index (IRI), an index that came out of the international evaluation of pavements, coordinated by the World Bank, and currently owns, as an index of limit measure, for purposes of receiving roads in Brazil, the value of 2.7 m/km. This work make use of the e.IRI parameter, obtained by the Roadroid app. for smartphones which use Android operating system. The choice of such application is due to the practicality for the user interaction, as it possesses a data storage on a cloud of its own, and the support given to universities all around the world. Data has been collected for six months, once in each month. The studies begun in March 2018, season of precipitations that worsen the conditions of the roads, besides the opportunity to accompany the damage and the quality of the interventions performed. About 350 kilometers of sections of four federal highways were analyzed, BR-020, BR-040, BR-060 and BR-070 that connect the Federal District (area where Brasilia is located) and surroundings, chosen for their economic and tourist importance, been two of them of federal and two others of private exploitation. As well as much of the road network, the analyzed stretches are coated of Hot Mix Asphalt (HMA). Thus, this present research performs a contrastive discussion between comfort conditions and safety of the roads under private exploitation in which users pay a fee to the concessionaires so they could travel on a road that meet the minimum requirements for usage, and regarding the quality of offered service on the roads under Federal Government jurisdiction. And finally, the contrast of data collected by National Department of Transport Infrastructure – DNIT, by means of a laser perfilometer, with data achieved by Roadroid, checking the applicability, the practicality and cost-effective, considering the app limitations.

Keywords: roadroid, international roughness index, Brazilian roads, pavement

Procedia PDF Downloads 79
295 Adsorptive Membrane for Hemodialysis: Potential, Future Prospection and Limitation of MOF as Nanofillers

Authors: Musawira Iftikhar

Abstract:

The field of membrane materials is the most dynamic due to the constantly evolving requirements advancement of materials, to address challenges such as biocompatibility, protein-bound uremic toxins, blood coagulation, auto-immune responses, oxidative stress, and poor clearance of uremic toxins. Hemodialysis is a membrane filtration processes that is currently necessary for daily living of the patients with ESRD. Tens of millions of people with ESRD have benefited from hemodialysis over the past 60–70 years, both in terms of safeguarding life and a longer lifespan. Beyond challenges associated with the efficiency and separative properties of the membranes, ensuring hemocompatibility, or the safe circulation of blood outside the body for four hours every two days, remains a persistent challenge. This review explores the ongoing field of metal–Organic Frameworks (MOFs) and their applications in hemodialysis, offering a comprehensive examination of various MOFs employed to address challenges inherent in traditional hemodialysis methodologies. this This review included includes the experimental work done with various MOFs as a filler such as UiO-66, HKUST-1, MIL-101, and ZIF-8, which together lead to improved adsorption capacities for a range of uremic toxins and proteins. Furthermore, this review highlights how effectively MOF-based hemodialysis membranes remove a variety of uremic toxins, including p-cresol, urea, creatinine, and indoxyl sulfate and potential filler choices for the future. Future research efforts should focus on refining synthesis techniques, enhancing toxin selectivity, and investigating the long-term durability of MOF-based membranes. With these considerations, MOFs emerge as transformative materials in the quest to develop advanced and efficient hemodialysis technologies, holding the promise to significantly enhance patient outcomes and redefine the landscape of renal therapy.

Keywords: membrane, hemodailysis, metal organic frameworks, seperation, protein adsorbtion

Procedia PDF Downloads 50
294 An Improved Two-dimensional Ordered Statistical Constant False Alarm Detection

Authors: Weihao Wang, Zhulin Zong

Abstract:

Two-dimensional ordered statistical constant false alarm detection is a widely used method for detecting weak target signals in radar signal processing applications. The method is based on analyzing the statistical characteristics of the noise and clutter present in the radar signal and then using this information to set an appropriate detection threshold. In this approach, the reference cell of the unit to be detected is divided into several reference subunits. These subunits are used to estimate the noise level and adjust the detection threshold, with the aim of minimizing the false alarm rate. By using an ordered statistical approach, the method is able to effectively suppress the influence of clutter and noise, resulting in a low false alarm rate. The detection process involves a number of steps, including filtering the input radar signal to remove any noise or clutter, estimating the noise level based on the statistical characteristics of the reference subunits, and finally, setting the detection threshold based on the estimated noise level. One of the main advantages of two-dimensional ordered statistical constant false alarm detection is its ability to detect weak target signals in the presence of strong clutter and noise. This is achieved by carefully analyzing the statistical properties of the signal and using an ordered statistical approach to estimate the noise level and adjust the detection threshold. In conclusion, two-dimensional ordered statistical constant false alarm detection is a powerful technique for detecting weak target signals in radar signal processing applications. By dividing the reference cell into several subunits and using an ordered statistical approach to estimate the noise level and adjust the detection threshold, this method is able to effectively suppress the influence of clutter and noise and maintain a low false alarm rate.

Keywords: two-dimensional, ordered statistical, constant false alarm, detection, weak target signals

Procedia PDF Downloads 76
293 Reduction of Fermentation Duration of Cassava to Remove Hydrogen Cyanide

Authors: Jean Paul Hategekimana, Josiane Irakoze, Eugene Niyonzima, Annick Ndekezi

Abstract:

Cassava (Manihot esculenta Crantz) is a root crop comprising an anti-nutritive factor known as cyanide. The compound can be removed by numerous processing methods such as boiling, fermentation, blanching, and sun drying to avoid the possibility of cyanide poisoning. Inappropriate processing mean can lead to disease and death. Cassava-based dishes are consumed in different ways, where cassava is cultivated according to their culture and preference. However, they have been shown to be unsafe based on high cyanide levels. The current study targeted to resolve the problem of high cyanide in cassava consumed in Rwanda. This study was conducted to determine the effect of slicing, blanching, and soaking time to reduce the fermentation duration of cassava for hydrogen cyanide (HCN) in mg/g removal. Cassava was sliced into three different portions (1cm, 2cm, and 5cm). The first portions were naturally fermented for seven days, where each portion was removed every 24 hours from soaking tanks and then oven dried at a temperature of 60°C and then milled to obtain naturally fermented cassava flours. Other portions of 1cm, 2cm, and 5cm were blanched for 2, 5, 10 min, respectively, and each similarly dried at 60°C and milled to produce blanched cassava flour. Other blanched portions were used to follow the previous fermentation steps. The last portions, which formed the control, were simply chopped. Cyanide content and starch content in mg/100g were investigated. According to the conducted analysis on different cassava treatments for detoxification, found that usual fermentation can be used, but for sliced portions aimed to size reduction for the easy hydrogen cyanide diffuse out and it takes four days to complete fermentation, which has reduced at 94.44% with significantly different (p<0.05)of total hydrogen cyanide contained in cassava to safe level of consumption, and what is recommended as more effective is to apply blanching combined with fermentation due to the fact that, it takes three days to complete hydrogen cyanide removal at 95.56% on significantly different (p<0.05) of reduction to the safe level of consumption.

Keywords: cassava, cyanide, blanching, drying, fermentation

Procedia PDF Downloads 61
292 Phase Composition Analysis of Ternary Alloy Materials for Gas Turbine Applications

Authors: Mayandi Ramanathan

Abstract:

Gas turbine blades see the most aggressive thermal stress conditions within the engine, due to high Turbine Entry Temperatures in the range of 1500 to 1600°C. The blades rotate at very high rotation rates and remove a significant amount of thermal power from the gas stream. At high temperatures, the major component failure mechanism is a creep. During its service over time under high thermal loads, the blade will deform, lengthen and rupture. High strength and stiffness in the longitudinal direction up to elevated service temperatures are certainly the most needed properties of turbine blades and gas turbine components. The proposed advanced Ti alloy material needs a process that provides a strategic orientation of metallic ordering, uniformity in composition and high metallic strength. The chemical composition of the proposed Ti alloy material (25% Ta/(Al+Ta) ratio), unlike Ti-47Al-2Cr-2Nb, has less excess Al that could limit the service life of turbine blades. Properties and performance of Ti-47Al-2Cr-2Nb and Ti-6Al-4V materials will be compared with that of the proposed Ti alloy material to generalize the performance metrics of various gas turbine components. This paper will involve the summary of the effects of additive manufacturing and heat treatment process conditions on the changes in the phase composition, grain structure, lattice structure of the material, tensile strength, creep strain rate, thermal expansion coefficient and fracture toughness at different temperatures. Based on these results, additive manufacturing and heat treatment process conditions will be optimized to fabricate turbine blade with Ti-43Al matrix alloyed with an optimized amount of refractory Ta metal. Improvement in service temperature of the turbine blades and corrosion resistance dependence on the coercivity of the alloy material will be reported. A correlation of phase composition and creep strain rate will also be discussed.

Keywords: high temperature materials, aerospace, specific strength, creep strain, phase composition

Procedia PDF Downloads 113
291 Toward Indoor and Outdoor Surveillance using an Improved Fast Background Subtraction Algorithm

Authors: El Harraj Abdeslam, Raissouni Naoufal

Abstract:

The detection of moving objects from a video image sequences is very important for object tracking, activity recognition, and behavior understanding in video surveillance. The most used approach for moving objects detection / tracking is background subtraction algorithms. Many approaches have been suggested for background subtraction. But, these are illumination change sensitive and the solutions proposed to bypass this problem are time consuming. In this paper, we propose a robust yet computationally efficient background subtraction approach and, mainly, focus on the ability to detect moving objects on dynamic scenes, for possible applications in complex and restricted access areas monitoring, where moving and motionless persons must be reliably detected. It consists of three main phases, establishing illumination changes in variance, background/foreground modeling and morphological analysis for noise removing. We handle illumination changes using Contrast Limited Histogram Equalization (CLAHE), which limits the intensity of each pixel to user determined maximum. Thus, it mitigates the degradation due to scene illumination changes and improves the visibility of the video signal. Initially, the background and foreground images are extracted from the video sequence. Then, the background and foreground images are separately enhanced by applying CLAHE. In order to form multi-modal backgrounds we model each channel of a pixel as a mixture of K Gaussians (K=5) using Gaussian Mixture Model (GMM). Finally, we post process the resulting binary foreground mask using morphological erosion and dilation transformations to remove possible noise. For experimental test, we used a standard dataset to challenge the efficiency and accuracy of the proposed method on a diverse set of dynamic scenes.

Keywords: video surveillance, background subtraction, contrast limited histogram equalization, illumination invariance, object tracking, object detection, behavior understanding, dynamic scenes

Procedia PDF Downloads 254
290 Preparing Data for Calibration of Mechanistic-Empirical Pavement Design Guide in Central Saudi Arabia

Authors: Abdulraaof H. Alqaili, Hamad A. Alsoliman

Abstract:

Through progress in pavement design developments, a pavement design method was developed, which is titled the Mechanistic Empirical Pavement Design Guide (MEPDG). Nowadays, the evolution in roads network and highways is observed in Saudi Arabia as a result of increasing in traffic volume. Therefore, the MEPDG currently is implemented for flexible pavement design by the Saudi Ministry of Transportation. Implementation of MEPDG for local pavement design requires the calibration of distress models under the local conditions (traffic, climate, and materials). This paper aims to prepare data for calibration of MEPDG in Central Saudi Arabia. Thus, the first goal is data collection for the design of flexible pavement from the local conditions of the Riyadh region. Since, the modifying of collected data to input data is needed; the main goal of this paper is the analysis of collected data. The data analysis in this paper includes processing each: Trucks Classification, Traffic Growth Factor, Annual Average Daily Truck Traffic (AADTT), Monthly Adjustment Factors (MAFi), Vehicle Class Distribution (VCD), Truck Hourly Distribution Factors, Axle Load Distribution Factors (ALDF), Number of axle types (single, tandem, and tridem) per truck class, cloud cover percent, and road sections selected for the local calibration. Detailed descriptions of input parameters are explained in this paper, which leads to providing of an approach for successful implementation of MEPDG. Local calibration of MEPDG to the conditions of Riyadh region can be performed based on the findings in this paper.

Keywords: mechanistic-empirical pavement design guide (MEPDG), traffic characteristics, materials properties, climate, Riyadh

Procedia PDF Downloads 223
289 A Structuring and Classification Method for Assigning Application Areas to Suitable Digital Factory Models

Authors: R. Hellmuth

Abstract:

The method of factory planning has changed a lot, especially when it is about planning the factory building itself. Factory planning has the task of designing products, plants, processes, organization, areas, and the building of a factory. Regular restructuring is becoming more important in order to maintain the competitiveness of a factory. Restrictions in new areas, shorter life cycles of product and production technology as well as a VUCA world (Volatility, Uncertainty, Complexity and Ambiguity) lead to more frequent restructuring measures within a factory. A digital factory model is the planning basis for rebuilding measures and becomes an indispensable tool. Furthermore, digital building models are increasingly being used in factories to support facility management and manufacturing processes. The main research question of this paper is, therefore: What kind of digital factory model is suitable for the different areas of application during the operation of a factory? First, different types of digital factory models are investigated, and their properties and usabilities for use cases are analysed. Within the scope of investigation are point cloud models, building information models, photogrammetry models, and these enriched with sensor data are examined. It is investigated which digital models allow a simple integration of sensor data and where the differences are. Subsequently, possible application areas of digital factory models are determined by means of a survey and the respective digital factory models are assigned to the application areas. Finally, an application case from maintenance is selected and implemented with the help of the appropriate digital factory model. It is shown how a completely digitalized maintenance process can be supported by a digital factory model by providing information. Among other purposes, the digital factory model is used for indoor navigation, information provision, and display of sensor data. In summary, the paper shows a structuring of digital factory models that concentrates on the geometric representation of a factory building and its technical facilities. A practical application case is shown and implemented. Thus, the systematic selection of digital factory models with the corresponding application cases is evaluated.

Keywords: building information modeling, digital factory model, factory planning, maintenance

Procedia PDF Downloads 103
288 Investigation of Shear Strength, and Dilative Behavior of Coarse-grained Samples Using Laboratory Test and Machine Learning Technique

Authors: Ehsan Mehryaar, Seyed Armin Motahari Tabari

Abstract:

Coarse-grained soils are known and commonly used in a wide range of geotechnical projects, including high earth dams or embankments for their high shear strength. The most important engineering property of these soils is friction angle which represents the interlocking between soil particles and can be applied widely in designing and constructing these earth structures. Friction angle and dilative behavior of coarse-grained soils can be estimated from empirical correlations with in-situ testing and physical properties of the soil or measured directly in the laboratory performing direct shear or triaxial tests. Unfortunately, large-scale testing is difficult, challenging, and expensive and is not possible in most soil mechanic laboratories. So, it is common to remove the large particles and do the tests, which cannot be counted as an exact estimation of the parameters and behavior of the original soil. This paper describes a new methodology to simulate particles grading distribution of a well-graded gravel sample to a smaller scale sample as it can be tested in an ordinary direct shear apparatus to estimate the stress-strain behavior, friction angle, and dilative behavior of the original coarse-grained soil considering its confining pressure, and relative density using a machine learning method. A total number of 72 direct shear tests are performed in 6 different sizes, 3 different confining pressures, and 4 different relative densities. Multivariate Adaptive Regression Spline (MARS) technique was used to develop an equation in order to predict shear strength and dilative behavior based on the size distribution of coarse-grained soil particles. Also, an uncertainty analysis was performed in order to examine the reliability of the proposed equation.

Keywords: MARS, coarse-grained soil, shear strength, uncertainty analysis

Procedia PDF Downloads 158
287 Revolutionizing Autonomous Trucking Logistics with Customer Relationship Management Cloud

Authors: Sharda Kumari, Saiman Shetty

Abstract:

Autonomous trucking is just one of the numerous significant shifts impacting fleet management services. The Society of Automotive Engineers (SAE) has defined six levels of vehicle automation that have been adopted internationally, including by the United States Department of Transportation. On public highways in the United States, organizations are testing driverless vehicles with at least Level 4 automation which indicates that a human is present in the vehicle and can disable automation, which is usually done while the trucks are not engaged in highway driving. However, completely driverless vehicles are presently being tested in the state of California. While autonomous trucking can increase safety, decrease trucking costs, provide solutions to trucker shortages, and improve efficiencies, logistics, too, requires advancements to keep up with trucking innovations. Given that artificial intelligence, machine learning, and automated procedures enable people to do their duties in other sectors with fewer resources, CRM (Customer Relationship Management) can be applied to the autonomous trucking business to provide the same level of efficiency. In a society witnessing significant digital disruptions, fleet management is likewise being transformed by technology. Utilizing strategic alliances to enhance core services is an effective technique for capitalizing on innovations and delivering enhanced services. Utilizing analytics on CRM systems improves cost control of fuel strategy, fleet maintenance, driver behavior, route planning, road safety compliance, and capacity utilization. Integration of autonomous trucks with automated fleet management, yard/terminal management, and customer service is possible, thus having significant power to redraw the lines between the public and private spheres in autonomous trucking logistics.

Keywords: autonomous vehicles, customer relationship management, customer experience, autonomous trucking, digital transformation

Procedia PDF Downloads 104
286 Comparative Efficacy of Gas Phase Sanitizers for Inactivating Salmonella, Escherichia coli O157:H7 and Listeria monocytogenes on Intact Lettuce Heads

Authors: Kayla Murray, Andrew Green, Gopi Paliyath, Keith Warriner

Abstract:

Introduction: It is now acknowledged that control of human pathogens associated with fresh produce requires an integrated approach of several interventions as opposed to relying on post-harvest washes to remove field acquired contamination. To this end, current research is directed towards identifying such interventions that can be applied at different points in leafy green processing. Purpose: In the following the efficacy of different gas phase treatments to decontaminate whole lettuce heads during pre-processing storage were evaluated. Methods: Whole Cos lettuce heads were spot inoculated with L. monocytogenes, E. coli O157:H7 or Salmonella spp. The inoculated lettuce heads were then placed in a treatment chamber and exposed to ozone, chlorine dioxide or hydroxyl radicals at different time periods under a range of relative humidity. Survivors of the treatments were enumerated along with sensory analysis performed on the treated lettuce. Results: Ozone gas reduced L. monocytogenes by 2-log10 after ten-minutes of exposure with Salmonella and E. coli O157:H7 being decreased by 0.66 and 0.56-log cfu respectively. Chlorine dioxide gas treatment reduced L. monocytogenes and Salmonella on lettuce heads by 4 log cfu but only supported a 0.8 log cfu reduction in E. coli O157:H7 numbers. In comparison, hydroxyl radicals supported a 2.9 – 4.8 log cfu reduction of model human pathogens inoculated onto lettuce heads but required extended exposure times and relative humidity < 0.8. Significance: From the gas phase sanitizers tested, chlorine dioxide and hydroxyl radicals are the most effective. The latter process holds most promise based on the ease of delivery, worker safety and preservation of lettuce sensory characteristics. Although expose times for hydroxyl radicles was relatively long (24h) this should not be considered a limitation given the intervention is applied in store rooms or in transport containers during transit.

Keywords: gas phase sanitizers, iceberg lettuce heads, leafy green processing

Procedia PDF Downloads 407
285 Optimizing Energy Efficiency: Leveraging Big Data Analytics and AWS Services for Buildings and Industries

Authors: Gaurav Kumar Sinha

Abstract:

In an era marked by increasing concerns about energy sustainability, this research endeavors to address the pressing challenge of energy consumption in buildings and industries. This study delves into the transformative potential of AWS services in optimizing energy efficiency. The research is founded on the recognition that effective management of energy consumption is imperative for both environmental conservation and economic viability. Buildings and industries account for a substantial portion of global energy use, making it crucial to develop advanced techniques for analysis and reduction. This study sets out to explore the integration of AWS services with big data analytics to provide innovative solutions for energy consumption analysis. Leveraging AWS's cloud computing capabilities, scalable infrastructure, and data analytics tools, the research aims to develop efficient methods for collecting, processing, and analyzing energy data from diverse sources. The core focus is on creating predictive models and real-time monitoring systems that enable proactive energy management. By harnessing AWS's machine learning and data analytics capabilities, the research seeks to identify patterns, anomalies, and optimization opportunities within energy consumption data. Furthermore, this study aims to propose actionable recommendations for reducing energy consumption in buildings and industries. By combining AWS services with metrics-driven insights, the research strives to facilitate the implementation of energy-efficient practices, ultimately leading to reduced carbon emissions and cost savings. The integration of AWS services not only enhances the analytical capabilities but also offers scalable solutions that can be customized for different building and industrial contexts. The research also recognizes the potential for AWS-powered solutions to promote sustainable practices and support environmental stewardship.

Keywords: energy consumption analysis, big data analytics, AWS services, energy efficiency

Procedia PDF Downloads 60
284 Radar Track-based Classification of Birds and UAVs

Authors: Altilio Rosa, Chirico Francesco, Foglia Goffredo

Abstract:

In recent years, the number of Unmanned Aerial Vehicles (UAVs) has significantly increased. The rapid development of commercial and recreational drones makes them an important part of our society. Despite the growing list of their applications, these vehicles pose a huge threat to civil and military installations: detection, classification and neutralization of such flying objects become an urgent need. Radar is an effective remote sensing tool for detecting and tracking flying objects, but scenarios characterized by the presence of a high number of tracks related to flying birds make especially challenging the drone detection task: operator PPI is cluttered with a huge number of potential threats and his reaction time can be severely affected. Flying birds compared to UAVs show similar velocity, RADAR cross-section and, in general, similar characteristics. Building from the absence of a single feature that is able to distinguish UAVs and birds, this paper uses a multiple features approach where an original feature selection technique is developed to feed binary classifiers trained to distinguish birds and UAVs. RADAR tracks acquired on the field and related to different UAVs and birds performing various trajectories were used to extract specifically designed target movement-related features based on velocity, trajectory and signal strength. An optimization strategy based on a genetic algorithm is also introduced to select the optimal subset of features and to estimate the performance of several classification algorithms (Neural network, SVM, Logistic regression…) both in terms of the number of selected features and misclassification error. Results show that the proposed methods are able to reduce the dimension of the data space and to remove almost all non-drone false targets with a suitable classification accuracy (higher than 95%).

Keywords: birds, classification, machine learning, UAVs

Procedia PDF Downloads 214
283 A Method for Precise Vertical Position of the Implant When Using Computerized Surgical Guides and Bone Reduction

Authors: Abraham Finkelman

Abstract:

Computerized Surgical Guides have been proven to be a predictable way to perform dental implants, with a relatively high accuracy in comparison to a treatment plan. When using the CSG Bone supported, it allows us to make the necessary changes of the hard tissue prior to the implant placement and after the implant placement. The CSG gives us an accurate position for the drilling, and during the implant placement it allows us to alter the vertical position of the implant altering the final position of the abutment and avoiding any risk of any damage to the adjacent anatomical structures. Any Changes required to the bone level can be done prior to the fixation of the CSG using a reduction guide, which incur extra surgical fees and the need of a second surgical guide. Any changes of the bone level after the implant placement are at the risk of damaging the implant neck surface. The technique consists of a universal system that allows us to remove the excess bone around the implant sockets prior to the implant placement which then enables us to place the implant in the vertical position with accuracy as planned with the CSG. The systems consist of a hollow pin of different sizes and diameters. Depending on the implant system that we are using. Length sizes are from 6mm-16mm and a diameter of 2.6mm-4.8mm. Upon the completion of the drilling, the pin is then inserted into the implant socket-using the insertion tool. Once the insertion tool has unscrewed the pin, we can continue with the bone reduction. The bone reduction can be done using conventional methods upon the removal of all the excess bone around the pin. The insertion tool is then screwed into the pin and the pin is then removed. We now, have the new bone level at the crest of the implant socket which is our mark for the vertical position of the implant. In some cases, when we are locating the implant very close to anatomical structures, any form of deviation to the vertical position of the implant during the surgery, can cause damage to such anatomical structures, creating irreversible damages such as paresthesia or dysesthesia of the mandibular nerve. If we are planning for immediate loading and we have done our temporary restauration in base of our computerized plan, deviation in the vertical position of the implant will affect the position of the abutment, affecting the accuracy of the temporary prosthesis, extending the working time till we adapt the prosthesis to the new position.

Keywords: bone reduction, computer aided navigation, dental implant placement, surgical guides

Procedia PDF Downloads 325
282 Forensic Medical Capacities of Research of Saliva Stains on Physical Evidence after Washing

Authors: Saule Mussabekova

Abstract:

Recent advances in genetics have allowed increasing acutely the capacities of the formation of reliable evidence in conducting forensic examinations. Thus, traces of biological origin are important sources of information about a crime. Currently, around the world, sexual offenses have increased, and among them are those in which the criminals use various detergents to remove traces of their crime. A feature of modern synthetic detergents is the presence of biological additives - enzymes. Enzymes purposefully destroy stains of biological origin. To study the nature and extent of the impact of modern washing powders on saliva stains on the physical evidence, specially prepared test specimens of different types of tissues to which saliva was applied have been examined. Materials and Methods: Washing machines of famous manufacturers of household appliances have been used with different production characteristics and advertised brands of washing powder for test washing. Over 3,500 experimental samples were tested. After washing, the traces of saliva were identified using modern research methods of forensic medicine. Results: The influence was tested and the dependence of the use of different washing programs, types of washing machines and washing powders in the process of establishing saliva trace and identify of the stains on the physical evidence while washing was revealed. The results of experimental and practical expert studies have shown that in most cases it is not possible to draw the conclusions in the identification of saliva traces on physical evidence after washing. This is a consequence of the effect of biological additives and other additional factors on traces of saliva during washing. Conclusions: On the basis of the results of the study, the feasibility of saliva traces of the stains on physical evidence after washing is established. The use of modern molecular genetic methods makes it possible to partially solve the problems arising in the study of unlaundered evidence. Additional study of physical evidence after washing facilitates detection and investigation of sexual offenses against women and children.

Keywords: saliva research, modern synthetic detergents, laundry detergents, forensic medicine

Procedia PDF Downloads 215
281 Objective Assessment of the Evolution of Microplastic Contamination in Sediments from a Vast Coastal Area

Authors: Vanessa Morgado, Ricardo Bettencourt da Silva, Carla Palma

Abstract:

The environmental pollution by microplastics is well recognized. Microplastics were already detected in various matrices from distinct environmental compartments worldwide, some from remote areas. Various methodologies and techniques have been used to determine microplastic in such matrices, for instance, sediment samples from the ocean bottom. In order to determine microplastics in a sediment matrix, the sample is typically sieved through a 5 mm mesh, digested to remove the organic matter, and density separated to isolate microplastics from the denser part of the sediment. The physical analysis of microplastic consists of visual analysis under a stereomicroscope to determine particle size, colour, and shape. The chemical analysis is performed by an infrared spectrometer coupled to a microscope (micro-FTIR), allowing to the identification of the chemical composition of microplastic, i.e., the type of polymer. Creating legislation and policies to control and manage (micro)plastic pollution is essential to protect the environment, namely the coastal areas. The regulation is defined from the known relevance and trends of the pollution type. This work discusses the assessment of contamination trends of a 700 km² oceanic area affected by contamination heterogeneity, sampling representativeness, and the uncertainty of the analysis of collected samples. The methodology developed consists of objectively identifying meaningful variations of microplastic contamination by the Monte Carlo simulation of all uncertainty sources. This work allowed us to unequivocally conclude that the contamination level of the studied area did not vary significantly between two consecutive years (2018 and 2019) and that PET microplastics are the major type of polymer. The comparison of contamination levels was performed for a 99% confidence level. The developed know-how is crucial for the objective and binding determination of microplastic contamination in relevant environmental compartments.

Keywords: measurement uncertainty, micro-ATR-FTIR, microplastics, ocean contamination, sampling uncertainty

Procedia PDF Downloads 87
280 Monitoring Deforestation Using Remote Sensing And GIS

Authors: Tejaswi Agarwal, Amritansh Agarwal

Abstract:

Forest ecosystem plays very important role in the global carbon cycle. It stores about 80% of all above ground and 40% of all below ground terrestrial organic carbon. There is much interest in the extent of tropical forests and their rates of deforestation for two reasons: greenhouse gas contributions and the impact of profoundly negative biodiversity. Deforestation has many ecological, social and economic consequences, one of which is the loss of biological diversity. The rapid deployment of remote sensing (RS) satellites and development of RS analysis techniques in the past three decades have provided a reliable, effective, and practical way to characterize terrestrial ecosystem properties. Global estimates of tropical deforestation vary widely and range from 50,000 to 170,000km2 /yr Recent FAO tropical deforestation estimates for 1990–1995 cite 116,756km2 / yr globally. Remote Sensing can prove to be a very useful tool in monitoring of forests and associated deforestation to a sufficient level of accuracy without the need of physically surveying the forest areas as many of them are physically inaccessible. The methodology for the assessment of forest cover using digital image processing (ERDAS) has been followed. The satellite data for the study was procured from Indian institute of remote Sensing (IIRS), Dehradoon in the digital format. While procuring the satellite data, care was taken to ensure that the data was cloud free and did not belong to dry and leafless season. The Normalized Difference Vegetation Index (NDVI) has been used as a numerical indicator of the reduction in ground biomass. NDVI = (near I.R - Red)/ (near I.R + Red). After calculating the NDVI variations and associated mean, we have analysed the change in ground biomass. Through this paper, we have tried to indicate the rate of deforestation over a given period of time by comparing the forest cover at different time intervals. With the help of remote sensing and GIS techniques, it is clearly shown that the total forest cover is continuously degrading and transforming into various land use/land cover category.

Keywords: remote sensing, deforestation, supervised classification, NDVI, change detection

Procedia PDF Downloads 1192
279 NDVI as a Measure of Change in Forest Biomass

Authors: Amritansh Agarwal, Tejaswi Agarwal

Abstract:

Forest ecosystem plays very important role in the global carbon cycle. It stores about 80% of all above ground and 40% of all below ground terrestrial organic carbon. There is much interest in the extent of tropical forests and their rates of deforestation for two reasons: greenhouse gas contributions and the impact of profoundly negative biodiversity. Deforestation has many ecological, social and economic consequences, one of which is the loss of biological diversity. The rapid deployment of remote sensing (RS) satellites and development of RS analysis techniques in the past three decades have provided a reliable, effective, and practical way to characterize terrestrial ecosystem properties. Global estimates of tropical deforestation vary widely and range from 50,000 to 170,000 km2 /yr Recent FAO tropical deforestation estimates for 1990–1995 cite 116,756km2 / yr globally. Remote Sensing can prove to be a very useful tool in monitoring of forests and associated deforestation to a sufficient level of accuracy without the need of physically surveying the forest areas as many of them are physically inaccessible. The methodology for the assessment of forest cover using digital image processing (ERDAS) has been followed. The satellite data for the study was procured from USGS website in the digital format. While procuring the satellite data, care was taken to ensure that the data was cloud and aerosol free by making using of FLAASH atmospheric correction technique. The Normalized Difference Vegetation Index (NDVI) has been used as a numerical indicator of the reduction in ground biomass. NDVI = (near I.R - Red)/ (near I.R + Red). After calculating the NDVI variations and associated mean we have analysed the change in ground biomass. Through this paper we have tried to indicate the rate of deforestation over a given period of time by comparing the forest cover at different time intervals. With the help of remote sensing and GIS techniques it is clearly shows that the total forest cover is continuously degrading and transforming into various land use/land cover category.

Keywords: remote sensing, deforestation, supervised classification, NDVI change detection

Procedia PDF Downloads 397
278 Harvesting Energy from Lightning Strikes

Authors: Vaishakh Medikeri

Abstract:

Lightning, the marvelous, spectacular and the awesome truth of nature is one of the greatest energy sources left unharnessed since ages. A single lightning bolt of lightning contains energy of about 15 billion joules. This huge amount of energy cannot be harnessed completely but partially. This paper proposes to harness the energy from lightning strikes. Throughout the globe the frequency of lightning is 40-50 flashes per second, totally 1.4 billion flashes per year; all of these flashes carrying an average energy of about 15 billion joules each. When a lightning bolt strikes the ground, tremendous amounts of energy is transferred to earth which propagates in the form of concentric circular energy waves. These waves have a frequency of about 7.83Hz. Harvesting the lightning bolt directly seems impossible, but harvesting the energy waves produced by the lightning is pretty easier. This can be done using a tricoil energy harnesser which is a new device which I have invented. We know that lightning bolt seeks the path which has minimum resistance down to the earth. For this we can make a lightning rod about 100 meters high. Now the lightning rod is attached to the tricoil energy harnesser. The tricoil energy harnesser contains three coils whose centers are collinear and all the coils are parallel to the ground. The first coil has one of its ends connected to the lightning rod and the other end grounded. There is a secondary coil wound on the first coil with one of its end grounded and the other end pointing to the ground and left unconnected and placed a little bit above the ground so that this end of the coil produces more intense currents, hence producing intense energy waves. The first coil produces very high magnetic fields and induces them in the second and third coils. Along with the magnetic fields induced by the first coil, the energy waves which are currents also flow through the second and the third coils. The second and the third coils are connected to a generator which in turn is connected to a capacitor which stores the electrical energy. The first coil is placed in the middle of the second and the third coil. The stored energy can be used for transmission of electricity. This new technique of harnessing the lightning strikes would be most efficient in places with more probability of the lightning strikes. Since we are using a lightning rod sufficiently long, the probability of cloud to ground strikes is increased. If the proposed apparatus is implemented, it would be a great source of pure and clean energy.

Keywords: generator, lightning rod, tricoil energy harnesser, harvesting energy

Procedia PDF Downloads 376
277 Reverse Osmosis Application on Sewage Tertiary Treatment

Authors: Elisa K. Schoenell, Cristiano De Oliveira, Luiz R. H. Dos Santos, Alexandre Giacobbo, Andréa M. Bernardes, Marco A. S. Rodrigues

Abstract:

Water is an indispensable natural resource, which must be preserved to human activities as well the ecosystems. However, the sewage discharge has been contaminating water resources. Conventional treatment, such as physicochemical treatment followed by biological processes, has not been efficient to the complete degradation of persistent organic compounds, such as medicines and hormones. Therefore, the use of advanced technologies to sewage treatment has become urgent and necessary. The aim of this study was to apply Reverse Osmosis (RO) on sewage tertiary treatment from a Waste Water Treatment Plant (WWTP) in south Brazil. It was collected 200 L of sewage pre-treated by wetland with aquatic macrophytes. The sewage was treated in a RO pilot plant, using a polyamide membrane BW30-4040 model (DOW FILMTEC), with 7.2 m² membrane area. In order to avoid damage to the equipment, this system contains a pleated polyester filter with 5 µm pore size. It was applied 8 bar until achieve 5 times of concentration, obtaining 80% of recovery of permeate, with 10 L.min-1 of concentrate flow rate. Samples of sewage pre-treated on WWTP, permeate and concentrate generated on RO was analyzed for physicochemical parameters and by gas chromatography (GC) to qualitative analysis of organic compounds. The results proved that the sewage treated on WWTP does not comply with the limit of phosphorus and nitrogen of Brazilian legislation. Besides this, it was found many organic compounds in this sewage, such as benzene, which is carcinogenic. Analyzing permeate results, it was verified that the RO as sewage tertiary treatment was efficient to remove of physicochemical parameters, achieving 100% of iron, copper, zinc and phosphorus removal, 98% of color removal, 91% of BOD and 62% of ammoniacal nitrogen. RO was capable of removing organic compounds, however, it was verified the presence of some organic compounds on de RO permeate, showing that RO did not have the capacity of removal all organic compounds of sewage. It has to be considered that permeate showed lower intensity of peaks in chromatogram in comparison to the sewage of WWTP. It is important to note that the concentrate generate on RO needs a treatment before its disposal in environment.

Keywords: organic compounds, reverse osmosis, sewage treatment, tertiary treatment

Procedia PDF Downloads 197
276 Operating Parameters and Costs Assessments of a Real Fishery Wastewater Effluent Treated by Electrocoagulation Process

Authors: Mirian Graciella Dalla Porta, Humberto Jorge José, Danielle de Bem Luiz, Regina de F. P. M.Moreira

Abstract:

Similar to most processing industries, fish processing produces large volumes of wastewater, which contains especially organic contaminants, salts and oils dispersed therein. Different processes have been used for the treatment of fishery wastewaters, but the most commonly used are chemical coagulation and flotation. These techniques are well known but sometimes the characteristics of the treated effluent do not comply with legal standards for discharge. Electrocoagulation (EC) is an electrochemical process that can be used to treat wastewaters in terms of both organic matter and nutrient removal. The process is based on the use of sacrificial electrodes such as aluminum, iron or zinc, that are oxidized to produce metal ions that can be used to coagulate and react with organic matter and nutrients in the wastewater. While EC processes are effective to treatment of several types of wastewaters, applications have been limited due to the high energy demands and high current densities. Generally, the for EC process can be performed without additional chemicals or pre-treatment, but the costs should be reduced for EC processes to become more applicable. In this work, we studied the treatment of a real wastewater from fishmeal industry by electrocoagulation process. Removal efficiencies for chemical oxygen demand (COD), total organic carbon (TOC) turbidity, phosphorous and nitrogen concentration were determined as a function of the operating conditions, such as pH, current density and operating time. The optimum operating conditions were determined to be operating time of 10 minutes, current density 100 A.m-2, and initial pH 4.0. COD, TOC, phosphorous concentration, and turbidity removal efficiencies at the optimum operating conditions were higher than 90% for aluminum electrode. Operating costs at the optimum conditions were calculated as US$ 0.37/m3 (US$ 0.038/kg COD) for Al electrode. These results demonstrate that the EC process is a promising technology to remove nutrients from fishery wastewaters, as the process has both a high efficiency of nutrient removal, and low energy requirements.

Keywords: electrocoagulation, fish, food industry, wastewater

Procedia PDF Downloads 241
275 Special Education in the South African Context: A Bio-Ecological Perspective

Authors: Suegnet Smit

Abstract:

Prior to 1994, special education in South Africa was marginalized and fragmented. Moving away from a Medical model approach to special education, the Government, after 1994, promoted an Inclusive approach, as a means to transform education in general, and special education in particular. This transformation, however, is moving at too a slow pace for learners with barriers to learning and development to benefit fully from their education. The goal of the Department of Basic Education is to minimize, remove, and prevent barriers to learning and development in the educational setting, by attending to the unique needs of the individual learner. However, the implementation of Inclusive education is problematic, and general education remains poor. This paper highlights the historical development of special education in South Africa, underpinned by a bio-ecological perspective. Problematic areas within the systemic levels of the education system are highlighted in order to indicate how the interactive processes within the systemic levels affect special needs learners on the personal dimension of the bio-ecological approach. As part of the methodology, thorough document analysis was conducted on information collected from a large body of research literature, which included academic articles, reports, policies, and policy reviews. Through a qualitative analysis, data were grouped and categorized according to the bio-ecological model systems, which revealed various successes and challenges within the education system. The challenges inhibit change, growth, and development for the child, who experience barriers to learning. From these findings, it is established that special education in South Africa has been, and still is, on a bumpy road. Sadly, the transformation process of change, envisaged by implementing Inclusive education, is still yet a dream, not fully realized. Special education seems to be stuck at what is, and the education system has not moved forward significantly enough to reach what special education should and could be. The gap that exists between a vision of Inclusive quality education for all, and the current reality, is still too wide. Problems encountered in all the education system levels, causes a funnel-effect downward to learners with special educational needs, with negative effects for the development of these learners.

Keywords: bio-ecological perspective, education systems, inclusive education, special education

Procedia PDF Downloads 143
274 2016 Taiwan's 'Health and Physical Education Field of 12-Year Basic Education Curriculum Outline (Draft)' Reform and Its Implications

Authors: Hai Zeng, Yisheng Li, Jincheng Huang, Chenghui Huang, Ying Zhang

Abstract:

Children are strong; the country strong, the development of children Basketball is a strategic advantage. Common forms of basketball equipment has been difficult to meet the needs of young children teaching the game of basketball, basketball development for 3-6 years old children in the form of appropriate teaching aids is a breakthrough basketball game teaching children bottlenecks, improve teaching critical path pleasure, but also the development of early childhood basketball a necessary requirement. In this study, literature, questionnaires, focus group interviews, comparative analysis, for domestic and foreign use of 12 kinds of basketball teaching aids (cloud computing MINI basketball, adjustable basketball MINI, MINI basketball court, shooting assist paw print ball, dribble goggles, dribbling machine, machine cartoon shooting, rebounding machine, against the mat, elastic belt, ladder, fitness ball), from fun and improve early childhood shooting technique, dribbling technology, as well as offensive and defensive rebounding against technology conduct research on conversion technology. The results show that by using appropriate forms of teaching children basketball aids, can effectively improve children's fun basketball game, targeted to improve a technology, different types of aids from different perspectives enrich the connotation of children basketball game. Recommended for children of color psychology, cartoon and environmentally friendly material production aids, and increase research efforts basketball aids children, encourage children to sports teachers aids applications.

Keywords: health and physical education field of curriculum outline, health fitness, sports and health curriculum reform, Taiwan, twelve years basic education

Procedia PDF Downloads 389
273 Feature Engineering Based Detection of Buffer Overflow Vulnerability in Source Code Using Deep Neural Networks

Authors: Mst Shapna Akter, Hossain Shahriar

Abstract:

One of the most important challenges in the field of software code audit is the presence of vulnerabilities in software source code. Every year, more and more software flaws are found, either internally in proprietary code or revealed publicly. These flaws are highly likely exploited and lead to system compromise, data leakage, or denial of service. C and C++ open-source code are now available in order to create a largescale, machine-learning system for function-level vulnerability identification. We assembled a sizable dataset of millions of opensource functions that point to potential exploits. We developed an efficient and scalable vulnerability detection method based on deep neural network models that learn features extracted from the source codes. The source code is first converted into a minimal intermediate representation to remove the pointless components and shorten the dependency. Moreover, we keep the semantic and syntactic information using state-of-the-art word embedding algorithms such as glove and fastText. The embedded vectors are subsequently fed into deep learning networks such as LSTM, BilSTM, LSTM-Autoencoder, word2vec, BERT, and GPT-2 to classify the possible vulnerabilities. Furthermore, we proposed a neural network model which can overcome issues associated with traditional neural networks. Evaluation metrics such as f1 score, precision, recall, accuracy, and total execution time have been used to measure the performance. We made a comparative analysis between results derived from features containing a minimal text representation and semantic and syntactic information. We found that all of the deep learning models provide comparatively higher accuracy when we use semantic and syntactic information as the features but require higher execution time as the word embedding the algorithm puts on a bit of complexity to the overall system.

Keywords: cyber security, vulnerability detection, neural networks, feature extraction

Procedia PDF Downloads 84
272 Nickel Removal from Industrial Wastewater by Eucalyptus Leaves and Poplar Ashes

Authors: Negin Bayat, Nahid HasanZadeh

Abstract:

Effluents of different industries such as metalworking, battery industry, mining, including heavy metal are considered problematic issues for both humans and the environment. These heavy metals include cadmium, copper, zinc, nickel, chromium, cyanide, lead, etc. Different physicochemical and biological methods are used to remove heavy metals, such as sedimentation, coagulation, flotation, chemical precipitation, filtration, membrane processes (reverse osmosis and nanofiltration), ion exchange, biological methods, adsorption with activated carbon, etc. These methods are generally either expensive or ineffective. In recent years, considerable attention has been given to the removal of heavy metal ions from solution by absorption using discarded and low-cost materials. In this study, nickel removal using an adsorption process by eucalyptus powdered leaves and poplar ash was investigated. This is an applied study. The effect of various parameters on metal removal, such as pH, amount of adsorbent, contact time, and stirring speed, was studied using a discontinuous method. This research was conducted in aqueous solutions on the laboratory scale. Then, optimum absorption conditions were obtained. Then, the study was conducted on real wastewater samples. In addition, the nickel concentration in the wastewater before and after the absorption process was measured. In all experiments, the remaining nickel was measured using an atomic absorption spectrometry device at 382 nm wavelength after an appropriate time and filtration. The results showed that increasing both adsorbent and pH parameters increase the metal removal rate. Nickel removal increased at the first 60 minutes. Then, the absorption rate remained constant and reached equilibrium. A desired removal rate with 40 mg in 100 ml adsorbent solution at pH = 9.5 was observed. According to the obtained results, the best absorption rate was observed at 40 mg dose using a combination of eucalyptus leaves and poplar ash in this study, which was equal to 99.76%. Thus, this combined method can be used as an inexpensive and effective absorbent for the removal of nickel from aqueous solutions.

Keywords: absorption, wastewater, nickel, poplar ash, eucalyptus leaf, treatment

Procedia PDF Downloads 12
271 Influence of the Moisture Content on the Flowability of Fine-Grained Iron Ore Concentrate

Authors: C. Lanzerstorfer, M. Hinterberger

Abstract:

The iron content of the ore used is crucial for the productivity and coke consumption rate in blast furnace pig iron production. Therefore, most iron ore deposits are processed in beneficiation plants to increase the iron content and remove impurities. In several comminution stages, the particle size of the ore is reduced to ensure that the iron oxides are physically liberated from the gangue. Subsequently, physical separation processes are applied to concentrate the iron ore. The fine-grained ore concentrates produced need to be transported, stored, and processed. For smooth operation of these processes, the flow properties of the material are crucial. The flowability of powders depends on several properties of the material: grain size, grain size distribution, grain shape, and moisture content of the material. The flowability of powders can be measured using ring shear testers. In this study, the influence of the moisture content on the flowability for the Krivoy Rog magnetite iron ore concentrate was investigated. Dry iron ore concentrate was mixed with varying amounts of water to produce samples with a moisture content in the range of 0.2 to 12.2%. The flowability of the samples was investigated using a Schulze ring shear tester. At all measured values of the normal stress (1.0 kPa – 20 kPa), the flowability decreased significantly from dry ore to a moisture content of approximately 3-5%. At higher moisture contents, the flowability was nearly constant, while at the maximum moisture content the flowability improved for high values of the normal stress only. The results also showed an improving flowability with increasing consolidation stress for all moisture content levels investigated. The wall friction angle of the dust with carbon steel (S235JR), and an ultra-high molecule low-pressure polyethylene (Robalon) was also investigated. The wall friction angle increased significantly from dry ore to a moisture content of approximately 3%. For higher moisture content levels, the wall friction angles were nearly constant. Generally, the wall friction angle was approximately 4° lower at the higher wall normal stress.

Keywords: iron ore concentrate, flowability, moisture content, wall friction angle

Procedia PDF Downloads 316