Search results for: mitigation techniques
6324 Ground Improvement Using Deep Vibro Techniques at Madhepura E-Loco Project
Authors: A. Sekhar, N. Ramakrishna Raju
Abstract:
This paper is a result of ground improvement using deep vibro techniques with combination of sand and stone columns performed on a highly liquefaction susceptible site (70 to 80% sand strata and balance silt) with low bearing capacities due to high settlements located (earth quake zone V as per IS code) at Madhepura, Bihar state in northern part of India. Initially, it was envisaged with bored cast in-situ/precast piles, stone/sand columns. However, after detail analysis to address both liquefaction and improve bearing capacities simultaneously, it was analyzed the deep vibro techniques with combination of sand and stone columns is excellent solution for given site condition which may be first time in India. First after detail soil investigation, pre eCPT test was conducted to evaluate the potential depth of liquefaction to densify silty sandy soils to improve factor of safety against liquefaction. Then trail test were being carried out at site by deep vibro compaction technique with sand and stone columns combination with different spacings of columns in triangular shape with different timings during each lift of vibro up to ground level. Different spacings and timing was done to obtain the most effective spacing and timing with vibro compaction technique to achieve maximum densification of saturated loose silty sandy soils uniformly for complete treated area. Then again, post eCPT test and plate load tests were conducted at all trail locations of different spacings and timing of sand and stone columns to evaluate the best results for obtaining the required factor of safety against liquefaction and the desired bearing capacities with reduced settlements for construction of industrial structures. After reviewing these results, it was noticed that the ground layers are densified more than the expected with improved factor of safety against liquefaction and achieved good bearing capacities for a given settlements as per IS codal provisions. It was also worked out for cost-effectiveness of lightly loaded single storied structures by using deep vibro technique with sand column avoiding stone. The results were observed satisfactory for resting the lightly loaded foundations. In this technique, the most important is to mitigating liquefaction with improved bearing capacities and reduced settlements to acceptable limits as per IS: 1904-1986 simultaneously up to a depth of 19M. To our best knowledge it was executed first time in India.Keywords: ground improvement, deep vibro techniques, liquefaction, bearing capacity, settlement
Procedia PDF Downloads 1996323 Heuristic Classification of Hydrophone Recordings
Authors: Daniel M. Wolff, Patricia Gray, Rafael de la Parra Venegas
Abstract:
An unsupervised machine listening system is constructed and applied to a dataset of 17,195 30-second marine hydrophone recordings. The system is then heuristically supplemented with anecdotal listening, contextual recording information, and supervised learning techniques to reduce the number of false positives. Features for classification are assembled by extracting the following data from each of the audio files: the spectral centroid, root-mean-squared values for each frequency band of a 10-octave filter bank, and mel-frequency cepstral coefficients in 5-second frames. In this way both time- and frequency-domain information are contained in the features to be passed to a clustering algorithm. Classification is performed using the k-means algorithm and then a k-nearest neighbors search. Different values of k are experimented with, in addition to different combinations of the available feature sets. Hypothesized class labels are 'primarily anthrophony' and 'primarily biophony', where the best class result conforming to the former label has 104 members after heuristic pruning. This demonstrates how a large audio dataset has been made more tractable with machine learning techniques, forming the foundation of a framework designed to acoustically monitor and gauge biological and anthropogenic activity in a marine environment.Keywords: anthrophony, hydrophone, k-means, machine learning
Procedia PDF Downloads 1726322 Educational Data Mining: The Case of the Department of Mathematics and Computing in the Period 2009-2018
Authors: Mário Ernesto Sitoe, Orlando Zacarias
Abstract:
University education is influenced by several factors that range from the adoption of strategies to strengthen the whole process to the academic performance improvement of the students themselves. This work uses data mining techniques to develop a predictive model to identify students with a tendency to evasion and retention. To this end, a database of real students’ data from the Department of University Admission (DAU) and the Department of Mathematics and Informatics (DMI) was used. The data comprised 388 undergraduate students admitted in the years 2009 to 2014. The Weka tool was used for model building, using three different techniques, namely: K-nearest neighbor, random forest, and logistic regression. To allow for training on multiple train-test splits, a cross-validation approach was employed with a varying number of folds. To reduce bias variance and improve the performance of the models, ensemble methods of Bagging and Stacking were used. After comparing the results obtained by the three classifiers, Logistic Regression using Bagging with seven folds obtained the best performance, showing results above 90% in all evaluated metrics: accuracy, rate of true positives, and precision. Retention is the most common tendency.Keywords: evasion and retention, cross-validation, bagging, stacking
Procedia PDF Downloads 876321 National Digital Soil Mapping Initiatives in Europe: A Review and Some Examples
Authors: Dominique Arrouays, Songchao Chen, Anne C. Richer-De-Forges
Abstract:
Soils are at the crossing of many issues such as food and water security, sustainable energy, climate change mitigation and adaptation, biodiversity protection, human health and well-being. They deliver many ecosystem services that are essential to life on Earth. Therefore, there is a growing demand for soil information on a national and global scale. Unfortunately, many countries do not have detailed soil maps, and, when existing, these maps are generally based on more or less complex and often non-harmonized soil classifications. An estimate of their uncertainty is also often missing. Thus, there are not easy to understand and often not properly used by end-users. Therefore, there is an urgent need to provide end-users with spatially exhaustive grids of essential soil properties, together with an estimate of their uncertainty. One way to achieve this is digital soil mapping (DSM). The concept of DSM relies on the hypothesis that soils and their properties are not randomly distributed, but that they depend on the main soil-forming factors that are climate, organisms, relief, parent material, time (age), and position in space. All these forming factors can be approximated using several exhaustive spatial products such as climatic grids, remote sensing products or vegetation maps, digital elevation models, geological or lithological maps, spatial coordinates of soil information, etc. Thus, DSM generally relies on models calibrated with existing observed soil data (point observations or maps) and so-called “ancillary co-variates” that come from other available spatial products. Then the model is generalized on grids where soil parameters are unknown in order to predict them, and the prediction performances are validated using various methods. With the growing demand for soil information at a national and global scale and the increase of available spatial co-variates national and continental DSM initiatives are continuously increasing. This short review illustrates the main national and continental advances in Europe, the diversity of the approaches and the databases that are used, the validation techniques and the main scientific and other issues. Examples from several countries illustrate the variety of products that were delivered during the last ten years. The scientific production on this topic is continuously increasing and new models and approaches are developed at an incredible speed. Most of the digital soil mapping (DSM) products rely mainly on machine learning (ML) prediction models and/or the use or pedotransfer functions (PTF) in which calibration data come from soil analyses performed in labs or for existing conventional maps. However, some scientific issues remain to be solved and also political and legal ones related, for instance, to data sharing and to different laws in different countries. Other issues related to communication to end-users and education, especially on the use of uncertainty. Overall, the progress is very important and the willingness of institutes and countries to join their efforts is increasing. Harmonization issues are still remaining, mainly due to differences in classifications or in laboratory standards between countries. However numerous initiatives are ongoing at the EU level and also at the global level. All these progress are scientifically stimulating and also promissing to provide tools to improve and monitor soil quality in countries, EU and at the global level.Keywords: digital soil mapping, global soil mapping, national and European initiatives, global soil mapping products, mini-review
Procedia PDF Downloads 1886320 Improving the Strength Characteristics of Soil Using Cotton Fibers
Authors: Bindhu Lal, Karnika Kochal
Abstract:
Clayey soil contains clay minerals with traces of metal oxides and organic matter, which exhibits properties like low drainage, high plasticity, and shrinkage. To overcome these issues, various soil reinforcement techniques are used to elevate the stiffness, water tightness, and bearing capacity of the soil. Such techniques include cementation, bituminization, freezing, fiber inclusion, geo-synthetics, nailing, etc. Reinforcement of soil with fibers has been a cost-effective solution to soil improvement problems. An experimental study was undertaken involving the inclusion of cotton waste fibers in clayey soil as reinforcement with different fiber contents (1%, 1.5%, 2%, and 2.5% by weight) and analyzing its effects on the unconfined compressive strength of the soil. Two categories of soil were taken, comprising of natural clay and clay mixed with 5% sodium bentonite by weight. The soil specimens were subjected to proctor compaction and unconfined compression tests. The validated outcome shows that fiber inclusion has a strikingly positive impact on the compressive strength and axial strain at failure of the soil. Based on the commendatory results procured, compressive strength was found to be directly proportional to the fiber content, with the effect being more pronounced at lower water content.Keywords: bentonite clay, clay, cotton fibers, unconfined compressive strength
Procedia PDF Downloads 1856319 A Review of Benefit-Risk Assessment over the Product Lifecycle
Authors: M. Miljkovic, A. Urakpo, M. Simic-Koumoutsaris
Abstract:
Benefit-risk assessment (BRA) is a valuable tool that takes place in multiple stages during a medicine's lifecycle, and this assessment can be conducted in a variety of ways. The aim was to summarize current BRA methods used during approval decisions and in post-approval settings and to see possible future directions. Relevant reviews, recommendations, and guidelines published in medical literature and through regulatory agencies over the past five years have been examined. BRA implies the review of two dimensions: the dimension of benefits (determined mainly by the therapeutic efficacy) and the dimension of risks (comprises the safety profile of a drug). Regulators, industry, and academia have developed various approaches, ranging from descriptive textual (qualitative) to decision-analytic (quantitative) models, to facilitate the BRA of medicines during the product lifecycle (from Phase I trials, to authorization procedure, post-marketing surveillance and health technology assessment for inclusion in public formularies). These approaches can be classified into the following categories: stepwise structured approaches (frameworks); measures for benefits and risks that are usually endpoint specific (metrics), simulation techniques and meta-analysis (estimation techniques), and utility survey techniques to elicit stakeholders’ preferences (utilities). All these approaches share the following two common goals: to assist this analysis and to improve the communication of decisions, but each is subject to its own specific strengths and limitations. Before using any method, its utility, complexity, the extent to which it is established, and the ease of results interpretation should be considered. Despite widespread and long-time use, BRA is subject to debate, suffers from a number of limitations, and currently is still under development. The use of formal, systematic structured approaches to BRA for regulatory decision-making and quantitative methods to support BRA during the product lifecycle is a standard practice in medicine that is subject to continuous improvement and modernization, not only in methodology but also in cooperation between organizations.Keywords: benefit-risk assessment, benefit-risk profile, product lifecycle, quantitative methods, structured approaches
Procedia PDF Downloads 1616318 A Metallography Study of Secondary A226 Aluminium Alloy Used in Automotive Industries
Authors: Lenka Hurtalová, Eva Tillová, Mária Chalupová, Juraj Belan, Milan Uhríčik
Abstract:
The secondary alloy A226 is used for many automotive casting produced by mould casting and high pressure die-casting. This alloy has excellent castability, good mechanical properties and cost-effectiveness. Production of primary aluminium alloys belong to heavy source fouling of life environs. The European Union calls for the emission reduction and reduction in energy consumption, therefore, increase production of recycled (secondary) aluminium cast alloys. The contribution is deal with influence of recycling on the quality of the casting made from A226 in automotive industry. The properties of the casting made from secondary aluminium alloys were compared with the required properties of primary aluminium alloys. The effect of recycling on microstructure was observed using combination different analytical techniques (light microscopy upon black-white etching, scanning electron microscopy-SEM upon deep etching and energy dispersive X-ray analysis-EDX). These techniques were used for the identification of the various structure parameters, which was used to compare secondary alloy microstructure with primary alloy microstructure.Keywords: A226 secondary aluminium alloy, deep etching, mechanical properties, recycling foundry aluminium alloy
Procedia PDF Downloads 5466317 Reducing Crash Risk at Intersections with Safety Improvements
Authors: Upal Barua
Abstract:
Crash risk at intersections is a critical safety issue. This paper examines the effectiveness of removing an existing off-set at an intersection by realignment, in reducing crashes. Empirical Bayes method was applied to conduct a before-and-after study to assess the effect of this safety improvement. The Transportation Safety Improvement Program in Austin Transportation Department completed several safety improvement projects at high crash intersections with a view to reducing crashes. One of the common safety improvement techniques applied was the realignment of intersection approaches removing an existing off-set. This paper illustrates how this safety improvement technique is applied at a high crash intersection from inception to completion. This paper also highlights the significant crash reductions achieved from this safety improvement technique applying Empirical Bayes method in a before-and-after study. The result showed that realignment of intersection approaches removing an existing off-set can reduce crashes by 53%. This paper also features the state of the art techniques applied in planning, engineering, designing and construction of this safety improvement, key factors driving the success, and lessons learned in the process.Keywords: crash risk, intersection, off-set, safety improvement technique, before-and-after study, empirical Bayes method
Procedia PDF Downloads 2496316 Electrochemical and Theoretical Quantum Approaches on the Inhibition of C1018 Carbon Steel Corrosion in Acidic Medium Containing Chloride Using Newly Synthesized Phenolic Schiff Bases Compounds
Authors: Hany M. Abd El-Lateef
Abstract:
Two novel Schiff bases, 5-bromo-2-[(E)-(pyridin-3-ylimino) methyl] phenol (HBSAP) and 5-bromo-2-[(E)-(quinolin-8-ylimino) methyl] phenol (HBSAQ) have been synthesized. They have been characterized by elemental analysis and spectroscopic techniques (UV–Vis, IR and NMR). Moreover, the molecular structure of HBSAP and HBSAQ compounds are determined by single crystal X-ray diffraction technique. The inhibition activity of HBSAP and HBSAQ for carbon steel in 3.5 %NaCl+0.1 M HCl for both short and long immersion time, at different temperatures (20-50 ºC), was investigated using electrochemistry and surface characterization. The potentiodynamic polarization shows that the inhibitors molecule is more adsorbed on the cathodic sites. Its efficiency increases with increasing inhibitor concentrations (92.8 % at the optimal concentration of 10-3 M for HBSAQ). Adsorption of the inhibitors on the carbon steel surface was found to obey Langmuir’s adsorption isotherm with physical/chemical nature of the adsorption, as it is shown also by scanning electron microscopy. Further, the electronic structural calculations using quantum chemical methods were found to be in a good agreement with the results of the experimental studies.Keywords: carbon steel, Schiff bases, corrosion inhibition, SEM, electrochemical techniques
Procedia PDF Downloads 3956315 Methods for Enhancing Ensemble Learning or Improving Classifiers of This Technique in the Analysis and Classification of Brain Signals
Authors: Seyed Mehdi Ghezi, Hesam Hasanpoor
Abstract:
This scientific article explores enhancement methods for ensemble learning with the aim of improving the performance of classifiers in the analysis and classification of brain signals. The research approach in this field consists of two main parts, each with its own strengths and weaknesses. The choice of approach depends on the specific research question and available resources. By combining these approaches and leveraging their respective strengths, researchers can enhance the accuracy and reliability of classification results, consequently advancing our understanding of the brain and its functions. The first approach focuses on utilizing machine learning methods to identify the best features among the vast array of features present in brain signals. The selection of features varies depending on the research objective, and different techniques have been employed for this purpose. For instance, the genetic algorithm has been used in some studies to identify the best features, while optimization methods have been utilized in others to identify the most influential features. Additionally, machine learning techniques have been applied to determine the influential electrodes in classification. Ensemble learning plays a crucial role in identifying the best features that contribute to learning, thereby improving the overall results. The second approach concentrates on designing and implementing methods for selecting the best classifier or utilizing meta-classifiers to enhance the final results in ensemble learning. In a different section of the research, a single classifier is used instead of multiple classifiers, employing different sets of features to improve the results. The article provides an in-depth examination of each technique, highlighting their advantages and limitations. By integrating these techniques, researchers can enhance the performance of classifiers in the analysis and classification of brain signals. This advancement in ensemble learning methodologies contributes to a better understanding of the brain and its functions, ultimately leading to improved accuracy and reliability in brain signal analysis and classification.Keywords: ensemble learning, brain signals, classification, feature selection, machine learning, genetic algorithm, optimization methods, influential features, influential electrodes, meta-classifiers
Procedia PDF Downloads 816314 Electrokinetic Application for the Improvement of Soft Clays
Authors: Abiola Ayopo Abiodun, Zalihe Nalbantoglu
Abstract:
The electrokinetic application (EKA), a relatively modern chemical treatment has a potential for in-situ ground improvement in an open field or under existing structures. It utilizes a low electrical gradient to transport electrolytic chemical ions between bespoke electrodes inserted in the fine-grained, low permeable soft soils. The paper investigates the efficacy of the EKA as a mitigation technique for the soft clay beds. The laboratory model of the EKA comprises of rectangular plexiglass test tank, electrolytes compartments, geosynthetic electrodes and direct electric current supply. Within this setup, the EK effects resulted from the exchange of ions between anolyte (anodic) and catholyte (cathodic) ends through the tested soil were examined by basic experimental laboratory testing methods. As such, the treated soft soil properties were investigated as a function of the anode-to-cathode distances and curing periods. The test results showed that there have been some changes in the physical and engineering properties of the treated soft soils. The significant changes in the physicochemical and electrical properties suggested that their corresponding changes can be utilized as a monitoring technique to evaluate the improvement in the engineering properties EK treated soft clay soils.Keywords: electrokinetic, electrolytes, exchange ions, geosynthetic electrodes, soft soils
Procedia PDF Downloads 3206313 Laboratory Model Tests on Encased Group Columns
Authors: Kausar Ali
Abstract:
There are several ground treatment techniques which may meet the twin objectives of increasing the bearing capacity with simultaneous reduction of settlements, but the use of stone columns is one of the most suited techniques for flexible structures such as embankments, oil storage tanks etc. that can tolerate some settlement and used worldwide. However, when the stone columns in very soft soils are loaded; stone columns undergo excessive settlement due to low lateral confinement provided by the soft soil, leading to the failure of the structure. The poor performance of stone columns under these conditions can be improved by encasing the columns with a suitable geosynthetic. In this study, the effect of reinforcement on bearing capacity of composite soil has been investigated by conducting laboratory model tests on floating and end bearing long stone columns with l/d ratio of 12. The columns were reinforced by providing geosynthetic encasement over varying column length (upper 25%, 50%, 75%, and 100% column length). In this study, a group of columns has been used instead of single column, because in the field, columns used for the purpose always remain in groups. The tests indicate that the encasement over the full column length gives higher failure stress as compared to the encasement over the partial column length for both floating and end bearing long columns. The performance of end-bearing columns was found much better than the floating columns.Keywords: geosynthetic, ground improvement, soft clay, stone column
Procedia PDF Downloads 4386312 Reinventing Urban Governance: Sustainable Transport Solutions for Mitigating Climate Risks in Smart Cities
Authors: Jaqueline Nichi, Leila Da Costa Ferreira, Fabiana Barbi Seleguim, Gabriela Marques Di Giulio, Mariana Barbieri
Abstract:
The transport sector is responsible for approximately 55% of global greenhouse gas (GHG) emissions, in addition to pollution and other negative externalities, such as road accidents and congestion, that impact the routine of those who live in large cities. The objective of this article is to discuss the application and use of distinct mobility technologies such as climate adaptation and mitigation measures in the context of smart cities in the Global South. The documentary analysis is associated with 22 semi structured interviews with managers who work with mobility technologies in the public and private sectors and in civil society organizations to explore solutions in multilevel governance for smart and low-carbon mobility based on the case study from the city of São Paulo, Brazil. The hypothesis that innovation and technology to mitigate and adapt to climate impacts are not yet sufficient to make mobility more sustainable has been confirmed. The results indicate four relevant aspects for advancing a climate agenda in smart cities: integrated planning, coproduction of knowledge, experiments in governance, and new means of financing to guarantee the sustainable sociotechnical transition of the sector.Keywords: urban mobility, climate change, smart cities, multilevel governance
Procedia PDF Downloads 616311 Ilorin Traditional Architecture as a Good Example of a Green Building Design
Authors: Olutola Funmilayo Adekeye
Abstract:
Tradition African practice of architecture can be said to be deeply rooted in Green Architecture in concept, design and execution. A study into the ancient building techniques in Ilorin Emirate depicts prominent (eco-centric approach of) Green Architecture principles. In the Pre-colonial era before the introduction of modern architecture and Western building materials, the Nigeria traditional communities built their houses to meet their cultural, religious and social needs using mainly indigenous building materials such as mud (Amo), cowdung (Boto), straws (koriko), palm fronts (Imo-Ope) to mention a few. This research attempts to identify the various techniques of applying the traditional African principles of Green Architecture to Ilorin traditional buildings. It will examine and assess some case studies to understand the extent to which Green architecture principles have been applied to traditional building designs that are still preserved today in Ilorin, Nigeria. Furthermore, this study intends to answer many questions, which can be summarized into two basic questions which are: (1) What aspects of what today are recognized as important green architecture principles have been applied to Ilorin traditional buildings? (2) To what extent have the principles of green architecture applied to Ilorin traditional buildings been ways of demonstrating a cultural attachment to the earth as an expression of the African sense of human being as one with nature?Keywords: green architecture, Ilorin, traditional buildings, design principles, ecocentric, application
Procedia PDF Downloads 5576310 Designing Form, Meanings, and Relationships for Future Industrial Products. Case Study Observation of PAD
Authors: Elisabetta Cianfanelli, Margherita Tufarelli, Paolo Pupparo
Abstract:
The dialectical mediation between desires and objects or between mass production and consumption continues to evolve over time. This relationship is influenced both by variable geometries of contexts that are distant from the mere design of product form and by aspects rooted in the very definition of industrial design. In particular, the overcoming of macro-areas of innovation in the technological, social, cultural, formal, and morphological spheres, supported by recent theories in critical and speculative design, seems to be moving further and further away from the design of the formal dimension of advanced products. The articulated fabric of theories and practices that feed the definition of “hyperobjects”, and no longer objects describes a common tension in all areas of design and production of industrial products. The latter are increasingly detached from the design of the form and meaning of the same in mass productions, thus losing the quality of products capable of social transformation. For years we have been living in a transformative moment as regards the design process in the definition of the industrial product. We are faced with a dichotomy in which there is, on the one hand, a reactionary aversion to the new techniques of industrial production and, on the other hand, a sterile adoption of the techniques of mass production that we can now consider traditional. This ambiguity becomes even more evident when we talk about industrial products, and we realize that we are moving further and further away from the concepts of "form" as a synthesis of a design thought aimed at the aesthetic-emotional component as well as the functional one. The design of forms and their contents, as statutes of social acts, allows us to investigate the tension on mass production that crosses seasons, trends, technicalities, and sterile determinisms. The design culture has always determined the formal qualities of objects as a sum of aesthetic characteristics functional and structural relationships that define a product as a coherent unit. The contribution proposes a reflection and a series of practical experiences of research on the form of advanced products. This form is understood as a kaleidoscope of relationships through the search for an identity, the desire for democratization, and between these two, the exploration of the aesthetic factor. The study of form also corresponds to the study of production processes, technological innovations, the definition of standards, distribution, advertising, the vicissitudes of taste and lifestyles. Specifically, we will investigate how the genesis of new forms for new meanings introduces a change in the relative innovative production techniques. It becomes, therefore, fundamental to investigate, through the reflections and the case studies exposed inside the contribution, also the new techniques of production and elaboration of the forms of the products, as new immanent and determining element inside the planning process.Keywords: industrial design, product advanced design, mass productions, new meanings
Procedia PDF Downloads 1276309 Regional Flood-Duration-Frequency Models for Norway
Authors: Danielle M. Barna, Kolbjørn Engeland, Thordis Thorarinsdottir, Chong-Yu Xu
Abstract:
Design flood values give estimates of flood magnitude within a given return period and are essential to making adaptive decisions around land use planning, infrastructure design, and disaster mitigation. Often design flood values are needed at locations with insufficient data. Additionally, in hydrologic applications where flood retention is important (e.g., floodplain management and reservoir design), design flood values are required at different flood durations. A statistical approach to this problem is a development of a regression model for extremes where some of the parameters are dependent on flood duration in addition to being covariate-dependent. In hydrology, this is called a regional flood-duration-frequency (regional-QDF) model. Typically, the underlying statistical distribution is chosen to be the Generalized Extreme Value (GEV) distribution. However, as the support of the GEV distribution depends on both its parameters and the range of the data, special care must be taken with the development of the regional model. In particular, we find that the GEV is problematic when developing a GAMLSS-type analysis due to the difficulty of proposing a link function that is independent of the unknown parameters and the observed data. We discuss these challenges in the context of developing a regional QDF model for Norway.Keywords: design flood values, bayesian statistics, regression modeling of extremes, extreme value analysis, GEV
Procedia PDF Downloads 756308 Field Deployment of Corrosion Inhibitor Developed for Sour Oil and Gas Carbon Steel Pipelines
Authors: Jeremy Moloney
Abstract:
A major oil and gas operator in western Canada producing approximately 50,000 BOE per day of sour fluids was experiencing increased water production along with decreased oil production over several years. The higher water volumes being produced meant an increase in the operator’s incumbent corrosion inhibitor (CI) chemical requirements but with reduced oil production revenues. Thus, a cost-effective corrosion inhibitor solution was sought to deliver enhanced corrosion mitigation of the carbon steel pipeline infrastructure but at reduced chemical injection dose rates. This paper presents the laboratory work conducted on the development of a corrosion inhibitor under the operator’s simulated sour operating conditions and then subsequent field testing of the product. The new CI not only provided extremely good levels of general and localized corrosion inhibition and outperformed the incumbent CI under the laboratory test conditions but did so at vastly lower concentrations. In turn, the novel CI product facilitated field chemical injection rates to be optimized and reduced by 40% compared with the incumbent whilst maintaining superior corrosion protection resulting in significant cost savings and associated sustainability benefits for the operator.Keywords: carbon steel, sour gas, hydrogen sulphide, localized corrosion, pitting, corrosion inhibitor
Procedia PDF Downloads 906307 Single-Molecule Analysis of Structure and Dynamics in Polymer Materials by Super-Resolution Technique
Authors: Hiroyuki Aoki
Abstract:
The physical properties of polymer materials are dependent on the conformation and molecular motion of a polymer chain. Therefore, the structure and dynamic behavior of the single polymer chain have been the most important concerns in the field of polymer physics. However, it has been impossible to directly observe the conformation of the single polymer chain in a bulk medium. In the current work, the novel techniques to study the conformation and dynamics of a single polymer chain are proposed. Since a fluorescence method is extremely sensitive, the fluorescence microscopy enables the direct detection of a single molecule. However, the structure of the polymer chain as large as 100 nm cannot be resolved by conventional fluorescence methods because of the diffraction limit of light. In order to observe the single chains, we developed the labeling method of polymer materials with a photo-switchable dye and the super-resolution microscopy. The real-space conformational analysis of single polymer chains with the spatial resolution of 15-20 nm was achieved. The super-resolution microscopy enables us to obtain the three-dimensional coordinates; therefore, we succeeded the conformational analysis in three dimensions. The direct observation by the nanometric optical microscopy would reveal the detailed information on the molecular processes in the various polymer systems.Keywords: polymer materials, single molecule, super-resolution techniques, conformation
Procedia PDF Downloads 3096306 Real Estate Price Classification Using Machine Learning Techniques
Authors: Hadeel Sulaiman Alamri, Mohamed Maher Ben Ismail, Ouiem Bchir
Abstract:
Abstract— The continued advances in Artificial Intelligence (AI) and Machine Learning (ML) have boosted the interest of tax authorities in developing smart solutions as efficient alternatives to their actual fraud detection mechanisms. In particular, the real estate data collected by the administrations promoted the efforts to develop advanced analytics models aimed at detecting fraudulent real estate transactions. Specifically, supervised and unsupervised Machine Learning techniques have been associated with the available large datasets to improve overall taxpayer compliance. This research introduces a machine-learning approach intended to classify land and building prices in Saudi Arabia. Specifically, it intends to group real estate transactions reported into homogeneous groups based on relevant features. Moreover, the proposed solution classifies the lands and buildings prices in Saudi city, neighborhood, and schema. In fact, the outcomes of the clustering task are fed into a supervised machine learning process to categorize future real estate transactions into “Fair”, “Under-valued” or “Over-valued” classes. In particular, the experimental findings indicate that associating clustering algorithms with Random Forest (RF) model yields an accuracy of 99%.Keywords: classification, clustering, machine learning, real estate price
Procedia PDF Downloads 106305 Systems of Liquid Organic Fertilizer Application with Respect to Environmental Impact
Authors: Hidayatul Fitri, Petr Šařec
Abstract:
The use of organic fertilizer is increasing nowadays, and the application must be conducted accurately to provide the right benefits for plants and maintain soil health. Improper application of fertilizers can cause problems for both plants and the environment. This study investigated the liquid organic fertilizer application, particularly digestate, varied into different application doses concerning mitigation of adverse environmental impacts, improving water infiltration ability, and crop yields. The experiment was established into eight variants with different digestate doses, conducted on emission monitoring and soil physical properties. As a result, the digestate application with shallow injection (5 cm in depth) was confirmed as an appropriate technique for applying liquid fertilizer into the soil. Gas emissions resulted in low concentration and declined gradually over time, obviously proved from the experiment conducted under two measurements immediately after application and the next day. Applied various doses of liquid digestate fertilizer affected the emission concentrations of NH3 volatilization, differing significantly and decreasing about 40% from the first to second measurement. In this study, winter wheat crop production significantly increases under digestate application with additional N fertilizer. This study suggested the long-term application of digestate to obtain more alteration of soil properties such as bulk density, penetration resistance, and hydraulic conductivity.Keywords: liquid organic fertilizer, digestate, application, ammonia, emission
Procedia PDF Downloads 2926304 Air Quality Analysis Using Machine Learning Models Under Python Environment
Authors: Salahaeddine Sbai
Abstract:
Air quality analysis using machine learning models is a method employed to assess and predict air pollution levels. This approach leverages the capabilities of machine learning algorithms to analyze vast amounts of air quality data and extract valuable insights. By training these models on historical air quality data, they can learn patterns and relationships between various factors such as weather conditions, pollutant emissions, and geographical features. The trained models can then be used to predict air quality levels in real-time or forecast future pollution levels. This application of machine learning in air quality analysis enables policymakers, environmental agencies, and the general public to make informed decisions regarding health, environmental impact, and mitigation strategies. By understanding the factors influencing air quality, interventions can be implemented to reduce pollution levels, mitigate health risks, and enhance overall air quality management. Climate change is having significant impacts on Morocco, affecting various aspects of the country's environment, economy, and society. In this study, we use some machine learning models under python environment to predict and analysis air quality change over North of Morocco to evaluate the climate change impact on agriculture.Keywords: air quality, machine learning models, pollution, pollutant emissions
Procedia PDF Downloads 966303 Imp_hist-Si: Improved Hybrid Image Segmentation Technique for Satellite Imagery to Decrease the Segmentation Error Rate
Authors: Neetu Manocha
Abstract:
Image segmentation is a technique where a picture is parted into distinct parts having similar features which have a place with similar items. Various segmentation strategies have been proposed as of late by prominent analysts. But, after ultimate thorough research, the novelists have analyzed that generally, the old methods do not decrease the segmentation error rate. Then author finds the technique HIST-SI to decrease the segmentation error rates. In this technique, cluster-based and threshold-based segmentation techniques are merged together. After then, to improve the result of HIST-SI, the authors added the method of filtering and linking in this technique named Imp_HIST-SI to decrease the segmentation error rates. The goal of this research is to find a new technique to decrease the segmentation error rates and produce much better results than the HIST-SI technique. For testing the proposed technique, a dataset of Bhuvan – a National Geoportal developed and hosted by ISRO (Indian Space Research Organisation) is used. Experiments are conducted using Scikit-image & OpenCV tools of Python, and performance is evaluated and compared over various existing image segmentation techniques for several matrices, i.e., Mean Square Error (MSE) and Peak Signal Noise Ratio (PSNR).Keywords: satellite image, image segmentation, edge detection, error rate, MSE, PSNR, HIST-SI, linking, filtering, imp_HIST-SI
Procedia PDF Downloads 1436302 Identifying the Structural Components of Old Buildings from Floor Plans
Authors: Shi-Yu Xu
Abstract:
The top three risk factors that have contributed to building collapses during past earthquake events in Taiwan are: "irregular floor plans or elevations," "insufficient columns in single-bay buildings," and the "weak-story problem." Fortunately, these unsound structural characteristics can be directly identified from the floor plans. However, due to the vast number of old buildings, conducting manual inspections to identify these compromised structural features in all existing structures would be time-consuming and prone to human errors. This study aims to develop an algorithm that utilizes artificial intelligence techniques to automatically pinpoint the structural components within a building's floor plans. The obtained spatial information will be utilized to construct a digital structural model of the building. This information, particularly regarding the distribution of columns in the floor plan, can then be used to conduct preliminary seismic assessments of the building. The study employs various image processing and pattern recognition techniques to enhance detection efficiency and accuracy. The study enables a large-scale evaluation of structural vulnerability for numerous old buildings, providing ample time to arrange for structural retrofitting in those buildings that are at risk of significant damage or collapse during earthquakes.Keywords: structural vulnerability detection, object recognition, seismic capacity assessment, old buildings, artificial intelligence
Procedia PDF Downloads 946301 A Review on Existing Challenges of Data Mining and Future Research Perspectives
Authors: Hema Bhardwaj, D. Srinivasa Rao
Abstract:
Technology for analysing, processing, and extracting meaningful data from enormous and complicated datasets can be termed as "big data." The technique of big data mining and big data analysis is extremely helpful for business movements such as making decisions, building organisational plans, researching the market efficiently, improving sales, etc., because typical management tools cannot handle such complicated datasets. Special computational and statistical issues, such as measurement errors, noise accumulation, spurious correlation, and storage and scalability limitations, are brought on by big data. These unique problems call for new computational and statistical paradigms. This research paper offers an overview of the literature on big data mining, its process, along with problems and difficulties, with a focus on the unique characteristics of big data. Organizations have several difficulties when undertaking data mining, which has an impact on their decision-making. Every day, terabytes of data are produced, yet only around 1% of that data is really analyzed. The idea of the mining and analysis of data and knowledge discovery techniques that have recently been created with practical application systems is presented in this study. This article's conclusion also includes a list of issues and difficulties for further research in the area. The report discusses the management's main big data and data mining challenges.Keywords: big data, data mining, data analysis, knowledge discovery techniques, data mining challenges
Procedia PDF Downloads 1146300 Performance Evaluation of Production Schedules Based on Process Mining
Authors: Kwan Hee Han
Abstract:
External environment of enterprise is rapidly changing majorly by global competition, cost reduction pressures, and new technology. In these situations, production scheduling function plays a critical role to meet customer requirements and to attain the goal of operational efficiency. It deals with short-term decision making in the production process of the whole supply chain. The major task of production scheduling is to seek a balance between customer orders and limited resources. In manufacturing companies, this task is so difficult because it should efficiently utilize resource capacity under the careful consideration of many interacting constraints. At present, many computerized software solutions have been utilized in many enterprises to generate a realistic production schedule to overcome the complexity of schedule generation. However, most production scheduling systems do not provide sufficient information about the validity of the generated schedule except limited statistics. Process mining only recently emerged as a sub-discipline of both data mining and business process management. Process mining techniques enable the useful analysis of a wide variety of processes such as process discovery, conformance checking, and bottleneck analysis. In this study, the performance of generated production schedule is evaluated by mining event log data of production scheduling software system by using the process mining techniques since every software system generates event logs for the further use such as security investigation, auditing and error bugging. An application of process mining approach is proposed for the validation of the goodness of production schedule generated by scheduling software systems in this study. By using process mining techniques, major evaluation criteria such as utilization of workstation, existence of bottleneck workstations, critical process route patterns, and work load balance of each machine over time are measured, and finally, the goodness of production schedule is evaluated. By using the proposed process mining approach for evaluating the performance of generated production schedule, the quality of production schedule of manufacturing enterprises can be improved.Keywords: data mining, event log, process mining, production scheduling
Procedia PDF Downloads 2836299 Human-Tiger Conflict in Chitwan National Park, Nepal
Authors: Abishek Poudel
Abstract:
Human-tiger conflicts are serious issues of conflicts between local people and park authority and the conflicting situation potentially play negative role in park management. The study aimed (1) To determine the trend and nature of human-tiger conflicts (2) To understand people's perception and mitigation measures towards tiger conservation. Both primary and secondary information were used to determine human- tiger conflicts in Chitwan National Park. Systematic random sampling with 5% intensity was done to collect the perception of the villagers regarding human-tiger conflicts. The study sites were selected based on frequencies of incidences of human attacks and livestock depredation viz. Rajahar and Ayodhyapuri VDCs respectively. The trend of human casualties by tiger has increased in last five year whereas the trend of livestock has decreased. Reportedly, between 2008 and 2012, tigers killed 22 people, injured 10 and killed at least 213 livestock. Conflict was less common in the park and more intense in the sub-optimal habitats of Buffer Zone. Goat was the most vulnerable livestock followed by cattle. The livestock grazing and human intrusion into tiger habitat were the causes of conflicts. Developing local stewardship and support for tiger conservation, livestock insurance, and compensation policy simplification may help reduce human-tiger conflicts.Keywords: livestock depredation, sub optimal habitat, human-tiger, local stewardship
Procedia PDF Downloads 4656298 Analysing Techniques for Fusing Multimodal Data in Predictive Scenarios Using Convolutional Neural Networks
Authors: Philipp Ruf, Massiwa Chabbi, Christoph Reich, Djaffar Ould-Abdeslam
Abstract:
In recent years, convolutional neural networks (CNN) have demonstrated high performance in image analysis, but oftentimes, there is only structured data available regarding a specific problem. By interpreting structured data as images, CNNs can effectively learn and extract valuable insights from tabular data, leading to improved predictive accuracy and uncovering hidden patterns that may not be apparent in traditional structured data analysis. In applying a single neural network for analyzing multimodal data, e.g., both structured and unstructured information, significant advantages in terms of time complexity and energy efficiency can be achieved. Converting structured data into images and merging them with existing visual material offers a promising solution for applying CNN in multimodal datasets, as they often occur in a medical context. By employing suitable preprocessing techniques, structured data is transformed into image representations, where the respective features are expressed as different formations of colors and shapes. In an additional step, these representations are fused with existing images to incorporate both types of information. This final image is finally analyzed using a CNN.Keywords: CNN, image processing, tabular data, mixed dataset, data transformation, multimodal fusion
Procedia PDF Downloads 1286297 Normal and Peaberry Coffee Beans Classification from Green Coffee Bean Images Using Convolutional Neural Networks and Support Vector Machine
Authors: Hira Lal Gope, Hidekazu Fukai
Abstract:
The aim of this study is to develop a system which can identify and sort peaberries automatically at low cost for coffee producers in developing countries. In this paper, the focus is on the classification of peaberries and normal coffee beans using image processing and machine learning techniques. The peaberry is not bad and not a normal bean. The peaberry is born in an only single seed, relatively round seed from a coffee cherry instead of the usual flat-sided pair of beans. It has another value and flavor. To make the taste of the coffee better, it is necessary to separate the peaberry and normal bean before green coffee beans roasting. Otherwise, the taste of total beans will be mixed, and it will be bad. In roaster procedure time, all the beans shape, size, and weight must be unique; otherwise, the larger bean will take more time for roasting inside. The peaberry has a different size and different shape even though they have the same weight as normal beans. The peaberry roasts slower than other normal beans. Therefore, neither technique provides a good option to select the peaberries. Defect beans, e.g., sour, broken, black, and fade bean, are easy to check and pick up manually by hand. On the other hand, the peaberry pick up is very difficult even for trained specialists because the shape and color of the peaberry are similar to normal beans. In this study, we use image processing and machine learning techniques to discriminate the normal and peaberry bean as a part of the sorting system. As the first step, we applied Deep Convolutional Neural Networks (CNN) and Support Vector Machine (SVM) as machine learning techniques to discriminate the peaberry and normal bean. As a result, better performance was obtained with CNN than with SVM for the discrimination of the peaberry. The trained artificial neural network with high performance CPU and GPU in this work will be simply installed into the inexpensive and low in calculation Raspberry Pi system. We assume that this system will be used in under developed countries. The study evaluates and compares the feasibility of the methods in terms of accuracy of classification and processing speed.Keywords: convolutional neural networks, coffee bean, peaberry, sorting, support vector machine
Procedia PDF Downloads 1486296 Operationalizing the Concept of Community Resilience through Community Capitals Framework-Based Index
Authors: Warda Ajaz
Abstract:
This study uses the ‘Community Capitals Framework’ (CCF) to develop a community resilience index that can serve as a useful tool for measuring resilience of communities in diverse contexts and backgrounds. CCF is an important analytical tool to assess holistic community change. This framework identifies seven major types of community capitals: natural, cultural, human, social, political, financial and built, and claims that the communities that have been successful in supporting healthy sustainable community and economic development have paid attention to all these capitals. The framework, therefore, proposes to study the community development through identification of assets in these major capitals (stock), investment in these capitals (flow), and the interaction between these capitals. Capital based approaches have been extensively used to assess community resilience, especially in the context of natural disasters and extreme events. Therefore, this study identifies key indicators for estimating each of the seven capitals through an extensive literature review and then develops an index to calculate a community resilience score. The CCF-based community resilience index presents an innovative way of operationalizing the concept of community resilience and will contribute toward decision-relevant research regarding adaptation and mitigation of community vulnerabilities to climate change-induced, as well as other adverse events.Keywords: adverse events, community capitals, community resilience, climate change, economic development, sustainability
Procedia PDF Downloads 2716295 Role of Nano-Technology on Remediation of Poly- and Perfluoroalkyl Substances Contaminated Soil and Ground Water
Authors: Leila Alidokht
Abstract:
PFAS (poly- and perfluoroalkyl substances) are a large collection of environmentally persistent organic chemicals of industrial origin that have a negative influence on human health and ecosystems. Many distinct PFAS are being utilized in a wide range of applications (on the order of thousands), and there is no comprehensive source of information on the many different compounds and their roles in diverse applications. Facilities are increasingly looking into ways to reduce waste from cleanup projects. PFAS are widespread in the environment, have been found in a wide range of human biomonitoring investigations, and are a rising source of regulatory concern for federal, state, and local governments. Nanotechnology has the potential to contribute considerably to the creation of a cleaner, greener technologies with considerable environmental and health benefits. Nanotechnology approaches are being studied for their potential to provide pollution management and mitigation options, as well as to increase the effectiveness of standard environmental cleanup procedures. Diversified nanoparticles have shown useful in removing certain pollutants from their original environment, such as sewage spills and landmines. Furthermore, they have a low hazardous effect during production rates and can thus be thoroughly explored in the future to make them more compatible with lower production costs.Keywords: PFOS, PFOA, PFAS, soil remediation
Procedia PDF Downloads 117