Search results for: smelting techniques
5635 Comparative Analysis of Control Techniques Based Sliding Mode for Transient Stability Assessment for Synchronous Multicellular Converter
Authors: Rihab Hamdi, Amel Hadri Hamida, Fatiha Khelili, Sakina Zerouali, Ouafae Bennis
Abstract:
This paper features a comparative study performance of sliding mode controller (SMC) for closed-loop voltage control of direct current to direct current (DC-DC) three-cells buck converter connected in parallel, operating in continuous conduction mode (CCM), based on pulse-width modulation (PWM) with SMC based on hysteresis modulation (HM) where an adaptive feedforward technique is adopted. On one hand, for the PWM-based SM, the approach is to incorporate a fixed-frequency PWM scheme which is effectively a variant of SM control. On the other hand, for the HM-based SM, oncoming an adaptive feedforward control that makes the hysteresis band variable in the hysteresis modulator of the SM controller in the aim to restrict the switching frequency variation in the case of any change of the line input voltage or output load variation are introduced. The results obtained under load change, input change and reference change clearly demonstrates a similar dynamic response of both proposed techniques, their effectiveness is fast and smooth tracking of the desired output voltage. The PWM-based SM technique has greatly improved the dynamic behavior with a bit advantageous compared to the HM-based SM technique, as well as provide stability in any operating conditions. Simulation studies in MATLAB/Simulink environment have been performed to verify the concept.Keywords: DC-DC converter, hysteresis modulation, parallel multi-cells converter, pulse-width modulation, robustness, sliding mode control
Procedia PDF Downloads 1675634 Advances in Genome Editing and Future Prospects for Sorghum Improvement: A Review
Authors: Micheale Yifter Weldemichael, Hailay Mehari Gebremedhn, Teklehaimanot Hailesslasie Teklu
Abstract:
Recent developments in targeted genome editing accelerated genetic research and opened new potentials to improve crops for better yields and quality. Given the significance of cereal crops as a primary source of food for the global population, the utilization of contemporary genome editing techniques like CRISPR/Cas9 is timely and crucial. CRISPR/Cas technology has enabled targeted genomic modifications, revolutionizing genetic research and exploration. Application of gene editing through CRISPR/Cas9 in enhancing sorghum is particularly vital given the current ecological, environmental, and agricultural challenges exacerbated by climate change. As sorghum is one of the main staple foods of our region and is known to be a resilient crop with a high potential to overcome the above challenges, the application of genome editing technology will enhance the investigation of gene functionality. CRISPR/Cas9 enables the improvement of desirable sorghum traits, including nutritional value, yield, resistance to pests and diseases, and tolerance to various abiotic stresses. Furthermore, CRISPR/Cas9 has the potential to perform intricate editing and reshape the existing elite sorghum varieties, and introduce new genetic variations. However, current research primarily focuses on improving the efficacy of the CRISPR/Cas9 system in successfully editing endogenous sorghum genes, making it a feasible and successful undertaking in sorghum improvement. Recent advancements and developments in CRISPR/Cas9 techniques have further empowered researchers to modify additional genes in sorghum with greater efficiency. Successful application and advancement of CRISPR techniques in sorghum will aid not only in gene discovery and the creation of novel traits that regulate gene expression and functional genomics but also in facilitating site-specific integration events. The purpose of this review is, therefore, to elucidate the current advances in sorghum genome editing and highlight its potential in addressing food security issues. It also assesses the efficiency of CRISPR-mediated improvement and its long-term effects on crop improvement and host resistance against parasites, including tissue-specific activity and the ability to induce resistance. This review ends by emphasizing the challenges and opportunities of CRISPR technology in combating parasitic plants and proposing directions for future research to safeguard global agricultural productivity.Keywords: CRISPR/Cas9, genome editing, quality, sorghum, stress, yield
Procedia PDF Downloads 385633 Comprehensive Machine Learning-Based Glucose Sensing from Near-Infrared Spectra
Authors: Bitewulign Mekonnen
Abstract:
Context: This scientific paper focuses on the use of near-infrared (NIR) spectroscopy to determine glucose concentration in aqueous solutions accurately and rapidly. The study compares six different machine learning methods for predicting glucose concentration and also explores the development of a deep learning model for classifying NIR spectra. The objective is to optimize the detection model and improve the accuracy of glucose prediction. This research is important because it provides a comprehensive analysis of various machine-learning techniques for estimating aqueous glucose concentrations. Research Aim: The aim of this study is to compare and evaluate different machine-learning methods for predicting glucose concentration from NIR spectra. Additionally, the study aims to develop and assess a deep-learning model for classifying NIR spectra. Methodology: The research methodology involves the use of machine learning and deep learning techniques. Six machine learning regression models, including support vector machine regression, partial least squares regression, extra tree regression, random forest regression, extreme gradient boosting, and principal component analysis-neural network, are employed to predict glucose concentration. The NIR spectra data is randomly divided into train and test sets, and the process is repeated ten times to increase generalization ability. In addition, a convolutional neural network is developed for classifying NIR spectra. Findings: The study reveals that the SVMR, ETR, and PCA-NN models exhibit excellent performance in predicting glucose concentration, with correlation coefficients (R) > 0.99 and determination coefficients (R²)> 0.985. The deep learning model achieves high macro-averaging scores for precision, recall, and F1-measure. These findings demonstrate the effectiveness of machine learning and deep learning methods in optimizing the detection model and improving glucose prediction accuracy. Theoretical Importance: This research contributes to the field by providing a comprehensive analysis of various machine-learning techniques for estimating glucose concentrations from NIR spectra. It also explores the use of deep learning for the classification of indistinguishable NIR spectra. The findings highlight the potential of machine learning and deep learning in enhancing the prediction accuracy of glucose-relevant features. Data Collection and Analysis Procedures: The NIR spectra and corresponding references for glucose concentration are measured in increments of 20 mg/dl. The data is randomly divided into train and test sets, and the models are evaluated using regression analysis and classification metrics. The performance of each model is assessed based on correlation coefficients, determination coefficients, precision, recall, and F1-measure. Question Addressed: The study addresses the question of whether machine learning and deep learning methods can optimize the detection model and improve the accuracy of glucose prediction from NIR spectra. Conclusion: The research demonstrates that machine learning and deep learning methods can effectively predict glucose concentration from NIR spectra. The SVMR, ETR, and PCA-NN models exhibit superior performance, while the deep learning model achieves high classification scores. These findings suggest that machine learning and deep learning techniques can be used to improve the prediction accuracy of glucose-relevant features. Further research is needed to explore their clinical utility in analyzing complex matrices, such as blood glucose levels.Keywords: machine learning, signal processing, near-infrared spectroscopy, support vector machine, neural network
Procedia PDF Downloads 945632 Fight against Money Laundering with Optical Character Recognition
Authors: Saikiran Subbagari, Avinash Malladhi
Abstract:
Anti Money Laundering (AML) regulations are designed to prevent money laundering and terrorist financing activities worldwide. Financial institutions around the world are legally obligated to identify, assess and mitigate the risks associated with money laundering and report any suspicious transactions to governing authorities. With increasing volumes of data to analyze, financial institutions seek to automate their AML processes. In the rise of financial crimes, optical character recognition (OCR), in combination with machine learning (ML) algorithms, serves as a crucial tool for automating AML processes by extracting the data from documents and identifying suspicious transactions. In this paper, we examine the utilization of OCR for AML and delve into various OCR techniques employed in AML processes. These techniques encompass template-based, feature-based, neural network-based, natural language processing (NLP), hidden markov models (HMMs), conditional random fields (CRFs), binarizations, pattern matching and stroke width transform (SWT). We evaluate each technique, discussing their strengths and constraints. Also, we emphasize on how OCR can improve the accuracy of customer identity verification by comparing the extracted text with the office of foreign assets control (OFAC) watchlist. We will also discuss how OCR helps to overcome language barriers in AML compliance. We also address the implementation challenges that OCR-based AML systems may face and offer recommendations for financial institutions based on the data from previous research studies, which illustrate the effectiveness of OCR-based AML.Keywords: anti-money laundering, compliance, financial crimes, fraud detection, machine learning, optical character recognition
Procedia PDF Downloads 1445631 The Utilization of Tea Extract within the Realm of the Food Industry
Authors: Raana Babadi Fathipour
Abstract:
Tea, a beverage widely cherished across the globe, has captured the interest of scholars with its recent acknowledgement for possessing noteworthy health advantages. Of particular significance is its proven ability to ward off ailments such as cancer and cardiovascular afflictions. Moreover, within the realm of culinary creations, lipid oxidation poses a significant challenge for food product development. In light of these aforementioned concerns, this present discourse turns its attention towards exploring diverse methodologies employed in extracting polyphenols from various types of tea leaves and examining their utility within the vast landscape of the ever-evolving food industry. Based on the discoveries unearthed in this comprehensive investigation, it has been determined that the fundamental constituents of tea are polyphenols possessed of intrinsic health-enhancing properties. This includes an assortment of catechins, namely epicatechin, epigallocatechin, epicatechin gallate, and epigallocatechin gallate. Moreover, gallic acid, flavonoids, flavonols and theaphlavins have also been detected within this aromatic beverage. Of these myriad components examined vigorously in this study's analysis, catechin emerges as particularly beneficial. Multiple techniques have emerged over time to successfully extract key compounds from tea plants, including solvent-based extraction methodologies, microwave-assisted water extraction approaches and ultrasound-assisted extraction techniques. In particular, consideration is given to microwave-assisted water extraction method as a viable scheme which effectively procures valuable polyphenols from tea extracts. This methodology appears adaptable for implementation within sectors such as dairy production along with meat and oil industries alike.Keywords: camellia sinensis, extraction, food application, shelf life, tea
Procedia PDF Downloads 705630 Exploring the Role of Data Mining in Crime Classification: A Systematic Literature Review
Authors: Faisal Muhibuddin, Ani Dijah Rahajoe
Abstract:
This in-depth exploration, through a systematic literature review, scrutinizes the nuanced role of data mining in the classification of criminal activities. The research focuses on investigating various methodological aspects and recent developments in leveraging data mining techniques to enhance the effectiveness and precision of crime categorization. Commencing with an exposition of the foundational concepts of crime classification and its evolutionary dynamics, this study details the paradigm shift from conventional methods towards approaches supported by data mining, addressing the challenges and complexities inherent in the modern crime landscape. Specifically, the research delves into various data mining techniques, including K-means clustering, Naïve Bayes, K-nearest neighbour, and clustering methods. A comprehensive review of the strengths and limitations of each technique provides insights into their respective contributions to improving crime classification models. The integration of diverse data sources takes centre stage in this research. A detailed analysis explores how the amalgamation of structured data (such as criminal records) and unstructured data (such as social media) can offer a holistic understanding of crime, enriching classification models with more profound insights. Furthermore, the study explores the temporal implications in crime classification, emphasizing the significance of considering temporal factors to comprehend long-term trends and seasonality. The availability of real-time data is also elucidated as a crucial element in enhancing responsiveness and accuracy in crime classification.Keywords: data mining, classification algorithm, naïve bayes, k-means clustering, k-nearest neigbhor, crime, data analysis, sistematic literature review
Procedia PDF Downloads 655629 Application of Electrochemical Impedance Spectroscopy to Monitor the Steel/Soil Interface During Cathodic Protection of Steel in Simulated Soil Solution
Authors: Mandlenkosi George Robert Mahlobo, Tumelo Seadira, Major Melusi Mabuza, Peter Apata Olubambi
Abstract:
Cathodic protection (CP) has been widely considered a suitable technique for mitigating corrosion of buried metal structures. Plenty of efforts have been made in developing techniques, in particular non-destructive techniques, for monitoring and quantifying the effectiveness of CP to ensure the sustainability and performance of buried steel structures. The aim of this study was to investigate the evolution of the electrochemical processes at the steel/soil interface during the application of CP on steel in simulated soil. Carbon steel was subjected to electrochemical tests with NS4 solution used as simulated soil conditions for 4 days before applying CP for a further 11 days. A previously modified non-destructive voltammetry technique was applied before and after the application of CP to measure the corrosion rate. Electrochemical impedance spectroscopy (EIS), in combination with mathematical modeling through equivalent electric circuits, was applied to determine the electrochemical behavior at the steel/soil interface. The measured corrosion rate was found to have decreased from 410 µm/yr to 8 µm/yr between days 5 and 14 because of the applied CP. Equivalent electrical circuits were successfully constructed and used to adequately model the EIS results. The modeling of the obtained EIS results revealed the formation of corrosion products via a mixed activation-diffusion mechanism during the first 4 days, while the activation mechanism prevailed in the presence of CP, resulting in a protective film. The x-ray diffraction analysis confirmed the presence of corrosion products and the predominant protective film corresponding to the calcareous deposit.Keywords: carbon steel, cathodic protection, NS4 solution, voltammetry, EIS
Procedia PDF Downloads 645628 Review of Downscaling Methods in Climate Change and Their Role in Hydrological Studies
Authors: Nishi Bhuvandas, P. V. Timbadiya, P. L. Patel, P. D. Porey
Abstract:
Recent perceived climate variability raises concerns with unprecedented hydrological phenomena and extremes. Distribution and circulation of the waters of the Earth become increasingly difficult to determine because of additional uncertainty related to anthropogenic emissions. According to the sixth Intergovernmental Panel on Climate Change (IPCC) Technical Paper on Climate Change and water, changes in the large-scale hydrological cycle have been related to an increase in the observed temperature over several decades. Although many previous research carried on effect of change in climate on hydrology provides a general picture of possible hydrological global change, new tools and frameworks for modelling hydrological series with nonstationary characteristics at finer scales, are required for assessing climate change impacts. Of the downscaling techniques, dynamic downscaling is usually based on the use of Regional Climate Models (RCMs), which generate finer resolution output based on atmospheric physics over a region using General Circulation Model (GCM) fields as boundary conditions. However, RCMs are not expected to capture the observed spatial precipitation extremes at a fine cell scale or at a basin scale. Statistical downscaling derives a statistical or empirical relationship between the variables simulated by the GCMs, called predictors, and station-scale hydrologic variables, called predictands. The main focus of the paper is on the need for using statistical downscaling techniques for projection of local hydrometeorological variables under climate change scenarios. The projections can be then served as a means of input source to various hydrologic models to obtain streamflow, evapotranspiration, soil moisture and other hydrological variables of interest.Keywords: climate change, downscaling, GCM, RCM
Procedia PDF Downloads 4065627 Accessibility Assessment of School Facilities Using Geospatial Technologies: A Case Study of District Sheikhupura
Authors: Hira Jabbar
Abstract:
Education is vital for inclusive growth of an economy and a critical contributor for investment in human capital. Like other developing countries, Pakistan is facing enormous challenges regarding the provision of public facilities, improper infrastructure planning, accelerating rate of population and poor accessibility. The influence of the rapid advancement and innovations in GIS and RS techniques have proved to be a useful tool for better planning and decision making to encounter these challenges. Therefore present study incorporates GIS and RS techniques to investigate the spatial distribution of school facilities, identifies settlements with served and unserved population, finds potential areas for new schools based on population and develops an accessibility index to evaluate the higher accessibility for schools. For this purpose high-resolution worldview imagery was used to develop road network, settlements and school facilities and to generate school accessibility for each level. Landsat 8 imagery was utilized to extract built-up area by applying pre and post-processing models and Landscan 2015 was used to analyze population statistics. Service area analysis was performed using network analyst extension in ArcGIS 10.3v and results were evaluated for served and underserved areas and population. An accessibility tool was used to evaluate a set of potential destinations to determine which is the most accessible with the given population distribution. Findings of the study may contribute to facilitating the town planners and education authorities for understanding the existing patterns of school facilities. It is concluded that GIS and remote sensing can be effectively used in urban transport and facility planning.Keywords: accessibility, geographic information system, landscan, worldview
Procedia PDF Downloads 3255626 The Art of Contemporary Arabic Calligraphy in Oman: Salman Alhajri as an Example
Authors: Salman Amur Alhajri
Abstract:
Purpose: This paper explores the art of contemporary Arabic calligraphy in Oman. It explains the aesthetic features of Arabic calligraphy as a unique icon of Islamic art. This paper also explores the profile of one Omani artist, Salman Alhajri, as an example of Omani artists who have developed unique styles in this art stream. Methodology and approach: The paper is based on a theoretical study using a descriptive and case-study approach. Omani artists are fascinated by the art forms of Arabic calligraphy, which combine both spiritual meaning and aesthetic beauty. Artist Salman Alhajri is an example of a contemporary Arabic artist who uses Arabic calligraphy as the main theme in his art. Dr. Alhajri is trying to introduce the beauty of Arabic letters from a new aesthetic point of view. He also aims to create unusual visual effects that viewers can easily interact with. Even though words and phrases appear in Alhajri’s artwork, they are not conveying direct meanings: viewers can create their own meaning or expressions from them by appreciating the compositions of the artwork. Results: Arabic writing is directly related to the identity of Omani artists and their cultural background. This paper shows how the beauty of Arabic letters comes from its indefinite possibilities in designing calligraphic expressions, even within a single word, because letters can be stretched and transformed in various ways to create different compositions. Omani artists are interested in employing new media applications in this kind of practice to find new techniques for creating artwork based on Arabic writing. It is really important for all Omani artists to practice this art style because Arabic calligraphy and its flexibility introduce infinite possibilities that involve further exploration and investigation.Keywords: Islamic art, contemporary Arabic calligraphy, new techniques, Omani artist
Procedia PDF Downloads 3605625 COVID_ICU_BERT: A Fine-Tuned Language Model for COVID-19 Intensive Care Unit Clinical Notes
Authors: Shahad Nagoor, Lucy Hederman, Kevin Koidl, Annalina Caputo
Abstract:
Doctors’ notes reflect their impressions, attitudes, clinical sense, and opinions about patients’ conditions and progress, and other information that is essential for doctors’ daily clinical decisions. Despite their value, clinical notes are insufficiently researched within the language processing community. Automatically extracting information from unstructured text data is known to be a difficult task as opposed to dealing with structured information such as vital physiological signs, images, and laboratory results. The aim of this research is to investigate how Natural Language Processing (NLP) techniques and machine learning techniques applied to clinician notes can assist in doctors’ decision-making in Intensive Care Unit (ICU) for coronavirus disease 2019 (COVID-19) patients. The hypothesis is that clinical outcomes like survival or mortality can be useful in influencing the judgement of clinical sentiment in ICU clinical notes. This paper introduces two contributions: first, we introduce COVID_ICU_BERT, a fine-tuned version of clinical transformer models that can reliably predict clinical sentiment for notes of COVID patients in the ICU. We train the model on clinical notes for COVID-19 patients, a type of notes that were not previously seen by clinicalBERT, and Bio_Discharge_Summary_BERT. The model, which was based on clinicalBERT achieves higher predictive accuracy (Acc 93.33%, AUC 0.98, and precision 0.96 ). Second, we perform data augmentation using clinical contextual word embedding that is based on a pre-trained clinical model to balance the samples in each class in the data (survived vs. deceased patients). Data augmentation improves the accuracy of prediction slightly (Acc 96.67%, AUC 0.98, and precision 0.92 ).Keywords: BERT fine-tuning, clinical sentiment, COVID-19, data augmentation
Procedia PDF Downloads 2065624 Prevalence and Associated Factors of Periodontal Disease among Diabetes Patients in Addis Ababa, Ethiopia, 2018
Authors: Addisu Tadesse Sahile, Tennyson Mgutshini
Abstract:
Background: Periodontal disease is a common, complex, inflammatory disease characterized by the destruction of tooth-supporting soft and hard tissues of the periodontium and a major public health problem across developed and developing countries. Objectives: The study was aimed at assessing the prevalence of periodontal disease and associated factors among diabetes patients in Addis Ababa, Ethiopia, 2018. Methods: Institutional based cross-sectional study was conducted on 388 diabetes patients selected by systematic random sampling method from March to May 2018. The study was conducted at two conveniently selected public hospitals in Addis Ababa. Data were collected with pre-tested, structured and translated questionnaire then entered to SPSS version 23 software for analysis. Descriptive statistics as a summary, in line with chi-square and binary logistics regression to identify factors associated with periodontal disease, were applied. A 95% CI with a p-value less than 5% was used as a level of significance. Results: Ninety-one percent (n=353) of participants had periodontal disease while oral examination was done in six regions. While only 9% (n=35) of participants were free of periodontal disease. The number of tooth brushings per day, correct techniques of brushing, malocclusion, and fillings that are defective were associated with periodontal disease at p < 0.05. Conclusion and recommendation: A higher prevalence of periodontal disease among diabetes patient was observed. The frequency of tooth brushing, correct techniques of brushing, malocclusion and defective fillings were associated with periodontal disease. Emphasis has to be given to oral health of diabetes patients by every concerned body so as to control the current higher burden of periodontal disease in diabetes.Keywords: periodontal disease, risk factors, diabetes mellitus, Addis Ababa
Procedia PDF Downloads 1285623 Effect of Spontaneous Ripening and Drying Techniques on the Bioactive Activities Peel of Plantain (Musa paradisiaca) Fruit
Authors: Famuwagun A. A., Abiona O. O., Gbadamosi S.O., Adeboye O. A., Adebooye O. C.
Abstract:
The need to provide more information on the perceived bioactive status of the peel of plantain fruit informed the design of this research. Matured Plantain fruits were harvested, and fruits were allowed to ripen spontaneously. Samples of plantain fruit were taken every fortnight, and the peels were removed. The peels were dried using two different drying techniques (Oven drying and sun drying) and milled into powdery forms. Other samples were picked and processed in a similar manner on the first, third, seventh and tenth day until the peels of the fruits were fully ripped, resulting in eight different samples. The anti-oxidative properties of the samples using different assays (DPPH, FRAP, MCA, HRSA, SRSA, ABTS, ORAC), inhibitory activities against enzymes related to diabetes (alpha-amylase and glucosidase) and inhibition against angiotensin-converting enzymes (ACE) were evaluated. The result showed that peels of plantain fruits on the 7th day of ripening and sundried exhibited greater inhibitions against free radicals, which enhanced its antioxidant activities, resulting in greater inhibitions against alpha-amylase and alpha-glucosidase enzymes. Also, oven oven-dried sample of the peel of plantain fruit on the 7th day of ripening had greater phenolic contents than the other samples, which also resulted in higher inhibition against angiotensin converting enzymes when compared with other samples. The results showed that even though the unripe peel of plantain fruit is assumed to contain excellent bioactive activities, consumption of the peel should be allowed to ripen for seven days after maturity and harvesting so as to derive maximum benefit from the peel.Keywords: functional ingredient, diabetics, hypertension, functional foods
Procedia PDF Downloads 515622 Experimental Study of the Behavior of Elongated Non-spherical Particles in Wall-Bounded Turbulent Flows
Authors: Manuel Alejandro Taborda Ceballos, Martin Sommerfeld
Abstract:
Transport phenomena and dispersion of non-spherical particle in turbulent flows are found everywhere in industrial application and processes. Powder handling, pollution control, pneumatic transport, particle separation are just some examples where the particle encountered are not only spherical. These types of multiphase flows are wall bounded and mostly highly turbulent. The particles found in these processes are rarely spherical but may have various shapes (e.g., fibers, and rods). Although research related to the behavior of regular non-spherical particles in turbulent flows has been carried out for many years, it is still necessary to refine models, especially near walls where the interaction fiber-wall changes completely its behavior. Imaging-based experimental studies on dispersed particle-laden flows have been applied for many decades for a detailed experimental analysis. These techniques have the advantages that they provide field information in two or three dimensions, but have a lower temporal resolution compared to point-wise techniques such as PDA (phase-Doppler anemometry) and derivations therefrom. The applied imaging techniques in dispersed two-phase flows are extensions from classical PIV (particle image velocimetry) and PTV (particle tracking velocimetry) and the main emphasis was simultaneous measurement of the velocity fields of both phases. In a similar way, such data should also provide adequate information for validating the proposed models. Available experimental studies on the behavior of non-spherical particles are uncommon and mostly based on planar light-sheet measurements. Especially for elongated non-spherical particles, however, three-dimensional measurements are needed to fully describe their motion and to provide sufficient information for validation of numerical computations. For further providing detailed experimental results allowing a validation of numerical calculations of non-spherical particle dispersion in turbulent flows, a water channel test facility was built around a horizontal closed water channel. Into this horizontal main flow, a small cross-jet laden with fiber-like particles was injected, which was also solely driven by gravity. The dispersion of the fibers was measured by applying imaging techniques based on a LED array for backlighting and high-speed cameras. For obtaining the fluid velocity fields, almost neutrally buoyant tracer was used. The discrimination between tracer and fibers was done based on image size which was also the basis to determine fiber orientation with respect to the inertial coordinate system. The synchronous measurement of fluid velocity and fiber properties also allow the collection of statistics of fiber orientation, velocity fields of tracer and fibers, the angular velocity of the fibers and the orientation between fiber and instantaneous relative velocity. Consequently, an experimental study the behavior of elongated non-spherical particles in wall bounded turbulent flows was achieved. The development of a comprehensive analysis was succeeded, especially near the wall region, where exists hydrodynamic wall interaction effects (e.g., collision or lubrication) and abrupt changes of particle rotational velocity. This allowed us to predict numerically afterwards the behavior of non-spherical particles within the frame of the Euler/Lagrange approach, where the particles are therein treated as “point-particles”.Keywords: crossflow, non-spherical particles, particle tracking velocimetry, PIV
Procedia PDF Downloads 865621 Implementing 3D Printing for 3D Digital Modeling in the Classroom
Authors: Saritdikhun Somasa
Abstract:
3D printing fabrication has empowered many artists in many fields. Artists who work in stop motion, 3D modeling, toy design, product design, sculpture, and fine arts become one-stop shop operations–where they can design, prototype, and distribute their designs for commercial or fine art purposes. The author has developed a digital sculpting course that fosters digital software, peripheral hardware, and 3D printing with traditional sculpting concept techniques to address the complexities of this multifaceted process, allowing the students to produce complex 3d-printed work. The author will detail the preparation and planning for pre- to post-process 3D printing elements, including software, materials, space, equipment, tools, and schedule consideration for small to medium figurine design statues in a semester-long class. In addition, the author provides insight into teaching challenges in the non-studio space that requires students to work intensively on post-printed models to assemble parts, finish, and refine the 3D printed surface. Even though this paper focuses on the 3D printing processes and techniques for small to medium design statue projects for the Digital Media program, the author hopes the paper will benefit other fields of study such as craft practices, product design, and fine-arts programs. Other schools that might implement 3D printing and fabrication in their programs will find helpful information in this paper, such as a teaching plan, choices of equipment and materials, adaptation for non-studio spaces, and putting together a complete and well-resolved project for students.Keywords: 3D digital modeling, 3D digital sculpting, 3D modeling, 3D printing, 3D digital fabrication
Procedia PDF Downloads 1035620 The Relationship between Renewable Energy, Real Income, Tourism and Air Pollution
Authors: Eyup Dogan
Abstract:
One criticism of the energy-growth-environment literature, to the best of our knowledge, is that only a few studies analyze the influence of tourism on CO₂ emissions even though tourism sector is closely related to the environment. The other criticism is the selection of methodology. Panel estimation techniques that fail to consider both heterogeneity and cross-sectional dependence across countries can cause forecasting errors. To fulfill the mentioned gaps in the literature, this study analyzes the impacts of real GDP, renewable energy and tourism on the levels of carbon dioxide (CO₂) emissions for the top 10 most-visited countries around the world. This study focuses on the top 10 touristic (most-visited) countries because they receive about the half of the worldwide tourist arrivals in late years and are among the top ones in 'Renewables Energy Country Attractiveness Index (RECAI)'. By looking at Pesaran’s CD test and average growth rates of variables for each country, we detect the presence of cross-sectional dependence and heterogeneity. Hence, this study uses second generation econometric techniques (cross-sectionally augmented Dickey-Fuller (CADF), and cross-sectionally augmented IPS (CIPS) unit root test, the LM bootstrap cointegration test, and the DOLS and the FMOLS estimators) which are robust to the mentioned issues. Therefore, the reported results become accurate and reliable. It is found that renewable energy mitigates the pollution whereas real GDP and tourism contribute to carbon emissions. Thus, regulatory policies are necessary to increase the awareness of sustainable tourism. In addition, the use of renewable energy and the adoption of clean technologies in tourism sector as well as in producing goods and services play significant roles in reducing the levels of emissions.Keywords: air pollution, tourism, renewable energy, income, panel data
Procedia PDF Downloads 1845619 Smart Safari: Safari Guidance Mobile Application
Authors: D. P. Lawrence, T. M. M. D. Ariyarathna, W. N. K. De Silva, M. D. S. C. De Silva, Lasantha Abeysiri, Pradeep Abeygunawardhna
Abstract:
Safari traveling is one of the most famous hobbies all over the world. In Sri Lanka, 'Yala' is the second-largest national park, which is a better place to go for a safari. Many number of local and foreign travelers are coming to go for a safari in 'Yala'. But 'Yala' does not have a mobile application that is made to facilitate the traveler with some important features that the traveler wants to achieve in the safari experience. To overcome these difficulties, the proposed mobile application by adding those identified features to make travelers, guiders, and administration's works easier. The proposed safari traveling guidance mobile application is called 'SMART SAFARI' for the 'Yala' National Park in Sri Lanka. There are four facilities in this mobile application that provide for travelers as well as the guiders. As the first facility, the guider and traveler can view the created map of the park, and the guider can add temporary locations of animals and special locations on the map. This is a Geographic Information System (GIS) to capture, analyze, and display geographical data. And as the second facility is to generate optimal paths according to the travelers' requirements through the park by using machine learning techniques. In the third part, the traveler can get information about animals using an animal identification system by capturing the animal. As in the other facility, the traveler will be facilitated to add reviews and a rate and view those comments under categorized sections and pre-defined score range. With those facilities, this user-friendly mobile application provides the user to get a better experience in safari traveling, and it will probably help to develop tourism culture in Sri Lanka.Keywords: animal identification system, geographic information system, machine learning techniques, pre defined score range
Procedia PDF Downloads 1335618 Cross-Validation of the Data Obtained for ω-6 Linoleic and ω-3 α-Linolenic Acids Concentration of Hemp Oil Using Jackknife and Bootstrap Resampling
Authors: Vibha Devi, Shabina Khanam
Abstract:
Hemp (Cannabis sativa) possesses a rich content of ω-6 linoleic and ω-3 linolenic essential fatty acid in the ratio of 3:1, which is a rare and most desired ratio that enhances the quality of hemp oil. These components are beneficial for the development of cell and body growth, strengthen the immune system, possess anti-inflammatory action, lowering the risk of heart problem owing to its anti-clotting property and a remedy for arthritis and various disorders. The present study employs supercritical fluid extraction (SFE) approach on hemp seed at various conditions of parameters; temperature (40 - 80) °C, pressure (200 - 350) bar, flow rate (5 - 15) g/min, particle size (0.430 - 1.015) mm and amount of co-solvent (0 - 10) % of solvent flow rate through central composite design (CCD). CCD suggested 32 sets of experiments, which was carried out. As SFE process includes large number of variables, the present study recommends the application of resampling techniques for cross-validation of the obtained data. Cross-validation refits the model on each data to achieve the information regarding the error, variability, deviation etc. Bootstrap and jackknife are the most popular resampling techniques, which create a large number of data through resampling from the original dataset and analyze these data to check the validity of the obtained data. Jackknife resampling is based on the eliminating one observation from the original sample of size N without replacement. For jackknife resampling, the sample size is 31 (eliminating one observation), which is repeated by 32 times. Bootstrap is the frequently used statistical approach for estimating the sampling distribution of an estimator by resampling with replacement from the original sample. For bootstrap resampling, the sample size is 32, which was repeated by 100 times. Estimands for these resampling techniques are considered as mean, standard deviation, variation coefficient and standard error of the mean. For ω-6 linoleic acid concentration, mean value was approx. 58.5 for both resampling methods, which is the average (central value) of the sample mean of all data points. Similarly, for ω-3 linoleic acid concentration, mean was observed as 22.5 through both resampling. Variance exhibits the spread out of the data from its mean. Greater value of variance exhibits the large range of output data, which is 18 for ω-6 linoleic acid (ranging from 48.85 to 63.66 %) and 6 for ω-3 linoleic acid (ranging from 16.71 to 26.2 %). Further, low value of standard deviation (approx. 1 %), low standard error of the mean (< 0.8) and low variance coefficient (< 0.2) reflect the accuracy of the sample for prediction. All the estimator value of variance coefficients, standard deviation and standard error of the mean are found within the 95 % of confidence interval.Keywords: resampling, supercritical fluid extraction, hemp oil, cross-validation
Procedia PDF Downloads 1415617 Strengthening of Reinforced Concrete Columns Using Advanced Composite Materials to Resist Earthquakes
Authors: Mohamed Osama Hassaan
Abstract:
Recent earthquakes have demonstrated the vulnerability of older reinforced concrete buildings to fail under imposed seismic loads. Accordingly, the need to strengthen existing reinforced concrete structures, mainly columns, to resist high seismic loads has increased. Conventional strengthening techniques such as using steel plates, steel angles and concrete overlay are used to achieve the required increase in strength or ductility. However, techniques using advanced composite materials are established. The column's splice zone is the most critical zone that failed under seismic loads. There are three types of splice zone failure that can be observed under seismic action, namely, Failure of the flexural plastic hinge region, shear failure and failure due to short lap splice. A lapped splice transfers the force from one bar to another through the concrete surrounding both bars. At any point along the splice, force is transferred from one bar by a bond to the surrounding concrete and also by a bond to the other bar of the pair forming the splice. The integrity of the lap splice depends on the development of adequate bond length. The R.C. columns built in seismic regions are expected to undergo a large number of inelastic deformation cycles while maintaining the overall strength and stability of the structure. This can be ensured by proper confinement of the concrete core. The last type of failure is focused in this research. There are insufficient studies that address the problem of strengthening existing reinforced concrete columns at splice zone through confinement with “advanced composite materials". Accordingly, more investigation regarding the seismic behavior of strengthened reinforced concrete columns using the new generation of composite materials such as (Carbon fiber polymer), (Glass fiber polymer), (Armiad fiber polymer).Keywords: strengthening, columns, advanced composite materials, earthquakes
Procedia PDF Downloads 785616 A Study of Binding Methods and Techniques in Safavid Era Emphasizing on Iran Shahnamehs (16-18th Century AD/10-12th Century AH)
Authors: Ashrafosadat Mousavi Laer, Elaheh Moravej
Abstract:
The art of binding was simple and elementary at the beginning of Islam. This art thrived gradually and continued its development as an independent art. Identification of the binding techniques and used materials in covers and investigation of the arrays give us indexes for the better identification of different doctrines and methods of that time. The catalogers of the manuscripts usually pay attention to four items: gender, color, art elegances, injury, and exquisiteness of the cover. The criterion for classification of the covers is their art nature and gender. 15th century AD (9th century AH) was the period of the binding art development in which the most beautiful covers were produced by the so-called method of ‘burning’. At 16th century AD (10th century AH), in Safavid era, art changed completely and a fundamental evolution occurred in the technique and method of binding. The greatest change in this art was the extensive use of stamp that was made mostly of steel and copper. Theses stamps were presses against leather. These covers were called ‘beat’. In this paper, writing and bookbinding of about 32 Shahnamehs of Safavid era available in the Iranian libraries and museums are studied. An analytical-statistical study shows that four methods have been used including beat, burning, mosaic, and oily. 69 percent of the covers of these copies are cardboards with a leathery coating (goatskin) and have been produced by burning and beat methods. Its reasons are that these two methods have been common methods in Safavid era and performing them was only feasible on leather and the most desirable and commonly used leather of that time was goatskin which was the best option for cover legend durability and preserving the book and it was more durable because it had been made of goat skin. In addition, it had prepared a suitable opportunity for the binding artist’s creativity and innovation.Keywords: Shahnameh, Safavid era, bookbinding, beat cover, burning cover
Procedia PDF Downloads 2385615 Fuzzy Climate Control System for Hydroponic Green Forage Production
Authors: Germán Díaz Flórez, Carlos Alberto Olvera Olvera, Domingo José Gómez Meléndez, Francisco Eneldo López Monteagudo
Abstract:
In recent decades, population growth has exerted great pressure on natural resources. Two of the most scarce and difficult to obtain resources, arable land, and water, are closely interrelated, to the satisfaction of the demand for food production. In Mexico, the agricultural sector uses more than 70% of water consumption. Therefore, maximize the efficiency of current production systems is inescapable. It is essential to utilize techniques and tools that will enable us to the significant savings of water, labor and fertilizer. In this study, we present a production module of hydroponic green forage (HGF), which is a viable alternative in the production of livestock feed in the semi-arid and arid zones. The equipment in addition to having a forage production module, has a climate and irrigation control system that operated with photovoltaics. The climate control, irrigation and power management is based on fuzzy control techniques. The fuzzy control provides an accurate method in the design of controllers for nonlinear dynamic physical phenomena such as temperature and humidity, besides other as lighting level, aeration and irrigation control using heuristic information. In this working, firstly refers to the production of the hydroponic green forage, suitable weather conditions and fertigation subsequently presents the design of the production module and the design of the controller. A simulation of the behavior of the production module and the end results of actual operation of the equipment are presented, demonstrating its easy design, flexibility, robustness and low cost that represents this equipment in the primary sector.Keywords: fuzzy, climate control system, hydroponic green forage, forage production module
Procedia PDF Downloads 3975614 Enhancing Project Performance Forecasting using Machine Learning Techniques
Authors: Soheila Sadeghi
Abstract:
Accurate forecasting of project performance metrics is crucial for successfully managing and delivering urban road reconstruction projects. Traditional methods often rely on static baseline plans and fail to consider the dynamic nature of project progress and external factors. This research proposes a machine learning-based approach to forecast project performance metrics, such as cost variance and earned value, for each Work Breakdown Structure (WBS) category in an urban road reconstruction project. The proposed model utilizes time series forecasting techniques, including Autoregressive Integrated Moving Average (ARIMA) and Long Short-Term Memory (LSTM) networks, to predict future performance based on historical data and project progress. The model also incorporates external factors, such as weather patterns and resource availability, as features to enhance the accuracy of forecasts. By applying the predictive power of machine learning, the performance forecasting model enables proactive identification of potential deviations from the baseline plan, which allows project managers to take timely corrective actions. The research aims to validate the effectiveness of the proposed approach using a case study of an urban road reconstruction project, comparing the model's forecasts with actual project performance data. The findings of this research contribute to the advancement of project management practices in the construction industry, offering a data-driven solution for improving project performance monitoring and control.Keywords: project performance forecasting, machine learning, time series forecasting, cost variance, earned value management
Procedia PDF Downloads 495613 Development of Medical Intelligent Process Model Using Ontology Based Technique
Authors: Emmanuel Chibuogu Asogwa, Tochukwu Sunday Belonwu
Abstract:
An urgent demand for creative solutions has been created by the rapid expansion of medical knowledge, the complexity of patient care, and the requirement for more precise decision-making. As a solution to this problem, the creation of a Medical Intelligent Process Model (MIPM) utilizing ontology-based appears as a promising way to overcome this obstacle and unleash the full potential of healthcare systems. The development of a Medical Intelligent Process Model (MIPM) using ontology-based techniques is motivated by a lack of quick access to relevant medical information and advanced tools for treatment planning and clinical decision-making, which ontology-based techniques can provide. The aim of this work is to develop a structured and knowledge-driven framework that leverages ontology, a formal representation of domain knowledge, to enhance various aspects of healthcare. Object-Oriented Analysis and Design Methodology (OOADM) were adopted in the design of the system as we desired to build a usable and evolvable application. For effective implementation of this work, we used the following materials/methods/tools: the medical dataset for the test of our model in this work was obtained from Kaggle. The ontology-based technique was used with Confusion Matrix, MySQL, Python, Hypertext Markup Language (HTML), Hypertext Preprocessor (PHP), Cascaded Style Sheet (CSS), JavaScript, Dreamweaver, and Fireworks. According to test results on the new system using Confusion Matrix, both the accuracy and overall effectiveness of the medical intelligent process significantly improved by 20% compared to the previous system. Therefore, using the model is recommended for healthcare professionals.Keywords: ontology-based, model, database, OOADM, healthcare
Procedia PDF Downloads 785612 A Comparison and Discussion of Modern Anaesthetic Techniques in Elective Lower Limb Arthroplasties
Authors: P. T. Collett, M. Kershaw
Abstract:
Introduction: The discussion regarding which method of anesthesia provides better results for lower limb arthroplasty is a continuing debate. Multiple meta-analysis has been performed with no clear consensus. The current recommendation is to use neuraxial anesthesia for lower limb arthroplasty; however, the evidence to support this decision is weak. The Enhanced Recovery After Surgery (ERAS) society has recommended, either technique can be used as part of a multimodal anesthetic regimen. A local study was performed to see if the current anesthetic practice correlates with the current recommendations and to evaluate the efficacy of the different techniques utilized. Method: 90 patients who underwent total hip or total knee replacements at Nevill Hall Hospital between February 2019 to July 2019 were reviewed. Data collected included the anesthetic technique, day one opiate use, pain score, and length of stay. The data was collected from anesthetic charts, and the pain team follows up forms. Analysis: The average of patients undergoing lower limb arthroplasty was 70. Of those 83% (n=75) received a spinal anaesthetic and 17% (n=15) received a general anaesthetic. For patients undergoing knee replacement under general anesthetic the average day, one pain score was 2.29 and 1.94 if a spinal anesthetic was performed. For hip replacements, the scores were 1.87 and 1.8, respectively. There was no statistical significance between these scores. Day 1 opiate usage was significantly higher in knee replacement patients who were given a general anesthetic (45.7mg IV morphine equivalent) vs. those who were operated on under spinal anesthetic (19.7mg). This difference was not noticeable in hip replacement patients. There was no significant difference in length of stay between the two anesthetic techniques. Discussion: There was no significant difference in the day one pain score between the patients who received a general or spinal anesthetic for either knee or hip replacements. The higher pain scores in the knee replacement group overall are consistent with this being a more painful procedure. This is a small patient population, which means any difference between the two groups is unlikely to be representative of a larger population. The pain scale has 4 points, which means it is difficult to identify a significant difference between pain scores. Conclusion: There is currently little standardization between the different anesthetic approaches utilized in Nevill Hall Hospital. This is likely due to the lack of adherence to a standardized anesthetic regimen. In accordance with ERAS recommends a standard anesthetic protocol is a core component. The results of this study and the guidance from the ERAS society will support the implementation of a new health board wide ERAS protocol.Keywords: anaesthesia, orthopaedics, intensive care, patient centered decision making, treatment escalation
Procedia PDF Downloads 1275611 Using Visualization Techniques to Support Common Clinical Tasks in Clinical Documentation
Authors: Jonah Kenei, Elisha Opiyo
Abstract:
Electronic health records, as a repository of patient information, is nowadays the most commonly used technology to record, store and review patient clinical records and perform other clinical tasks. However, the accurate identification and retrieval of relevant information from clinical records is a difficult task due to the unstructured nature of clinical documents, characterized in particular by a lack of clear structure. Therefore, medical practice is facing a challenge thanks to the rapid growth of health information in electronic health records (EHRs), mostly in narrative text form. As a result, it's becoming important to effectively manage the growing amount of data for a single patient. As a result, there is currently a requirement to visualize electronic health records (EHRs) in a way that aids physicians in clinical tasks and medical decision-making. Leveraging text visualization techniques to unstructured clinical narrative texts is a new area of research that aims to provide better information extraction and retrieval to support clinical decision support in scenarios where data generated continues to grow. Clinical datasets in electronic health records (EHR) offer a lot of potential for training accurate statistical models to classify facets of information which can then be used to improve patient care and outcomes. However, in many clinical note datasets, the unstructured nature of clinical texts is a common problem. This paper examines the very issue of getting raw clinical texts and mapping them into meaningful structures that can support healthcare professionals utilizing narrative texts. Our work is the result of a collaborative design process that was aided by empirical data collected through formal usability testing.Keywords: classification, electronic health records, narrative texts, visualization
Procedia PDF Downloads 1185610 Assessment of Work-Related Stress and Its Predictors in Ethiopian Federal Bureau of Investigation in Addis Ababa
Authors: Zelalem Markos Borko
Abstract:
Work-related stress is a reaction that occurs when the work weight progress toward becoming excessive. Therefore, unless properly managed, stress leads to high employee turnover, decreased performance, illness and absenteeism. Yet, little has been addressed regarding work-related stress and its predictors in the study area. Therefore, the objective of this study was to assess stress prevalence and its predictors in the study area. To that effect, a cross-sectional study design was conducted on 281 employees from the Ethiopian Federal Bureau of Investigation by using stratified random sampling techniques. Survey questionnaire scales were employed to collect data. Data were analyzed by percentage, Pearson correlation coefficients, simple linear regression, multiple linear regressions, independent t-test and one-way ANOVA statistical techniques. In the present study13.9% of participants faced high stress, whereas 13.5% of participants faced low stress and the rest 72.6% of officers experienced moderate stress. There is no significant group difference among workers due to age, gender, marital status, educational level, years of service and police rank. This study concludes that factors such as role conflict, performance over-utilization, role ambiguity, and qualitative and quantitative role overload together predict 39.6% of work-related stress. This indicates that 60.4% of the variation in stress is explained by other factors, so other additional research should be done to identify additional factors predicting stress. To prevent occupational stress among police, the Ethiopian Federal Bureau of Investigation should develop strategies based on factors that will help to develop stress reduction management.Keywords: work-related stress, Ethiopian federal bureau of investigation, predictors, Addis Ababa
Procedia PDF Downloads 705609 Characterization and Monitoring of the Yarn Faults Using Diametric Fault System
Authors: S. M. Ishtiaque, V. K. Yadav, S. D. Joshi, J. K. Chatterjee
Abstract:
The DIAMETRIC FAULTS system has been developed that captures a bi-directional image of yarn continuously in sequentially manner and provides the detailed classification of faults. A novel mathematical framework developed on the acquired bi-directional images forms the basis of fault classification in four broad categories, namely, Thick1, Thick2, Thin and Normal Yarn. A discretised version of Radon transformation has been used to convert the bi-directional images into one-dimensional signals. Images were divided into training and test sample sets. Karhunen–Loève Transformation (KLT) basis is computed for the signals from the images in training set for each fault class taking top six highest energy eigen vectors. The fault class of the test image is identified by taking the Euclidean distance of its signal from its projection on the KLT basis for each sample realization and fault class in the training set. Euclidean distance applied using various techniques is used for classifying an unknown fault class. An accuracy of about 90% is achieved in detecting the correct fault class using the various techniques. The four broad fault classes were further sub classified in four sub groups based on the user set boundary limits for fault length and fault volume. The fault cross-sectional area and the fault length defines the total volume of fault. A distinct distribution of faults is found in terms of their volume and physical dimensions which can be used for monitoring the yarn faults. It has been shown from the configurational based characterization and classification that the spun yarn faults arising out of mass variation, exhibit distinct characteristics in terms of their contours, sizes and shapes apart from their frequency of occurrences.Keywords: Euclidean distance, fault classification, KLT, Radon Transform
Procedia PDF Downloads 2655608 Spatial Assessment of Creek Habitats of Marine Fish Stock in Sindh Province
Authors: Syed Jamil H. Kazmi, Faiza Sarwar
Abstract:
The Indus delta of Sindh Province forms the largest creeks zone of Pakistan. The Sindh coast starts from the mouth of Hab River and terminates at Sir Creek area. In this paper, we have considered the major creeks from the site of Bin Qasim Port in Karachi to Jetty of Keti Bunder in Thatta District. A general decline in the mangrove forest has been observed that within a span of last 25 years. The unprecedented human interventions damage the creeks habitat badly which includes haphazard urban development, industrial and sewage disposal, illegal cutting of mangroves forest, reduced and inconsistent fresh water flow mainly from Jhang and Indus rivers. These activities not only harm the creeks habitat but affected the fish stock substantially. Fishing is the main livelihood of coastal people but with the above-mentioned threats, it is also under enormous pressure by fish catches resulted in unchecked overutilization of the fish resources. This pressure is almost unbearable when it joins with deleterious fishing methods, uncontrolled fleet size, increase trash and by-catch of juvenile and illegal mesh size. Along with these anthropogenic interventions study area is under the red zone of tropical cyclones and active seismicity causing floods, sea intrusion, damage mangroves forests and devastation of fish stock. In order to sustain the natural resources of the Indus Creeks, this study was initiated with the support of FAO, WWF and NIO, the main purpose was to develop a Geo-Spatial dataset for fish stock assessment. The study has been spread over a year (2013-14) on monthly basis which mainly includes detailed fish stock survey, water analysis and few other environmental analyses. Environmental analysis also includes the habitat classification of study area which has done through remote sensing techniques for 22 years’ time series (1992-2014). Furthermore, out of 252 species collected, fifteen species from estuarine and marine groups were short-listed to measure the weight, health and growth of fish species at each creek under GIS data through SPSS system. Furthermore, habitat suitability analysis has been conducted by assessing the surface topographic and aspect derivation through different GIS techniques. The output variables then overlaid in GIS system to measure the creeks productivity. Which provided the results in terms of subsequent classes: extremely productive, highly productive, productive, moderately productive and less productive. This study has revealed the Geospatial tools utilization along with the evaluation of the fisheries resources and creeks habitat risk zone mapping. It has also been identified that the geo-spatial technologies are highly beneficial to identify the areas of high environmental risk in Sindh Creeks. This has been clearly discovered from this study that creeks with high rugosity are more productive than the creeks with low levels of rugosity. The study area has the immense potential to boost the economy of Pakistan in terms of fish export, if geo-spatial techniques are implemented instead of conventional techniques.Keywords: fish stock, geo-spatial, productivity analysis, risk
Procedia PDF Downloads 2455607 Utility of Geospatial Techniques in Delineating Groundwater-Dependent Ecosystems in Arid Environments
Authors: Mangana B. Rampheri, Timothy Dube, Farai Dondofema, Tatenda Dalu
Abstract:
Identifying and delineating groundwater-dependent ecosystems (GDEs) is critical to the well understanding of the GDEs spatial distribution as well as groundwater allocation. However, this information is inadequately understood due to limited available data for the most area of concerns. Thus, this study aims to address this gap using remotely sensed, analytical hierarchy process (AHP) and in-situ data to identify and delineate GDEs in Khakea-Bray Transboundary Aquifer. Our study developed GDEs index, which integrates seven explanatory variables, namely, Normalized Difference Vegetation Index (NDVI), Modified Normalized Difference Water Index (MNDWI), Land-use and landcover (LULC), slope, Topographic Wetness Index (TWI), flow accumulation and curvature. The GDEs map was delineated using the weighted overlay tool in ArcGIS environments. The map was spatially classified into two classes, namely, GDEs and Non-GDEs. The results showed that only 1,34 % (721,91 km2) of the area is characterised by GDEs. Finally, groundwater level (GWL) data was used for validation through correlation analysis. Our results indicated that: 1) GDEs are concentrated at the northern, central, and south-western part of our study area, and 2) the validation results showed that GDEs classes do not overlap with GWL located in the 22 boreholes found in the given area. However, the results show a possible delineation of GDEs in the study area using remote sensing and GIS techniques along with AHP. The results of this study further contribute to identifying and delineating priority areas where appropriate water conservation programs, as well as strategies for sustainable groundwater development, can be implemented.Keywords: analytical hierarchy process (AHP), explanatory variables, groundwater-dependent ecosystems (GDEs), khakea-bray transboundary aquifer, sentinel-2
Procedia PDF Downloads 1085606 Machine Learning Techniques for Estimating Ground Motion Parameters
Authors: Farid Khosravikia, Patricia Clayton
Abstract:
The main objective of this study is to evaluate the advantages and disadvantages of various machine learning techniques in forecasting ground-motion intensity measures given source characteristics, source-to-site distance, and local site condition. Intensity measures such as peak ground acceleration and velocity (PGA and PGV, respectively) as well as 5% damped elastic pseudospectral accelerations at different periods (PSA), are indicators of the strength of shaking at the ground surface. Estimating these variables for future earthquake events is a key step in seismic hazard assessment and potentially subsequent risk assessment of different types of structures. Typically, linear regression-based models, with pre-defined equations and coefficients, are used in ground motion prediction. However, due to the restrictions of the linear regression methods, such models may not capture more complex nonlinear behaviors that exist in the data. Thus, this study comparatively investigates potential benefits from employing other machine learning techniques as a statistical method in ground motion prediction such as Artificial Neural Network, Random Forest, and Support Vector Machine. The algorithms are adjusted to quantify event-to-event and site-to-site variability of the ground motions by implementing them as random effects in the proposed models to reduce the aleatory uncertainty. All the algorithms are trained using a selected database of 4,528 ground-motions, including 376 seismic events with magnitude 3 to 5.8, recorded over the hypocentral distance range of 4 to 500 km in Oklahoma, Kansas, and Texas since 2005. The main reason of the considered database stems from the recent increase in the seismicity rate of these states attributed to petroleum production and wastewater disposal activities, which necessities further investigation in the ground motion models developed for these states. Accuracy of the models in predicting intensity measures, generalization capability of the models for future data, as well as usability of the models are discussed in the evaluation process. The results indicate the algorithms satisfy some physically sound characteristics such as magnitude scaling distance dependency without requiring pre-defined equations or coefficients. Moreover, it is shown that, when sufficient data is available, all the alternative algorithms tend to provide more accurate estimates compared to the conventional linear regression-based method, and particularly, Random Forest outperforms the other algorithms. However, the conventional method is a better tool when limited data is available.Keywords: artificial neural network, ground-motion models, machine learning, random forest, support vector machine
Procedia PDF Downloads 122