Search results for: NDE techniques
5680 Don't Just Guess and Slip: Estimating Bayesian Knowledge Tracing Parameters When Observations Are Scant
Authors: Michael Smalenberger
Abstract:
Intelligent tutoring systems (ITS) are computer-based platforms which can incorporate artificial intelligence to provide step-by-step guidance as students practice problem-solving skills. ITS can replicate and even exceed some benefits of one-on-one tutoring, foster transactivity in collaborative environments, and lead to substantial learning gains when used to supplement the instruction of a teacher or when used as the sole method of instruction. A common facet of many ITS is their use of Bayesian Knowledge Tracing (BKT) to estimate parameters necessary for the implementation of the artificial intelligence component, and for the probability of mastery of a knowledge component relevant to the ITS. While various techniques exist to estimate these parameters and probability of mastery, none directly and reliably ask the user to self-assess these. In this study, 111 undergraduate students used an ITS in a college-level introductory statistics course for which detailed transaction-level observations were recorded, and users were also routinely asked direct questions that would lead to such a self-assessment. Comparisons were made between these self-assessed values and those obtained using commonly used estimation techniques. Our findings show that such self-assessments are particularly relevant at the early stages of ITS usage while transaction level data are scant. Once a user’s transaction level data become available after sufficient ITS usage, these can replace the self-assessments in order to eliminate the identifiability problem in BKT. We discuss how these findings are relevant to the number of exercises necessary to lead to mastery of a knowledge component, the associated implications on learning curves, and its relevance to instruction time.Keywords: Bayesian Knowledge Tracing, Intelligent Tutoring System, in vivo study, parameter estimation
Procedia PDF Downloads 1745679 Synchrotron Based Techniques for the Characterization of Chemical Vapour Deposition Overgrowth Diamond Layers on High Pressure, High Temperature Substrates
Authors: T. N. Tran Thi, J. Morse, C. Detlefs, P. K. Cook, C. Yıldırım, A. C. Jakobsen, T. Zhou, J. Hartwig, V. Zurbig, D. Caliste, B. Fernandez, D. Eon, O. Loto, M. L. Hicks, A. Pakpour-Tabrizi, J. Baruchel
Abstract:
The ability to grow boron-doped diamond epilayers of high crystalline quality is a prerequisite for the fabrication of diamond power electronic devices, in particular high voltage diodes and metal-oxide-semiconductor (MOS) transistors. Boron and intrinsic diamond layers are homoepitaxially overgrown by microwave assisted chemical vapour deposition (MWCVD) on single crystal high pressure, high temperature (HPHT) grown bulk diamond substrates. Various epilayer thicknesses were grown, with dopant concentrations ranging from 1021 atom/cm³ at nanometer thickness in the case of 'delta doping', up 1016 atom/cm³ and 50µm thickness or high electric field drift regions. The crystalline quality of these overgrown layers as regards defects, strain, distortion… is critical for the device performance through its relation to the final electrical properties (Hall mobility, breakdown voltage...). In addition to the optimization of the epilayer growth conditions in the MWCVD reactor, other important questions related to the crystalline quality of the overgrown layer(s) are: 1) what is the dependence on the bulk quality and surface preparation methods of the HPHT diamond substrate? 2) how do defects already present in the substrate crystal propagate into the overgrown layer; 3) what types of new defects are created during overgrowth, what are their growth mechanisms, and how can these defects be avoided? 4) how can we relate in a quantitative manner parameters related to the measured crystalline quality of the boron doped layer to the electronic properties of final processed devices? We describe synchrotron-based techniques developed to address these questions. These techniques allow the visualization of local defects and crystal distortion which complements the data obtained by other well-established analysis methods such as AFM, SIMS, Hall conductivity…. We have used Grazing Incidence X-ray Diffraction (GIXRD) at the ID01 beamline of the ESRF to study lattice parameters and damage (strain, tilt and mosaic spread) both in diamond substrate near surface layers and in thick (10–50 µm) overgrown boron doped diamond epi-layers. Micro- and nano-section topography have been carried out at both the BM05 and ID06-ESRF) beamlines using rocking curve imaging techniques to study defects which have propagated from the substrate into the overgrown layer(s) and their influence on final electronic device performance. These studies were performed using various commercially sourced HPHT grown diamond substrates, with the MWCVD overgrowth carried out at the Fraunhofer IAF-Germany. The synchrotron results are in good agreement with low-temperature (5°K) cathodoluminescence spectroscopy carried out on the grown samples using an Inspect F5O FESEM fitted with an IHR spectrometer.Keywords: synchrotron X-ray diffaction, crystalline quality, defects, diamond overgrowth, rocking curve imaging
Procedia PDF Downloads 2625678 Aerodynamic Analysis by Computational Fluids Dynamics in Building: Case Study
Authors: Javier Navarro Garcia, Narciso Vazquez Carretero
Abstract:
Eurocode 1, part 1-4, wind actions, includes in its article 1.5 the possibility of using numerical calculation methods to obtain information on the loads acting on a building. On the other hand, the analysis using computational fluids dynamics (CFD) in aerospace, aeronautical, and industrial applications is already in widespread use. The application of techniques based on CFD analysis on the building to study its aerodynamic behavior now opens a whole alternative field of possibilities for civil engineering and architecture; optimization of the results with respect to those obtained by applying the regulations, the possibility of obtaining information on pressures, speeds at any point of the model for each moment, the analysis of turbulence and the possibility of modeling any geometry or configuration. The present work compares the results obtained on a building, with respect to its aerodynamic behavior, from a mathematical model based on the analysis by CFD with the results obtained by applying Eurocode1, part1-4, wind actions. It is verified that the results obtained by CFD techniques suppose an optimization of the wind action that acts on the building with respect to the wind action obtained by applying the Eurocode1, part 1-4, wind actions. In order to carry out this verification, a 45m high square base truncated pyramid building has been taken. The mathematical model on CFD, based on finite volumes, has been calculated using the FLUENT commercial computer application using a scale-resolving simulation (SRS) type large eddy simulation (LES) turbulence model for an atmospheric boundary layer wind with turbulent component in the direction of the flow.Keywords: aerodynamic, CFD, computacional fluids dynamics, computational mechanics
Procedia PDF Downloads 1385677 Implementation of the Quality Management System and Development of Organizational Learning: Case of Three Small and Medium-Sized Enterprises in Morocco
Authors: Abdelghani Boudiaf
Abstract:
The profusion of studies relating to the concept of organizational learning shows the importance that has been given to this concept in the management sciences. A few years ago, companies leaned towards ISO 9001 certification; this requires the implementation of the quality management system (QMS). In order for this objective to be achieved, companies must have a set of skills, which pushes them to develop learning through continuous training. The results of empirical research have shown that implementation of the QMS in the company promotes the development of learning. It should also be noted that several types of learning are developed in this sense. Given the nature of skills development is normative in the context of the quality demarche, companies are obliged to qualify and improve the skills of their human resources. Continuous training is the keystone to develop the necessary learning. To carry out continuous training, companies need to be able to identify their real needs by developing training plans based on well-defined engineering. The training process goes obviously through several stages. Initially, training has a general aspect, that is to say, it focuses on topics and actions of a general nature. Subsequently, this is done in a more targeted and more precise way to accompany the evolution of the QMS and also to make the changes decided each time (change of working method, change of practices, change of objectives, change of mentality, etc.). To answer our problematic we opted for the method of qualitative research. It should be noted that the case study method crosses several data collection techniques to explain and understand a phenomenon. Three cases of companies were studied as part of this research work using different data collection techniques related to this method.Keywords: changing mentalities, continuing training, organizational learning, quality management system, skills development
Procedia PDF Downloads 1105676 Combining Diffusion Maps and Diffusion Models for Enhanced Data Analysis
Authors: Meng Su
Abstract:
High-dimensional data analysis often presents challenges in capturing the complex, nonlinear relationships and manifold structures inherent to the data. This article presents a novel approach that leverages the strengths of two powerful techniques, Diffusion Maps and Diffusion Probabilistic Models (DPMs), to address these challenges. By integrating the dimensionality reduction capability of Diffusion Maps with the data modeling ability of DPMs, the proposed method aims to provide a comprehensive solution for analyzing and generating high-dimensional data. The Diffusion Map technique preserves the nonlinear relationships and manifold structure of the data by mapping it to a lower-dimensional space using the eigenvectors of the graph Laplacian matrix. Meanwhile, DPMs capture the dependencies within the data, enabling effective modeling and generation of new data points in the low-dimensional space. The generated data points can then be mapped back to the original high-dimensional space, ensuring consistency with the underlying manifold structure. Through a detailed example implementation, the article demonstrates the potential of the proposed hybrid approach to achieve more accurate and effective modeling and generation of complex, high-dimensional data. Furthermore, it discusses possible applications in various domains, such as image synthesis, time-series forecasting, and anomaly detection, and outlines future research directions for enhancing the scalability, performance, and integration with other machine learning techniques. By combining the strengths of Diffusion Maps and DPMs, this work paves the way for more advanced and robust data analysis methods.Keywords: diffusion maps, diffusion probabilistic models (DPMs), manifold learning, high-dimensional data analysis
Procedia PDF Downloads 1115675 Predicting the Impact of Scope Changes on Project Cost and Schedule Using Machine Learning Techniques
Authors: Soheila Sadeghi
Abstract:
In the dynamic landscape of project management, scope changes are an inevitable reality that can significantly impact project performance. These changes, whether initiated by stakeholders, external factors, or internal project dynamics, can lead to cost overruns and schedule delays. Accurately predicting the consequences of these changes is crucial for effective project control and informed decision-making. This study aims to develop predictive models to estimate the impact of scope changes on project cost and schedule using machine learning techniques. The research utilizes a comprehensive dataset containing detailed information on project tasks, including the Work Breakdown Structure (WBS), task type, productivity rate, estimated cost, actual cost, duration, task dependencies, scope change magnitude, and scope change timing. Multiple machine learning models are developed and evaluated to predict the impact of scope changes on project cost and schedule. These models include Linear Regression, Decision Tree, Ridge Regression, Random Forest, Gradient Boosting, and XGBoost. The dataset is split into training and testing sets, and the models are trained using the preprocessed data. Cross-validation techniques are employed to assess the robustness and generalization ability of the models. The performance of the models is evaluated using metrics such as Mean Squared Error (MSE) and R-squared. Residual plots are generated to assess the goodness of fit and identify any patterns or outliers. Hyperparameter tuning is performed to optimize the XGBoost model and improve its predictive accuracy. The feature importance analysis reveals the relative significance of different project attributes in predicting the impact on cost and schedule. Key factors such as productivity rate, scope change magnitude, task dependencies, estimated cost, actual cost, duration, and specific WBS elements are identified as influential predictors. The study highlights the importance of considering both cost and schedule implications when managing scope changes. The developed predictive models provide project managers with a data-driven tool to proactively assess the potential impact of scope changes on project cost and schedule. By leveraging these insights, project managers can make informed decisions, optimize resource allocation, and develop effective mitigation strategies. The findings of this research contribute to improved project planning, risk management, and overall project success.Keywords: cost impact, machine learning, predictive modeling, schedule impact, scope changes
Procedia PDF Downloads 445674 The Colorectal Cancer in Patients of Eastern Algeria
Authors: S. Tebibel, C. Mechati, S. Messaoudi
Abstract:
Algeria is currently experiencing the same rate of cancer progression as that registered these last years in the western countries. Colorectal cancer, constituting increasingly a major public health problem, is the most common form of cancer after breast and Neck-womb cancer at the woman and prostate cancer at the man. Our work is based on a retrospective study to determine the cases of colorectal cancer through eastern Algeria. Our goal is to carry out an epidemiological, histological and immune- histochemical study to investigate different techniques for the diagnosis of colorectal cancer and their interests and specific in detecting the disease. The study includes 110 patients (aged between 20 to 87 years) with colorectal cancer where the inclusions and exclusions criteria were established. In our study, colorectal cancer, expresses a male predominance, with a sex ratio of 1, 99 and the most affected age group is between 50 and 59 years. We noted that the colon cancer rate is higher than rectal cancer rate, whose frequencies are respectively 60,91 % and 39,09 %. In the series of colon cancer, the ADK lieberkunien is histological the most represented type, or 85,07 % of all cases. In contrast, the proportion of ADK mucinous (colloid mucous) is only 1,49% only. Well-differentiated ADKS, are very significant in our series, they represent 83,58 % of cases. Adenocarcinoma moderately and poorly differentiated, whose proportions are respectively 2,99 % and 0.05 %. For histological varieties of rectal ADK, we see in our workforce that ADK lieberkunien represent the most common histological form, or 76,74%, while the mucosal colloid is 13,95 %. Research of the mutation on the gene encoding K-ras, a major step in the targeted therapy of colorectal cancers, is underway in our study. Colorectal cancer is the subject of much promising research concern: the evaluation of new therapies (antiangiogenic monoclonal antibodies), the search for predictors of sensitivity to chemotherapy and new prognostic markers using techniques of molecular biology and proteomics.Keywords: adenocarcinoma, age, colorectal cancer, epidemiology, histological section, sex
Procedia PDF Downloads 3455673 Dosimetric Comparison among Different Head and Neck Radiotherapy Techniques Using PRESAGE™ Dosimeter
Authors: Jalil ur Rehman, Ramesh C. Tailor, Muhammad Isa Khan, Jahnzeeb Ashraf, Muhammad Afzal, Geofferry S. Ibbott
Abstract:
Purpose: The purpose of this analysis was to investigate dose distribution of different techniques (3D-CRT, IMRT and VMAT) of head and neck cancer using 3-dimensional dosimeter called PRESAGETM Dosimeter. Materials and Methods: Computer tomography (CT) scans of radiological physics center (RPC) head and neck anthropomorphic phantom with both RPC standard insert and PRESAGETM insert were acquired separated with Philipp’s CT scanner and both CT scans were exported via DICOM to the Pinnacle version 9.4 treatment planning system (TPS). Each plan was delivered twice to the RPC phantom first containing the RPC standard insert having TLD and film dosimeters and then again containing the Presage insert having 3-D dosimeter (PRESAGETM) by using a Varian True Beam linear accelerator. After irradiation, the standard insert including point dose measurements (TLD) and planar Gafchromic® EBT film measurement were read using RPC standard procedure. The 3D dose distribution from PRESAGETM was read out with the Duke Midsized optical scanner dedicated to RPC (DMOS-RPC). Dose volume histogram (DVH), mean and maximal doses for organs at risk were calculated and compared among each head and neck technique. The prescription dose was same for all head and neck radiotherapy techniques which was 6.60 Gy/friction. Beam profile comparison and gamma analysis were used to quantify agreements among film measurement, PRESAGETM measurement and calculated dose distribution. Quality assurances of all plans were performed by using ArcCHECK method. Results: VMAT delivered the lowest mean and maximum doses to organ at risk (spinal cord, parotid) than IMRT and 3DCRT. Such dose distribution was verified by absolute dose distribution using thermoluminescent dosimeter (TLD) system. The central axial, sagittal and coronal planes were evaluated using 2D gamma map criteria(± 5%/3 mm) and results were 99.82% (axial), 99.78% (sagital), 98.38% (coronal) for VMAT plan and found the agreement between PRESAGE and pinnacle was better than IMRT and 3D-CRT plan excludes a 7 mm rim at the edge of the dosimeter. Profile showed good agreement for all plans between film, PRESAGE and pinnacle and 3D gamma was performed for PTV and OARs, VMAT and 3DCRT endow with better agreement than IMRT. Conclusion: VMAT delivered lowered mean and maximal doses to organs at risk and better PTV coverage during head and neck radiotherapy. TLD, EBT film and PRESAGETM dosimeters suggest that VMAT was better for the treatment of head and neck cancer than IMRT and 3D-CRT.Keywords: RPC, 3DCRT, IMRT, VMAT, EBT2 film, TLD, PRESAGETM
Procedia PDF Downloads 3975672 The Effect of Corporate Governance on Financial Stability and Solvency Margin for Insurance Companies in Jordan
Authors: Ghadeer A.Al-Jabaree, Husam Aldeen Al-Khadash, M. Nassar
Abstract:
This study aimed at investigating the effect of well-designed corporate governance system on the financial stability of insurance companies listed in ASE. Further, this study provides a comprehensive model for evaluating and analyzing insurance companies' financial position and prospective for comparing the degree of corporate governance application provisions among Jordanian insurance companies. In order to achieve the goals of the study, a whole population that consist of (27) listed insurance companies was introduced through the variables of (board of director, audit committee, internal and external auditor, board and management ownership and block holder's identities). Statistical methods were used with alternative techniques by (SPSS); where descriptive statistical techniques such as means, standard deviations were used to describe the variables, while (F) test and ANOVA analysis of variance were used to test the hypotheses of the study. The study revealed the existence of significant effect of corporate governance variables except local companies that are not listed in ASE on financial stability within control variables especially debt ratio (leverage),where it's also showed that concentration in motor third party doesn't have significant effect on insurance companies' financial stability during study period. Moreover, the study concludes that Global financial crisis affect the investment side of insurance companies with insignificant effect on the technical side. Finally, some recommendations were presented such as enhancing the laws and regulation that help the appropriate application of corporate governance, and work on activating the transparency in the disclosures of the financial statements and focusing on supporting the technical provisions for the companies, rather than focusing only on profit side.Keywords: corporate governance, financial stability and solvency margin, insurance companies, Jordan
Procedia PDF Downloads 4905671 Permeability Prediction Based on Hydraulic Flow Unit Identification and Artificial Neural Networks
Authors: Emad A. Mohammed
Abstract:
The concept of hydraulic flow units (HFU) has been used for decades in the petroleum industry to improve the prediction of permeability. This concept is strongly related to the flow zone indicator (FZI) which is a function of the reservoir rock quality index (RQI). Both indices are based on reservoir porosity and permeability of core samples. It is assumed that core samples with similar FZI values belong to the same HFU. Thus, after dividing the porosity-permeability data based on the HFU, transformations can be done in order to estimate the permeability from the porosity. The conventional practice is to use the power law transformation using conventional HFU where percentage of error is considerably high. In this paper, neural network technique is employed as a soft computing transformation method to predict permeability instead of power law method to avoid higher percentage of error. This technique is based on HFU identification where Amaefule et al. (1993) method is utilized. In this regard, Kozeny and Carman (K–C) model, and modified K–C model by Hasan and Hossain (2011) are employed. A comparison is made between the two transformation techniques for the two porosity-permeability models. Results show that the modified K-C model helps in getting better results with lower percentage of error in predicting permeability. The results also show that the use of artificial intelligence techniques give more accurate prediction than power law method. This study was conducted on a heterogeneous complex carbonate reservoir in Oman. Data were collected from seven wells to obtain the permeability correlations for the whole field. The findings of this study will help in getting better estimation of permeability of a complex reservoir.Keywords: permeability, hydraulic flow units, artificial intelligence, correlation
Procedia PDF Downloads 1385670 3D-printing for Ablation Planning in Patients Undergoing Atrial Fibrillation Ablation: 3D-GALA Trial
Authors: Terentes Printzios Dimitrios, Loanna Gourgouli, Vlachopoulos Charalambos
Abstract:
Aims: Atrial fibrillation (AF) remains one of the major causes of stroke, heart failure, sudden death and cardiovascular morbidity. Ablation techniques are becoming more appealing after the latest results of randomized trials showing the overall clinical benefit. On the other hand, imaging techniques and the frontier application of 3D printing are emerging as a valuable ally for cardiac procedures. However, no randomized trial has directly assessed the impact of preprocedural imaging and especially 3D printing guidance for AF ablation. The present study is designed to investigate for the first time the effect of 3D printing of the heart on the safety and effectiveness of the ablation procedure. Methods and design: The 3D-GALA trial is a randomized, open-label, controlled, multicentre clinical trial of 2 parallel groups designed to enroll a total of 100 patients undergoing ablation using cryo-balloon for paroxysmal and persistent AF. Patients will be randomized with a patient allocation ratio of 1: 1 to preprocedural MRI scan of the heart and 3D printing of left atrium and pulmonary veins and cryoablation versus standard cryoablation without imaging. Patients will be followed up to 6 months after the index procedure. The primary outcome measure is the reduction of radiation dose and contrast amount during pulmonary veins isolation. Secondary endpoints will include the percentage of atrial fibrillation relapse at 24h-Holter electrocardiogram monitoring at 6 months after initial treatment. Discussion: To our knowledge, the 3D-GALA trial will be the first study to provide evidence about the clinical impact of preprocedural imaging and 3D printing before cryoablation.Keywords: atrial fibrillation, cardiac MRI, cryoablation, 3-d printing
Procedia PDF Downloads 1785669 Comparative Analysis of Control Techniques Based Sliding Mode for Transient Stability Assessment for Synchronous Multicellular Converter
Authors: Rihab Hamdi, Amel Hadri Hamida, Fatiha Khelili, Sakina Zerouali, Ouafae Bennis
Abstract:
This paper features a comparative study performance of sliding mode controller (SMC) for closed-loop voltage control of direct current to direct current (DC-DC) three-cells buck converter connected in parallel, operating in continuous conduction mode (CCM), based on pulse-width modulation (PWM) with SMC based on hysteresis modulation (HM) where an adaptive feedforward technique is adopted. On one hand, for the PWM-based SM, the approach is to incorporate a fixed-frequency PWM scheme which is effectively a variant of SM control. On the other hand, for the HM-based SM, oncoming an adaptive feedforward control that makes the hysteresis band variable in the hysteresis modulator of the SM controller in the aim to restrict the switching frequency variation in the case of any change of the line input voltage or output load variation are introduced. The results obtained under load change, input change and reference change clearly demonstrates a similar dynamic response of both proposed techniques, their effectiveness is fast and smooth tracking of the desired output voltage. The PWM-based SM technique has greatly improved the dynamic behavior with a bit advantageous compared to the HM-based SM technique, as well as provide stability in any operating conditions. Simulation studies in MATLAB/Simulink environment have been performed to verify the concept.Keywords: DC-DC converter, hysteresis modulation, parallel multi-cells converter, pulse-width modulation, robustness, sliding mode control
Procedia PDF Downloads 1695668 Advances in Genome Editing and Future Prospects for Sorghum Improvement: A Review
Authors: Micheale Yifter Weldemichael, Hailay Mehari Gebremedhn, Teklehaimanot Hailesslasie Teklu
Abstract:
Recent developments in targeted genome editing accelerated genetic research and opened new potentials to improve crops for better yields and quality. Given the significance of cereal crops as a primary source of food for the global population, the utilization of contemporary genome editing techniques like CRISPR/Cas9 is timely and crucial. CRISPR/Cas technology has enabled targeted genomic modifications, revolutionizing genetic research and exploration. Application of gene editing through CRISPR/Cas9 in enhancing sorghum is particularly vital given the current ecological, environmental, and agricultural challenges exacerbated by climate change. As sorghum is one of the main staple foods of our region and is known to be a resilient crop with a high potential to overcome the above challenges, the application of genome editing technology will enhance the investigation of gene functionality. CRISPR/Cas9 enables the improvement of desirable sorghum traits, including nutritional value, yield, resistance to pests and diseases, and tolerance to various abiotic stresses. Furthermore, CRISPR/Cas9 has the potential to perform intricate editing and reshape the existing elite sorghum varieties, and introduce new genetic variations. However, current research primarily focuses on improving the efficacy of the CRISPR/Cas9 system in successfully editing endogenous sorghum genes, making it a feasible and successful undertaking in sorghum improvement. Recent advancements and developments in CRISPR/Cas9 techniques have further empowered researchers to modify additional genes in sorghum with greater efficiency. Successful application and advancement of CRISPR techniques in sorghum will aid not only in gene discovery and the creation of novel traits that regulate gene expression and functional genomics but also in facilitating site-specific integration events. The purpose of this review is, therefore, to elucidate the current advances in sorghum genome editing and highlight its potential in addressing food security issues. It also assesses the efficiency of CRISPR-mediated improvement and its long-term effects on crop improvement and host resistance against parasites, including tissue-specific activity and the ability to induce resistance. This review ends by emphasizing the challenges and opportunities of CRISPR technology in combating parasitic plants and proposing directions for future research to safeguard global agricultural productivity.Keywords: CRISPR/Cas9, genome editing, quality, sorghum, stress, yield
Procedia PDF Downloads 425667 Comprehensive Machine Learning-Based Glucose Sensing from Near-Infrared Spectra
Authors: Bitewulign Mekonnen
Abstract:
Context: This scientific paper focuses on the use of near-infrared (NIR) spectroscopy to determine glucose concentration in aqueous solutions accurately and rapidly. The study compares six different machine learning methods for predicting glucose concentration and also explores the development of a deep learning model for classifying NIR spectra. The objective is to optimize the detection model and improve the accuracy of glucose prediction. This research is important because it provides a comprehensive analysis of various machine-learning techniques for estimating aqueous glucose concentrations. Research Aim: The aim of this study is to compare and evaluate different machine-learning methods for predicting glucose concentration from NIR spectra. Additionally, the study aims to develop and assess a deep-learning model for classifying NIR spectra. Methodology: The research methodology involves the use of machine learning and deep learning techniques. Six machine learning regression models, including support vector machine regression, partial least squares regression, extra tree regression, random forest regression, extreme gradient boosting, and principal component analysis-neural network, are employed to predict glucose concentration. The NIR spectra data is randomly divided into train and test sets, and the process is repeated ten times to increase generalization ability. In addition, a convolutional neural network is developed for classifying NIR spectra. Findings: The study reveals that the SVMR, ETR, and PCA-NN models exhibit excellent performance in predicting glucose concentration, with correlation coefficients (R) > 0.99 and determination coefficients (R²)> 0.985. The deep learning model achieves high macro-averaging scores for precision, recall, and F1-measure. These findings demonstrate the effectiveness of machine learning and deep learning methods in optimizing the detection model and improving glucose prediction accuracy. Theoretical Importance: This research contributes to the field by providing a comprehensive analysis of various machine-learning techniques for estimating glucose concentrations from NIR spectra. It also explores the use of deep learning for the classification of indistinguishable NIR spectra. The findings highlight the potential of machine learning and deep learning in enhancing the prediction accuracy of glucose-relevant features. Data Collection and Analysis Procedures: The NIR spectra and corresponding references for glucose concentration are measured in increments of 20 mg/dl. The data is randomly divided into train and test sets, and the models are evaluated using regression analysis and classification metrics. The performance of each model is assessed based on correlation coefficients, determination coefficients, precision, recall, and F1-measure. Question Addressed: The study addresses the question of whether machine learning and deep learning methods can optimize the detection model and improve the accuracy of glucose prediction from NIR spectra. Conclusion: The research demonstrates that machine learning and deep learning methods can effectively predict glucose concentration from NIR spectra. The SVMR, ETR, and PCA-NN models exhibit superior performance, while the deep learning model achieves high classification scores. These findings suggest that machine learning and deep learning techniques can be used to improve the prediction accuracy of glucose-relevant features. Further research is needed to explore their clinical utility in analyzing complex matrices, such as blood glucose levels.Keywords: machine learning, signal processing, near-infrared spectroscopy, support vector machine, neural network
Procedia PDF Downloads 955666 Fight against Money Laundering with Optical Character Recognition
Authors: Saikiran Subbagari, Avinash Malladhi
Abstract:
Anti Money Laundering (AML) regulations are designed to prevent money laundering and terrorist financing activities worldwide. Financial institutions around the world are legally obligated to identify, assess and mitigate the risks associated with money laundering and report any suspicious transactions to governing authorities. With increasing volumes of data to analyze, financial institutions seek to automate their AML processes. In the rise of financial crimes, optical character recognition (OCR), in combination with machine learning (ML) algorithms, serves as a crucial tool for automating AML processes by extracting the data from documents and identifying suspicious transactions. In this paper, we examine the utilization of OCR for AML and delve into various OCR techniques employed in AML processes. These techniques encompass template-based, feature-based, neural network-based, natural language processing (NLP), hidden markov models (HMMs), conditional random fields (CRFs), binarizations, pattern matching and stroke width transform (SWT). We evaluate each technique, discussing their strengths and constraints. Also, we emphasize on how OCR can improve the accuracy of customer identity verification by comparing the extracted text with the office of foreign assets control (OFAC) watchlist. We will also discuss how OCR helps to overcome language barriers in AML compliance. We also address the implementation challenges that OCR-based AML systems may face and offer recommendations for financial institutions based on the data from previous research studies, which illustrate the effectiveness of OCR-based AML.Keywords: anti-money laundering, compliance, financial crimes, fraud detection, machine learning, optical character recognition
Procedia PDF Downloads 1465665 The Utilization of Tea Extract within the Realm of the Food Industry
Authors: Raana Babadi Fathipour
Abstract:
Tea, a beverage widely cherished across the globe, has captured the interest of scholars with its recent acknowledgement for possessing noteworthy health advantages. Of particular significance is its proven ability to ward off ailments such as cancer and cardiovascular afflictions. Moreover, within the realm of culinary creations, lipid oxidation poses a significant challenge for food product development. In light of these aforementioned concerns, this present discourse turns its attention towards exploring diverse methodologies employed in extracting polyphenols from various types of tea leaves and examining their utility within the vast landscape of the ever-evolving food industry. Based on the discoveries unearthed in this comprehensive investigation, it has been determined that the fundamental constituents of tea are polyphenols possessed of intrinsic health-enhancing properties. This includes an assortment of catechins, namely epicatechin, epigallocatechin, epicatechin gallate, and epigallocatechin gallate. Moreover, gallic acid, flavonoids, flavonols and theaphlavins have also been detected within this aromatic beverage. Of these myriad components examined vigorously in this study's analysis, catechin emerges as particularly beneficial. Multiple techniques have emerged over time to successfully extract key compounds from tea plants, including solvent-based extraction methodologies, microwave-assisted water extraction approaches and ultrasound-assisted extraction techniques. In particular, consideration is given to microwave-assisted water extraction method as a viable scheme which effectively procures valuable polyphenols from tea extracts. This methodology appears adaptable for implementation within sectors such as dairy production along with meat and oil industries alike.Keywords: camellia sinensis, extraction, food application, shelf life, tea
Procedia PDF Downloads 725664 Exploring the Role of Data Mining in Crime Classification: A Systematic Literature Review
Authors: Faisal Muhibuddin, Ani Dijah Rahajoe
Abstract:
This in-depth exploration, through a systematic literature review, scrutinizes the nuanced role of data mining in the classification of criminal activities. The research focuses on investigating various methodological aspects and recent developments in leveraging data mining techniques to enhance the effectiveness and precision of crime categorization. Commencing with an exposition of the foundational concepts of crime classification and its evolutionary dynamics, this study details the paradigm shift from conventional methods towards approaches supported by data mining, addressing the challenges and complexities inherent in the modern crime landscape. Specifically, the research delves into various data mining techniques, including K-means clustering, Naïve Bayes, K-nearest neighbour, and clustering methods. A comprehensive review of the strengths and limitations of each technique provides insights into their respective contributions to improving crime classification models. The integration of diverse data sources takes centre stage in this research. A detailed analysis explores how the amalgamation of structured data (such as criminal records) and unstructured data (such as social media) can offer a holistic understanding of crime, enriching classification models with more profound insights. Furthermore, the study explores the temporal implications in crime classification, emphasizing the significance of considering temporal factors to comprehend long-term trends and seasonality. The availability of real-time data is also elucidated as a crucial element in enhancing responsiveness and accuracy in crime classification.Keywords: data mining, classification algorithm, naïve bayes, k-means clustering, k-nearest neigbhor, crime, data analysis, sistematic literature review
Procedia PDF Downloads 685663 Application of Electrochemical Impedance Spectroscopy to Monitor the Steel/Soil Interface During Cathodic Protection of Steel in Simulated Soil Solution
Authors: Mandlenkosi George Robert Mahlobo, Tumelo Seadira, Major Melusi Mabuza, Peter Apata Olubambi
Abstract:
Cathodic protection (CP) has been widely considered a suitable technique for mitigating corrosion of buried metal structures. Plenty of efforts have been made in developing techniques, in particular non-destructive techniques, for monitoring and quantifying the effectiveness of CP to ensure the sustainability and performance of buried steel structures. The aim of this study was to investigate the evolution of the electrochemical processes at the steel/soil interface during the application of CP on steel in simulated soil. Carbon steel was subjected to electrochemical tests with NS4 solution used as simulated soil conditions for 4 days before applying CP for a further 11 days. A previously modified non-destructive voltammetry technique was applied before and after the application of CP to measure the corrosion rate. Electrochemical impedance spectroscopy (EIS), in combination with mathematical modeling through equivalent electric circuits, was applied to determine the electrochemical behavior at the steel/soil interface. The measured corrosion rate was found to have decreased from 410 µm/yr to 8 µm/yr between days 5 and 14 because of the applied CP. Equivalent electrical circuits were successfully constructed and used to adequately model the EIS results. The modeling of the obtained EIS results revealed the formation of corrosion products via a mixed activation-diffusion mechanism during the first 4 days, while the activation mechanism prevailed in the presence of CP, resulting in a protective film. The x-ray diffraction analysis confirmed the presence of corrosion products and the predominant protective film corresponding to the calcareous deposit.Keywords: carbon steel, cathodic protection, NS4 solution, voltammetry, EIS
Procedia PDF Downloads 655662 Review of Downscaling Methods in Climate Change and Their Role in Hydrological Studies
Authors: Nishi Bhuvandas, P. V. Timbadiya, P. L. Patel, P. D. Porey
Abstract:
Recent perceived climate variability raises concerns with unprecedented hydrological phenomena and extremes. Distribution and circulation of the waters of the Earth become increasingly difficult to determine because of additional uncertainty related to anthropogenic emissions. According to the sixth Intergovernmental Panel on Climate Change (IPCC) Technical Paper on Climate Change and water, changes in the large-scale hydrological cycle have been related to an increase in the observed temperature over several decades. Although many previous research carried on effect of change in climate on hydrology provides a general picture of possible hydrological global change, new tools and frameworks for modelling hydrological series with nonstationary characteristics at finer scales, are required for assessing climate change impacts. Of the downscaling techniques, dynamic downscaling is usually based on the use of Regional Climate Models (RCMs), which generate finer resolution output based on atmospheric physics over a region using General Circulation Model (GCM) fields as boundary conditions. However, RCMs are not expected to capture the observed spatial precipitation extremes at a fine cell scale or at a basin scale. Statistical downscaling derives a statistical or empirical relationship between the variables simulated by the GCMs, called predictors, and station-scale hydrologic variables, called predictands. The main focus of the paper is on the need for using statistical downscaling techniques for projection of local hydrometeorological variables under climate change scenarios. The projections can be then served as a means of input source to various hydrologic models to obtain streamflow, evapotranspiration, soil moisture and other hydrological variables of interest.Keywords: climate change, downscaling, GCM, RCM
Procedia PDF Downloads 4085661 Accessibility Assessment of School Facilities Using Geospatial Technologies: A Case Study of District Sheikhupura
Authors: Hira Jabbar
Abstract:
Education is vital for inclusive growth of an economy and a critical contributor for investment in human capital. Like other developing countries, Pakistan is facing enormous challenges regarding the provision of public facilities, improper infrastructure planning, accelerating rate of population and poor accessibility. The influence of the rapid advancement and innovations in GIS and RS techniques have proved to be a useful tool for better planning and decision making to encounter these challenges. Therefore present study incorporates GIS and RS techniques to investigate the spatial distribution of school facilities, identifies settlements with served and unserved population, finds potential areas for new schools based on population and develops an accessibility index to evaluate the higher accessibility for schools. For this purpose high-resolution worldview imagery was used to develop road network, settlements and school facilities and to generate school accessibility for each level. Landsat 8 imagery was utilized to extract built-up area by applying pre and post-processing models and Landscan 2015 was used to analyze population statistics. Service area analysis was performed using network analyst extension in ArcGIS 10.3v and results were evaluated for served and underserved areas and population. An accessibility tool was used to evaluate a set of potential destinations to determine which is the most accessible with the given population distribution. Findings of the study may contribute to facilitating the town planners and education authorities for understanding the existing patterns of school facilities. It is concluded that GIS and remote sensing can be effectively used in urban transport and facility planning.Keywords: accessibility, geographic information system, landscan, worldview
Procedia PDF Downloads 3275660 The Art of Contemporary Arabic Calligraphy in Oman: Salman Alhajri as an Example
Authors: Salman Amur Alhajri
Abstract:
Purpose: This paper explores the art of contemporary Arabic calligraphy in Oman. It explains the aesthetic features of Arabic calligraphy as a unique icon of Islamic art. This paper also explores the profile of one Omani artist, Salman Alhajri, as an example of Omani artists who have developed unique styles in this art stream. Methodology and approach: The paper is based on a theoretical study using a descriptive and case-study approach. Omani artists are fascinated by the art forms of Arabic calligraphy, which combine both spiritual meaning and aesthetic beauty. Artist Salman Alhajri is an example of a contemporary Arabic artist who uses Arabic calligraphy as the main theme in his art. Dr. Alhajri is trying to introduce the beauty of Arabic letters from a new aesthetic point of view. He also aims to create unusual visual effects that viewers can easily interact with. Even though words and phrases appear in Alhajri’s artwork, they are not conveying direct meanings: viewers can create their own meaning or expressions from them by appreciating the compositions of the artwork. Results: Arabic writing is directly related to the identity of Omani artists and their cultural background. This paper shows how the beauty of Arabic letters comes from its indefinite possibilities in designing calligraphic expressions, even within a single word, because letters can be stretched and transformed in various ways to create different compositions. Omani artists are interested in employing new media applications in this kind of practice to find new techniques for creating artwork based on Arabic writing. It is really important for all Omani artists to practice this art style because Arabic calligraphy and its flexibility introduce infinite possibilities that involve further exploration and investigation.Keywords: Islamic art, contemporary Arabic calligraphy, new techniques, Omani artist
Procedia PDF Downloads 3625659 COVID_ICU_BERT: A Fine-Tuned Language Model for COVID-19 Intensive Care Unit Clinical Notes
Authors: Shahad Nagoor, Lucy Hederman, Kevin Koidl, Annalina Caputo
Abstract:
Doctors’ notes reflect their impressions, attitudes, clinical sense, and opinions about patients’ conditions and progress, and other information that is essential for doctors’ daily clinical decisions. Despite their value, clinical notes are insufficiently researched within the language processing community. Automatically extracting information from unstructured text data is known to be a difficult task as opposed to dealing with structured information such as vital physiological signs, images, and laboratory results. The aim of this research is to investigate how Natural Language Processing (NLP) techniques and machine learning techniques applied to clinician notes can assist in doctors’ decision-making in Intensive Care Unit (ICU) for coronavirus disease 2019 (COVID-19) patients. The hypothesis is that clinical outcomes like survival or mortality can be useful in influencing the judgement of clinical sentiment in ICU clinical notes. This paper introduces two contributions: first, we introduce COVID_ICU_BERT, a fine-tuned version of clinical transformer models that can reliably predict clinical sentiment for notes of COVID patients in the ICU. We train the model on clinical notes for COVID-19 patients, a type of notes that were not previously seen by clinicalBERT, and Bio_Discharge_Summary_BERT. The model, which was based on clinicalBERT achieves higher predictive accuracy (Acc 93.33%, AUC 0.98, and precision 0.96 ). Second, we perform data augmentation using clinical contextual word embedding that is based on a pre-trained clinical model to balance the samples in each class in the data (survived vs. deceased patients). Data augmentation improves the accuracy of prediction slightly (Acc 96.67%, AUC 0.98, and precision 0.92 ).Keywords: BERT fine-tuning, clinical sentiment, COVID-19, data augmentation
Procedia PDF Downloads 2095658 Prevalence and Associated Factors of Periodontal Disease among Diabetes Patients in Addis Ababa, Ethiopia, 2018
Authors: Addisu Tadesse Sahile, Tennyson Mgutshini
Abstract:
Background: Periodontal disease is a common, complex, inflammatory disease characterized by the destruction of tooth-supporting soft and hard tissues of the periodontium and a major public health problem across developed and developing countries. Objectives: The study was aimed at assessing the prevalence of periodontal disease and associated factors among diabetes patients in Addis Ababa, Ethiopia, 2018. Methods: Institutional based cross-sectional study was conducted on 388 diabetes patients selected by systematic random sampling method from March to May 2018. The study was conducted at two conveniently selected public hospitals in Addis Ababa. Data were collected with pre-tested, structured and translated questionnaire then entered to SPSS version 23 software for analysis. Descriptive statistics as a summary, in line with chi-square and binary logistics regression to identify factors associated with periodontal disease, were applied. A 95% CI with a p-value less than 5% was used as a level of significance. Results: Ninety-one percent (n=353) of participants had periodontal disease while oral examination was done in six regions. While only 9% (n=35) of participants were free of periodontal disease. The number of tooth brushings per day, correct techniques of brushing, malocclusion, and fillings that are defective were associated with periodontal disease at p < 0.05. Conclusion and recommendation: A higher prevalence of periodontal disease among diabetes patient was observed. The frequency of tooth brushing, correct techniques of brushing, malocclusion and defective fillings were associated with periodontal disease. Emphasis has to be given to oral health of diabetes patients by every concerned body so as to control the current higher burden of periodontal disease in diabetes.Keywords: periodontal disease, risk factors, diabetes mellitus, Addis Ababa
Procedia PDF Downloads 1305657 Effect of Spontaneous Ripening and Drying Techniques on the Bioactive Activities Peel of Plantain (Musa paradisiaca) Fruit
Authors: Famuwagun A. A., Abiona O. O., Gbadamosi S.O., Adeboye O. A., Adebooye O. C.
Abstract:
The need to provide more information on the perceived bioactive status of the peel of plantain fruit informed the design of this research. Matured Plantain fruits were harvested, and fruits were allowed to ripen spontaneously. Samples of plantain fruit were taken every fortnight, and the peels were removed. The peels were dried using two different drying techniques (Oven drying and sun drying) and milled into powdery forms. Other samples were picked and processed in a similar manner on the first, third, seventh and tenth day until the peels of the fruits were fully ripped, resulting in eight different samples. The anti-oxidative properties of the samples using different assays (DPPH, FRAP, MCA, HRSA, SRSA, ABTS, ORAC), inhibitory activities against enzymes related to diabetes (alpha-amylase and glucosidase) and inhibition against angiotensin-converting enzymes (ACE) were evaluated. The result showed that peels of plantain fruits on the 7th day of ripening and sundried exhibited greater inhibitions against free radicals, which enhanced its antioxidant activities, resulting in greater inhibitions against alpha-amylase and alpha-glucosidase enzymes. Also, oven oven-dried sample of the peel of plantain fruit on the 7th day of ripening had greater phenolic contents than the other samples, which also resulted in higher inhibition against angiotensin converting enzymes when compared with other samples. The results showed that even though the unripe peel of plantain fruit is assumed to contain excellent bioactive activities, consumption of the peel should be allowed to ripen for seven days after maturity and harvesting so as to derive maximum benefit from the peel.Keywords: functional ingredient, diabetics, hypertension, functional foods
Procedia PDF Downloads 525656 Experimental Study of the Behavior of Elongated Non-spherical Particles in Wall-Bounded Turbulent Flows
Authors: Manuel Alejandro Taborda Ceballos, Martin Sommerfeld
Abstract:
Transport phenomena and dispersion of non-spherical particle in turbulent flows are found everywhere in industrial application and processes. Powder handling, pollution control, pneumatic transport, particle separation are just some examples where the particle encountered are not only spherical. These types of multiphase flows are wall bounded and mostly highly turbulent. The particles found in these processes are rarely spherical but may have various shapes (e.g., fibers, and rods). Although research related to the behavior of regular non-spherical particles in turbulent flows has been carried out for many years, it is still necessary to refine models, especially near walls where the interaction fiber-wall changes completely its behavior. Imaging-based experimental studies on dispersed particle-laden flows have been applied for many decades for a detailed experimental analysis. These techniques have the advantages that they provide field information in two or three dimensions, but have a lower temporal resolution compared to point-wise techniques such as PDA (phase-Doppler anemometry) and derivations therefrom. The applied imaging techniques in dispersed two-phase flows are extensions from classical PIV (particle image velocimetry) and PTV (particle tracking velocimetry) and the main emphasis was simultaneous measurement of the velocity fields of both phases. In a similar way, such data should also provide adequate information for validating the proposed models. Available experimental studies on the behavior of non-spherical particles are uncommon and mostly based on planar light-sheet measurements. Especially for elongated non-spherical particles, however, three-dimensional measurements are needed to fully describe their motion and to provide sufficient information for validation of numerical computations. For further providing detailed experimental results allowing a validation of numerical calculations of non-spherical particle dispersion in turbulent flows, a water channel test facility was built around a horizontal closed water channel. Into this horizontal main flow, a small cross-jet laden with fiber-like particles was injected, which was also solely driven by gravity. The dispersion of the fibers was measured by applying imaging techniques based on a LED array for backlighting and high-speed cameras. For obtaining the fluid velocity fields, almost neutrally buoyant tracer was used. The discrimination between tracer and fibers was done based on image size which was also the basis to determine fiber orientation with respect to the inertial coordinate system. The synchronous measurement of fluid velocity and fiber properties also allow the collection of statistics of fiber orientation, velocity fields of tracer and fibers, the angular velocity of the fibers and the orientation between fiber and instantaneous relative velocity. Consequently, an experimental study the behavior of elongated non-spherical particles in wall bounded turbulent flows was achieved. The development of a comprehensive analysis was succeeded, especially near the wall region, where exists hydrodynamic wall interaction effects (e.g., collision or lubrication) and abrupt changes of particle rotational velocity. This allowed us to predict numerically afterwards the behavior of non-spherical particles within the frame of the Euler/Lagrange approach, where the particles are therein treated as “point-particles”.Keywords: crossflow, non-spherical particles, particle tracking velocimetry, PIV
Procedia PDF Downloads 875655 Implementing 3D Printing for 3D Digital Modeling in the Classroom
Authors: Saritdikhun Somasa
Abstract:
3D printing fabrication has empowered many artists in many fields. Artists who work in stop motion, 3D modeling, toy design, product design, sculpture, and fine arts become one-stop shop operations–where they can design, prototype, and distribute their designs for commercial or fine art purposes. The author has developed a digital sculpting course that fosters digital software, peripheral hardware, and 3D printing with traditional sculpting concept techniques to address the complexities of this multifaceted process, allowing the students to produce complex 3d-printed work. The author will detail the preparation and planning for pre- to post-process 3D printing elements, including software, materials, space, equipment, tools, and schedule consideration for small to medium figurine design statues in a semester-long class. In addition, the author provides insight into teaching challenges in the non-studio space that requires students to work intensively on post-printed models to assemble parts, finish, and refine the 3D printed surface. Even though this paper focuses on the 3D printing processes and techniques for small to medium design statue projects for the Digital Media program, the author hopes the paper will benefit other fields of study such as craft practices, product design, and fine-arts programs. Other schools that might implement 3D printing and fabrication in their programs will find helpful information in this paper, such as a teaching plan, choices of equipment and materials, adaptation for non-studio spaces, and putting together a complete and well-resolved project for students.Keywords: 3D digital modeling, 3D digital sculpting, 3D modeling, 3D printing, 3D digital fabrication
Procedia PDF Downloads 1055654 The Relationship between Renewable Energy, Real Income, Tourism and Air Pollution
Authors: Eyup Dogan
Abstract:
One criticism of the energy-growth-environment literature, to the best of our knowledge, is that only a few studies analyze the influence of tourism on CO₂ emissions even though tourism sector is closely related to the environment. The other criticism is the selection of methodology. Panel estimation techniques that fail to consider both heterogeneity and cross-sectional dependence across countries can cause forecasting errors. To fulfill the mentioned gaps in the literature, this study analyzes the impacts of real GDP, renewable energy and tourism on the levels of carbon dioxide (CO₂) emissions for the top 10 most-visited countries around the world. This study focuses on the top 10 touristic (most-visited) countries because they receive about the half of the worldwide tourist arrivals in late years and are among the top ones in 'Renewables Energy Country Attractiveness Index (RECAI)'. By looking at Pesaran’s CD test and average growth rates of variables for each country, we detect the presence of cross-sectional dependence and heterogeneity. Hence, this study uses second generation econometric techniques (cross-sectionally augmented Dickey-Fuller (CADF), and cross-sectionally augmented IPS (CIPS) unit root test, the LM bootstrap cointegration test, and the DOLS and the FMOLS estimators) which are robust to the mentioned issues. Therefore, the reported results become accurate and reliable. It is found that renewable energy mitigates the pollution whereas real GDP and tourism contribute to carbon emissions. Thus, regulatory policies are necessary to increase the awareness of sustainable tourism. In addition, the use of renewable energy and the adoption of clean technologies in tourism sector as well as in producing goods and services play significant roles in reducing the levels of emissions.Keywords: air pollution, tourism, renewable energy, income, panel data
Procedia PDF Downloads 1845653 Smart Safari: Safari Guidance Mobile Application
Authors: D. P. Lawrence, T. M. M. D. Ariyarathna, W. N. K. De Silva, M. D. S. C. De Silva, Lasantha Abeysiri, Pradeep Abeygunawardhna
Abstract:
Safari traveling is one of the most famous hobbies all over the world. In Sri Lanka, 'Yala' is the second-largest national park, which is a better place to go for a safari. Many number of local and foreign travelers are coming to go for a safari in 'Yala'. But 'Yala' does not have a mobile application that is made to facilitate the traveler with some important features that the traveler wants to achieve in the safari experience. To overcome these difficulties, the proposed mobile application by adding those identified features to make travelers, guiders, and administration's works easier. The proposed safari traveling guidance mobile application is called 'SMART SAFARI' for the 'Yala' National Park in Sri Lanka. There are four facilities in this mobile application that provide for travelers as well as the guiders. As the first facility, the guider and traveler can view the created map of the park, and the guider can add temporary locations of animals and special locations on the map. This is a Geographic Information System (GIS) to capture, analyze, and display geographical data. And as the second facility is to generate optimal paths according to the travelers' requirements through the park by using machine learning techniques. In the third part, the traveler can get information about animals using an animal identification system by capturing the animal. As in the other facility, the traveler will be facilitated to add reviews and a rate and view those comments under categorized sections and pre-defined score range. With those facilities, this user-friendly mobile application provides the user to get a better experience in safari traveling, and it will probably help to develop tourism culture in Sri Lanka.Keywords: animal identification system, geographic information system, machine learning techniques, pre defined score range
Procedia PDF Downloads 1345652 Cross-Validation of the Data Obtained for ω-6 Linoleic and ω-3 α-Linolenic Acids Concentration of Hemp Oil Using Jackknife and Bootstrap Resampling
Authors: Vibha Devi, Shabina Khanam
Abstract:
Hemp (Cannabis sativa) possesses a rich content of ω-6 linoleic and ω-3 linolenic essential fatty acid in the ratio of 3:1, which is a rare and most desired ratio that enhances the quality of hemp oil. These components are beneficial for the development of cell and body growth, strengthen the immune system, possess anti-inflammatory action, lowering the risk of heart problem owing to its anti-clotting property and a remedy for arthritis and various disorders. The present study employs supercritical fluid extraction (SFE) approach on hemp seed at various conditions of parameters; temperature (40 - 80) °C, pressure (200 - 350) bar, flow rate (5 - 15) g/min, particle size (0.430 - 1.015) mm and amount of co-solvent (0 - 10) % of solvent flow rate through central composite design (CCD). CCD suggested 32 sets of experiments, which was carried out. As SFE process includes large number of variables, the present study recommends the application of resampling techniques for cross-validation of the obtained data. Cross-validation refits the model on each data to achieve the information regarding the error, variability, deviation etc. Bootstrap and jackknife are the most popular resampling techniques, which create a large number of data through resampling from the original dataset and analyze these data to check the validity of the obtained data. Jackknife resampling is based on the eliminating one observation from the original sample of size N without replacement. For jackknife resampling, the sample size is 31 (eliminating one observation), which is repeated by 32 times. Bootstrap is the frequently used statistical approach for estimating the sampling distribution of an estimator by resampling with replacement from the original sample. For bootstrap resampling, the sample size is 32, which was repeated by 100 times. Estimands for these resampling techniques are considered as mean, standard deviation, variation coefficient and standard error of the mean. For ω-6 linoleic acid concentration, mean value was approx. 58.5 for both resampling methods, which is the average (central value) of the sample mean of all data points. Similarly, for ω-3 linoleic acid concentration, mean was observed as 22.5 through both resampling. Variance exhibits the spread out of the data from its mean. Greater value of variance exhibits the large range of output data, which is 18 for ω-6 linoleic acid (ranging from 48.85 to 63.66 %) and 6 for ω-3 linoleic acid (ranging from 16.71 to 26.2 %). Further, low value of standard deviation (approx. 1 %), low standard error of the mean (< 0.8) and low variance coefficient (< 0.2) reflect the accuracy of the sample for prediction. All the estimator value of variance coefficients, standard deviation and standard error of the mean are found within the 95 % of confidence interval.Keywords: resampling, supercritical fluid extraction, hemp oil, cross-validation
Procedia PDF Downloads 1415651 Strengthening of Reinforced Concrete Columns Using Advanced Composite Materials to Resist Earthquakes
Authors: Mohamed Osama Hassaan
Abstract:
Recent earthquakes have demonstrated the vulnerability of older reinforced concrete buildings to fail under imposed seismic loads. Accordingly, the need to strengthen existing reinforced concrete structures, mainly columns, to resist high seismic loads has increased. Conventional strengthening techniques such as using steel plates, steel angles and concrete overlay are used to achieve the required increase in strength or ductility. However, techniques using advanced composite materials are established. The column's splice zone is the most critical zone that failed under seismic loads. There are three types of splice zone failure that can be observed under seismic action, namely, Failure of the flexural plastic hinge region, shear failure and failure due to short lap splice. A lapped splice transfers the force from one bar to another through the concrete surrounding both bars. At any point along the splice, force is transferred from one bar by a bond to the surrounding concrete and also by a bond to the other bar of the pair forming the splice. The integrity of the lap splice depends on the development of adequate bond length. The R.C. columns built in seismic regions are expected to undergo a large number of inelastic deformation cycles while maintaining the overall strength and stability of the structure. This can be ensured by proper confinement of the concrete core. The last type of failure is focused in this research. There are insufficient studies that address the problem of strengthening existing reinforced concrete columns at splice zone through confinement with “advanced composite materials". Accordingly, more investigation regarding the seismic behavior of strengthened reinforced concrete columns using the new generation of composite materials such as (Carbon fiber polymer), (Glass fiber polymer), (Armiad fiber polymer).Keywords: strengthening, columns, advanced composite materials, earthquakes
Procedia PDF Downloads 78