Search results for: de-noising techniques
5653 Particle Swarm Optimization Algorithm vs. Genetic Algorithm for Image Watermarking Based Discrete Wavelet Transform
Authors: Omaima N. Ahmad AL-Allaf
Abstract:
Over communication networks, images can be easily copied and distributed in an illegal way. The copyright protection for authors and owners is necessary. Therefore, the digital watermarking techniques play an important role as a valid solution for authority problems. Digital image watermarking techniques are used to hide watermarks into images to achieve copyright protection and prevent its illegal copy. Watermarks need to be robust to attacks and maintain data quality. Therefore, we discussed in this paper two approaches for image watermarking, first is based on Particle Swarm Optimization (PSO) and the second approach is based on Genetic Algorithm (GA). Discrete wavelet transformation (DWT) is used with the two approaches separately for embedding process to cover image transformation. Each of PSO and GA is based on co-relation coefficient to detect the high energy coefficient watermark bit in the original image and then hide the watermark in original image. Many experiments were conducted for the two approaches with different values of PSO and GA parameters. From experiments, PSO approach got better results with PSNR equal 53, MSE equal 0.0039. Whereas GA approach got PSNR equal 50.5 and MSE equal 0.0048 when using population size equal to 100, number of iterations equal to 150 and 3×3 block. According to the results, we can note that small block size can affect the quality of image watermarking based PSO/GA because small block size can increase the search area of the watermarking image. Better PSO results were obtained when using swarm size equal to 100.Keywords: image watermarking, genetic algorithm, particle swarm optimization, discrete wavelet transform
Procedia PDF Downloads 2265652 Pre-Implementation of Total Body Irradiation Using Volumetric Modulated Arc Therapy: Full Body Anthropomorphic Phantom Development
Authors: Susana Gonçalves, Joana Lencart, Anabela Gregório Dias
Abstract:
Introduction: In combination with chemotherapy, Total Body Irradiation (TBI) is most used as part of the conditioning regimen prior to allogeneic hematopoietic stem cell transplantation. Conventional TBI techniques have a long application time but non-conformality of beam-application with the inability to individually spare organs at risk. Our institution’s intention is to start using Volumetric Modulated Arc Therapy (VMAT) techniques to increase homogeneity of delivered radiation. As a first approach, a dosimetric plan was performed on a computed tomography (CT) scan of a Rando Alderson antropomorfic phantom (head and torso), using a set of six arcs distributed along the phantom. However, a full body anthropomorphic phantom is essential to carry out technique validation and implementation. Our aim is to define the physical and chemical characteristics and the ideal manufacturing procedure of upper and lower limbs to our anthropomorphic phantom, for later validate TBI using VMAT. Materials and Methods: To study the better fit between our phantom and limbs, a CT scan of Rando Alderson anthropomorphic phantom was acquired. CT was performed on GE Healthcare equipment (model Optima CT580 W), with slice thickness of 2.5 mm. This CT was also used to access the electronic density of soft tissue and bone through Hounsfield units (HU) analysis. Results: CT images were analyzed and measures were made for the ideal upper and lower limbs. Upper limbs should be build under the following measures: 43cm length and 7cm diameter (next to the shoulder section). Lower limbs should be build under the following measures: 79cm length and 16.5cm diameter (next to the thigh section). As expected, soft tissue and bone have very different electronic density. This is important to choose and analyze different materials to better represent soft tissue and bone characteristics. The approximate HU values of the soft tissue and for bone shall be 35HU and 250HU, respectively. Conclusion: At the moment, several compounds are being developed based on different types of resins and additives in order to be able to control and mimic the various constituent densities of the tissues. Concurrently, several manufacturing techniques are being explored to make it possible to produce the upper and lower limbs in a simple and non-expensive way, in order to finally carry out a systematic and appropriate study of the total body irradiation. This preliminary study was a good starting point to demonstrate the feasibility of TBI with VMAT.Keywords: TBI, VMAT, anthropomorphic phantom, tissue equivalent materials
Procedia PDF Downloads 805651 The Application of Lesson Study Model in Writing Review Text in Junior High School
Authors: Sulastriningsih Djumingin
Abstract:
This study has some objectives. It aims at describing the ability of the second-grade students to write review text without applying the Lesson Study model at SMPN 18 Makassar. Second, it seeks to describe the ability of the second-grade students to write review text by applying the Lesson Study model at SMPN 18 Makassar. Third, it aims at testing the effectiveness of the Lesson Study model in writing review text at SMPN 18 Makassar. This research was true experimental design with posttest Only group design involving two groups consisting of one class of the control group and one class of the experimental group. The research populations were all the second-grade students at SMPN 18 Makassar amounted to 250 students consisting of 8 classes. The sampling technique was purposive sampling technique. The control class was VIII2 consisting of 30 students, while the experimental class was VIII8 consisting of 30 students. The research instruments were in the form of observation and tests. The collected data were analyzed using descriptive statistical techniques and inferential statistical techniques with t-test types processed using SPSS 21 for windows. The results shows that: (1) of 30 students in control class, there are only 14 (47%) students who get the score more than 7.5, categorized as inadequate; (2) in the experimental class, there are 26 (87%) students who obtain the score of 7.5, categorized as adequate; (3) the Lesson Study models is effective to be applied in writing review text. Based on the comparison of the ability of the control class and experimental class, it indicates that the value of t-count is greater than the value of t-table (2.411> 1.667). It means that the alternative hypothesis (H1) proposed by the researcher is accepted.Keywords: application, lesson study, review text, writing
Procedia PDF Downloads 2015650 Empowering Transformers for Evidence-Based Medicine
Authors: Jinan Fiaidhi, Hashmath Shaik
Abstract:
Breaking the barrier for practicing evidence-based medicine relies on effective methods for rapidly identifying relevant evidence from the body of biomedical literature. An important challenge confronted by medical practitioners is the long time needed to browse, filter, summarize and compile information from different medical resources. Deep learning can help in solving this based on automatic question answering (Q&A) and transformers. However, Q&A and transformer technologies are not trained to answer clinical queries that can be used for evidence-based practice, nor can they respond to structured clinical questioning protocols like PICO (Patient/Problem, Intervention, Comparison and Outcome). This article describes the use of deep learning techniques for Q&A that are based on transformer models like BERT and GPT to answer PICO clinical questions that can be used for evidence-based practice extracted from sound medical research resources like PubMed. We are reporting acceptable clinical answers that are supported by findings from PubMed. Our transformer methods are reaching an acceptable state-of-the-art performance based on two staged bootstrapping processes involving filtering relevant articles followed by identifying articles that support the requested outcome expressed by the PICO question. Moreover, we are also reporting experimentations to empower our bootstrapping techniques with patch attention to the most important keywords in the clinical case and the PICO questions. Our bootstrapped patched with attention is showing relevancy of the evidence collected based on entropy metrics.Keywords: automatic question answering, PICO questions, evidence-based medicine, generative models, LLM transformers
Procedia PDF Downloads 435649 Overview of Pre-Analytical Lab Errors in a Tertiary Care Hospital at Rawalpindi, Pakistan
Authors: S. Saeed, T. Butt, M. Rehan, S. Khaliq
Abstract:
Objective: To determine the frequency of pre-analytical errors in samples taken from patients for various lab tests at Fauji Foundation Hospital, Rawalpindi. Material and Methods: All the lab specimens for diagnostic purposes received at the lab from Fauji Foundation hospital, Rawalpindi indoor and outdoor patients were included. Total number of samples received in the lab is recorded in the computerized program made for the hospital. All the errors observed for pre-analytical process including patient identification, sampling techniques, test collection procedures, specimen transport/processing and storage were recorded in the log book kept for the purpose. Results: A total of 476616 specimens were received in the lab during the period of study including 237931 and 238685 from outdoor and indoor patients respectively. Forty-one percent of the samples (n=197976) revealed pre-analytical discrepancies. The discrepancies included Hemolyzed samples (34.8%), Clotted blood (27.8%), Incorrect samples (17.4%), Unlabeled samples (8.9%), Insufficient specimens (3.9%), Request forms without authorized signature (2.9%), Empty containers (3.9%) and tube breakage during centrifugation (0.8%). Most of these pre-analytical discrepancies were observed in samples received from the wards revealing that inappropriate sample collection by the medical staff of the ward, as most of the outdoor samples are collected by the lab staff who are properly trained for sample collection. Conclusion: It is mandatory to educate phlebotomists and paramedical staff particularly performing duties in the wards regarding timing and techniques of sampling/appropriate container to use/early delivery of the samples to the lab to reduce pre-analytical errors.Keywords: pre analytical lab errors, tertiary care hospital, hemolyzed, paramedical staff
Procedia PDF Downloads 2045648 Preparation of Conductive Composite Fiber by the Reduction of Silver Particles onto Hydrolyzed Polyacrylonitrile Fiber
Authors: Z. Okay, M. Kalkan Erdoğan, M. Şahin, M. Saçak
Abstract:
Polyacrylonitrile (PAN) is one of the most common and cheap fiber-forming polymers because of its high strength and high abrasion resistance properties. The result of alkaline hydrolysis of PAN fiber could be formed the products with conjugated sequences of –C=N–, acrylamide, sodium acrylate, and amidine. In this study, PAN fiber was hydrolyzed in a solution of sodium hydroxide, and this hydrolyzed PAN (HPAN) fiber was used to prepare conductive composite fiber by silver particles. The electrically conductive PAN fiber has the usage potential to produce variety of materials such as antistatic materials, life jackets and static charge reducing products. We monitored the change in the weight loss values of the PAN fiber with hydrolysis time. It was observed that a 60 % of weight loss was obtained in the fiber weight after 7h hydrolysis under the investigated conditions, but the fiber lost its fibrous structure. The hydrolysis time of 5h was found to be suitable in terms of preserving its fibrous structure. The change in the conductivity values of the composite with the preparation conditions such as hydrolysis time, silver ion concentration was studied. PAN fibers with different degrees of hydrolysis were treated with aqueous solutions containing different concentrations of silver ions by continuous stirring at 20 oC for 30 min, and the composite having the maximum conductivity of 2 S/cm could be prepared. The antibacterial property of the conductive HPAN fibers participated silver was also investigated. While the hydrolysis of the PAN fiber was characterized with FTIR and SEM techniques, the silver reduction process of the HPAN fiber was investigated with SEM and TGA-DTA techniques. The SEM micrographs showed that the surface of HPAN fiber was rougher and much more corroded than that of the PAN fiber. Composite, Conducting polymer, Fiber, Polyacrylonitrile.Keywords: composite, conducting polymer, fiber, polyacrylonitrile
Procedia PDF Downloads 4785647 Modern Information Security Management and Digital Technologies: A Comprehensive Approach to Data Protection
Authors: Mahshid Arabi
Abstract:
With the rapid expansion of digital technologies and the internet, information security has become a critical priority for organizations and individuals. The widespread use of digital tools such as smartphones and internet networks facilitates the storage of vast amounts of data, but simultaneously, vulnerabilities and security threats have significantly increased. The aim of this study is to examine and analyze modern methods of information security management and to develop a comprehensive model to counteract threats and information misuse. This study employs a mixed-methods approach, including both qualitative and quantitative analyses. Initially, a systematic review of previous articles and research in the field of information security was conducted. Then, using the Delphi method, interviews with 30 information security experts were conducted to gather their insights on security challenges and solutions. Based on the results of these interviews, a comprehensive model for information security management was developed. The proposed model includes advanced encryption techniques, machine learning-based intrusion detection systems, and network security protocols. AES and RSA encryption algorithms were used for data protection, and machine learning models such as Random Forest and Neural Networks were utilized for intrusion detection. Statistical analyses were performed using SPSS software. To evaluate the effectiveness of the proposed model, T-Test and ANOVA statistical tests were employed, and results were measured using accuracy, sensitivity, and specificity indicators of the models. Additionally, multiple regression analysis was conducted to examine the impact of various variables on information security. The findings of this study indicate that the comprehensive proposed model reduced cyber-attacks by an average of 85%. Statistical analysis showed that the combined use of encryption techniques and intrusion detection systems significantly improves information security. Based on the obtained results, it is recommended that organizations continuously update their information security systems and use a combination of multiple security methods to protect their data. Additionally, educating employees and raising public awareness about information security can serve as an effective tool in reducing security risks. This research demonstrates that effective and up-to-date information security management requires a comprehensive and coordinated approach, including the development and implementation of advanced techniques and continuous training of human resources.Keywords: data protection, digital technologies, information security, modern management
Procedia PDF Downloads 295646 Homogenization of a Non-Linear Problem with a Thermal Barrier
Authors: Hassan Samadi, Mustapha El Jarroudi
Abstract:
In this work, we consider the homogenization of a non-linear problem in periodic medium with two periodic connected media exchanging a heat flux throughout their common interface. The interfacial exchange coefficient λ is assumed to tend to zero or to infinity following a rate λ=λ(ε) when the size ε of the basic cell tends to zero. Three homogenized problems are determined according to some critical value depending of λ and ε. Our method is based on Γ-Convergence techniques.Keywords: variational methods, epiconvergence, homogenization, convergence technique
Procedia PDF Downloads 5255645 Contactless Electromagnetic Detection of Stress Fluctuations in Steel Elements
Authors: M. A. García, J. Vinolas, A. Hernando
Abstract:
Steel is nowadays one of the most important structural materials because of its outstanding mechanical properties. Therefore, in order to look for a sustainable economic model and to optimize the use of extensive resources, new methods to monitor and prevent failure of steel-based facilities are required. The classical mechanical tests, as for instance building tasting, are invasive and destructive. Moreover, for facilities where the steel element is embedded, (as reinforced concrete) these techniques are directly non applicable. Hence, non-invasive monitoring techniques to prevent failure, without altering the structural properties of the elements are required. Among them, electromagnetic methods are particularly suitable for non-invasive inspection of the mechanical state of steel-based elements. The magnetoelastic coupling effects induce a modification of the electromagnetic properties of an element upon applied stress. Since most steels are ferromagnetic because of their large Fe content, it is possible to inspect their structure and state in a non-invasive way. We present here a distinct electromagnetic method for contactless evaluation of internal stress in steel-based elements. In particular, this method relies on measuring the magnetic induction between two coils with the steel specimen in between them. We found that the alteration of electromagnetic properties of the steel specimen induced by applied stress-induced changes in the induction allowed us to detect stress well below half of the elastic limit of the material. Hence, it represents an outstanding non-invasive method to prevent failure in steel-based facilities. We here describe the theoretical model, present experimental results to validate it and finally we show a practical application for detection of stress and inhomogeneities in train railways.Keywords: magnetoelastic, magnetic induction, mechanical stress, steel
Procedia PDF Downloads 505644 Design, Synthesis and Evaluation of 4-(Phenylsulfonamido)Benzamide Derivatives as Selective Butyrylcholinesterase Inhibitors
Authors: Sushil Kumar Singh, Ashok Kumar, Ankit Ganeshpurkar, Ravi Singh, Devendra Kumar
Abstract:
In spectrum of neurodegenerative diseases, Alzheimer’s disease (AD) is characterized by the presence of amyloid β plaques and neurofibrillary tangles in the brain. It results in cognitive and memory impairment due to loss of cholinergic neurons, which is considered to be one of the contributing factors. Donepezil, an acetylcholinesterase (AChE) inhibitor which also inhibits butyrylcholinesterase (BuChE) and improves the memory and brain’s cognitive functions, is the most successful and prescribed drug to treat the symptoms of AD. The present work is based on designing of the selective BuChE inhibitors using computational techniques. In this work, machine learning models were trained using classification algorithms followed by screening of diverse chemical library of compounds. The various molecular modelling and simulation techniques were used to obtain the virtual hits. The amide derivatives of 4-(phenylsulfonamido) benzoic acid were synthesized and characterized using 1H & 13C NMR, FTIR and mass spectrometry. The enzyme inhibition assays were performed on equine plasma BuChE and electric eel’s AChE by method developed by Ellman et al. Compounds 31, 34, 37, 42, 49, 52 and 54 were found to be active against equine BuChE. N-(2-chlorophenyl)-4-(phenylsulfonamido)benzamide and N-(2-bromophenyl)-4-(phenylsulfonamido)benzamide (compounds 34 and 37) displayed IC50 of 61.32 ± 7.21 and 42.64 ± 2.17 nM against equine plasma BuChE. Ortho-substituted derivatives were more active against BuChE. Further, the ortho-halogen and ortho-alkyl substituted derivatives were found to be most active among all with minimal AChE inhibition. The compounds were selective toward BuChE.Keywords: Alzheimer disease, butyrylcholinesterase, machine learning, sulfonamides
Procedia PDF Downloads 1395643 Appropriation of Cryptocurrencies as a Payment Method by South African Retailers
Authors: Neliswa Dyosi
Abstract:
Purpose - Using an integrated Technology-Organization-Environment (TOE) framework and the model of technology appropriation (MTA) as a theoretical lens, this interpretive qualitative study seeks to understand and explain the factors that influence the appropriation, non-appropriation, and disappropriation of bitcoin as a payment method by South African retailers. Design/methodology/approach –The study adopts the interpretivist philosophical paradigm. Multiple case studies will be adopted as a research strategy. For data collection, the study follows a qualitative approach. Qualitative data will be collected from the six retailers in various industries. Semi-structured interviews and documents will be used as the data collection techniques. Purposive and snowballing sampling techniques will be used to identify participants within the organizations. Data will be analyzed using thematic analysis. Originality/value - Using the deduction approach, the study seeks to provide a descriptive and explanatory contribution to theory. The study contributes to theory development by integrating the MTA and TOE frameworks as a means to understand technology adoption behaviors of organizations, in this case, retailers. This is also the first study that looks at an integrated approach of the Technology-Organization-Environment (TOE) framework and the MTA framework to understand the adoption and use of a payment method. South Africa is ranked amongst the top ten countries in the world on cryptocurrency adoption. There is, however, still a dearth of literature on the current state of adoption and usage of bitcoin as a payment method in South Africa. The study will contribute to the existing literature as bitcoin cryptocurrency is gaining popularity as an alternative payment method across the globe.Keywords: cryptocurrency, bitcoin, payment methods, blockchain, appropriation, online retailers, TOE framework, disappropriation, non-appropriation
Procedia PDF Downloads 1365642 Land Cover, Land Surface Temperature, and Urban Heat Island Effects in Tropical Sub Saharan City of Accra
Authors: Eric Mensah
Abstract:
The effects of rapid urbanisation of tropical sub-Saharan developing cities on local and global climate are of great concern due to the negative impacts of Urban Heat Island (UHI) effects. The importance of urban parks, vegetative cover and forest reserves in these tropical cities have been undervalued with a rapid degradation and loss of these vegetative covers to urban developments which continue to cause an increase in daily mean temperatures and changes to local climatic conditions. Using Landsat data of the same months and period intervals, the spatial variations of land cover changes, temperature, and vegetation were examined to determine how vegetation improves local temperature and the effects of urbanisation on daily mean temperatures over the past 12 years. The remote sensing techniques of maximum likelihood supervised classification, land surface temperature retrieval technique, and normalised differential vegetation index techniques were used to analyse and create the land use land cover (LULC), land surface temperature (LST), and vegetation and non-vegetation cover maps respectively. Results from the study showed an increase in daily mean temperature by 0.80 °C as a result of rapid increase in urban area by 46.13 sq. km and loss of vegetative cover by 46.24 sq. km between 2005 and 2017. The LST map also shows the existence of UHI within the urban areas of Accra, the potential mitigating effects offered by the existence of forest and vegetative cover as demonstrated by the existence of cool islands around the Achimota ecological forest and University of Ghana botanical gardens areas.Keywords: land surface temperature, climate, remote sensing, urbanisation
Procedia PDF Downloads 3205641 Predicting the Human Impact of Natural Onset Disasters Using Pattern Recognition Techniques and Rule Based Clustering
Authors: Sara Hasani
Abstract:
This research focuses on natural sudden onset disasters characterised as ‘occurring with little or no warning and often cause excessive injuries far surpassing the national response capacities’. Based on the panel analysis of the historic record of 4,252 natural onset disasters between 1980 to 2015, a predictive method was developed to predict the human impact of the disaster (fatality, injured, homeless) with less than 3% of errors. The geographical dispersion of the disasters includes every country where the data were available and cross-examined from various humanitarian sources. The records were then filtered into 4252 records of the disasters where the five predictive variables (disaster type, HDI, DRI, population, and population density) were clearly stated. The procedure was designed based on a combination of pattern recognition techniques and rule-based clustering for prediction and discrimination analysis to validate the results further. The result indicates that there is a relationship between the disaster human impact and the five socio-economic characteristics of the affected country mentioned above. As a result, a framework was put forward, which could predict the disaster’s human impact based on their severity rank in the early hours of disaster strike. The predictions in this model were outlined in two worst and best-case scenarios, which respectively inform the lower range and higher range of the prediction. A necessity to develop the predictive framework can be highlighted by noticing that despite the existing research in literature, a framework for predicting the human impact and estimating the needs at the time of the disaster is yet to be developed. This can further be used to allocate the resources at the response phase of the disaster where the data is scarce.Keywords: disaster management, natural disaster, pattern recognition, prediction
Procedia PDF Downloads 1535640 Competing Risks Modeling Using within Node Homogeneity Classification Tree
Authors: Kazeem Adesina Dauda, Waheed Babatunde Yahya
Abstract:
To design a tree that maximizes within-node homogeneity, there is a need for a homogeneity measure that is appropriate for event history data with multiple risks. We consider the use of Deviance and Modified Cox-Snell residuals as a measure of impurity in Classification Regression Tree (CART) and compare our results with the results of Fiona (2008) in which homogeneity measures were based on Martingale Residual. Data structure approach was used to validate the performance of our proposed techniques via simulation and real life data. The results of univariate competing risk revealed that: using Deviance and Cox-Snell residuals as a response in within node homogeneity classification tree perform better than using other residuals irrespective of performance techniques. Bone marrow transplant data and double-blinded randomized clinical trial, conducted in other to compare two treatments for patients with prostate cancer were used to demonstrate the efficiency of our proposed method vis-à-vis the existing ones. Results from empirical studies of the bone marrow transplant data showed that the proposed model with Cox-Snell residual (Deviance=16.6498) performs better than both the Martingale residual (deviance=160.3592) and Deviance residual (Deviance=556.8822) in both event of interest and competing risks. Additionally, results from prostate cancer also reveal the performance of proposed model over the existing one in both causes, interestingly, Cox-Snell residual (MSE=0.01783563) outfit both the Martingale residual (MSE=0.1853148) and Deviance residual (MSE=0.8043366). Moreover, these results validate those obtained from the Monte-Carlo studies.Keywords: within-node homogeneity, Martingale residual, modified Cox-Snell residual, classification and regression tree
Procedia PDF Downloads 2725639 Erector Spine Plane Block versus Para Vertebral Block in Brest Surgery
Authors: Widad Kouachi, Nacera Benmouhoub
Abstract:
Background: Erector spinae plane block (ESP) and thoracic paravertebral block (PVB) are two widely used regional anesthesia techniques in breast cancer surgery. Both techniques aim to improve postoperative pain management and reduce opioid consumption. However, comparative data on their efficacy in oncologic breast surgery remains limited. Objectives: This study aims to compare the efficacy of ESP and PVB in postoperative pain control, patient satisfaction, and opioid consumption in breast cancer surgery. Methods: A randomized, double-blind trial was conducted involving 100 patients undergoing oncologic breast surgery. Patients were randomly assigned to two groups: 50 received ESP, and 50 received PVB. Postoperative pain scores (at rest and during movement), opioid consumption, patient satisfaction, and hospital length of stay were recorded and analyzed. Results: Both ESP and PVB provided effective postoperative analgesia. No significant difference in pain scores was observed between the two groups within the first 24 hours. However, ESP showed a notable advantage in managing chronic postoperative pain at the 6-month follow-up. Opioid consumption was lower in both groups compared to patients without a block. No significant differences in complication rates or hospital stay were noted between the groups. Conclusion: ESP and PVB offer comparable efficacy for immediate postoperative pain control in breast cancer surgery. Nevertheless, ESP may have a superior role in managing long-term pain. Further research is needed to explore the mechanisms behind the observed differences in chronic pain outcomes.Keywords: pain assessment, brest surgery, bpv block, ESP block
Procedia PDF Downloads 305638 Don't Just Guess and Slip: Estimating Bayesian Knowledge Tracing Parameters When Observations Are Scant
Authors: Michael Smalenberger
Abstract:
Intelligent tutoring systems (ITS) are computer-based platforms which can incorporate artificial intelligence to provide step-by-step guidance as students practice problem-solving skills. ITS can replicate and even exceed some benefits of one-on-one tutoring, foster transactivity in collaborative environments, and lead to substantial learning gains when used to supplement the instruction of a teacher or when used as the sole method of instruction. A common facet of many ITS is their use of Bayesian Knowledge Tracing (BKT) to estimate parameters necessary for the implementation of the artificial intelligence component, and for the probability of mastery of a knowledge component relevant to the ITS. While various techniques exist to estimate these parameters and probability of mastery, none directly and reliably ask the user to self-assess these. In this study, 111 undergraduate students used an ITS in a college-level introductory statistics course for which detailed transaction-level observations were recorded, and users were also routinely asked direct questions that would lead to such a self-assessment. Comparisons were made between these self-assessed values and those obtained using commonly used estimation techniques. Our findings show that such self-assessments are particularly relevant at the early stages of ITS usage while transaction level data are scant. Once a user’s transaction level data become available after sufficient ITS usage, these can replace the self-assessments in order to eliminate the identifiability problem in BKT. We discuss how these findings are relevant to the number of exercises necessary to lead to mastery of a knowledge component, the associated implications on learning curves, and its relevance to instruction time.Keywords: Bayesian Knowledge Tracing, Intelligent Tutoring System, in vivo study, parameter estimation
Procedia PDF Downloads 1725637 Synchrotron Based Techniques for the Characterization of Chemical Vapour Deposition Overgrowth Diamond Layers on High Pressure, High Temperature Substrates
Authors: T. N. Tran Thi, J. Morse, C. Detlefs, P. K. Cook, C. Yıldırım, A. C. Jakobsen, T. Zhou, J. Hartwig, V. Zurbig, D. Caliste, B. Fernandez, D. Eon, O. Loto, M. L. Hicks, A. Pakpour-Tabrizi, J. Baruchel
Abstract:
The ability to grow boron-doped diamond epilayers of high crystalline quality is a prerequisite for the fabrication of diamond power electronic devices, in particular high voltage diodes and metal-oxide-semiconductor (MOS) transistors. Boron and intrinsic diamond layers are homoepitaxially overgrown by microwave assisted chemical vapour deposition (MWCVD) on single crystal high pressure, high temperature (HPHT) grown bulk diamond substrates. Various epilayer thicknesses were grown, with dopant concentrations ranging from 1021 atom/cm³ at nanometer thickness in the case of 'delta doping', up 1016 atom/cm³ and 50µm thickness or high electric field drift regions. The crystalline quality of these overgrown layers as regards defects, strain, distortion… is critical for the device performance through its relation to the final electrical properties (Hall mobility, breakdown voltage...). In addition to the optimization of the epilayer growth conditions in the MWCVD reactor, other important questions related to the crystalline quality of the overgrown layer(s) are: 1) what is the dependence on the bulk quality and surface preparation methods of the HPHT diamond substrate? 2) how do defects already present in the substrate crystal propagate into the overgrown layer; 3) what types of new defects are created during overgrowth, what are their growth mechanisms, and how can these defects be avoided? 4) how can we relate in a quantitative manner parameters related to the measured crystalline quality of the boron doped layer to the electronic properties of final processed devices? We describe synchrotron-based techniques developed to address these questions. These techniques allow the visualization of local defects and crystal distortion which complements the data obtained by other well-established analysis methods such as AFM, SIMS, Hall conductivity…. We have used Grazing Incidence X-ray Diffraction (GIXRD) at the ID01 beamline of the ESRF to study lattice parameters and damage (strain, tilt and mosaic spread) both in diamond substrate near surface layers and in thick (10–50 µm) overgrown boron doped diamond epi-layers. Micro- and nano-section topography have been carried out at both the BM05 and ID06-ESRF) beamlines using rocking curve imaging techniques to study defects which have propagated from the substrate into the overgrown layer(s) and their influence on final electronic device performance. These studies were performed using various commercially sourced HPHT grown diamond substrates, with the MWCVD overgrowth carried out at the Fraunhofer IAF-Germany. The synchrotron results are in good agreement with low-temperature (5°K) cathodoluminescence spectroscopy carried out on the grown samples using an Inspect F5O FESEM fitted with an IHR spectrometer.Keywords: synchrotron X-ray diffaction, crystalline quality, defects, diamond overgrowth, rocking curve imaging
Procedia PDF Downloads 2615636 Aerodynamic Analysis by Computational Fluids Dynamics in Building: Case Study
Authors: Javier Navarro Garcia, Narciso Vazquez Carretero
Abstract:
Eurocode 1, part 1-4, wind actions, includes in its article 1.5 the possibility of using numerical calculation methods to obtain information on the loads acting on a building. On the other hand, the analysis using computational fluids dynamics (CFD) in aerospace, aeronautical, and industrial applications is already in widespread use. The application of techniques based on CFD analysis on the building to study its aerodynamic behavior now opens a whole alternative field of possibilities for civil engineering and architecture; optimization of the results with respect to those obtained by applying the regulations, the possibility of obtaining information on pressures, speeds at any point of the model for each moment, the analysis of turbulence and the possibility of modeling any geometry or configuration. The present work compares the results obtained on a building, with respect to its aerodynamic behavior, from a mathematical model based on the analysis by CFD with the results obtained by applying Eurocode1, part1-4, wind actions. It is verified that the results obtained by CFD techniques suppose an optimization of the wind action that acts on the building with respect to the wind action obtained by applying the Eurocode1, part 1-4, wind actions. In order to carry out this verification, a 45m high square base truncated pyramid building has been taken. The mathematical model on CFD, based on finite volumes, has been calculated using the FLUENT commercial computer application using a scale-resolving simulation (SRS) type large eddy simulation (LES) turbulence model for an atmospheric boundary layer wind with turbulent component in the direction of the flow.Keywords: aerodynamic, CFD, computacional fluids dynamics, computational mechanics
Procedia PDF Downloads 1375635 Implementation of the Quality Management System and Development of Organizational Learning: Case of Three Small and Medium-Sized Enterprises in Morocco
Authors: Abdelghani Boudiaf
Abstract:
The profusion of studies relating to the concept of organizational learning shows the importance that has been given to this concept in the management sciences. A few years ago, companies leaned towards ISO 9001 certification; this requires the implementation of the quality management system (QMS). In order for this objective to be achieved, companies must have a set of skills, which pushes them to develop learning through continuous training. The results of empirical research have shown that implementation of the QMS in the company promotes the development of learning. It should also be noted that several types of learning are developed in this sense. Given the nature of skills development is normative in the context of the quality demarche, companies are obliged to qualify and improve the skills of their human resources. Continuous training is the keystone to develop the necessary learning. To carry out continuous training, companies need to be able to identify their real needs by developing training plans based on well-defined engineering. The training process goes obviously through several stages. Initially, training has a general aspect, that is to say, it focuses on topics and actions of a general nature. Subsequently, this is done in a more targeted and more precise way to accompany the evolution of the QMS and also to make the changes decided each time (change of working method, change of practices, change of objectives, change of mentality, etc.). To answer our problematic we opted for the method of qualitative research. It should be noted that the case study method crosses several data collection techniques to explain and understand a phenomenon. Three cases of companies were studied as part of this research work using different data collection techniques related to this method.Keywords: changing mentalities, continuing training, organizational learning, quality management system, skills development
Procedia PDF Downloads 1105634 Combining Diffusion Maps and Diffusion Models for Enhanced Data Analysis
Authors: Meng Su
Abstract:
High-dimensional data analysis often presents challenges in capturing the complex, nonlinear relationships and manifold structures inherent to the data. This article presents a novel approach that leverages the strengths of two powerful techniques, Diffusion Maps and Diffusion Probabilistic Models (DPMs), to address these challenges. By integrating the dimensionality reduction capability of Diffusion Maps with the data modeling ability of DPMs, the proposed method aims to provide a comprehensive solution for analyzing and generating high-dimensional data. The Diffusion Map technique preserves the nonlinear relationships and manifold structure of the data by mapping it to a lower-dimensional space using the eigenvectors of the graph Laplacian matrix. Meanwhile, DPMs capture the dependencies within the data, enabling effective modeling and generation of new data points in the low-dimensional space. The generated data points can then be mapped back to the original high-dimensional space, ensuring consistency with the underlying manifold structure. Through a detailed example implementation, the article demonstrates the potential of the proposed hybrid approach to achieve more accurate and effective modeling and generation of complex, high-dimensional data. Furthermore, it discusses possible applications in various domains, such as image synthesis, time-series forecasting, and anomaly detection, and outlines future research directions for enhancing the scalability, performance, and integration with other machine learning techniques. By combining the strengths of Diffusion Maps and DPMs, this work paves the way for more advanced and robust data analysis methods.Keywords: diffusion maps, diffusion probabilistic models (DPMs), manifold learning, high-dimensional data analysis
Procedia PDF Downloads 1075633 Predicting the Impact of Scope Changes on Project Cost and Schedule Using Machine Learning Techniques
Authors: Soheila Sadeghi
Abstract:
In the dynamic landscape of project management, scope changes are an inevitable reality that can significantly impact project performance. These changes, whether initiated by stakeholders, external factors, or internal project dynamics, can lead to cost overruns and schedule delays. Accurately predicting the consequences of these changes is crucial for effective project control and informed decision-making. This study aims to develop predictive models to estimate the impact of scope changes on project cost and schedule using machine learning techniques. The research utilizes a comprehensive dataset containing detailed information on project tasks, including the Work Breakdown Structure (WBS), task type, productivity rate, estimated cost, actual cost, duration, task dependencies, scope change magnitude, and scope change timing. Multiple machine learning models are developed and evaluated to predict the impact of scope changes on project cost and schedule. These models include Linear Regression, Decision Tree, Ridge Regression, Random Forest, Gradient Boosting, and XGBoost. The dataset is split into training and testing sets, and the models are trained using the preprocessed data. Cross-validation techniques are employed to assess the robustness and generalization ability of the models. The performance of the models is evaluated using metrics such as Mean Squared Error (MSE) and R-squared. Residual plots are generated to assess the goodness of fit and identify any patterns or outliers. Hyperparameter tuning is performed to optimize the XGBoost model and improve its predictive accuracy. The feature importance analysis reveals the relative significance of different project attributes in predicting the impact on cost and schedule. Key factors such as productivity rate, scope change magnitude, task dependencies, estimated cost, actual cost, duration, and specific WBS elements are identified as influential predictors. The study highlights the importance of considering both cost and schedule implications when managing scope changes. The developed predictive models provide project managers with a data-driven tool to proactively assess the potential impact of scope changes on project cost and schedule. By leveraging these insights, project managers can make informed decisions, optimize resource allocation, and develop effective mitigation strategies. The findings of this research contribute to improved project planning, risk management, and overall project success.Keywords: cost impact, machine learning, predictive modeling, schedule impact, scope changes
Procedia PDF Downloads 395632 The Colorectal Cancer in Patients of Eastern Algeria
Authors: S. Tebibel, C. Mechati, S. Messaoudi
Abstract:
Algeria is currently experiencing the same rate of cancer progression as that registered these last years in the western countries. Colorectal cancer, constituting increasingly a major public health problem, is the most common form of cancer after breast and Neck-womb cancer at the woman and prostate cancer at the man. Our work is based on a retrospective study to determine the cases of colorectal cancer through eastern Algeria. Our goal is to carry out an epidemiological, histological and immune- histochemical study to investigate different techniques for the diagnosis of colorectal cancer and their interests and specific in detecting the disease. The study includes 110 patients (aged between 20 to 87 years) with colorectal cancer where the inclusions and exclusions criteria were established. In our study, colorectal cancer, expresses a male predominance, with a sex ratio of 1, 99 and the most affected age group is between 50 and 59 years. We noted that the colon cancer rate is higher than rectal cancer rate, whose frequencies are respectively 60,91 % and 39,09 %. In the series of colon cancer, the ADK lieberkunien is histological the most represented type, or 85,07 % of all cases. In contrast, the proportion of ADK mucinous (colloid mucous) is only 1,49% only. Well-differentiated ADKS, are very significant in our series, they represent 83,58 % of cases. Adenocarcinoma moderately and poorly differentiated, whose proportions are respectively 2,99 % and 0.05 %. For histological varieties of rectal ADK, we see in our workforce that ADK lieberkunien represent the most common histological form, or 76,74%, while the mucosal colloid is 13,95 %. Research of the mutation on the gene encoding K-ras, a major step in the targeted therapy of colorectal cancers, is underway in our study. Colorectal cancer is the subject of much promising research concern: the evaluation of new therapies (antiangiogenic monoclonal antibodies), the search for predictors of sensitivity to chemotherapy and new prognostic markers using techniques of molecular biology and proteomics.Keywords: adenocarcinoma, age, colorectal cancer, epidemiology, histological section, sex
Procedia PDF Downloads 3445631 Dosimetric Comparison among Different Head and Neck Radiotherapy Techniques Using PRESAGE™ Dosimeter
Authors: Jalil ur Rehman, Ramesh C. Tailor, Muhammad Isa Khan, Jahnzeeb Ashraf, Muhammad Afzal, Geofferry S. Ibbott
Abstract:
Purpose: The purpose of this analysis was to investigate dose distribution of different techniques (3D-CRT, IMRT and VMAT) of head and neck cancer using 3-dimensional dosimeter called PRESAGETM Dosimeter. Materials and Methods: Computer tomography (CT) scans of radiological physics center (RPC) head and neck anthropomorphic phantom with both RPC standard insert and PRESAGETM insert were acquired separated with Philipp’s CT scanner and both CT scans were exported via DICOM to the Pinnacle version 9.4 treatment planning system (TPS). Each plan was delivered twice to the RPC phantom first containing the RPC standard insert having TLD and film dosimeters and then again containing the Presage insert having 3-D dosimeter (PRESAGETM) by using a Varian True Beam linear accelerator. After irradiation, the standard insert including point dose measurements (TLD) and planar Gafchromic® EBT film measurement were read using RPC standard procedure. The 3D dose distribution from PRESAGETM was read out with the Duke Midsized optical scanner dedicated to RPC (DMOS-RPC). Dose volume histogram (DVH), mean and maximal doses for organs at risk were calculated and compared among each head and neck technique. The prescription dose was same for all head and neck radiotherapy techniques which was 6.60 Gy/friction. Beam profile comparison and gamma analysis were used to quantify agreements among film measurement, PRESAGETM measurement and calculated dose distribution. Quality assurances of all plans were performed by using ArcCHECK method. Results: VMAT delivered the lowest mean and maximum doses to organ at risk (spinal cord, parotid) than IMRT and 3DCRT. Such dose distribution was verified by absolute dose distribution using thermoluminescent dosimeter (TLD) system. The central axial, sagittal and coronal planes were evaluated using 2D gamma map criteria(± 5%/3 mm) and results were 99.82% (axial), 99.78% (sagital), 98.38% (coronal) for VMAT plan and found the agreement between PRESAGE and pinnacle was better than IMRT and 3D-CRT plan excludes a 7 mm rim at the edge of the dosimeter. Profile showed good agreement for all plans between film, PRESAGE and pinnacle and 3D gamma was performed for PTV and OARs, VMAT and 3DCRT endow with better agreement than IMRT. Conclusion: VMAT delivered lowered mean and maximal doses to organs at risk and better PTV coverage during head and neck radiotherapy. TLD, EBT film and PRESAGETM dosimeters suggest that VMAT was better for the treatment of head and neck cancer than IMRT and 3D-CRT.Keywords: RPC, 3DCRT, IMRT, VMAT, EBT2 film, TLD, PRESAGETM
Procedia PDF Downloads 3955630 The Effect of Corporate Governance on Financial Stability and Solvency Margin for Insurance Companies in Jordan
Authors: Ghadeer A.Al-Jabaree, Husam Aldeen Al-Khadash, M. Nassar
Abstract:
This study aimed at investigating the effect of well-designed corporate governance system on the financial stability of insurance companies listed in ASE. Further, this study provides a comprehensive model for evaluating and analyzing insurance companies' financial position and prospective for comparing the degree of corporate governance application provisions among Jordanian insurance companies. In order to achieve the goals of the study, a whole population that consist of (27) listed insurance companies was introduced through the variables of (board of director, audit committee, internal and external auditor, board and management ownership and block holder's identities). Statistical methods were used with alternative techniques by (SPSS); where descriptive statistical techniques such as means, standard deviations were used to describe the variables, while (F) test and ANOVA analysis of variance were used to test the hypotheses of the study. The study revealed the existence of significant effect of corporate governance variables except local companies that are not listed in ASE on financial stability within control variables especially debt ratio (leverage),where it's also showed that concentration in motor third party doesn't have significant effect on insurance companies' financial stability during study period. Moreover, the study concludes that Global financial crisis affect the investment side of insurance companies with insignificant effect on the technical side. Finally, some recommendations were presented such as enhancing the laws and regulation that help the appropriate application of corporate governance, and work on activating the transparency in the disclosures of the financial statements and focusing on supporting the technical provisions for the companies, rather than focusing only on profit side.Keywords: corporate governance, financial stability and solvency margin, insurance companies, Jordan
Procedia PDF Downloads 4895629 Permeability Prediction Based on Hydraulic Flow Unit Identification and Artificial Neural Networks
Authors: Emad A. Mohammed
Abstract:
The concept of hydraulic flow units (HFU) has been used for decades in the petroleum industry to improve the prediction of permeability. This concept is strongly related to the flow zone indicator (FZI) which is a function of the reservoir rock quality index (RQI). Both indices are based on reservoir porosity and permeability of core samples. It is assumed that core samples with similar FZI values belong to the same HFU. Thus, after dividing the porosity-permeability data based on the HFU, transformations can be done in order to estimate the permeability from the porosity. The conventional practice is to use the power law transformation using conventional HFU where percentage of error is considerably high. In this paper, neural network technique is employed as a soft computing transformation method to predict permeability instead of power law method to avoid higher percentage of error. This technique is based on HFU identification where Amaefule et al. (1993) method is utilized. In this regard, Kozeny and Carman (K–C) model, and modified K–C model by Hasan and Hossain (2011) are employed. A comparison is made between the two transformation techniques for the two porosity-permeability models. Results show that the modified K-C model helps in getting better results with lower percentage of error in predicting permeability. The results also show that the use of artificial intelligence techniques give more accurate prediction than power law method. This study was conducted on a heterogeneous complex carbonate reservoir in Oman. Data were collected from seven wells to obtain the permeability correlations for the whole field. The findings of this study will help in getting better estimation of permeability of a complex reservoir.Keywords: permeability, hydraulic flow units, artificial intelligence, correlation
Procedia PDF Downloads 1365628 3D-printing for Ablation Planning in Patients Undergoing Atrial Fibrillation Ablation: 3D-GALA Trial
Authors: Terentes Printzios Dimitrios, Loanna Gourgouli, Vlachopoulos Charalambos
Abstract:
Aims: Atrial fibrillation (AF) remains one of the major causes of stroke, heart failure, sudden death and cardiovascular morbidity. Ablation techniques are becoming more appealing after the latest results of randomized trials showing the overall clinical benefit. On the other hand, imaging techniques and the frontier application of 3D printing are emerging as a valuable ally for cardiac procedures. However, no randomized trial has directly assessed the impact of preprocedural imaging and especially 3D printing guidance for AF ablation. The present study is designed to investigate for the first time the effect of 3D printing of the heart on the safety and effectiveness of the ablation procedure. Methods and design: The 3D-GALA trial is a randomized, open-label, controlled, multicentre clinical trial of 2 parallel groups designed to enroll a total of 100 patients undergoing ablation using cryo-balloon for paroxysmal and persistent AF. Patients will be randomized with a patient allocation ratio of 1: 1 to preprocedural MRI scan of the heart and 3D printing of left atrium and pulmonary veins and cryoablation versus standard cryoablation without imaging. Patients will be followed up to 6 months after the index procedure. The primary outcome measure is the reduction of radiation dose and contrast amount during pulmonary veins isolation. Secondary endpoints will include the percentage of atrial fibrillation relapse at 24h-Holter electrocardiogram monitoring at 6 months after initial treatment. Discussion: To our knowledge, the 3D-GALA trial will be the first study to provide evidence about the clinical impact of preprocedural imaging and 3D printing before cryoablation.Keywords: atrial fibrillation, cardiac MRI, cryoablation, 3-d printing
Procedia PDF Downloads 1775627 Comparative Analysis of Control Techniques Based Sliding Mode for Transient Stability Assessment for Synchronous Multicellular Converter
Authors: Rihab Hamdi, Amel Hadri Hamida, Fatiha Khelili, Sakina Zerouali, Ouafae Bennis
Abstract:
This paper features a comparative study performance of sliding mode controller (SMC) for closed-loop voltage control of direct current to direct current (DC-DC) three-cells buck converter connected in parallel, operating in continuous conduction mode (CCM), based on pulse-width modulation (PWM) with SMC based on hysteresis modulation (HM) where an adaptive feedforward technique is adopted. On one hand, for the PWM-based SM, the approach is to incorporate a fixed-frequency PWM scheme which is effectively a variant of SM control. On the other hand, for the HM-based SM, oncoming an adaptive feedforward control that makes the hysteresis band variable in the hysteresis modulator of the SM controller in the aim to restrict the switching frequency variation in the case of any change of the line input voltage or output load variation are introduced. The results obtained under load change, input change and reference change clearly demonstrates a similar dynamic response of both proposed techniques, their effectiveness is fast and smooth tracking of the desired output voltage. The PWM-based SM technique has greatly improved the dynamic behavior with a bit advantageous compared to the HM-based SM technique, as well as provide stability in any operating conditions. Simulation studies in MATLAB/Simulink environment have been performed to verify the concept.Keywords: DC-DC converter, hysteresis modulation, parallel multi-cells converter, pulse-width modulation, robustness, sliding mode control
Procedia PDF Downloads 1675626 Advances in Genome Editing and Future Prospects for Sorghum Improvement: A Review
Authors: Micheale Yifter Weldemichael, Hailay Mehari Gebremedhn, Teklehaimanot Hailesslasie Teklu
Abstract:
Recent developments in targeted genome editing accelerated genetic research and opened new potentials to improve crops for better yields and quality. Given the significance of cereal crops as a primary source of food for the global population, the utilization of contemporary genome editing techniques like CRISPR/Cas9 is timely and crucial. CRISPR/Cas technology has enabled targeted genomic modifications, revolutionizing genetic research and exploration. Application of gene editing through CRISPR/Cas9 in enhancing sorghum is particularly vital given the current ecological, environmental, and agricultural challenges exacerbated by climate change. As sorghum is one of the main staple foods of our region and is known to be a resilient crop with a high potential to overcome the above challenges, the application of genome editing technology will enhance the investigation of gene functionality. CRISPR/Cas9 enables the improvement of desirable sorghum traits, including nutritional value, yield, resistance to pests and diseases, and tolerance to various abiotic stresses. Furthermore, CRISPR/Cas9 has the potential to perform intricate editing and reshape the existing elite sorghum varieties, and introduce new genetic variations. However, current research primarily focuses on improving the efficacy of the CRISPR/Cas9 system in successfully editing endogenous sorghum genes, making it a feasible and successful undertaking in sorghum improvement. Recent advancements and developments in CRISPR/Cas9 techniques have further empowered researchers to modify additional genes in sorghum with greater efficiency. Successful application and advancement of CRISPR techniques in sorghum will aid not only in gene discovery and the creation of novel traits that regulate gene expression and functional genomics but also in facilitating site-specific integration events. The purpose of this review is, therefore, to elucidate the current advances in sorghum genome editing and highlight its potential in addressing food security issues. It also assesses the efficiency of CRISPR-mediated improvement and its long-term effects on crop improvement and host resistance against parasites, including tissue-specific activity and the ability to induce resistance. This review ends by emphasizing the challenges and opportunities of CRISPR technology in combating parasitic plants and proposing directions for future research to safeguard global agricultural productivity.Keywords: CRISPR/Cas9, genome editing, quality, sorghum, stress, yield
Procedia PDF Downloads 385625 Comprehensive Machine Learning-Based Glucose Sensing from Near-Infrared Spectra
Authors: Bitewulign Mekonnen
Abstract:
Context: This scientific paper focuses on the use of near-infrared (NIR) spectroscopy to determine glucose concentration in aqueous solutions accurately and rapidly. The study compares six different machine learning methods for predicting glucose concentration and also explores the development of a deep learning model for classifying NIR spectra. The objective is to optimize the detection model and improve the accuracy of glucose prediction. This research is important because it provides a comprehensive analysis of various machine-learning techniques for estimating aqueous glucose concentrations. Research Aim: The aim of this study is to compare and evaluate different machine-learning methods for predicting glucose concentration from NIR spectra. Additionally, the study aims to develop and assess a deep-learning model for classifying NIR spectra. Methodology: The research methodology involves the use of machine learning and deep learning techniques. Six machine learning regression models, including support vector machine regression, partial least squares regression, extra tree regression, random forest regression, extreme gradient boosting, and principal component analysis-neural network, are employed to predict glucose concentration. The NIR spectra data is randomly divided into train and test sets, and the process is repeated ten times to increase generalization ability. In addition, a convolutional neural network is developed for classifying NIR spectra. Findings: The study reveals that the SVMR, ETR, and PCA-NN models exhibit excellent performance in predicting glucose concentration, with correlation coefficients (R) > 0.99 and determination coefficients (R²)> 0.985. The deep learning model achieves high macro-averaging scores for precision, recall, and F1-measure. These findings demonstrate the effectiveness of machine learning and deep learning methods in optimizing the detection model and improving glucose prediction accuracy. Theoretical Importance: This research contributes to the field by providing a comprehensive analysis of various machine-learning techniques for estimating glucose concentrations from NIR spectra. It also explores the use of deep learning for the classification of indistinguishable NIR spectra. The findings highlight the potential of machine learning and deep learning in enhancing the prediction accuracy of glucose-relevant features. Data Collection and Analysis Procedures: The NIR spectra and corresponding references for glucose concentration are measured in increments of 20 mg/dl. The data is randomly divided into train and test sets, and the models are evaluated using regression analysis and classification metrics. The performance of each model is assessed based on correlation coefficients, determination coefficients, precision, recall, and F1-measure. Question Addressed: The study addresses the question of whether machine learning and deep learning methods can optimize the detection model and improve the accuracy of glucose prediction from NIR spectra. Conclusion: The research demonstrates that machine learning and deep learning methods can effectively predict glucose concentration from NIR spectra. The SVMR, ETR, and PCA-NN models exhibit superior performance, while the deep learning model achieves high classification scores. These findings suggest that machine learning and deep learning techniques can be used to improve the prediction accuracy of glucose-relevant features. Further research is needed to explore their clinical utility in analyzing complex matrices, such as blood glucose levels.Keywords: machine learning, signal processing, near-infrared spectroscopy, support vector machine, neural network
Procedia PDF Downloads 945624 Fight against Money Laundering with Optical Character Recognition
Authors: Saikiran Subbagari, Avinash Malladhi
Abstract:
Anti Money Laundering (AML) regulations are designed to prevent money laundering and terrorist financing activities worldwide. Financial institutions around the world are legally obligated to identify, assess and mitigate the risks associated with money laundering and report any suspicious transactions to governing authorities. With increasing volumes of data to analyze, financial institutions seek to automate their AML processes. In the rise of financial crimes, optical character recognition (OCR), in combination with machine learning (ML) algorithms, serves as a crucial tool for automating AML processes by extracting the data from documents and identifying suspicious transactions. In this paper, we examine the utilization of OCR for AML and delve into various OCR techniques employed in AML processes. These techniques encompass template-based, feature-based, neural network-based, natural language processing (NLP), hidden markov models (HMMs), conditional random fields (CRFs), binarizations, pattern matching and stroke width transform (SWT). We evaluate each technique, discussing their strengths and constraints. Also, we emphasize on how OCR can improve the accuracy of customer identity verification by comparing the extracted text with the office of foreign assets control (OFAC) watchlist. We will also discuss how OCR helps to overcome language barriers in AML compliance. We also address the implementation challenges that OCR-based AML systems may face and offer recommendations for financial institutions based on the data from previous research studies, which illustrate the effectiveness of OCR-based AML.Keywords: anti-money laundering, compliance, financial crimes, fraud detection, machine learning, optical character recognition
Procedia PDF Downloads 144