Search results for: computer assisted classification
2923 AI-Based Autonomous Plant Health Monitoring and Control System with Visual Health-Scoring Models
Authors: Uvais Qidwai, Amor Moursi, Mohamed Tahar, Malek Hamad, Hamad Alansi
Abstract:
This paper focuses on the development and implementation of an advanced plant health monitoring system with an AI backbone and IoT sensory network. Our approach involves addressing the critical environmental factors essential for preserving a plant’s well-being, including air temperature, soil moisture, soil temperature, soil conductivity, pH, water levels, and humidity, as well as the presence of essential nutrients like nitrogen, phosphorus, and potassium. Central to our methodology is the utilization of computer vision technology, particularly a night vision camera. The captured data is then compared against a reference database containing different health statuses. This comparative analysis is implemented using an AI deep learning model, which enables us to generate accurate assessments of plant health status. By combining the AI-based decision-making approach, our system aims to provide precise and timely insights into the overall health and well-being of plants, offering a valuable tool for effective plant care and management.Keywords: deep learning image model, IoT sensing, cloud-based analysis, remote monitoring app, computer vision, fuzzy control
Procedia PDF Downloads 542922 Assessment of a Coupled Geothermal-Solar Thermal Based Hydrogen Production System
Authors: Maryam Hamlehdar, Guillermo A. Narsilio
Abstract:
To enhance the feasibility of utilising geothermal hot sedimentary aquifers (HSAs) for clean hydrogen production, one approach is the implementation of solar-integrated geothermal energy systems. This detailed modelling study conducts a thermo-economic assessment of an advanced Organic Rankine Cycle (ORC)-based hydrogen production system that uses low-temperature geothermal reservoirs, with a specific focus on hot sedimentary aquifers (HSAs) over a 30-year period. In the proposed hybrid system, solar-thermal energy is used to raise the water temperature extracted from the geothermal production well. This temperature increase leads to a higher steam output, powering the turbine and subsequently enhancing the electricity output for running the electrolyser. Thermodynamic modeling of a parabolic trough solar (PTS) collector is developed and integrated with modeling for a geothermal-based configuration. This configuration includes a closed regenerator cycle (CRC), proton exchange membrane (PEM) electrolyser, and thermoelectric generator (TEG). Following this, the study investigates the impact of solar energy use on the temperature enhancement of the geothermal reservoir. It assesses the resulting consequences on the lifecycle performance of the hydrogen production system in comparison with a standalone geothermal system. The results indicate that, with the appropriate solar collector area, a combined solar-geothermal hydrogen production system outperforms a standalone geothermal system in both cost and rate of production. These findings underscore a solar-assisted geothermal hybrid system holds the potential to generate lower-cost hydrogen with enhanced efficiency, thereby boosting the appeal of numerous low to medium-temperature geothermal sources for hydrogen production.Keywords: clean hydrogen production, integrated solar-geothermal, low-temperature geothermal energy, numerical modelling
Procedia PDF Downloads 692921 The Use of Thermal Infrared Wavelengths to Determine the Volcanic Soils
Authors: Levent Basayigit, Mert Dedeoglu, Fadime Ozogul
Abstract:
In this study, an application was carried out to determine the Volcanic Soils by using remote sensing. The study area was located on the Golcuk formation in Isparta-Turkey. The thermal bands of Landsat 7 image were used for processing. The implementation of the climate model that was based on the water index was used in ERDAS Imagine software together with pixel based image classification. Soil Moisture Index (SMI) was modeled by using the surface temperature (Ts) which was obtained from thermal bands and vegetation index (NDVI) derived from Landsat 7. Surface moisture values were grouped and classified by using scoring system. Thematic layers were compared together with the field studies. Consequently, different moisture levels for volcanic soils were indicator for determination and separation. Those thermal wavelengths are preferable bands for separation of volcanic soils using moisture and temperature models.Keywords: Landsat 7, soil moisture index, temperature models, volcanic soils
Procedia PDF Downloads 3062920 A Five-Year Follow-up Survey Using Regression Analysis Finds Only Maternal Age to Be a Significant Medical Predictor for Infertility Treatment
Authors: Lea Stein, Sabine Rösner, Alessandra Lo Giudice, Beate Ditzen, Tewes Wischmann
Abstract:
For many couples bearing children is a consistent life goal; however, it cannot always be fulfilled. Undergoing infertility treatment does not guarantee pregnancies and live births. Couples have to deal with miscarriages and sometimes even discontinue infertility treatment. Significant medical predictors for the outcome of infertility treatment have yet to be fully identified. To further our understanding, a cross-sectional five-year follow-up survey was undertaken, in which 95 women and 82 men that have been treated at the Women’s Hospital of Heidelberg University participated. Binary logistic regressions, parametric and non-parametric methods were used for our sample to determine the relevance of biological (infertility diagnoses, maternal and paternal age) and lifestyle factors (smoking, drinking, over- and underweight) on the outcome of infertility treatment (clinical pregnancy, live birth, miscarriage, dropout rate). During infertility treatment, 72.6% of couples became pregnant and 69.5% were able to give birth. Suffering from miscarriages 27.5% of couples and 20.5% decided to discontinue an unsuccessful fertility treatment. The binary logistic regression models for clinical pregnancies, live births and dropouts were statistically significant for the maternal age, whereas the paternal age in addition to maternal and paternal BMI, smoking, infertility diagnoses and infections, showed no significant predicting effect on any of the outcome variables. The results confirm an effect of maternal age on infertility treatment, whereas the relevance of other medical predictors remains unclear. Further investigations should be considered to increase our knowledge of medical predictors.Keywords: advanced maternal age, assisted reproductive technology, female factor, male factor, medical predictors, infertility treatment, reproductive medicine
Procedia PDF Downloads 1102919 Intelligent Prediction of Breast Cancer Severity
Authors: Wahab Ali, Oyebade K. Oyedotun, Adnan Khashman
Abstract:
Breast cancer remains a threat to the woman’s world in view of survival rates, it early diagnosis and mortality statistics. So far, research has shown that many survivors of breast cancer cases are in the ones with early diagnosis. Breast cancer is usually categorized into stages which indicates its severity and corresponding survival rates for patients. Investigations show that the farther into the stages before diagnosis the lesser the chance of survival; hence the early diagnosis of breast cancer becomes imperative, and consequently the application of novel technologies to achieving this. Over the year, mammograms have used in the diagnosis of breast cancer, but the inconclusive deductions made from such scans lead to either false negative cases where cancer patients may be left untreated or false positive where unnecessary biopsies are carried out. This paper presents the application of artificial neural networks in the prediction of severity of breast tumour (whether benign or malignant) using mammography reports and other factors that are related to breast cancer.Keywords: breast cancer, intelligent classification, neural networks, mammography
Procedia PDF Downloads 4872918 Computer-Aided Ship Design Approach for Non-Uniform Rational Basis Spline Based Ship Hull Surface Geometry
Authors: Anu S. Nair, V. Anantha Subramanian
Abstract:
This paper presents a surface development and fairing technique combining the features of a modern computer-aided design tool namely the Non-Uniform Rational Basis Spline (NURBS) with an algorithm to obtain a rapidly faired hull form. Some of the older series based designs give sectional area distribution such as in the Wageningen-Lap Series. Others such as the FORMDATA give more comprehensive offset data points. Nevertheless, this basic data still requires fairing to obtain an acceptable faired hull form. This method uses the input of sectional area distribution as an example and arrives at the faired form. Characteristic section shapes define any general ship hull form in the entrance, parallel mid-body and run regions. The method defines a minimum of control points at each section and using the Golden search method or the bisection method; the section shape converges to the one with the prescribed sectional area with a minimized error in the area fit. The section shapes combine into evolving the faired surface by NURBS and typically takes 20 iterations. The advantage of the method is that it is fast, robust and evolves the faired hull form through minimal iterations. The curvature criterion check for the hull lines shows the evolution of the smooth faired surface. The method is applicable to hull form from any parent series and the evolved form can be evaluated for hydrodynamic performance as is done in more modern design practice. The method can handle complex shape such as that of the bulbous bow. Surface patches developed fit together at their common boundaries with curvature continuity and fairness check. The development is coded in MATLAB and the example illustrates the development of the method. The most important advantage is quick time, the rapid iterative fairing of the hull form.Keywords: computer-aided design, methodical series, NURBS, ship design
Procedia PDF Downloads 1692917 An Approach of Computer Modalities for Exploration of Hieroglyphics Substantial in an Investigation
Authors: Aditi Chauhan, Neethu S. Mohan
Abstract:
In the modern era, the advancement and digitalization in technology have taken place during an investigation of crime scene. The rapid enhancement and investigative techniques have changed the mean of identification of suspect. Identification of the person is one of the significant aspects, and personal authentication is the key of security and reliability in society. Since early 90 s, people have relied on comparing handwriting through its class and individual characteristics. But in today’s 21st century we need more reliable means to identify individual through handwriting. An approach employing computer modalities have lately proved itself auspicious enough in exploration of hieroglyphics substantial in investigating the case. Various software’s such as FISH, WRITEON, and PIKASO, CEDAR-FOX SYSTEM identify and verify the associated quantitative measure of the similarity between two samples. The research till date has been confined to identify the authorship of the concerned samples. But prospects associated with the use of computational modalities might help to identify disguised writing, forged handwriting or say altered or modified writing. Considering the applications of such modal, similar work is sure to attract plethora of research in immediate future. It has a promising role in national security too. Documents exchanged among terrorist can also be brought under the radar of surveillance, bringing forth their source of existence.Keywords: documents, identity, computational system, suspect
Procedia PDF Downloads 1762916 Nonmedical Determinants of Congenital Heart Diseases in Children from the Perspective of Mothers: A Qualitative Study in Iran
Authors: Maryam Borjali
Abstract:
Introduction. Mortality due to noncommunicable diseases has increased in the world today with the advent of demographic shifts, growing age, and lifestyle patterns in the world, which have been affected by economic and social crises. Congenital heart defects are one of the forms of diseases that have raised infant mortality worldwide. e objective of present study was to identify nonmedical determinants related to this abnormality from the mother’s perspectives. Methods. is research was a qualitative study and the data collection method was a semistructured interview with mothers who had children with congenital heart diseases referring to the Shahid Rajaei Heart Hospital in Tehran, Iran. A thematic analysis approach was employed to analyze transcribed documents assisted by MAXQDA Plus version 12. Results. Four general themes and ten subthemes including social contexts (social harms, social interactions, and social necessities), psychological contexts (mood disorders and mental well-being), cultural contexts (unhealthy lifestyle, family culture, and poor parental health behaviors), and environmental contexts (living area and polluted air) were extracted from interviews with mothers of children with congenital heart diseases. Conclusions. Results suggest that factors such as childhood poverty, lack of parental awareness of congenital diseases, lack of proper nutrition and health facilities, education, and lack of medical supervision during pregnancy were most related with the birth of children with congenital heart disease from mothers’ prospective. In this regard, targeted and intersectorial collaborations are proposed to address nonmedical determinants related to the incidence of congenital heart diseases.Keywords: congenital_cou, cultural, social, platform
Procedia PDF Downloads 992915 Set-point Performance Evaluation of Robust Back-Stepping Control Design for a Nonlinear Electro-Hydraulic Servo System
Authors: Maria Ahmadnezhad, Seyedgharani Ghoreishi
Abstract:
Electrohydraulic servo system have been used in industry in a wide number of applications. Its dynamics are highly nonlinear and also have large extent of model uncertainties and external disturbances. In this thesis, a robust back-stepping control (RBSC) scheme is proposed to overcome the problem of disturbances and system uncertainties effectively and to improve the set-point performance of EHS systems. In order to implement the proposed control scheme, the system uncertainties in EHS systems are considered as total leakage coefficient and effective oil volume. In addition, in order to obtain the virtual controls for stabilizing system, the update rule for the system uncertainty term is induced by the Lyapunov control function (LCF). To verify the performance and robustness of the proposed control system, computer simulation of the proposed control system using Matlab/Simulink Software is executed. From the computer simulation, it was found that the RBSC system produces the desired set-point performance and has robustness to the disturbances and system uncertainties of EHS systems.Keywords: electro hydraulic servo system, back-stepping control, robust back-stepping control, Lyapunov redesign
Procedia PDF Downloads 10042914 Plasma Engineered Nanorough Substrates for Stem Cells in vitro Culture
Authors: Melanie Macgregor-Ramiasa, Isabel Hopp, Patricia Murray, Krasimir Vasilev
Abstract:
Stem cells based therapies are one of the greatest promises of new-age medicine due to their potential to help curing most dreaded conditions such as cancer, diabetes and even auto-immune disease. However, establishing suitable in vitro culture materials allowing to control the fate of stem cells remain a challenge. Amongst the factor influencing stem cell behavior, substrate chemistry and nanotopogaphy are particularly critical. In this work, we used plasma assisted surface modification methods to produce model substrates with tailored nanotopography and controlled chemistry. Three different sizes of gold nanoparticles were bound to amine rich plasma polymer layers to produce homogeneous and gradient surface nanotopographies. The outer chemistry of the substrate was kept constant for all substrates by depositing a thin layer of our patented biocompatible polyoxazoline plasma polymer on top of the nanofeatures. For the first time, protein adsorption and stem cell behaviour (mouse kidney stem cells and mesenchymal stem cells) were evaluated on nanorough plasma deposited polyoxazoline thin films. Compared to other nitrogen rich coatings, polyoxazoline plasma polymer supports the covalent binding of proteins. Moderate surface nanoroughness, in both size and density, triggers cell proliferation. In association with polyoxazoline coating, cell proliferation is further enhanced on nanorough substrates. Results are discussed in term of substrates wetting properties. These findings provide valuable insights on the mechanisms governing the interactions between stem cells and their growth support.Keywords: nanotopography, stem cells, differentiation, plasma polymer, oxazoline, gold nanoparticles
Procedia PDF Downloads 2802913 Antioxidative Maillard Reaction Products Derived from Gelatin Hydrolysate of Unicorn Leatherjacket Skin
Authors: Supatra Karnjanapratum, Soottawat Benjakul
Abstract:
Gelatin hydrolysate, especially from marine resource, has been known to possess antioxidative activity. Nevertheless, the activity is still lower in comparison with the commercially available antioxidant. Maillard reactions can be use to increase antioxidative activity of gelatin hydrolysate, in which the numerous amino group could be involved in glycation. In the present study, gelatin hydrolysate (GH) from unicorn leatherjacket skin prepared using glycyl endopeptidase with prior autolysis assisted process was used for preparation of Maillard reaction products (MRPs) under dry condition. The impacts of different factors including, types of saccharides, GH to saccharide ratio, incubation temperatures, relative humidity (RH) and times on antioxidative activity of MRPs were investigated. MRPs prepared using the mixture of GH and galactose showed the highest antioxidative activity as determined by both ABTS radical scavenging activity and ferric reducing antioxidant power during heating (0-48 h) at 60 °C with 65% RH, compared with those derived from other saccharide tested. GH to galactose ratio at 2:1 (w/w) yielded the MRPs with the highest antioxidative activity, followed by the ratios of 1:1 and 1:2, respectively. When the effects of incubation temperatures (50, 60, 70 °C) and RH (55, 65, 75%) were examined, the highest browning index and the absorbance at 280 nm were found at 70 °C, regardless of RH. The pH and free amino group content of MRPs were decreased with the concomitant increase in antioxidative activity as the reaction time increased. Antioxidative activity of MRPs generally increased with increasing temperature and the highest antioxidative activity was found when RH of 55% was used. Based on electrophoresis of MRP, the polymerization along with the formation of high molecular weight material was observed. The optimal condition for preparing antioxidative MRPs was heating the mixture of GH and galactose (2:1) at 70 °C and 55% RH for 36 h. Therefore, antioxidative activity of GH was improved by Maillard reaction and the resulting MRP could be used as natural antioxidant in food products.Keywords: antioxidative activity, gelatin hydrolysate, maillard reaction, unicorn leatherjacket
Procedia PDF Downloads 2482912 Bridging the Divide: Mixed-Method Analysis of Student Engagement and Outcomes in Diverse Postgraduate Cohorts
Authors: A.Knox
Abstract:
Student diversity in postgraduate classes puts major challenges on educators seeking to encourage student engagement and desired to learn outcomes. This paper outlines the impact of a set of teaching initiatives aimed at addressing challenges associated with teaching and learning in an environment characterized by diversity in the student cohort. The study examines postgraduate students completing the core capstone unit within a specialized business degree. Although relatively small, the student cohort is highly diverse in terms of cultural backgrounds represented, prior learning and/or qualifications, as well as duration and type of work experience relevant to the degree, is completed. The wide range of cultures, existing knowledge and experience create enormous challenges with respect to students’ learning needs and outcomes. Subsequently, a suite of teaching innovations has been adopted to enhance curriculum content/delivery and the design of assessments. This paper explores the impact of these specific teaching and learning practices, examining the ways they have supported students’ diverse needs and enhanced students’ learning outcomes. Data from surveys and focus groups are used to assess the effectiveness of these practices. The results highlight the effectiveness of peer-assisted learning, cultural competence-building, and advanced assessment options in addressing diverse student needs and enhancing student engagement and learning outcomes. These findings suggest that such practices would benefit students’ learning in environments marked by diversity in the student cohort. Specific recommendations are offered for other educators working with diverse classes.Keywords: assessment design, curriculum content, curriculum delivery, student diversity
Procedia PDF Downloads 1102911 Numerical Study on Parallel Rear-Spoiler on Super Cars
Authors: Anshul Ashu
Abstract:
Computers are applied to the vehicle aerodynamics in two ways. One of two is Computational Fluid Dynamics (CFD) and other is Computer Aided Flow Visualization (CAFV). Out of two CFD is chosen because it shows the result with computer graphics. The simulation of flow field around the vehicle is one of the important CFD applications. The flow field can be solved numerically using panel methods, k-ε method, and direct simulation methods. The spoiler is the tool in vehicle aerodynamics used to minimize unfavorable aerodynamic effects around the vehicle and the parallel spoiler is set of two spoilers which are designed in such a manner that it could effectively reduce the drag. In this study, the standard k-ε model of the simplified version of Bugatti Veyron, Audi R8 and Porsche 911 are used to simulate the external flow field. Flow simulation is done for variable Reynolds number. The flow simulation consists of three different levels, first over the model without a rear spoiler, second for over model with single rear spoiler, and third over the model with parallel rear-spoiler. The second and third level has following parameter: the shape of the spoiler, the angle of attack and attachment position. A thorough analysis of simulations results has been found. And a new parallel spoiler is designed. It shows a little improvement in vehicle aerodynamics with a decrease in vehicle aerodynamic drag and lift. Hence, it leads to good fuel economy and traction force of the model.Keywords: drag, lift, flow simulation, spoiler
Procedia PDF Downloads 5002910 Non Enzymatic Electrochemical Sensing of Glucose Using Manganese Doped Nickel Oxide Nanoparticles Decorated Carbon Nanotubes
Authors: Anju Joshi, C. N. Tharamani
Abstract:
Diabetes is one of the leading cause of death at present and remains an important concern as the prevalence of the disease is increasing at an alarming rate. Therefore, it is crucial to diagnose the accurate levels of glucose for developing an efficient therapeutic for diabetes. Due to the availability of convenient and compact self-testing, continuous monitoring of glucose is feasible nowadays. Enzyme based electrochemical sensing of glucose is quite popular because of its high selectivity but suffers from drawbacks like complicated purification and immobilization procedures, denaturation, high cost, and low sensitivity due to indirect electron transfer. Hence, designing a robust enzyme free platform using transition metal oxides remains crucial for the efficient and sensitive determination of glucose. In the present work, manganese doped nickel oxide nanoparticles (Mn-NiO) has been synthesized onto the surface of multiwalled carbon nanotubes using a simple microwave assisted approach for non-enzymatic electrochemical sensing of glucose. The morphology and structure of the synthesized nanostructures were characterized using scanning electron microscopy (SEM) and X-Ray diffraction (XRD). We demonstrate that the synthesized nanostructures show enormous potential for electrocatalytic oxidation of glucose with high sensitivity and selectivity. Cyclic voltammetry and square wave voltammetry studies suggest superior sensitivity and selectivity of Mn-NiO decorated carbon nanotubes towards the non-enzymatic determination of glucose. A linear response between the peak current and the concentration of glucose has been found to be in the concentration range of 0.01 μM- 10000 μM which suggests the potential efficacy of Mn-NiO decorated carbon nanotubes for sensitive determination of glucose.Keywords: diabetes, glucose, Mn-NiO decorated carbon nanotubes, non-enzymatic
Procedia PDF Downloads 2362909 Randomness in Cybertext: A Study on Computer-Generated Poetry from the Perspective of Semiotics
Authors: Hongliang Zhang
Abstract:
The use of chance procedures and randomizers in poetry-writing can be traced back to surrealist works, which, by appealing to Sigmund Freud's theories, were still logocentrism. In the 1960s, random permutation and combination were extensively used by the Oulipo, John Cage and Jackson Mac Low, which further deconstructed the metaphysical presence of writing. Today, the randomly-generated digital poetry has emerged as a genre of cybertext which should be co-authored by readers. At the same time, the classical theories have now been updated by cybernetics and media theories. N· Katherine Hayles put forward the concept of ‘the floating signifiers’ by Jacques Lacan to be the ‘the flickering signifiers’ , arguing that the technology per se has become a part of the textual production. This paper makes a historical review of the computer-generated poetry in the perspective of semiotics, emphasizing that the randomly-generated digital poetry which hands over the dual tasks of both interpretation and writing to the readers demonstrates the intervention of media technology in literature. With the participation of computerized algorithm and programming languages, poems randomly generated by computers have not only blurred the boundary between encoder and decoder, but also raises the issue of human-machine. It is also a significant feature of the cybertext that the productive process of the text is full of randomness.Keywords: cybertext, digital poetry, poetry generator, semiotics
Procedia PDF Downloads 1752908 ExactData Smart Tool For Marketing Analysis
Authors: Aleksandra Jonas, Aleksandra Gronowska, Maciej Ścigacz, Szymon Jadczak
Abstract:
Exact Data is a smart tool which helps with meaningful marketing content creation. It helps marketers achieve this by analyzing the text of an advertisement before and after its publication on social media sites like Facebook or Instagram. In our research we focus on four areas of natural language processing (NLP): grammar correction, sentiment analysis, irony detection and advertisement interpretation. Our research has identified a considerable lack of NLP tools for the Polish language, which specifically aid online marketers. In light of this, our research team has set out to create a robust and versatile NLP tool for the Polish language. The primary objective of our research is to develop a tool that can perform a range of language processing tasks in this language, such as sentiment analysis, text classification, text correction and text interpretation. Our team has been working diligently to create a tool that is accurate, reliable, and adaptable to the specific linguistic features of Polish, and that can provide valuable insights for a wide range of marketers needs. In addition to the Polish language version, we are also developing an English version of the tool, which will enable us to expand the reach and impact of our research to a wider audience. Another area of focus in our research involves tackling the challenge of the limited availability of linguistically diverse corpora for non-English languages, which presents a significant barrier in the development of NLP applications. One approach we have been pursuing is the translation of existing English corpora, which would enable us to use the wealth of linguistic resources available in English for other languages. Furthermore, we are looking into other methods, such as gathering language samples from social media platforms. By analyzing the language used in social media posts, we can collect a wide range of data that reflects the unique linguistic characteristics of specific regions and communities, which can then be used to enhance the accuracy and performance of NLP algorithms for non-English languages. In doing so, we hope to broaden the scope and capabilities of NLP applications. Our research focuses on several key NLP techniques including sentiment analysis, text classification, text interpretation and text correction. To ensure that we can achieve the best possible performance for these techniques, we are evaluating and comparing different approaches and strategies for implementing them. We are exploring a range of different methods, including transformers and convolutional neural networks (CNNs), to determine which ones are most effective for different types of NLP tasks. By analyzing the strengths and weaknesses of each approach, we can identify the most effective techniques for specific use cases, and further enhance the performance of our tool. Our research aims to create a tool, which can provide a comprehensive analysis of advertising effectiveness, allowing marketers to identify areas for improvement and optimize their advertising strategies. The results of this study suggest that a smart tool for advertisement analysis can provide valuable insights for businesses seeking to create effective advertising campaigns.Keywords: NLP, AI, IT, language, marketing, analysis
Procedia PDF Downloads 862907 Identification of EEG Attention Level Using Empirical Mode Decompositions for BCI Applications
Authors: Chia-Ju Peng, Shih-Jui Chen
Abstract:
This paper proposes a method to discriminate electroencephalogram (EEG) signals between different concentration states using empirical mode decomposition (EMD). Brain-computer interface (BCI), also called brain-machine interface, is a direct communication pathway between the brain and an external device without the inherent pathway such as the peripheral nervous system or skeletal muscles. Attention level is a common index as a control signal of BCI systems. The EEG signals acquired from people paying attention or in relaxation, respectively, are decomposed into a set of intrinsic mode functions (IMF) by EMD. Fast Fourier transform (FFT) analysis is then applied to each IMF to obtain the frequency spectrums. By observing power spectrums of IMFs, the proposed method has the better identification of EEG attention level than the original EEG signals between different concentration states. The band power of IMF3 is the most obvious especially in β wave, which corresponds to fully awake and generally alert. The signal processing method and results of this experiment paves a new way for BCI robotic system using the attention-level control strategy. The integrated signal processing method reveals appropriate information for discrimination of the attention and relaxation, contributing to a more enhanced BCI performance.Keywords: biomedical engineering, brain computer interface, electroencephalography, rehabilitation
Procedia PDF Downloads 3912906 Syndromic Surveillance Framework Using Tweets Data Analytics
Authors: David Ming Liu, Benjamin Hirsch, Bashir Aden
Abstract:
Syndromic surveillance is to detect or predict disease outbreaks through the analysis of medical sources of data. Using social media data like tweets to do syndromic surveillance becomes more and more popular with the aid of open platform to collect data and the advantage of microblogging text and mobile geographic location features. In this paper, a Syndromic Surveillance Framework is presented with machine learning kernel using tweets data analytics. Influenza and the three cities Abu Dhabi, Al Ain and Dubai of United Arabic Emirates are used as the test disease and trial areas. Hospital cases data provided by the Health Authority of Abu Dhabi (HAAD) are used for the correlation purpose. In our model, Latent Dirichlet allocation (LDA) engine is adapted to do supervised learning classification and N-Fold cross validation confusion matrix are given as the simulation results with overall system recall 85.595% performance achieved.Keywords: Syndromic surveillance, Tweets, Machine Learning, data mining, Latent Dirichlet allocation (LDA), Influenza
Procedia PDF Downloads 1162905 A Mutually Exclusive Task Generation Method Based on Data Augmentation
Authors: Haojie Wang, Xun Li, Rui Yin
Abstract:
In order to solve the memorization overfitting in the model-agnostic meta-learning MAML algorithm, a method of generating mutually exclusive tasks based on data augmentation is proposed. This method generates a mutex task by corresponding one feature of the data to multiple labels so that the generated mutex task is inconsistent with the data distribution in the initial dataset. Because generating mutex tasks for all data will produce a large number of invalid data and, in the worst case, lead to an exponential growth of computation, this paper also proposes a key data extraction method that only extract part of the data to generate the mutex task. The experiments show that the method of generating mutually exclusive tasks can effectively solve the memorization overfitting in the meta-learning MAML algorithm.Keywords: mutex task generation, data augmentation, meta-learning, text classification.
Procedia PDF Downloads 1432904 A Weighted Approach to Unconstrained Iris Recognition
Authors: Yao-Hong Tsai
Abstract:
This paper presents a weighted approach to unconstrained iris recognition. Nowadays, commercial systems are usually characterized by strong acquisition constraints based on the subject’s cooperation. However, it is not always achievable for real scenarios in our daily life. Researchers have been focused on reducing these constraints and maintaining the performance of the system by new techniques at the same time. With large variation in the environment, there are two main improvements to develop the proposed iris recognition system. For solving extremely uneven lighting condition, statistic based illumination normalization is first used on eye region to increase the accuracy of iris feature. The detection of the iris image is based on Adaboost algorithm. Secondly, the weighted approach is designed by Gaussian functions according to the distance to the center of the iris. Furthermore, local binary pattern (LBP) histogram is then applied to texture classification with the weight. Experiment showed that the proposed system provided users a more flexible and feasible way to interact with the verification system through iris recognition.Keywords: authentication, iris recognition, adaboost, local binary pattern
Procedia PDF Downloads 2252903 Evaluation of Modern Natural Language Processing Techniques via Measuring a Company's Public Perception
Authors: Burak Oksuzoglu, Savas Yildirim, Ferhat Kutlu
Abstract:
Opinion mining (OM) is one of the natural language processing (NLP) problems to determine the polarity of opinions, mostly represented on a positive-neutral-negative axis. The data for OM is usually collected from various social media platforms. In an era where social media has considerable control over companies’ futures, it’s worth understanding social media and taking actions accordingly. OM comes to the fore here as the scale of the discussion about companies increases, and it becomes unfeasible to gauge opinion on individual levels. Thus, the companies opt to automize this process by applying machine learning (ML) approaches to their data. For the last two decades, OM or sentiment analysis (SA) has been mainly performed by applying ML classification algorithms such as support vector machines (SVM) and Naïve Bayes to a bag of n-gram representations of textual data. With the advent of deep learning and its apparent success in NLP, traditional methods have become obsolete. Transfer learning paradigm that has been commonly used in computer vision (CV) problems started to shape NLP approaches and language models (LM) lately. This gave a sudden rise to the usage of the pretrained language model (PTM), which contains language representations that are obtained by training it on the large datasets using self-supervised learning objectives. The PTMs are further fine-tuned by a specialized downstream task dataset to produce efficient models for various NLP tasks such as OM, NER (Named-Entity Recognition), Question Answering (QA), and so forth. In this study, the traditional and modern NLP approaches have been evaluated for OM by using a sizable corpus belonging to a large private company containing about 76,000 comments in Turkish: SVM with a bag of n-grams, and two chosen pre-trained models, multilingual universal sentence encoder (MUSE) and bidirectional encoder representations from transformers (BERT). The MUSE model is a multilingual model that supports 16 languages, including Turkish, and it is based on convolutional neural networks. The BERT is a monolingual model in our case and transformers-based neural networks. It uses a masked language model and next sentence prediction tasks that allow the bidirectional training of the transformers. During the training phase of the architecture, pre-processing operations such as morphological parsing, stemming, and spelling correction was not used since the experiments showed that their contribution to the model performance was found insignificant even though Turkish is a highly agglutinative and inflective language. The results show that usage of deep learning methods with pre-trained models and fine-tuning achieve about 11% improvement over SVM for OM. The BERT model achieved around 94% prediction accuracy while the MUSE model achieved around 88% and SVM did around 83%. The MUSE multilingual model shows better results than SVM, but it still performs worse than the monolingual BERT model.Keywords: BERT, MUSE, opinion mining, pretrained language model, SVM, Turkish
Procedia PDF Downloads 1462902 A Five-Year Experience of Intensity Modulated Radiotherapy in Nasopharyngeal Carcinomas in Tunisia
Authors: Omar Nouri, Wafa Mnejja, Fatma Dhouib, Syrine Zouari, Wicem Siala, Ilhem Charfeddine, Afef Khanfir, Leila Farhat, Nejla Fourati, Jamel Daoud
Abstract:
Purpose and Objective: Intensity modulated radiation (IMRT) technique, associated with induction chemotherapy (IC) and/or concomitant chemotherapy (CC), is actually the recommended treatment modality for nasopharyngeal carcinomas (NPC). The aim of this study was to evaluate the therapeutic results and the patterns of relapse with this treatment protocol. Material and methods: A retrospective monocentric study of 145 patients with NPC treated between June 2016 and July 2021. All patients received IMRT with integrated simultaneous boost (SIB) of 33 daily fractions at a dose of 69.96 Gy for high-risk volume, 60 Gy for intermediate risk volume and 54 Gy for low-risk volume. The high-risk volume dose was 66.5 Gy in children. Survival analysis was performed according to the Kaplan-Meier method, and the Log-rank test was used to compare factors that may influence survival. Results: Median age was 48 years (11-80) with a sex ratio of 2.9. One hundred-twenty tumors (82.7%) were classified as stages III-IV according to the 2017 UICC TNM classification. Ten patients (6.9%) were metastatic at diagnosis. One hundred-thirty-five patient (93.1%) received IC, 104 of which (77%) were TPF-based (taxanes, cisplatin and 5 fluoro-uracil). One hundred-thirty-eight patient (95.2%) received CC, mostly cisplatin in 134 cases (97%). After a median follow-up of 50 months [22-82], 46 patients (31.7%) had a relapse: 12 (8.2%) experienced local and/or regional relapse after a median of 18 months [6-43], 29 (20%) experienced distant relapse after a median of 9 months [2-24] and 5 patients (3.4%) had both. Thirty-five patients (24.1%) died, including 5 (3.4%) from a cause other than their cancer. Three-year overall survival (OS), cancer specific survival, disease free survival, metastasis free survival and loco-regional free survival were respectively 78.1%, 81.3%, 67.8%, 74.5% and 88.1%. Anatomo-clinic factors predicting OS were age > 50 years (88.7 vs. 70.5%; p=0.004), diabetes history (81.2 vs. 66.7%; p=0.027), UICC N classification (100 vs. 95 vs. 77.5 vs. 68.8% respectively for N0, N1, N2 and N3; p=0.008), the practice of a lymph node biopsy (84.2 vs. 57%; p=0.05), and UICC TNM stages III-IV (93.8 vs. 73.6% respectively for stage I-II vs. III-IV; p=0.044). Therapeutic factors predicting OS were a number of CC courses (less than 4 courses: 65.8 vs. 86%; p=0.03, less than 5 courses: 71.5 vs. 89%; p=0.041), a weight loss > 10% during treatment (84.1 vs. 60.9%; p=0.021) and a total cumulative cisplatin dose, including IC and CC, < 380 mg/m² (64.4 vs. 87.6%; p=0.003). Radiotherapy delay and total duration did not significantly affect OS. No grade 3-4 late side effects were noted in the evaluable 127 patients (87.6%). The most common toxicity was dry mouth which was grade 2 in 47 cases (37%) and grade 1 in 55 cases (43.3%).Conclusion: IMRT for nasopharyngeal carcinoma granted a high loco-regional control rate for patients during the last five years. However, distant relapses remain frequent and conditionate the prognosis. We identified many anatomo-clinic and therapeutic prognosis factors. Therefore, high-risk patients require a more aggressive therapeutic approach, such as radiotherapy dose escalation or adding adjuvant chemotherapy.Keywords: therapeutic results, prognostic factors, intensity-modulated radiotherapy, nasopharyngeal carcinoma
Procedia PDF Downloads 642901 Data-Centric Anomaly Detection with Diffusion Models
Authors: Sheldon Liu, Gordon Wang, Lei Liu, Xuefeng Liu
Abstract:
Anomaly detection, also referred to as one-class classification, plays a crucial role in identifying product images that deviate from the expected distribution. This study introduces Data-centric Anomaly Detection with Diffusion Models (DCADDM), presenting a systematic strategy for data collection and further diversifying the data with image generation via diffusion models. The algorithm addresses data collection challenges in real-world scenarios and points toward data augmentation with the integration of generative AI capabilities. The paper explores the generation of normal images using diffusion models. The experiments demonstrate that with 30% of the original normal image size, modeling in an unsupervised setting with state-of-the-art approaches can achieve equivalent performances. With the addition of generated images via diffusion models (10% equivalence of the original dataset size), the proposed algorithm achieves better or equivalent anomaly localization performance.Keywords: diffusion models, anomaly detection, data-centric, generative AI
Procedia PDF Downloads 822900 Use of the Gas Chromatography Method for Hydrocarbons' Quality Evaluation in the Offshore Fields of the Baltic Sea
Authors: Pavel Shcherban, Vlad Golovanov
Abstract:
Currently, there is an active geological exploration and development of the subsoil shelf of the Kaliningrad region. To carry out a comprehensive and accurate assessment of the volumes and degree of extraction of hydrocarbons from open deposits, it is necessary to establish not only a number of geological and lithological characteristics of the structures under study, but also to determine the oil quality, its viscosity, density, fractional composition as accurately as possible. In terms of considered works, gas chromatography is one of the most capacious methods that allow the rapid formation of a significant amount of initial data. The aspects of the application of the gas chromatography method for determining the chemical characteristics of the hydrocarbons of the Kaliningrad shelf fields are observed in the article, as well as the correlation-regression analysis of these parameters in comparison with the previously obtained chemical characteristics of hydrocarbon deposits located on the land of the region. In the process of research, a number of methods of mathematical statistics and computer processing of large data sets have been applied, which makes it possible to evaluate the identity of the deposits, to specify the amount of reserves and to make a number of assumptions about the genesis of the hydrocarbons under analysis.Keywords: computer processing of large databases, correlation-regression analysis, hydrocarbon deposits, method of gas chromatography
Procedia PDF Downloads 1572899 Classification Earthquake Distribution in the Banda Sea Collision Zone with Point Process Approach
Authors: H. J. Wattimanela, U. S. Passaribu, N. T. Puspito, S. W. Indratno
Abstract:
Banda Sea collision zone (BSCZ) of is the result of the interaction and convergence of Indo-Australian plate, Eurasian plate and Pacific plate. This location in the eastern part of Indonesia. This zone has a very high seismic activity. In this research, we will be calculated rate (λ) and Mean Square Eror (MSE). By this result, we will identification of Poisson distribution of earthquakes in the BSCZ with the point process approach. Chi-square test approach and test Anscombe made in the process of identifying a Poisson distribution in the partition area. The data used are earthquakes with Magnitude ≥ 6 SR and its period 1964-2013 and sourced from BMKG Jakarta. This research is expected to contribute to the Moluccas Province and surrounding local governments in performing spatial plan document related to disaster management.Keywords: molluca banda sea collision zone, earthquakes, mean square error, poisson distribution, chi-square test, anscombe test
Procedia PDF Downloads 3002898 Errors in Selected Writings of EFL Students: A Study of Department of English, Taraba State University, Jalingo, Nigeria
Authors: Joy Aworookoroh
Abstract:
Writing is one of the active skills in language learning. Students of English as a foreign language are expected to write efficiently and proficiently in the language; however, there are usually challenges to optimal performance and competence in writing. Errors, on the other hand, in a foreign language learning situation are more positive than negative as they provide the basis for solving the limitations of the students. This paper investigates the situation in the Department of English, Taraba State University Jalingo. Students are administered a descriptive writing test across different levels of study. The target students are multilingual with an L1 of either Kuteb, Hausa or Junkun languages. The essays are accessed to identify the different kinds of errors in them alongside the classification of the order. Errors of correctness, clarity, engagement, and delivery were identified. However, the study identified that the degree of errors reduces alongside the experience and exposure of the students to an EFL classroom.Keywords: errors, writings, descriptive essay, multilingual
Procedia PDF Downloads 632897 Research Study on the Environmental Conditions in the Foreign
Authors: Vahid Bairami Rad, Shapoor Norazar, Moslem Talebi Asl
Abstract:
The fast growing accessibility and capability of emerging technologies have fashioned enormous possibilities of designing, developing and implementing innovative teaching methods in the classroom. Using teaching methods and technology together have a fantastic results, because the global technological scenario has paved the way to new pedagogies in teaching-learning process. At the other side methods by focusing on students and the ways of learning in them, that can demonstrate logical ways of improving student achievement in English as a foreign language in Iran. The sample of study was 90 students of 10th grade of high school located in Ardebil. A pretest-posttest equivalent group designed to compare the achievement of groups. Students divided to 3 group, Control base, computer base, method and technology base. Pretest and post test contain 30 items each from English textbook were developed and administrated, then obtained data were analyzed. The results showed that there was an important difference. The 3rd group performance was better than other groups. On the basis of this result it was obviously counseled that teaching-learning capabilities.Keywords: method, technology based environment, computer based environment, english as a foreign language, student achievement
Procedia PDF Downloads 4742896 Detection of COVID-19 Cases From X-Ray Images Using Capsule-Based Network
Authors: Donya Ashtiani Haghighi, Amirali Baniasadi
Abstract:
Coronavirus (COVID-19) disease has spread abruptly all over the world since the end of 2019. Computed tomography (CT) scans and X-ray images are used to detect this disease. Different Deep Neural Network (DNN)-based diagnosis solutions have been developed, mainly based on Convolutional Neural Networks (CNNs), to accelerate the identification of COVID-19 cases. However, CNNs lose important information in intermediate layers and require large datasets. In this paper, Capsule Network (CapsNet) is used. Capsule Network performs better than CNNs for small datasets. Accuracy of 0.9885, f1-score of 0.9883, precision of 0.9859, recall of 0.9908, and Area Under the Curve (AUC) of 0.9948 are achieved on the Capsule-based framework with hyperparameter tuning. Moreover, different dropout rates are investigated to decrease overfitting. Accordingly, a dropout rate of 0.1 shows the best results. Finally, we remove one convolution layer and decrease the number of trainable parameters to 146,752, which is a promising result.Keywords: capsule network, dropout, hyperparameter tuning, classification
Procedia PDF Downloads 782895 Tracking Performance Evaluation of Robust Back-Stepping Control Design for a Nonlinear Electro-Hydraulic Servo System
Authors: Maria Ahmadnezhad, Mohammad Reza Soltanpour
Abstract:
Electrohydraulic servo systems have been used in industry in a wide number of applications. Its dynamics are highly nonlinear and also have large extent of model uncertainties and external disturbances. In this thesis, a robust back-stepping control (RBSC) scheme is proposed to overcome the problem of disturbances and system uncertainties effectively and to improve the tracking performance of EHS systems. In order to implement the proposed control scheme, the system uncertainties in EHS systems are considered as total leakage coefficient and effective oil volume. In addition, in order to obtain the virtual controls for stabilizing system, the update rule for the system uncertainty term is induced by the Lyapunov control function (LCF). To verify the performance and robustness of the proposed control system, computer simulation of the proposed control system using Matlab/Simulink Software is executed. From the computer simulation, it was found that the RBSC system produces the desired tracking performance and has robustness to the disturbances and system uncertainties of EHS systems.Keywords: electro hydraulic servo system, back-stepping control, robust back-stepping control, Lyapunov redesign
Procedia PDF Downloads 2962894 Food Bolus Obstruction: A Rural Hospital’s Experience
Authors: Davina Von Hagt, Genevieve Gibbons, Matt Henderson, Tom Bowles
Abstract:
Purpose: Food bolus obstructions are common emergency surgical presentations, but there is no established management guideline in a rural setting. Intervention usually involves endoscopic removal after initial medical management has failed. Within a rural setting, this falls upon the general surgeon. There are varied endoscopic techniques that may be used. Methodology: A review of the past fifty cases of food bolus obstruction managed at Albany Health Campus was retrospectively reviewed to assess endoscopic findings and techniques. Operation notes, histopathology, imaging, and patient notes were reviewed. Results: 50 patients underwent gastroscopy for food bolus obstruction from August 2017 to March 2021. Ages ranged from 11 months to 95 years, with the majority of patients aged between 30-70 years. 88% of patients were male. Meat was the most common bolus (20% unspecified, 20% steak, 10% chicken, 6% lamb, 4% sausage, 2% pork). At endoscopy, 12% were found not to have a food bolus obstruction. Two patients were found to have oesophageal cancer, and four patients had a stricture and required dilatation. A variety of methods were used to relieve oesophageal obstruction ranging from pushing through to stomach (24 patients), using an overtube (10 patients), raptor (13 patients), and less common instruments such as Roth net, basket, guidewire, and pronged grasper. One patient had an unsuccessful endoscopic retrieval and required theatre for laparoscopic assisted removal with rendezvous endoscopic piecemeal removal via oesophagus and gastrostomy. Conclusion: Food bolus obstruction is a common emergency presentation. Within the rural setting, management requires innovation and teamwork within the safety of the local experience.Keywords: food bolus obstruction, regional hospital, surgical management, innovative surgical treatment
Procedia PDF Downloads 267