Search results for: EEG signals classification
1731 A New Full Adder Cell for High Performance Low Power Applications
Authors: Mahdiar Hosseighadiry, Farnaz Fotovatikhah, Razali Ismail, Mohsen Khaledian, Mehdi Saeidemanesh
Abstract:
In this paper, a new low-power high-performance full adder is presented based on a new design method. The proposed method relies on pass gate design and provides full-swing circuits with minimum number of transistors. The method has been applied on SUM, COUT and XOR-XNOR modules resulting on rail-to-rail intermediate and output signals with no feedback transistors. The presented full adder cell has been simulated in 45 and 32 nm CMOS technologies using HSPICE considering parasitic capacitance and compared to several well-known designs from literature. In addition, the proposed cell has been extensively evaluated with different output loads, supply voltages, temperatures, threshold voltages, and operating frequencies. Results show that it functions properly under all mentioned conditions and exhibits less PDP compared to other design styles.Keywords: full adders, low-power, high-performance, VLSI design
Procedia PDF Downloads 3921730 Comprehensive Machine Learning-Based Glucose Sensing from Near-Infrared Spectra
Authors: Bitewulign Mekonnen
Abstract:
Context: This scientific paper focuses on the use of near-infrared (NIR) spectroscopy to determine glucose concentration in aqueous solutions accurately and rapidly. The study compares six different machine learning methods for predicting glucose concentration and also explores the development of a deep learning model for classifying NIR spectra. The objective is to optimize the detection model and improve the accuracy of glucose prediction. This research is important because it provides a comprehensive analysis of various machine-learning techniques for estimating aqueous glucose concentrations. Research Aim: The aim of this study is to compare and evaluate different machine-learning methods for predicting glucose concentration from NIR spectra. Additionally, the study aims to develop and assess a deep-learning model for classifying NIR spectra. Methodology: The research methodology involves the use of machine learning and deep learning techniques. Six machine learning regression models, including support vector machine regression, partial least squares regression, extra tree regression, random forest regression, extreme gradient boosting, and principal component analysis-neural network, are employed to predict glucose concentration. The NIR spectra data is randomly divided into train and test sets, and the process is repeated ten times to increase generalization ability. In addition, a convolutional neural network is developed for classifying NIR spectra. Findings: The study reveals that the SVMR, ETR, and PCA-NN models exhibit excellent performance in predicting glucose concentration, with correlation coefficients (R) > 0.99 and determination coefficients (R²)> 0.985. The deep learning model achieves high macro-averaging scores for precision, recall, and F1-measure. These findings demonstrate the effectiveness of machine learning and deep learning methods in optimizing the detection model and improving glucose prediction accuracy. Theoretical Importance: This research contributes to the field by providing a comprehensive analysis of various machine-learning techniques for estimating glucose concentrations from NIR spectra. It also explores the use of deep learning for the classification of indistinguishable NIR spectra. The findings highlight the potential of machine learning and deep learning in enhancing the prediction accuracy of glucose-relevant features. Data Collection and Analysis Procedures: The NIR spectra and corresponding references for glucose concentration are measured in increments of 20 mg/dl. The data is randomly divided into train and test sets, and the models are evaluated using regression analysis and classification metrics. The performance of each model is assessed based on correlation coefficients, determination coefficients, precision, recall, and F1-measure. Question Addressed: The study addresses the question of whether machine learning and deep learning methods can optimize the detection model and improve the accuracy of glucose prediction from NIR spectra. Conclusion: The research demonstrates that machine learning and deep learning methods can effectively predict glucose concentration from NIR spectra. The SVMR, ETR, and PCA-NN models exhibit superior performance, while the deep learning model achieves high classification scores. These findings suggest that machine learning and deep learning techniques can be used to improve the prediction accuracy of glucose-relevant features. Further research is needed to explore their clinical utility in analyzing complex matrices, such as blood glucose levels.Keywords: machine learning, signal processing, near-infrared spectroscopy, support vector machine, neural network
Procedia PDF Downloads 971729 Improving Fingerprinting-Based Localization (FPL) System Using Generative Artificial Intelligence (GAI)
Authors: Getaneh Berie Tarekegn, Li-Chia Tai
Abstract:
With the rapid advancement of artificial intelligence, low-power built-in sensors on Internet of Things devices, and communication technologies, location-aware services have become increasingly popular and have permeated every aspect of people’s lives. Global navigation satellite systems (GNSSs) are the default method of providing continuous positioning services for ground and aerial vehicles, as well as consumer devices (smartphones, watches, notepads, etc.). However, the environment affects satellite positioning systems, particularly indoors, in dense urban and suburban cities enclosed by skyscrapers, or when deep shadows obscure satellite signals. This is because (1) indoor environments are more complicated due to the presence of many objects surrounding them; (2) reflection within the building is highly dependent on the surrounding environment, including the positions of objects and human activity; and (3) satellite signals cannot be reached in an indoor environment, and GNSS doesn't have enough power to penetrate building walls. GPS is also highly power-hungry, which poses a severe challenge for battery-powered IoT devices. Due to these challenges, IoT applications are limited. Consequently, precise, seamless, and ubiquitous Positioning, Navigation and Timing (PNT) systems are crucial for many artificial intelligence Internet of Things (AI-IoT) applications in the era of smart cities. Their applications include traffic monitoring, emergency alarming, environmental monitoring, location-based advertising, intelligent transportation, and smart health care. This paper proposes a generative AI-based positioning scheme for large-scale wireless settings using fingerprinting techniques. In this article, we presented a novel semi-supervised deep convolutional generative adversarial network (S-DCGAN)-based radio map construction method for real-time device localization. We also employed a reliable signal fingerprint feature extraction method with t-distributed stochastic neighbor embedding (t-SNE), which extracts dominant features while eliminating noise from hybrid WLAN and long-term evolution (LTE) fingerprints. The proposed scheme reduced the workload of site surveying required to build the fingerprint database by up to 78.5% and significantly improved positioning accuracy. The results show that the average positioning error of GAILoc is less than 0.39 m, and more than 90% of the errors are less than 0.82 m. According to numerical results, SRCLoc improves positioning performance and reduces radio map construction costs significantly compared to traditional methods.Keywords: location-aware services, feature extraction technique, generative adversarial network, long short-term memory, support vector machine
Procedia PDF Downloads 561728 Navigating Uncertainties in Project Control: A Predictive Tracking Framework
Authors: Byung Cheol Kim
Abstract:
This study explores a method for the signal-noise separation challenge in project control, focusing on the limitations of traditional deterministic approaches that use single-point performance metrics to predict project outcomes. We detail how traditional methods often overlook future uncertainties, resulting in tracking biases when reliance is placed solely on immediate data without adjustments for predictive accuracy. Our investigation led to the development of the Predictive Tracking Project Control (PTPC) framework, which incorporates network simulation and Bayesian control models to adapt more effectively to project dynamics. The PTPC introduces controlled disturbances to better identify and separate tracking biases from useful predictive signals. We will demonstrate the efficacy of the PTPC with examples, highlighting its potential to enhance real-time project monitoring and decision-making, marking a significant shift towards more accurate project management practices.Keywords: predictive tracking, project control, signal-noise separation, Bayesian inference
Procedia PDF Downloads 271727 Evaluation of Methodologies for Measuring Harmonics and Inter-Harmonics in Photovoltaic Facilities
Authors: Anésio de Leles Ferreira Filho, Wesley Rodrigues de Oliveira, Jéssica Santoro Gonçalves, Jorge Andrés Cormane Angarita
Abstract:
The increase in electric power demand in face of environmental issues has intensified the participation of renewable energy sources such as photovoltaics, in the energy matrix of various countries. Due to their operational characteristics, they can generate time-varying harmonic and inter-harmonic distortions. For this reason, the application of methods of measurement based on traditional Fourier analysis, as proposed by IEC 61000-4-7, can provide inaccurate results. Considering the aspects mentioned herein, came the idea of the development of this work which aims to present the results of a comparative evaluation between a methodology arising from the combination of the Prony method with the Kalman filter and another method based on the IEC 61000-4-30 and IEC 61000-4-7 standards. Employed in this study were synthetic signals and data acquired through measurements in a 50kWp photovoltaic installation.Keywords: harmonics, inter-harmonics, iec61000-4-7, parametric estimators, photovoltaic generation
Procedia PDF Downloads 4911726 Hybrid GNN Based Machine Learning Forecasting Model For Industrial IoT Applications
Authors: Atish Bagchi, Siva Chandrasekaran
Abstract:
Background: According to World Bank national accounts data, the estimated global manufacturing value-added output in 2020 was 13.74 trillion USD. These manufacturing processes are monitored, modelled, and controlled by advanced, real-time, computer-based systems, e.g., Industrial IoT, PLC, SCADA, etc. These systems measure and manipulate a set of physical variables, e.g., temperature, pressure, etc. Despite the use of IoT, SCADA etc., in manufacturing, studies suggest that unplanned downtime leads to economic losses of approximately 864 billion USD each year. Therefore, real-time, accurate detection, classification and prediction of machine behaviour are needed to minimise financial losses. Although vast literature exists on time-series data processing using machine learning, the challenges faced by the industries that lead to unplanned downtimes are: The current algorithms do not efficiently handle the high-volume streaming data from industrial IoTsensors and were tested on static and simulated datasets. While the existing algorithms can detect significant 'point' outliers, most do not handle contextual outliers (e.g., values within normal range but happening at an unexpected time of day) or subtle changes in machine behaviour. Machines are revamped periodically as part of planned maintenance programmes, which change the assumptions on which original AI models were created and trained. Aim: This research study aims to deliver a Graph Neural Network(GNN)based hybrid forecasting model that interfaces with the real-time machine control systemand can detect, predict machine behaviour and behavioural changes (anomalies) in real-time. This research will help manufacturing industries and utilities, e.g., water, electricity etc., reduce unplanned downtimes and consequential financial losses. Method: The data stored within a process control system, e.g., Industrial-IoT, Data Historian, is generally sampled during data acquisition from the sensor (source) and whenpersistingin the Data Historian to optimise storage and query performance. The sampling may inadvertently discard values that might contain subtle aspects of behavioural changes in machines. This research proposed a hybrid forecasting and classification model which combines the expressive and extrapolation capability of GNN enhanced with the estimates of entropy and spectral changes in the sampled data and additional temporal contexts to reconstruct the likely temporal trajectory of machine behavioural changes. The proposed real-time model belongs to the Deep Learning category of machine learning and interfaces with the sensors directly or through 'Process Data Historian', SCADA etc., to perform forecasting and classification tasks. Results: The model was interfaced with a Data Historianholding time-series data from 4flow sensors within a water treatment plantfor45 days. The recorded sampling interval for a sensor varied from 10 sec to 30 min. Approximately 65% of the available data was used for training the model, 20% for validation, and the rest for testing. The model identified the anomalies within the water treatment plant and predicted the plant's performance. These results were compared with the data reported by the plant SCADA-Historian system and the official data reported by the plant authorities. The model's accuracy was much higher (20%) than that reported by the SCADA-Historian system and matched the validated results declared by the plant auditors. Conclusions: The research demonstrates that a hybrid GNN based approach enhanced with entropy calculation and spectral information can effectively detect and predict a machine's behavioural changes. The model can interface with a plant's 'process control system' in real-time to perform forecasting and classification tasks to aid the asset management engineers to operate their machines more efficiently and reduce unplanned downtimes. A series of trialsare planned for this model in the future in other manufacturing industries.Keywords: GNN, Entropy, anomaly detection, industrial time-series, AI, IoT, Industry 4.0, Machine Learning
Procedia PDF Downloads 1521725 Automatic Target Recognition in SAR Images Based on Sparse Representation Technique
Authors: Ahmet Karagoz, Irfan Karagoz
Abstract:
Synthetic Aperture Radar (SAR) is a radar mechanism that can be integrated into manned and unmanned aerial vehicles to create high-resolution images in all weather conditions, regardless of day and night. In this study, SAR images of military vehicles with different azimuth and descent angles are pre-processed at the first stage. The main purpose here is to reduce the high speckle noise found in SAR images. For this, the Wiener adaptive filter, the mean filter, and the median filters are used to reduce the amount of speckle noise in the images without causing loss of data. During the image segmentation phase, pixel values are ordered so that the target vehicle region is separated from other regions containing unnecessary information. The target image is parsed with the brightest 20% pixel value of 255 and the other pixel values of 0. In addition, by using appropriate parameters of statistical region merging algorithm, segmentation comparison is performed. In the step of feature extraction, the feature vectors belonging to the vehicles are obtained by using Gabor filters with different orientation, frequency and angle values. A number of Gabor filters are created by changing the orientation, frequency and angle parameters of the Gabor filters to extract important features of the images that form the distinctive parts. Finally, images are classified by sparse representation method. In the study, l₁ norm analysis of sparse representation is used. A joint database of the feature vectors generated by the target images of military vehicle types is obtained side by side and this database is transformed into the matrix form. In order to classify the vehicles in a similar way, the test images of each vehicle is converted to the vector form and l₁ norm analysis of the sparse representation method is applied through the existing database matrix form. As a result, correct recognition has been performed by matching the target images of military vehicles with the test images by means of the sparse representation method. 97% classification success of SAR images of different military vehicle types is obtained.Keywords: automatic target recognition, sparse representation, image classification, SAR images
Procedia PDF Downloads 3691724 Regeneration of Geological Models Using Support Vector Machine Assisted by Principal Component Analysis
Authors: H. Jung, N. Kim, B. Kang, J. Choe
Abstract:
History matching is a crucial procedure for predicting reservoir performances and making future decisions. However, it is difficult due to uncertainties of initial reservoir models. Therefore, it is important to have reliable initial models for successful history matching of highly heterogeneous reservoirs such as channel reservoirs. In this paper, we proposed a novel scheme for regenerating geological models using support vector machine (SVM) and principal component analysis (PCA). First, we perform PCA for figuring out main geological characteristics of models. Through the procedure, permeability values of each model are transformed to new parameters by principal components, which have eigenvalues of large magnitude. Secondly, the parameters are projected into two-dimensional plane by multi-dimensional scaling (MDS) based on Euclidean distances. Finally, we train an SVM classifier using 20% models which show the most similar or dissimilar well oil production rates (WOPR) with the true values (10% for each). Then, the other 80% models are classified by trained SVM. We select models on side of low WOPR errors. One hundred channel reservoir models are initially generated by single normal equation simulation. By repeating the classification process, we can select models which have similar geological trend with the true reservoir model. The average field of the selected models is utilized as a probability map for regeneration. Newly generated models can preserve correct channel features and exclude wrong geological properties maintaining suitable uncertainty ranges. History matching with the initial models cannot provide trustworthy results. It fails to find out correct geological features of the true model. However, history matching with the regenerated ensemble offers reliable characterization results by figuring out proper channel trend. Furthermore, it gives dependable prediction of future performances with reduced uncertainties. We propose a novel classification scheme which integrates PCA, MDS, and SVM for regenerating reservoir models. The scheme can easily sort out reliable models which have similar channel trend with the reference in lowered dimension space.Keywords: history matching, principal component analysis, reservoir modelling, support vector machine
Procedia PDF Downloads 1631723 A Novel Method For Non-Invasive Diagnosis Of Hepatitis C Virus Using Electromagnetic Signal Detection: A Multicenter International Study
Authors: Gamal Shiha, Waleed Samir, Zahid Azam, Premashis Kar, Saeed Hamid, Shiv Sarin
Abstract:
A simple, rapid and non-invasive electromagnetic sensor (C-FAST device) was- patented; for diagnosis of HCV RNA. Aim: To test the validity of the device compared to standard HCV PCR. Subjects and Methods: The first phase was done as pilot in Egypt on 79 participants; the second phase was done in five centers: one center from Egypt, two centers from Pakistan and two centers from India (800, 92 and 113 subjects respectively). The third phase was done nationally as multicenter study on (1600) participants for ensuring its representativeness. Results: When compared to PCR technique, C-FAST device revealed sensitivity 95% to 100%, specificity 95.5% to 100%, PPV 89.5% to 100%, NPV 95% to 100% and positive likelihood ratios 21.8% to 38.5%. Conclusion: It is practical evidence that HCV nucleotides emit electromagnetic signals that can be used for its identification. As compared to PCR, C-FAST is an accurate, valid and non-invasive device.Keywords: C-FAST- a valid and reliable device, distant cellular interaction, electromagnetic signal detection, non-invasive diagnosis of HCV
Procedia PDF Downloads 4361722 Real-Time Visualization Using GPU-Accelerated Filtering of LiDAR Data
Authors: Sašo Pečnik, Borut Žalik
Abstract:
This paper presents a real-time visualization technique and filtering of classified LiDAR point clouds. The visualization is capable of displaying filtered information organized in layers by the classification attribute saved within LiDAR data sets. We explain the used data structure and data management, which enables real-time presentation of layered LiDAR data. Real-time visualization is achieved with LOD optimization based on the distance from the observer without loss of quality. The filtering process is done in two steps and is entirely executed on the GPU and implemented using programmable shaders.Keywords: filtering, graphics, level-of-details, LiDAR, real-time visualization
Procedia PDF Downloads 3141721 Naïve Bayes: A Classical Approach for the Epileptic Seizures Recognition
Authors: Bhaveek Maini, Sanjay Dhanka, Surita Maini
Abstract:
Electroencephalography (EEG) is used to classify several epileptic seizures worldwide. It is a very crucial task for the neurologist to identify the epileptic seizure with manual EEG analysis, as it takes lots of effort and time. Human error is always at high risk in EEG, as acquiring signals needs manual intervention. Disease diagnosis using machine learning (ML) has continuously been explored since its inception. Moreover, where a large number of datasets have to be analyzed, ML is acting as a boon for doctors. In this research paper, authors proposed two different ML models, i.e., logistic regression (LR) and Naïve Bayes (NB), to predict epileptic seizures based on general parameters. These two techniques are applied to the epileptic seizures recognition dataset, available on the UCI ML repository. The algorithms are implemented on an 80:20 train test ratio (80% for training and 20% for testing), and the performance of the model was validated by 10-fold cross-validation. The proposed study has claimed accuracy of 81.87% and 95.49% for LR and NB, respectively.Keywords: epileptic seizure recognition, logistic regression, Naïve Bayes, machine learning
Procedia PDF Downloads 651720 Bandwidth Control Using Reconfigurable Antenna Elements
Authors: Sudhina H. K, Ravi M. Yadahalli, N. M. Shetti
Abstract:
Reconfigurable antennas represent a recent innovation in antenna design that changes from classical fixed-form, Fixed function antennas to modifiable structures that can be adapted to fit the requirements of a time varying system. The ability to control the operating band of an antenna system can have many useful applications. Systems that operate in an acquire-and-track configuration would see a benefit from active bandwidth control. In such systems a wide band search mode is first employed to find a desired signal, Then a narrow band track mode is used to follow only that signal. Utilizing active antenna bandwidth control, A single antenna would function for both the wide band and narrow band configurations providing the rejection of unwanted signals with the antenna hardware. This ability to move a portion of the RF filtering out of the receiver and onto the antenna itself will also aid in reducing the complexity of the often expensive RF processing subsystems.Keywords: designing methods, mems, stack, reconfigurable elements
Procedia PDF Downloads 2761719 Active Features Determination: A Unified Framework
Authors: Meenal Badki
Abstract:
We address the issue of active feature determination, where the objective is to determine the set of examples on which additional data (such as lab tests) needs to be gathered, given a large number of examples with some features (such as demographics) and some examples with all the features (such as the complete Electronic Health Record). We note that certain features may be more costly, unique, or laborious to gather. Our proposal is a general active learning approach that is independent of classifiers and similarity metrics. It allows us to identify examples that differ from the full data set and obtain all the features for the examples that match. Our comprehensive evaluation shows the efficacy of this approach, which is driven by four authentic clinical tasks.Keywords: feature determination, classification, active learning, sample-efficiency
Procedia PDF Downloads 801718 Comparative Analysis of SVPWM and the Standard PWM Technique for Three Level Diode Clamped Inverter fed Induction Motor
Authors: L. Lakhdari, B. Bouchiba, M. Bechar
Abstract:
The multi-level inverters present an important novelty in the field of energy control with high voltage and power. The major advantage of all multi-level inverters is the improvement and spectral quality of its generated output signals. In recent years, various pulse width modulation techniques have been developed. From these technics we have: Sinusoidal Pulse Width Modulation (SPWM) and Space Vector Pulse Width Modulation (SVPWM). This work presents a detailed analysis of the comparative advantage of space vector pulse width modulation (SVPWM) and the standard SPWM technique for Three Level Diode Clamped Inverter fed Induction Motor. The comparison is based on the evaluation of harmonic distortion THD.Keywords: induction motor, multilevel inverters, SVPWM, SPWM, THD
Procedia PDF Downloads 3411717 Use of Fractal Geometry in Machine Learning
Authors: Fuad M. Alkoot
Abstract:
The main component of a machine learning system is the classifier. Classifiers are mathematical models that can perform classification tasks for a specific application area. Additionally, many classifiers are combined using any of the available methods to reduce the classifier error rate. The benefits gained from the combination of multiple classifier designs has motivated the development of diverse approaches to multiple classifiers. We aim to investigate using fractal geometry to develop an improved classifier combiner. Initially we experiment with measuring the fractal dimension of data and use the results in the development of a combiner strategy.Keywords: fractal geometry, machine learning, classifier, fractal dimension
Procedia PDF Downloads 2231716 Artificial Neural Networks Based Calibration Approach for Six-Port Receiver
Authors: Nadia Chagtmi, Nejla Rejab, Noureddine Boulejfen
Abstract:
This paper presents a calibration approach based on artificial neural networks (ANN) to determine the envelop signal (I+jQ) of a six-port based receiver (SPR). The memory effects called also dynamic behavior and the nonlinearity brought by diode based power detector have been taken into consideration by the ANN. Experimental set-up has been performed to validate the efficiency of this method. The efficiency of this approach has been confirmed by the obtained results in terms of waveforms. Moreover, the obtained error vector magnitude (EVM) and the mean absolute error (MAE) have been calculated in order to confirm and to test the ANN’s performance to achieve I/Q recovery using the output voltage detected by the power based detector. The baseband signal has been recovered using ANN with EVMs no higher than 1 % and an MAE no higher than 17, 26 for the SPR excited different type of signals such QAM (quadrature amplitude modulation) and LTE (Long Term Evolution).Keywords: six-port based receiver; calibration, nonlinearity, memory effect, artificial neural network
Procedia PDF Downloads 811715 Arabic Handwriting Recognition Using Local Approach
Authors: Mohammed Arif, Abdessalam Kifouche
Abstract:
Optical character recognition (OCR) has a main role in the present time. It's capable to solve many serious problems and simplify human activities. The OCR yields to 70's, since many solutions has been proposed, but unfortunately, it was supportive to nothing but Latin languages. This work proposes a system of recognition of an off-line Arabic handwriting. This system is based on a structural segmentation method and uses support vector machines (SVM) in the classification phase. We have presented a state of art of the characters segmentation methods, after that a view of the OCR area, also we will address the normalization problems we went through. After a comparison between the Arabic handwritten characters & the segmentation methods, we had introduced a contribution through a segmentation algorithm.Keywords: OCR, segmentation, Arabic characters, PAW, post-processing, SVM
Procedia PDF Downloads 791714 Health of Riveted Joints with Active and Passive Structural Health Monitoring Techniques
Authors: Javad Yarmahmoudi, Alireza Mirzaee
Abstract:
Many active and passive structural health monitoring (SHM) techniques have been developed for detection of the defects of plates. Generally, riveted joints hold the plates together and their failure may create accidents. In this study, well known active and passive methods were modified for the evaluation of the health of the riveted joints between the plates. The active method generated Lamb waves and monitored their propagation by using lead zirconate titanate (PZT) disks. The signal was analyzed by using the wavelet transformations. The passive method used the Fiber Bragg Grating (FBG) sensors and evaluated the spectral characteristics of the signals by using Fast Fourier Transformation (FFT). The results indicated that the existing methods designed for the evaluation of the health of individual plates may be used for inspection of riveted joints with software modifications.Keywords: structural health monitoring, SHM, active SHM, passive SHM, fiber bragg grating sensor, lead zirconate titanate, PZT
Procedia PDF Downloads 3321713 Hybrid Knowledge Approach for Determining Health Care Provider Specialty from Patient Diagnoses
Authors: Erin Lynne Plettenberg, Jeremy Vickery
Abstract:
In an access-control situation, the role of a user determines whether a data request is appropriate. This paper combines vetted web mining and logic modeling to build a lightweight system for determining the role of a health care provider based only on their prior authorized requests. The model identifies provider roles with 100% recall from very little data. This shows the value of vetted web mining in AI systems, and suggests the impact of the ICD classification on medical practice.Keywords: electronic medical records, information extraction, logic modeling, ontology, vetted web mining
Procedia PDF Downloads 1781712 Transformers in Gene Expression-Based Classification
Authors: Babak Forouraghi
Abstract:
A genetic circuit is a collection of interacting genes and proteins that enable individual cells to implement and perform vital biological functions such as cell division, growth, death, and signaling. In cell engineering, synthetic gene circuits are engineered networks of genes specifically designed to implement functionalities that are not evolved by nature. These engineered networks enable scientists to tackle complex problems such as engineering cells to produce therapeutics within the patient's body, altering T cells to target cancer-related antigens for treatment, improving antibody production using engineered cells, tissue engineering, and production of genetically modified plants and livestock. Construction of computational models to realize genetic circuits is an especially challenging task since it requires the discovery of flow of genetic information in complex biological systems. Building synthetic biological models is also a time-consuming process with relatively low prediction accuracy for highly complex genetic circuits. The primary goal of this study was to investigate the utility of a pre-trained bidirectional encoder transformer that can accurately predict gene expressions in genetic circuit designs. The main reason behind using transformers is their innate ability (attention mechanism) to take account of the semantic context present in long DNA chains that are heavily dependent on spatial representation of their constituent genes. Previous approaches to gene circuit design, such as CNN and RNN architectures, are unable to capture semantic dependencies in long contexts as required in most real-world applications of synthetic biology. For instance, RNN models (LSTM, GRU), although able to learn long-term dependencies, greatly suffer from vanishing gradient and low-efficiency problem when they sequentially process past states and compresses contextual information into a bottleneck with long input sequences. In other words, these architectures are not equipped with the necessary attention mechanisms to follow a long chain of genes with thousands of tokens. To address the above-mentioned limitations of previous approaches, a transformer model was built in this work as a variation to the existing DNA Bidirectional Encoder Representations from Transformers (DNABERT) model. It is shown that the proposed transformer is capable of capturing contextual information from long input sequences with attention mechanism. In a previous work on genetic circuit design, the traditional approaches to classification and regression, such as Random Forrest, Support Vector Machine, and Artificial Neural Networks, were able to achieve reasonably high R2 accuracy levels of 0.95 to 0.97. However, the transformer model utilized in this work with its attention-based mechanism, was able to achieve a perfect accuracy level of 100%. Further, it is demonstrated that the efficiency of the transformer-based gene expression classifier is not dependent on presence of large amounts of training examples, which may be difficult to compile in many real-world gene circuit designs.Keywords: transformers, generative ai, gene expression design, classification
Procedia PDF Downloads 641711 Software Architectural Design Ontology
Authors: Muhammad Irfan Marwat, Sadaqat Jan, Syed Zafar Ali Shah
Abstract:
Software architecture plays a key role in software development but absence of formal description of software architecture causes different impede in software development. To cope with these difficulties, ontology has been used as artifact. This paper proposes ontology for software architectural design based on IEEE model for architecture description and Kruchten 4+1 model for viewpoints classification. For categorization of style and views, ISO/IEC 42010 has been used. Corpus method has been used to evaluate ontology. The main aim of the proposed ontology is to classify and locate software architectural design information.Keywords: semantic-based software architecture, software architecture, ontology, software engineering
Procedia PDF Downloads 5571710 Automatic Differential Diagnosis of Melanocytic Skin Tumours Using Ultrasound and Spectrophotometric Data
Authors: Kristina Sakalauskiene, Renaldas Raisutis, Gintare Linkeviciute, Skaidra Valiukeviciene
Abstract:
Cutaneous melanoma is a melanocytic skin tumour, which has a very poor prognosis while is highly resistant to treatment and tends to metastasize. Thickness of melanoma is one of the most important biomarker for stage of disease, prognosis and surgery planning. In this study, we hypothesized that the automatic analysis of spectrophotometric images and high-frequency ultrasonic 2D data can improve differential diagnosis of cutaneous melanoma and provide additional information about tumour penetration depth. This paper presents the novel complex automatic system for non-invasive melanocytic skin tumour differential diagnosis and penetration depth evaluation. The system is composed of region of interest segmentation in spectrophotometric images and high-frequency ultrasound data, quantitative parameter evaluation, informative feature extraction and classification with linear regression classifier. The segmentation of melanocytic skin tumour region in ultrasound image is based on parametric integrated backscattering coefficient calculation. The segmentation of optical image is based on Otsu thresholding. In total 29 quantitative tissue characterization parameters were evaluated by using ultrasound data (11 acoustical, 4 shape and 15 textural parameters) and 55 quantitative features of dermatoscopic and spectrophotometric images (using total melanin, dermal melanin, blood and collagen SIAgraphs acquired using spectrophotometric imaging device SIAscope). In total 102 melanocytic skin lesions (including 43 cutaneous melanomas) were examined by using SIAscope and ultrasound system with 22 MHz center frequency single element transducer. The diagnosis and Breslow thickness (pT) of each MST were evaluated during routine histological examination after excision and used as a reference. The results of this study have shown that automatic analysis of spectrophotometric and high frequency ultrasound data can improve non-invasive classification accuracy of early-stage cutaneous melanoma and provide supplementary information about tumour penetration depth.Keywords: cutaneous melanoma, differential diagnosis, high-frequency ultrasound, melanocytic skin tumours, spectrophotometric imaging
Procedia PDF Downloads 2721709 Detection of Voltage Sag and Voltage Swell in Power Quality Using Wavelet Transforms
Authors: Nor Asrina Binti Ramlee
Abstract:
Voltage sag, voltage swell, high-frequency noise and voltage transients are kinds of disturbances in power quality. They are also known as power quality events. Equipment used in the industry nowadays has become more sensitive to these events with the increasing complexity of equipment. This leads to the importance of distributing clean power quality to the consumer. To provide better service, the best analysis on power quality is very vital. Thus, this paper presents the events detection focusing on voltage sag and swell. The method is developed by applying time domain signal analysis using wavelet transform approach in MATLAB. Four types of mother wavelet namely Haar, Dmey, Daubechies, and Symlet are used to detect the events. This project analyzed real interrupted signal obtained from 22 kV transmission line in Skudai, Johor Bahru, Malaysia. The signals will be decomposed through the wavelet mothers. The best mother is the one that is capable to detect the time location of the event accurately.Keywords: power quality, voltage sag, voltage swell, wavelet transform
Procedia PDF Downloads 3761708 Networked Radar System to Increase Safety of Urban Railroad Crossing
Authors: Sergio Saponara, Luca Fanucci, Riccardo Cassettari, Ruggero Piernicola, Marco Righetto
Abstract:
The paper presents an innovative networked radar system for detection of obstacles in a railway level crossing scenario. This Monitoring System (MS) is able to detect moving or still obstacles within the railway level crossing area automatically, avoiding the need of human presence for surveillance. The MS is also connected to the National Railway Information and Signaling System to communicate in real-time the level crossing status. The architecture is compliant with the highest Safety Integrity Level (SIL4) of the CENELEC standard. The number of radar sensors used is configurable at set-up time and depends on how large the level crossing area can be. At least two sensors are expected and up four can be used for larger areas. The whole processing chain that elaborates the output sensor signals, as well as the communication interface, is fully-digital, was designed in VHDL code and implemented onto a Xilinx Virtex 6.Keywords: radar for safe mobility, railroad crossing, railway, transport safety
Procedia PDF Downloads 4851707 Study of Cavitation Erosion of Pump-Storage Hydro Power Plant Prototype
Authors: Tine Cencič, Marko Hočevar, Brane Širok
Abstract:
An experimental investigation has been made to detect cavitation in pump–storage hydro power plant prototype suffering from cavitation in pump mode. Vibrations and acoustic emission on the housing of turbine bearing and pressure fluctuations in the draft tube were measured and the corresponding signals have been recorded and analyzed. The analysis was based on the analysis of high-frequency content of measured variables. The pump-storage hydro power plant prototype has been operated at various input loads and Thoma numbers. Several estimators of cavitation were evaluated according to coefficient of determination between Thoma number and cavitation estimators. The best results were achieved with a compound discharge coefficient cavitation estimator. Cavitation estimators were evaluated in several intervals of frequencies. Also, a prediction of cavitation erosion was made in order to choose the appropriate maintenance and repair periods.Keywords: cavitation erosion, turbine, cavitation measurement, fluid dynamics
Procedia PDF Downloads 4201706 The Development of User Behavior in Urban Regeneration Areas by Utilizing the Floating Population Data
Authors: Jung-Hun Cho, Tae-Heon Moon, Sun-Young Heo
Abstract:
A lot of urban problems, caused by urbanization and industrialization, have occurred around the world. In particular, the creation of satellite towns, which was attributed to the explicit expansion of the city, has led to the traffic problems and the hollowization of old towns, raising the necessity of urban regeneration in old towns along with the aging of existing urban infrastructure. To select urban regeneration priority regions for the strategic execution of urban regeneration in Korea, the number of population, the number of businesses, and deterioration degree were chosen as standards. Existing standards had a limit in coping with solving urban problems fundamentally and rapidly changing reality. Therefore, it was necessary to add new indicators that can reflect the decline in relevant cities and conditions. In this regard, this study selected Busan Metropolitan City, Korea as the target area as a leading city, where urban regeneration such as an international port city has been activated like Yokohama, Japan. Prior to setting the urban regeneration priority region, the conditions of reality should be reflected because uniform and uncharacterized projects have been implemented without a quantitative analysis about population behavior within the region. For this reason, this study conducted a characterization analysis and type classification, based on the user behaviors by using representative floating population of the big data, which is a hot issue all over the society in recent days. The target areas were analyzed in this study. While 23 regions were classified as three types in existing Busan Metropolitan City urban regeneration priority region, 23 regions were classified as four types in existing Busan Metropolitan City urban regeneration priority region in terms of the type classification on the basis of user behaviors. Four types were classified as follows; type (Ⅰ) of young people - morning type, Type (Ⅱ) of the old and middle-aged- general type with sharp floating population, type (Ⅲ) of the old and middle aged-24hour-type, and type (Ⅳ) of the old and middle aged with less floating population. Characteristics were shown in each region of four types, and the study results of user behaviors were different from those of existing urban regeneration priority region. According to the results, in type (Ⅰ) young people were the majority around the existing old built-up area, where floating population at dawn is four times more than in other areas. In Type (Ⅱ), there were many old and middle-aged people around the existing built-up area and general neighborhoods, where the average floating population was more than in other areas due to commuting, while in type (Ⅲ), there was no change in the floating population throughout 24 hours, although there were many old and middle aged people in population around the existing general neighborhoods. Type (Ⅳ) includes existing economy-based type, central built-up area type, and general neighborhood type, where old and middle aged people were the majority as a general type of commuting with less floating population. Unlike existing urban regeneration priority region, these types were sub-divided according to types, and in this study, approach methods and basic orientations of urban regeneration were set to reflect the reality to a certain degree including the indicators of effective floating population to identify the dynamic activity of urban areas and existing regeneration priority areas in connection with urban regeneration projects by regions. Therefore, it is possible to make effective urban plans through offering the substantial ground by utilizing scientific and quantitative data. To induce more realistic and effective regeneration projects, the regeneration projects tailored to the present local conditions should be developed by reflecting the present conditions on the formulation of urban regeneration strategic plans.Keywords: floating population, big data, urban regeneration, urban regeneration priority region, type classification
Procedia PDF Downloads 2171705 A Deep Learning Approach for the Predictive Quality of Directional Valves in the Hydraulic Final Test
Authors: Christian Neunzig, Simon Fahle, Jürgen Schulz, Matthias Möller, Bernd Kuhlenkötter
Abstract:
The increasing use of deep learning applications in production is becoming a competitive advantage. Predictive quality enables the assurance of product quality by using data-driven forecasts via machine learning models as a basis for decisions on test results. The use of real Bosch production data along the value chain of hydraulic valves is a promising approach to classifying the leakage of directional valves.Keywords: artificial neural networks, classification, hydraulics, predictive quality, deep learning
Procedia PDF Downloads 2511704 Learning Example of a Biomedical Project from a Real Problem of Muscle Fatigue
Authors: M. Rezki, A. Belaidi
Abstract:
This paper deals with a method of learning to solve a real problem in biomedical engineering from a technical study of muscle fatigue. Electromyography (EMG) is a technique for evaluating and recording the electrical activity produced by skeletal muscles (viewpoint: anatomical and physiological). EMG is used as a diagnostics tool for identifying neuromuscular diseases, assessing low-back pain and muscle fatigue in general. In order to study the EMG signal for detecting fatigue in a muscle, we have taken a real problem which touches the tramway conductor the handle bar. For the study, we have used a typical autonomous platform in order to get signals at real time. In our case study, we were confronted with complex problem to do our experiments in a tram. This type of problem is recurring among students. To teach our students the method to solve this kind of problem, we built a similar system. Through this study, we realized a lot of objectives such as making the equipment for simulation, the study of detection of muscle fatigue and especially how to manage a study of biomedical looking.Keywords: EMG, health platform, conductor’s tram, muscle fatigue
Procedia PDF Downloads 3191703 Riesz Mixture Model for Brain Tumor Detection
Authors: Mouna Zitouni, Mariem Tounsi
Abstract:
This research introduces an application of the Riesz mixture model for medical image segmentation for accurate diagnosis and treatment of brain tumors. We propose a pixel classification technique based on the Riesz distribution, derived from an extended Bartlett decomposition. To our knowledge, this is the first study addressing this approach. The Expectation-Maximization algorithm is implemented for parameter estimation. A comparative analysis, using both synthetic and real brain images, demonstrates the superiority of the Riesz model over a recent method based on the Wishart distribution.Keywords: EM algorithm, segmentation, Riesz probability distribution, Wishart probability distribution
Procedia PDF Downloads 271702 Case Studies in Three Domains of Learning: Cognitive, Affective, Psychomotor
Authors: Zeinabsadat Haghshenas
Abstract:
Bloom’s Taxonomy has been changed during the years. The idea of this writing is about the revision that has happened in both facts and terms. It also contains case studies of using cognitive Bloom’s taxonomy in teaching geometric solids to the secondary school students, affective objectives in a creative workshop for adults and psychomotor objectives in fixing a malfunctioned refrigerator lamp. There is also pointed to the important role of classification objectives in adult education as a way to prevent memory loss.Keywords: adult education, affective domain, cognitive domain, memory loss, psychomotor domain
Procedia PDF Downloads 473