Search results for: gaussian process classification model with multiclass
29260 Accuracy Analysis of the American Society of Anesthesiologists Classification Using ChatGPT
Authors: Jae Ni Jang, Young Uk Kim
Abstract:
Background: Chat Generative Pre-training Transformer-3 (ChatGPT; San Francisco, California, Open Artificial Intelligence) is an artificial intelligence chatbot based on a large language model designed to generate human-like text. As the usage of ChatGPT is increasing among less knowledgeable patients, medical students, and anesthesia and pain medicine residents or trainees, we aimed to evaluate the accuracy of ChatGPT-3 responses to questions about the American Society of Anesthesiologists (ASA) classification based on patients’ underlying diseases and assess the quality of the generated responses. Methods: A total of 47 questions were submitted to ChatGPT using textual prompts. The questions were designed for ChatGPT-3 to provide answers regarding ASA classification in response to common underlying diseases frequently observed in adult patients. In addition, we created 18 questions regarding the ASA classification for pediatric patients and pregnant women. The accuracy of ChatGPT’s responses was evaluated by cross-referencing with Miller’s Anesthesia, Morgan & Mikhail’s Clinical Anesthesiology, and the American Society of Anesthesiologists’ ASA Physical Status Classification System (2020). Results: Out of the 47 questions pertaining to adults, ChatGPT -3 provided correct answers for only 23, resulting in an accuracy rate of 48.9%. Furthermore, the responses provided by ChatGPT-3 regarding children and pregnant women were mostly inaccurate, as indicated by a 28% accuracy rate (5 out of 18). Conclusions: ChatGPT provided correct responses to questions relevant to the daily clinical routine of anesthesiologists in approximately half of the cases, while the remaining responses contained errors. Therefore, caution is advised when using ChatGPT to retrieve anesthesia-related information. Although ChatGPT may not yet be suitable for clinical settings, we anticipate significant improvements in ChatGPT and other large language models in the near future. Regular assessments of ChatGPT's ASA classification accuracy are essential due to the evolving nature of ChatGPT as an artificial intelligence entity. This is especially important because ChatGPT has a clinically unacceptable rate of error and hallucination, particularly in pediatric patients and pregnant women. The methodology established in this study may be used to continue evaluating ChatGPT.Keywords: American Society of Anesthesiologists, artificial intelligence, Chat Generative Pre-training Transformer-3, ChatGPT
Procedia PDF Downloads 4629259 Short Text Classification Using Part of Speech Feature to Analyze Students' Feedback of Assessment Components
Authors: Zainab Mutlaq Ibrahim, Mohamed Bader-El-Den, Mihaela Cocea
Abstract:
Students' textual feedback can hold unique patterns and useful information about learning process, it can hold information about advantages and disadvantages of teaching methods, assessment components, facilities, and other aspects of teaching. The results of analysing such a feedback can form a key point for institutions’ decision makers to advance and update their systems accordingly. This paper proposes a data mining framework for analysing end of unit general textual feedback using part of speech feature (PoS) with four machine learning algorithms: support vector machines, decision tree, random forest, and naive bays. The proposed framework has two tasks: first, to use the above algorithms to build an optimal model that automatically classifies the whole data set into two subsets, one subset is tailored to assessment practices (assessment related), and the other one is the non-assessment related data. Second task to use the same algorithms to build an optimal model for whole data set, and the new data subsets to automatically detect their sentiment. The significance of this paper is to compare the performance of the above four algorithms using part of speech feature to the performance of the same algorithms using n-grams feature. The paper follows Knowledge Discovery and Data Mining (KDDM) framework to construct the classification and sentiment analysis models, which is understanding the assessment domain, cleaning and pre-processing the data set, selecting and running the data mining algorithm, interpreting mined patterns, and consolidating the discovered knowledge. The results of this paper experiments show that both models which used both features performed very well regarding first task. But regarding the second task, models that used part of speech feature has underperformed in comparison with models that used unigrams and bigrams.Keywords: assessment, part of speech, sentiment analysis, student feedback
Procedia PDF Downloads 14229258 Process Modeling and Problem Solving: Connecting Two Worlds by BPMN
Authors: Gionata Carmignani, Mario G. C. A. Cimino, Franco Failli
Abstract:
Business Processes (BPs) are the key instrument to understand how companies operate at an organizational level, taking an as-is view of the workflow, and how to address their issues by identifying a to-be model. In last year’s, the BP Model and Notation (BPMN) has become a de-facto standard for modeling processes. However, this standard does not incorporate explicitly the Problem-Solving (PS) knowledge in the Process Modeling (PM) results. Thus, such knowledge cannot be shared or reused. To narrow this gap is today a challenging research area. In this paper we present a framework able to capture the PS knowledge and to improve a workflow. This framework extends the BPMN specification by incorporating new general-purpose elements. A pilot scenario is also presented and discussed.Keywords: business process management, BPMN, problem solving, process mapping
Procedia PDF Downloads 41129257 Comparison of Different Methods to Produce Fuzzy Tolerance Relations for Rainfall Data Classification in the Region of Central Greece
Authors: N. Samarinas, C. Evangelides, C. Vrekos
Abstract:
The aim of this paper is the comparison of three different methods, in order to produce fuzzy tolerance relations for rainfall data classification. More specifically, the three methods are correlation coefficient, cosine amplitude and max-min method. The data were obtained from seven rainfall stations in the region of central Greece and refers to 20-year time series of monthly rainfall height average. Three methods were used to express these data as a fuzzy relation. This specific fuzzy tolerance relation is reformed into an equivalence relation with max-min composition for all three methods. From the equivalence relation, the rainfall stations were categorized and classified according to the degree of confidence. The classification shows the similarities among the rainfall stations. Stations with high similarity can be utilized in water resource management scenarios interchangeably or to augment data from one to another. Due to the complexity of calculations, it is important to find out which of the methods is computationally simpler and needs fewer compositions in order to give reliable results.Keywords: classification, fuzzy logic, tolerance relations, rainfall data
Procedia PDF Downloads 31429256 Spectroscopic, Molecular Structure and Electrostatic Potential, Polarizability, Hyperpolarizability, and HOMO–LUMO Analysis of Monomeric and Dimeric Structures of N-(2-Methylphenyl)-2-Nitrobenzenesulfonamide
Authors: A. Didaoui, N. Benhalima, M. Elkeurti, A. Chouaih, F. Hamzaoui
Abstract:
The monomer and dimer structures of the title molecule have been obtained from density functional theory (DFT) B3LYP method with 6-31G (d,p) as basis set calculations. The optimized geometrical parameters obtained by B3LYP/6-31G (d,p) method show good agreement with xperimental X-ray data. The polarizability and first order hyperpolarizabilty of the title molecule were calculated and interpreted. the intermolecular N–H•••O hydrogen bonds are discussed in dimer structure of the molecule. The vibrational wave numbers and their assignments were examined theoretically using the Gaussian 03 set of quantum chemistry codes. The predicted frontier molecular orbital energies at B3LYP/6-31G(d,p) method set show that charge transfer occurs within the molecule. The frontier molecular orbital calculations clearly show the inverse relationship of HOMO–LUMO gap with the total static hyperpolarizability. The results also show that N-(2-Methylphenyl)-2-nitrobenzenesulfonamide molecule may have nonlinear optical (NLO) comportment with non-zero values.Keywords: DFT, Gaussian 03, NLO, N-(2-Methylphenyl)-2-nitrobenzenesulfonamide
Procedia PDF Downloads 54929255 Stochastic Control of Decentralized Singularly Perturbed Systems
Authors: Walid S. Alfuhaid, Saud A. Alghamdi, John M. Watkins, M. Edwin Sawan
Abstract:
Designing a controller for stochastic decentralized interconnected large scale systems usually involves a high degree of complexity and computation ability. Noise, observability, and controllability of all system states, connectivity, and channel bandwidth are other constraints to design procedures for distributed large scale systems. The quasi-steady state model investigated in this paper is a reduced order model of the original system using singular perturbation techniques. This paper results in an optimal control synthesis to design an observer based feedback controller by standard stochastic control theory techniques using Linear Quadratic Gaussian (LQG) approach and Kalman filter design with less complexity and computation requirements. Numerical example is given at the end to demonstrate the efficiency of the proposed method.Keywords: decentralized, optimal control, output, singular perturb
Procedia PDF Downloads 36729254 Process Assessment Model for Process Capability Determination Based on ISO/IEC 20000-1:2011
Authors: Harvard Najoan, Sarwono Sutikno, Yusep Rosmansyah
Abstract:
Most enterprises are now using information technology services as their assets to support business objectives. These kinds of services are provided by the internal service provider (inside the enterprise) or external service provider (outside enterprise). To deliver quality information technology services, the service provider (which from now on will be called ‘organization’) either internal or external, must have a standard for service management system. At present, the standard that is recognized as best practice for service management system for the organization is international standard ISO/IEC 20000:2011. The most important part of this international standard is the first part or ISO/IEC 20000-1:2011-Service Management System Requirement, because it contains 22 for organization processes as a requirement to be implemented in an organizational environment in order to build, manage and deliver quality service to the customer. Assessing organization management processes is the first step to implementing ISO/IEC 20000:2011 into the organization management processes. This assessment needs Process Assessment Model (PAM) as an assessment instrument. PAM comprises two parts: Process Reference Model (PRM) and Measurement Framework (MF). PRM is built by transforming the 22 process of ISO/IEC 20000-1:2011 and MF is based on ISO/IEC 33020. This assessment instrument was designed to assess the capability of service management process in Divisi Teknologi dan Sistem Informasi (Information Systems and Technology Division) as an internal organization of PT Pos Indonesia. The result of this assessment model can be proposed to improve the capability of service management system.Keywords: ISO/IEC 20000-1:2011, ISO/IEC 33020:2015, process assessment, process capability, service management system
Procedia PDF Downloads 46429253 Generating Synthetic Chest X-ray Images for Improved COVID-19 Detection Using Generative Adversarial Networks
Authors: Muneeb Ullah, Daishihan, Xiadong Young
Abstract:
Deep learning plays a crucial role in identifying COVID-19 and preventing its spread. To improve the accuracy of COVID-19 diagnoses, it is important to have access to a sufficient number of training images of CXRs (chest X-rays) depicting the disease. However, there is currently a shortage of such images. To address this issue, this paper introduces COVID-19 GAN, a model that uses generative adversarial networks (GANs) to generate realistic CXR images of COVID-19, which can be used to train identification models. Initially, a generator model is created that uses digressive channels to generate images of CXR scans for COVID-19. To differentiate between real and fake disease images, an efficient discriminator is developed by combining the dense connectivity strategy and instance normalization. This approach makes use of their feature extraction capabilities on CXR hazy areas. Lastly, the deep regret gradient penalty technique is utilized to ensure stable training of the model. With the use of 4,062 grape leaf disease images, the Leaf GAN model successfully produces 8,124 COVID-19 CXR images. The COVID-19 GAN model produces COVID-19 CXR images that outperform DCGAN and WGAN in terms of the Fréchet inception distance. Experimental findings suggest that the COVID-19 GAN-generated CXR images possess noticeable haziness, offering a promising approach to address the limited training data available for COVID-19 model training. When the dataset was expanded, CNN-based classification models outperformed other models, yielding higher accuracy rates than those of the initial dataset and other augmentation techniques. Among these models, ImagNet exhibited the best recognition accuracy of 99.70% on the testing set. These findings suggest that the proposed augmentation method is a solution to address overfitting issues in disease identification and can enhance identification accuracy effectively.Keywords: classification, deep learning, medical images, CXR, GAN.
Procedia PDF Downloads 9529252 Efficient Schemes of Classifiers for Remote Sensing Satellite Imageries of Land Use Pattern Classifications
Authors: S. S. Patil, Sachidanand Kini
Abstract:
Classification of land use patterns is compelling in complexity and variability of remote sensing imageries data. An imperative research in remote sensing application exploited to mine some of the significant spatially variable factors as land cover and land use from satellite images for remote arid areas in Karnataka State, India. The diverse classification techniques, unsupervised and supervised consisting of maximum likelihood, Mahalanobis distance, and minimum distance are applied in Bellary District in Karnataka State, India for the classification of the raw satellite images. The accuracy evaluations of results are compared visually with the standard maps with ground-truths. We initiated with the maximum likelihood technique that gave the finest results and both minimum distance and Mahalanobis distance methods over valued agriculture land areas. In meanness of mislaid few irrelevant features due to the low resolution of the satellite images, high-quality accord between parameters extracted automatically from the developed maps and field observations was found.Keywords: Mahalanobis distance, minimum distance, supervised, unsupervised, user classification accuracy, producer's classification accuracy, maximum likelihood, kappa coefficient
Procedia PDF Downloads 18229251 Determination of Water Pollution and Water Quality with Decision Trees
Authors: Çiğdem Bakır, Mecit Yüzkat
Abstract:
With the increasing emphasis on water quality worldwide, the search for and expanding the market for new and intelligent monitoring systems has increased. The current method is the laboratory process, where samples are taken from bodies of water, and tests are carried out in laboratories. This method is time-consuming, a waste of manpower, and uneconomical. To solve this problem, we used machine learning methods to detect water pollution in our study. We created decision trees with the Orange3 software we used in our study and tried to determine all the factors that cause water pollution. An automatic prediction model based on water quality was developed by taking many model inputs such as water temperature, pH, transparency, conductivity, dissolved oxygen, and ammonia nitrogen with machine learning methods. The proposed approach consists of three stages: preprocessing of the data used, feature detection, and classification. We tried to determine the success of our study with different accuracy metrics and the results. We presented it comparatively. In addition, we achieved approximately 98% success with the decision tree.Keywords: decision tree, water quality, water pollution, machine learning
Procedia PDF Downloads 7829250 Prediction Fluid Properties of Iranian Oil Field with Using of Radial Based Neural Network
Authors: Abdolreza Memari
Abstract:
In this article in order to estimate the viscosity of crude oil,a numerical method has been used. We use this method to measure the crude oil's viscosity for 3 states: Saturated oil's viscosity, viscosity above the bubble point and viscosity under the saturation pressure. Then the crude oil's viscosity is estimated by using KHAN model and roller ball method. After that using these data that include efficient conditions in measuring viscosity, the estimated viscosity by the presented method, a radial based neural method, is taught. This network is a kind of two layered artificial neural network that its stimulation function of hidden layer is Gaussian function and teaching algorithms are used to teach them. After teaching radial based neural network, results of experimental method and artificial intelligence are compared all together. Teaching this network, we are able to estimate crude oil's viscosity without using KHAN model and experimental conditions and under any other condition with acceptable accuracy. Results show that radial neural network has high capability of estimating crude oil saving in time and cost is another advantage of this investigation.Keywords: viscosity, Iranian crude oil, radial based, neural network, roller ball method, KHAN model
Procedia PDF Downloads 49929249 A Real-Time Moving Object Detection and Tracking Scheme and Its Implementation for Video Surveillance System
Authors: Mulugeta K. Tefera, Xiaolong Yang, Jian Liu
Abstract:
Detection and tracking of moving objects are very important in many application contexts such as detection and recognition of people, visual surveillance and automatic generation of video effect and so on. However, the task of detecting a real shape of an object in motion becomes tricky due to various challenges like dynamic scene changes, presence of shadow, and illumination variations due to light switch. For such systems, once the moving object is detected, tracking is also a crucial step for those applications that used in military defense, video surveillance, human computer interaction, and medical diagnostics as well as in commercial fields such as video games. In this paper, an object presents in dynamic background is detected using adaptive mixture of Gaussian based analysis of the video sequences. Then the detected moving object is tracked using the region based moving object tracking and inter-frame differential mechanisms to address the partial overlapping and occlusion problems. Firstly, the detection algorithm effectively detects and extracts the moving object target by enhancing and post processing morphological operations. Secondly, the extracted object uses region based moving object tracking and inter-frame difference to improve the tracking speed of real-time moving objects in different video frames. Finally, the plotting method was applied to detect the moving objects effectively and describes the object’s motion being tracked. The experiment has been performed on image sequences acquired both indoor and outdoor environments and one stationary and web camera has been used.Keywords: background modeling, Gaussian mixture model, inter-frame difference, object detection and tracking, video surveillance
Procedia PDF Downloads 47529248 General Purpose Graphic Processing Units Based Real Time Video Tracking System
Authors: Mallikarjuna Rao Gundavarapu, Ch. Mallikarjuna Rao, K. Anuradha Bai
Abstract:
Real Time Video Tracking is a challenging task for computing professionals. The performance of video tracking techniques is greatly affected by background detection and elimination process. Local regions of the image frame contain vital information of background and foreground. However, pixel-level processing of local regions consumes a good amount of computational time and memory space by traditional approaches. In our approach we have explored the concurrent computational ability of General Purpose Graphic Processing Units (GPGPU) to address this problem. The Gaussian Mixture Model (GMM) with adaptive weighted kernels is used for detecting the background. The weights of the kernel are influenced by local regions and are updated by inter-frame variations of these corresponding regions. The proposed system has been tested with GPU devices such as GeForce GTX 280, GeForce GTX 280 and Quadro K2000. The results are encouraging with maximum speed up 10X compared to sequential approach.Keywords: connected components, embrace threads, local weighted kernel, structuring elements
Procedia PDF Downloads 43729247 Expert Review on Conceptual Design Model of Assistive Courseware for Low Vision (AC4LV) Learners
Authors: Nurulnadwan Aziz, Ariffin Abdul Mutalib, Siti Mahfuzah Sarif
Abstract:
This paper reports an ongoing project regarding the development of Conceptual Design Model of Assistive Courseware for Low Vision (AC4LV) learners. Having developed the intended model, it has to be validated prior to producing it as guidance for the developers to develop an AC4LV. This study requires two phases of validation process which are through expert review and prototyping method. This paper presents a part of the validation process which is findings from experts review on Conceptual Design Model of AC4LV which has been carried out through a questionnaire. Results from 12 international and local experts from various respectable fields in Human-Computer Interaction (HCI) were discussed and justified. In a nutshell, reviewed Conceptual Design Model of AC4LV was formed. Future works of this study are to validate the reviewed model through prototyping method prior to testing it to the targeted users.Keywords: assistive courseware, conceptual design model, expert review, low vision learners
Procedia PDF Downloads 54429246 A Prediction Model of Adopting IPTV
Authors: Jeonghwan Jeon
Abstract:
With the advent of IPTV in the fierce competition with existing broadcasting system, it is emerged as an important issue to predict how much the adoption of IPTV service will be. This paper aims to suggest a prediction model for adopting IPTV using classification and Ranking Belief Simplex (CaRBS). A simplex plot method of representing data allows a clear visual representation to the degree of interaction of the support from the variables to the prediction of the objects. CaRBS is applied to the survey data on the IPTV adoption.Keywords: prediction, adoption, IPTV, CaRBS
Procedia PDF Downloads 41029245 Job Shop Scheduling: Classification, Constraints and Objective Functions
Authors: Majid Abdolrazzagh-Nezhad, Salwani Abdullah
Abstract:
The job-shop scheduling problem (JSSP) is an important decision facing those involved in the fields of industry, economics and management. This problem is a class of combinational optimization problem known as the NP-hard problem. JSSPs deal with a set of machines and a set of jobs with various predetermined routes through the machines, where the objective is to assemble a schedule of jobs that minimizes certain criteria such as makespan, maximum lateness, and total weighted tardiness. Over the past several decades, interest in meta-heuristic approaches to address JSSPs has increased due to the ability of these approaches to generate solutions which are better than those generated from heuristics alone. This article provides the classification, constraints and objective functions imposed on JSSPs that are available in the literature.Keywords: job-shop scheduling, classification, constraints, objective functions
Procedia PDF Downloads 44229244 Music Genre Classification Based on Non-Negative Matrix Factorization Features
Authors: Soyon Kim, Edward Kim
Abstract:
In order to retrieve information from the massive stream of songs in the music industry, music search by title, lyrics, artist, mood, and genre has become more important. Despite the subjectivity and controversy over the definition of music genres across different nations and cultures, automatic genre classification systems that facilitate the process of music categorization have been developed. Manual genre selection by music producers is being provided as statistical data for designing automatic genre classification systems. In this paper, an automatic music genre classification system utilizing non-negative matrix factorization (NMF) is proposed. Short-term characteristics of the music signal can be captured based on the timbre features such as mel-frequency cepstral coefficient (MFCC), decorrelated filter bank (DFB), octave-based spectral contrast (OSC), and octave band sum (OBS). Long-term time-varying characteristics of the music signal can be summarized with (1) the statistical features such as mean, variance, minimum, and maximum of the timbre features and (2) the modulation spectrum features such as spectral flatness measure, spectral crest measure, spectral peak, spectral valley, and spectral contrast of the timbre features. Not only these conventional basic long-term feature vectors, but also NMF based feature vectors are proposed to be used together for genre classification. In the training stage, NMF basis vectors were extracted for each genre class. The NMF features were calculated in the log spectral magnitude domain (NMF-LSM) as well as in the basic feature vector domain (NMF-BFV). For NMF-LSM, an entire full band spectrum was used. However, for NMF-BFV, only low band spectrum was used since high frequency modulation spectrum of the basic feature vectors did not contain important information for genre classification. In the test stage, using the set of pre-trained NMF basis vectors, the genre classification system extracted the NMF weighting values of each genre as the NMF feature vectors. A support vector machine (SVM) was used as a classifier. The GTZAN multi-genre music database was used for training and testing. It is composed of 10 genres and 100 songs for each genre. To increase the reliability of the experiments, 10-fold cross validation was used. For a given input song, an extracted NMF-LSM feature vector was composed of 10 weighting values that corresponded to the classification probabilities for 10 genres. An NMF-BFV feature vector also had a dimensionality of 10. Combined with the basic long-term features such as statistical features and modulation spectrum features, the NMF features provided the increased accuracy with a slight increase in feature dimensionality. The conventional basic features by themselves yielded 84.0% accuracy, but the basic features with NMF-LSM and NMF-BFV provided 85.1% and 84.2% accuracy, respectively. The basic features required dimensionality of 460, but NMF-LSM and NMF-BFV required dimensionalities of 10 and 10, respectively. Combining the basic features, NMF-LSM and NMF-BFV together with the SVM with a radial basis function (RBF) kernel produced the significantly higher classification accuracy of 88.3% with a feature dimensionality of 480.Keywords: mel-frequency cepstral coefficient (MFCC), music genre classification, non-negative matrix factorization (NMF), support vector machine (SVM)
Procedia PDF Downloads 30129243 The Spectroscopic, Molecular Structure and Electrostatic Potential, Polarizability, Hyperpolarizability, and HOMO–LUMO Analysis of Monomeric and Dimeric Structures of N-(2-Methylphenyl)-2-Nitrobenzenesulfonamide
Authors: A. Didaoui, N. Benhalima, M. Elkeurti, A. Chouaih, F. Hamzaoui
Abstract:
The monomer and dimer structures of the title molecule have been obtained from density functional theory (DFT) B3LYP method with 6-31G(d,p) as basis set calculations. The optimized geometrical parameters obtained by B3LYP/6-31G(d,p) method show good agreement with experimental X-ray data. The polarizability and first order hyperpolarizability of the title molecule were calculated and interpreted. The intermolecular N–H•••O hydrogen bonds are discussed in dimer structure of the molecule. The vibrational wave numbers and their assignments were examined theoretically using the Gaussian 03 set of quantum chemistry codes. The predicted frontier molecular orbital energies at B3LYP/6-31G(d,p) method set show that charge transfer occurs within the molecule. The frontier molecular orbital calculations clearly show the inverse relationship of HOMO–LUMO gap with the total static hyperpolarizability. The results also show that N-(2-Methylphenyl)-2-nitrobenzenesulfonamide molecule may have nonlinear optical (NLO) comportment with non-zero values.Keywords: DFT, Gaussian 03, NLO, N-(2-Methylphenyl)-2-nitrobenzenesulfonamide, polarizability
Procedia PDF Downloads 32529242 Spectral Clustering from the Discrepancy View and Generalized Quasirandomness
Authors: Marianna Bolla
Abstract:
The aim of this paper is to compare spectral, discrepancy, and degree properties of expanding graph sequences. As we can prove equivalences and implications between them and the definition of the generalized (multiclass) quasirandomness of Lovasz–Sos (2008), they can be regarded as generalized quasirandom properties akin to the equivalent quasirandom properties of the seminal Chung-Graham-Wilson paper (1989) in the one-class scenario. Since these properties are valid for deterministic graph sequences, irrespective of stochastic models, the partial implications also justify for low-dimensional embedding of large-scale graphs and for discrepancy minimizing spectral clustering.Keywords: generalized random graphs, multiway discrepancy, normalized modularity spectra, spectral clustering
Procedia PDF Downloads 19529241 Landslide and Liquefaction Vulnerability Analysis Using Risk Assessment Analysis and Analytic Hierarchy Process Implication: Suitability of the New Capital of the Republic of Indonesia on Borneo Island
Authors: Rifaldy, Misbahudin, Khalid Rizky, Ricky Aryanto, M. Alfiyan Bagus, Fahri Septianto, Firman Najib Wibisana, Excobar Arman
Abstract:
Indonesia is a country that has a high level of disaster because it is on the ring of fire, and there are several regions with three major plates meeting in the world. So that disaster analysis must always be done to see the potential disasters that might always occur, especially in this research are landslides and liquefaction. This research was conducted to analyze areas that are vulnerable to landslides and liquefaction hazards and their relationship with the assessment of the issue of moving the new capital of the Republic of Indonesia to the island of Kalimantan with a total area of 612,267.22 km². The method in this analysis uses the Analytical Hierarchy Process and consistency ratio testing as a complex and unstructured problem-solving process into several parameters by providing values. The parameters used in this analysis are the slope, land cover, lithology distribution, wetness index, earthquake data, peak ground acceleration. Weighted overlay was carried out from all these parameters using the percentage value obtained from the Analytical Hierarchy Process and confirmed its accuracy with a consistency ratio so that a percentage of the area obtained with different vulnerability classification values was obtained. Based on the analysis results obtained vulnerability classification from very high to low vulnerability. There are (0.15%) 918.40083 km² of highly vulnerable, medium (20.75%) 127,045,44815 km², low (56.54%) 346,175.886188 km², very low (22.56%) 138,127.484832 km². This research is expected to be able to map landslides and liquefaction disasters on the island of Kalimantan and provide consideration of the suitability of regional development of the new capital of the Republic of Indonesia. Also, this research is expected to provide input or can be applied to all regions that are analyzing the vulnerability of landslides and liquefaction or the suitability of the development of certain regions.Keywords: analytic hierarchy process, Borneo Island, landslide and liquefaction, vulnerability analysis
Procedia PDF Downloads 17429240 Subjective versus Objective Assessment for Magnetic Resonance (MR) Images
Authors: Heshalini Rajagopal, Li Sze Chow, Raveendran Paramesran
Abstract:
Magnetic Resonance Imaging (MRI) is one of the most important medical imaging modality. Subjective assessment of the image quality is regarded as the gold standard to evaluate MR images. In this study, a database of 210 MR images which contains ten reference images and 200 distorted images is presented. The reference images were distorted with four types of distortions: Rician Noise, Gaussian White Noise, Gaussian Blur and DCT compression. The 210 images were assessed by ten subjects. The subjective scores were presented in Difference Mean Opinion Score (DMOS). The DMOS values were compared with four FR-IQA metrics. We have used Pearson Linear Coefficient (PLCC) and Spearman Rank Order Correlation Coefficient (SROCC) to validate the DMOS values. The high correlation values of PLCC and SROCC shows that the DMOS values are close to the objective FR-IQA metrics.Keywords: medical resonance (MR) images, difference mean opinion score (DMOS), full reference image quality assessment (FR-IQA)
Procedia PDF Downloads 45729239 ANFIS Approach for Locating Faults in Underground Cables
Authors: Magdy B. Eteiba, Wael Ismael Wahba, Shimaa Barakat
Abstract:
This paper presents a fault identification, classification and fault location estimation method based on Discrete Wavelet Transform and Adaptive Network Fuzzy Inference System (ANFIS) for medium voltage cable in the distribution system. Different faults and locations are simulated by ATP/EMTP, and then certain selected features of the wavelet transformed signals are used as an input for a training process on the ANFIS. Then an accurate fault classifier and locator algorithm was designed, trained and tested using current samples only. The results obtained from ANFIS output were compared with the real output. From the results, it was found that the percentage error between ANFIS output and real output is less than three percent. Hence, it can be concluded that the proposed technique is able to offer high accuracy in both of the fault classification and fault location.Keywords: ANFIS, fault location, underground cable, wavelet transform
Procedia PDF Downloads 51029238 Brain-Computer Interface Based Real-Time Control of Fixed Wing and Multi-Rotor Unmanned Aerial Vehicles
Authors: Ravi Vishwanath, Saumya Kumaar, S. N. Omkar
Abstract:
Brain-computer interfacing (BCI) is a technology that is almost four decades old, and it was developed solely for the purpose of developing and enhancing the impact of neuroprosthetics. However, in the recent times, with the commercialization of non-invasive electroencephalogram (EEG) headsets, the technology has seen a wide variety of applications like home automation, wheelchair control, vehicle steering, etc. One of the latest developed applications is the mind-controlled quadrotor unmanned aerial vehicle. These applications, however, do not require a very high-speed response and give satisfactory results when standard classification methods like Support Vector Machine (SVM) and Multi-Layer Perceptron (MLPC). Issues are faced when there is a requirement for high-speed control in the case of fixed-wing unmanned aerial vehicles where such methods are rendered unreliable due to the low speed of classification. Such an application requires the system to classify data at high speeds in order to retain the controllability of the vehicle. This paper proposes a novel method of classification which uses a combination of Common Spatial Paradigm and Linear Discriminant Analysis that provides an improved classification accuracy in real time. A non-linear SVM based classification technique has also been discussed. Further, this paper discusses the implementation of the proposed method on a fixed-wing and VTOL unmanned aerial vehicles.Keywords: brain-computer interface, classification, machine learning, unmanned aerial vehicles
Procedia PDF Downloads 28129237 Detection of Internal Mold Infection of Intact Tomatoes by Non-Destructive, Transmittance VIS-NIR Spectroscopy
Authors: K. Petcharaporn
Abstract:
The external characteristics of tomatoes, such as freshness, color and size are typically used in quality control processes for tomatoes sorting. However, the internal mold infection of intact tomato cannot be sorted based on a visible assessment and destructive method alone. In this study, a non-destructive technique was used to predict the internal mold infection of intact tomatoes by using transmittance visible and near infrared (VIS-NIR) spectroscopy. Spectra for 200 samples contained 100 samples for normal tomatoes and 100 samples for mold infected tomatoes were acquired in the wavelength range between 665-955 nm. This data was used in conjunction with partial least squares-discriminant analysis (PLS-DA) method to generate a classification model for tomato quality between groups of internal mold infection of intact tomato samples. For this task, the data was split into two groups, 140 samples were used for a training set and 60 samples were used for a test set. The spectra of both normal and internally mold infected tomatoes showed different features in the visible wavelength range. Combined spectral pretreatments of standard normal variate transformation (SNV) and smoothing (Savitzky-Golay) gave the optimal calibration model in training set, 85.0% (63 out of 71 for the normal samples and 56 out of 69 for the internal mold samples). The classification accuracy of the best model on the test set was 91.7% (29 out of 29 for the normal samples and 26 out of 31 for the internal mold tomato samples). The results from this experiment showed that transmittance VIS-NIR spectroscopy can be used as a non-destructive technique to predict the internal mold infection of intact tomatoes.Keywords: tomato, mold, quality, prediction, transmittance
Procedia PDF Downloads 36129236 Singularization: A Technique for Protecting Neural Networks
Authors: Robert Poenaru, Mihail Pleşa
Abstract:
In this work, a solution that addresses the protection of pre-trained neural networks is developed: Singularization. This method involves applying permutations to the weight matrices of a pre-trained model, introducing a form of structured noise that obscures the original model’s architecture. These permutations make it difficult for an attacker to reconstruct the original model, even if the permuted weights are obtained. Experimental benchmarks indicate that the application of singularization has a profound impact on model performance, often degrading it to the point where retraining from scratch becomes necessary to recover functionality, which is particularly effective for securing intellectual property in neural networks. Moreover, unlike other approaches, singularization is lightweight and computationally efficient, which makes it well suited for resource-constrained environments. Our experiments also demonstrate that this technique performs efficiently in various image classification tasks, highlighting its broad applicability and practicality in real-world scenarios.Keywords: machine learning, ANE, CNN, security
Procedia PDF Downloads 1229235 A Unique Exact Approach to Handle a Time-Delayed State-Space System: The Extraction of Juice Process
Authors: Mohamed T. Faheem Saidahmed, Ahmed M. Attiya Ibrahim, Basma GH. Elkilany
Abstract:
This paper discusses the application of Time Delay Control (TDC) compensation technique in the juice extraction process in a sugar mill. The objective is to improve the control performance of the process and increase extraction efficiency. The paper presents the mathematical model of the juice extraction process and the design of the TDC compensation controller. Simulation results show that the TDC compensation technique can effectively suppress the time delay effect in the process and improve control performance. The extraction efficiency is also significantly increased with the application of the TDC compensation technique. The proposed approach provides a practical solution for improving the juice extraction process in sugar mills using MATLAB Processes.Keywords: time delay control (TDC), exact and unique state space model, delay compensation, Smith predictor.
Procedia PDF Downloads 9129234 Classification of Hyperspectral Image Using Mathematical Morphological Operator-Based Distance Metric
Authors: Geetika Barman, B. S. Daya Sagar
Abstract:
In this article, we proposed a pixel-wise classification of hyperspectral images using a mathematical morphology operator-based distance metric called “dilation distance” and “erosion distance”. This method involves measuring the spatial distance between the spectral features of a hyperspectral image across the bands. The key concept of the proposed approach is that the “dilation distance” is the maximum distance a pixel can be moved without changing its classification, whereas the “erosion distance” is the maximum distance that a pixel can be moved before changing its classification. The spectral signature of the hyperspectral image carries unique class information and shape for each class. This article demonstrates how easily the dilation and erosion distance can measure spatial distance compared to other approaches. This property is used to calculate the spatial distance between hyperspectral image feature vectors across the bands. The dissimilarity matrix is then constructed using both measures extracted from the feature spaces. The measured distance metric is used to distinguish between the spectral features of various classes and precisely distinguish between each class. This is illustrated using both toy data and real datasets. Furthermore, we investigated the role of flat vs. non-flat structuring elements in capturing the spatial features of each class in the hyperspectral image. In order to validate, we compared the proposed approach to other existing methods and demonstrated empirically that mathematical operator-based distance metric classification provided competitive results and outperformed some of them.Keywords: dilation distance, erosion distance, hyperspectral image classification, mathematical morphology
Procedia PDF Downloads 8329233 Verification and Validation of Simulated Process Models of KALBR-SIM Training Simulator
Authors: T. Jayanthi, K. Velusamy, H. Seetha, S. A. V. Satya Murty
Abstract:
Verification and Validation of Simulated Process Model is the most important phase of the simulator life cycle. Evaluation of simulated process models based on Verification and Validation techniques checks the closeness of each component model (in a simulated network) with the real system/process with respect to dynamic behaviour under steady state and transient conditions. The process of Verification and validation helps in qualifying the process simulator for the intended purpose whether it is for providing comprehensive training or design verification. In general, model verification is carried out by comparison of simulated component characteristics with the original requirement to ensure that each step in the model development process completely incorporates all the design requirements. Validation testing is performed by comparing the simulated process parameters to the actual plant process parameters either in standalone mode or integrated mode. A Full Scope Replica Operator Training Simulator for PFBR - Prototype Fast Breeder Reactor has been developed at IGCAR, Kalpakkam, INDIA named KALBR-SIM (Kalpakkam Breeder Reactor Simulator) wherein the main participants are engineers/experts belonging to Modeling Team, Process Design and Instrumentation and Control design team. This paper discusses the Verification and Validation process in general, the evaluation procedure adopted for PFBR operator training Simulator, the methodology followed for verifying the models, the reference documents and standards used etc. It details out the importance of internal validation by design experts, subsequent validation by external agency consisting of experts from various fields, model improvement by tuning based on expert’s comments, final qualification of the simulator for the intended purpose and the difficulties faced while co-coordinating various activities.Keywords: Verification and Validation (V&V), Prototype Fast Breeder Reactor (PFBR), Kalpakkam Breeder Reactor Simulator (KALBR-SIM), steady state, transient state
Procedia PDF Downloads 26429232 Process Simulation of 1-Butene Separation from C4 Mixture by Extractive Distillation
Authors: Muhammad Naeem, Abdulrahman A. Al-Rabiah, Wasif Mughees
Abstract:
Technical mixture of C4 containing 1-butene and n-butane are very close to each other with regard to their boiling points i.e. -6.3°C for 1-butene and -1°C for n-butane. Extractive distillation process is used for the separation of 1-butene from the existing mixture of C4. The solvent is the essential of extractive distillation, and an appropriate solvent plays an important role in the process economy of extractive distillation. Aspen Plus has been applied for the separation of these hydrocarbons as a simulator. Moreover, NRTL activity coefficient model was used in the simulation. This model indicated that the material balances in this separation process were accurate for several solvent flow rates. Mixture of acetonitrile and water used as a solvent and 99% pure 1-butene was separated. This simulation proposed the ratio of the feed to solvent as 1: 7.9 and 15 plates for the solvent recovery column. Previously feed to solvent ratio was more than this and the number of proposed plates were 30, which shows that the separation process can be economized.Keywords: extractive distillation, 1-butene, aspen plus, ACN solvent
Procedia PDF Downloads 54229231 The Impacts of Local Decision Making on Customisation Process Speed across Distributed Boundaries
Authors: Abdulrahman M. Qahtani, Gary. B. Wills, Andy. M. Gravell
Abstract:
Communicating and managing customers’ requirements in software development projects play a vital role in the software development process. While it is difficult to do so locally, it is even more difficult to communicate these requirements over distributed boundaries and to convey them to multiple distribution customers. This paper discusses the communication of multiple distribution customers’ requirements in the context of customised software products. The main purpose is to understand the challenges of communicating and managing customisation requirements across distributed boundaries. We propose a model for Communicating Customisation Requirements of Multi-Clients in a Distributed Domain (CCRD). Thereafter, we evaluate that model by presenting the findings of a case study conducted with a company with customisation projects for 18 distributed customers. Then, we compare the outputs of the real case process and the outputs of the CCRD model using simulation methods. Our conjecture is that the CCRD model can reduce the challenge of communication requirements over distributed organisational boundaries, and the delay in decision making and in the entire customisation process time.Keywords: customisation software products, global software engineering, local decision making, requirement engineering, simulation model
Procedia PDF Downloads 428