Search results for: recurrent artificial neural network
4992 Empirical Evaluation of Gradient-Based Training Algorithms for Ordinary Differential Equation Networks
Authors: Martin K. Steiger, Lukas Heisler, Hans-Georg Brachtendorf
Abstract:
Deep neural networks and their variants form the backbone of many AI applications. Based on the so-called residual networks, a continuous formulation of such models as ordinary differential equations (ODEs) has proven advantageous since different techniques may be applied that significantly increase the learning speed and enable controlled trade-offs with the resulting error at the same time. For the evaluation of such models, high-performance numerical differential equation solvers are used, which also provide the gradients required for training. However, whether classical gradient-based methods are even applicable or which one yields the best results has not been discussed yet. This paper aims to redeem this situation by providing empirical results for different applications.Keywords: deep neural networks, gradient-based learning, image processing, ordinary differential equation networks
Procedia PDF Downloads 1704991 Artificial Intelligence in Disease Diagnosis
Authors: Shalini Tripathi, Pardeep Kumar
Abstract:
The method of translating observed symptoms into disease names is known as disease diagnosis. The ability to solve clinical problems in a complex manner is critical to a doctor's effectiveness in providing health care. The accuracy of his or her expertise is crucial to the survival and well-being of his or her patients. Artificial Intelligence (AI) has a huge economic influence depending on how well it is applied. In the medical sector, human brain-simulated intellect can help not only with classification accuracy, but also with reducing diagnostic time, cost and pain associated with pathologies tests. In light of AI's present and prospective applications in the biomedical, we will identify them in the paper based on potential benefits and risks, social and ethical consequences and issues that might be contentious but have not been thoroughly discussed in publications and literature. Current apps, personal tracking tools, genetic tests and editing programmes, customizable models, web environments, virtual reality (VR) technologies and surgical robotics will all be investigated in this study. While AI holds a lot of potential in medical diagnostics, it is still a very new method, and many clinicians are uncertain about its reliability, specificity and how it can be integrated into clinical practice without jeopardising clinical expertise. To validate their effectiveness, more systemic refinement of these implementations, as well as training of physicians and healthcare facilities on how to effectively incorporate these strategies into clinical practice, will be needed.Keywords: Artificial Intelligence, medical diagnosis, virtual reality, healthcare ethical implications
Procedia PDF Downloads 1334990 Roughness Discrimination Using Bioinspired Tactile Sensors
Authors: Zhengkun Yi
Abstract:
Surface texture discrimination using artificial tactile sensors has attracted increasing attentions in the past decade as it can endow technical and robot systems with a key missing ability. However, as a major component of texture, roughness has rarely been explored. This paper presents an approach for tactile surface roughness discrimination, which includes two parts: (1) design and fabrication of a bioinspired artificial fingertip, and (2) tactile signal processing for tactile surface roughness discrimination. The bioinspired fingertip is comprised of two polydimethylsiloxane (PDMS) layers, a polymethyl methacrylate (PMMA) bar, and two perpendicular polyvinylidene difluoride (PVDF) film sensors. This artificial fingertip mimics human fingertips in three aspects: (1) Elastic properties of epidermis and dermis in human skin are replicated by the two PDMS layers with different stiffness, (2) The PMMA bar serves the role analogous to that of a bone, and (3) PVDF film sensors emulate Meissner’s corpuscles in terms of both location and response to the vibratory stimuli. Various extracted features and classification algorithms including support vector machines (SVM) and k-nearest neighbors (kNN) are examined for tactile surface roughness discrimination. Eight standard rough surfaces with roughness values (Ra) of 50 μm, 25 μm, 12.5 μm, 6.3 μm 3.2 μm, 1.6 μm, 0.8 μm, and 0.4 μm are explored. The highest classification accuracy of (82.6 ± 10.8) % can be achieved using solely one PVDF film sensor with kNN (k = 9) classifier and the standard deviation feature.Keywords: bioinspired fingertip, classifier, feature extraction, roughness discrimination
Procedia PDF Downloads 3134989 ChatGPT as a “Foreign Language Teacher”: Attitudes of Tunisian English Language Learners
Authors: Leila Najeh Bel'Kiry
Abstract:
Artificial intelligence (AI) brought about many language robots, with ChatGPT being the most sophisticated thanks to its human-like linguistic capabilities. This aspect raises the idea of using ChatGPT in learning foreign languages. Starting from the premise that positions ChatGPT as a mediator between the language and the leaner, functioning as a “ghost teacher" offering a peaceful and secure learning space, this study aims to explore the attitudes of Tunisian students of English towards ChatGPT as a “Foreign Language Teacher” . Forty-five students, in their third year of fundamental English at Tunisian universities and high institutes, completed a Likert scale questionnaire consisting of thirty-two items and covering various aspects of language (phonology, morphology, syntax, semantics, and pragmatics). A scale ranging from 'Strongly Disagree,' 'Disagree,' 'Undecided,' 'Agree,' to 'Strongly Agree.' is used to assess the attitudes of the participants towards the integration of ChaGPTin learning a foreign language. Results indicate generally positive attitudes towards the reliance on ChatGPT in learning foreign languages, particularly some compounds of language like syntax, phonology, and morphology. However, learners show insecurity towards ChatGPT when it comes to pragmatics and semantics, where the artificial model may fail when dealing with deeper contextual and nuanced language levels.Keywords: artificial language model, attitudes, foreign language learning, ChatGPT, linguistic capabilities, Tunisian English language learners
Procedia PDF Downloads 654988 Applying Sequential Pattern Mining to Generate Block for Scheduling Problems
Authors: Meng-Hui Chen, Chen-Yu Kao, Chia-Yu Hsu, Pei-Chann Chang
Abstract:
The main idea in this paper is using sequential pattern mining to find the information which is helpful for finding high performance solutions. By combining this information, it is defined as blocks. Using the blocks to generate artificial chromosomes (ACs) could improve the structure of solutions. Estimation of Distribution Algorithms (EDAs) is adapted to solve the combinatorial problems. Nevertheless many of these approaches are advantageous for this application, but only some of them are used to enhance the efficiency of application. Generating ACs uses patterns and EDAs could increase the diversity. According to the experimental result, the algorithm which we proposed has a better performance to solve the permutation flow-shop problems.Keywords: combinatorial problems, sequential pattern mining, estimationof distribution algorithms, artificial chromosomes
Procedia PDF Downloads 6124987 Measures of Reliability and Transportation Quality on an Urban Rail Transit Network in Case of Links’ Capacities Loss
Authors: Jie Liu, Jinqu Cheng, Qiyuan Peng, Yong Yin
Abstract:
Urban rail transit (URT) plays a significant role in dealing with traffic congestion and environmental problems in cities. However, equipment failure and obstruction of links often lead to URT links’ capacities loss in daily operation. It affects the reliability and transport service quality of URT network seriously. In order to measure the influence of links’ capacities loss on reliability and transport service quality of URT network, passengers are divided into three categories in case of links’ capacities loss. Passengers in category 1 are less affected by the loss of links’ capacities. Their travel is reliable since their travel quality is not significantly reduced. Passengers in category 2 are affected by the loss of links’ capacities heavily. Their travel is not reliable since their travel quality is reduced seriously. However, passengers in category 2 still can travel on URT. Passengers in category 3 can not travel on URT because their travel paths’ passenger flow exceeds capacities. Their travel is not reliable. Thus, the proportion of passengers in category 1 whose travel is reliable is defined as reliability indicator of URT network. The transport service quality of URT network is related to passengers’ travel time, passengers’ transfer times and whether seats are available to passengers. The generalized travel cost is a comprehensive reflection of travel time, transfer times and travel comfort. Therefore, passengers’ average generalized travel cost is used as transport service quality indicator of URT network. The impact of links’ capacities loss on transport service quality of URT network is measured with passengers’ relative average generalized travel cost with and without links’ capacities loss. The proportion of the passengers affected by links and betweenness of links are used to determine the important links in URT network. The stochastic user equilibrium distribution model based on the improved logit model is used to determine passengers’ categories and calculate passengers’ generalized travel cost in case of links’ capacities loss, which is solved with method of successive weighted averages algorithm. The reliability and transport service quality indicators of URT network are calculated with the solution result. Taking Wuhan Metro as a case, the reliability and transport service quality of Wuhan metro network is measured with indicators and method proposed in this paper. The result shows that using the proportion of the passengers affected by links can identify important links effectively which have great influence on reliability and transport service quality of URT network; The important links are mostly connected to transfer stations and the passenger flow of important links is high; With the increase of number of failure links and the proportion of capacity loss, the reliability of the network keeps decreasing, the proportion of passengers in category 3 keeps increasing and the proportion of passengers in category 2 increases at first and then decreases; When the number of failure links and the proportion of capacity loss increased to a certain level, the decline of transport service quality is weakened.Keywords: urban rail transit network, reliability, transport service quality, links’ capacities loss, important links
Procedia PDF Downloads 1284986 AI-Based Technologies for Improving Patient Safety and Quality of Care
Authors: Tewelde Gebreslassie Gebreanenia, Frie Ayalew Yimam, Seada Hussen Adem
Abstract:
Patient safety and quality of care are essential goals of health care delivery, but they are often compromised by human errors, system failures, or resource constraints. In a variety of healthcare contexts, artificial intelligence (AI), a quickly developing field, can provide fresh approaches to enhancing patient safety and treatment quality. Artificial Intelligence (AI) has the potential to decrease errors and enhance patient outcomes by carrying out tasks that would typically require human intelligence. These tasks include the detection and prevention of adverse events, monitoring and warning patients and clinicians about changes in vital signs, symptoms, or risks, offering individualized and evidence-based recommendations for diagnosis, treatment, or prevention, and assessing and enhancing the effectiveness of health care systems and services. This study examines the state-of-the-art and potential future applications of AI-based technologies for enhancing patient safety and care quality, as well as the opportunities and problems they present for patients, policymakers, researchers, and healthcare providers. In order to ensure the safe, efficient, and responsible application of AI in healthcare, the paper also addresses the ethical, legal, social, and technical challenges that must be addressed and regulated.Keywords: artificial intelligence, health care, human intelligence, patient safty, quality of care
Procedia PDF Downloads 794985 Machine Learning in Patent Law: How Genetic Breeding Algorithms Challenge Modern Patent Law Regimes
Authors: Stefan Papastefanou
Abstract:
Artificial intelligence (AI) is an interdisciplinary field of computer science with the aim of creating intelligent machine behavior. Early approaches to AI have been configured to operate in very constrained environments where the behavior of the AI system was previously determined by formal rules. Knowledge was presented as a set of rules that allowed the AI system to determine the results for specific problems; as a structure of if-else rules that could be traversed to find a solution to a particular problem or question. However, such rule-based systems typically have not been able to generalize beyond the knowledge provided. All over the world and especially in IT-heavy industries such as the United States, the European Union, Singapore, and China, machine learning has developed to be an immense asset, and its applications are becoming more and more significant. It has to be examined how such products of machine learning models can and should be protected by IP law and for the purpose of this paper patent law specifically, since it is the IP law regime closest to technical inventions and computing methods in technical applications. Genetic breeding models are currently less popular than recursive neural network method and deep learning, but this approach can be more easily described by referring to the evolution of natural organisms, and with increasing computational power; the genetic breeding method as a subset of the evolutionary algorithms models is expected to be regaining popularity. The research method focuses on patentability (according to the world’s most significant patent law regimes such as China, Singapore, the European Union, and the United States) of AI inventions and machine learning. Questions of the technical nature of the problem to be solved, the inventive step as such, and the question of the state of the art and the associated obviousness of the solution arise in the current patenting processes. Most importantly, and the key focus of this paper is the problem of patenting inventions that themselves are developed through machine learning. The inventor of a patent application must be a natural person or a group of persons according to the current legal situation in most patent law regimes. In order to be considered an 'inventor', a person must actually have developed part of the inventive concept. The mere application of machine learning or an AI algorithm to a particular problem should not be construed as the algorithm that contributes to a part of the inventive concept. However, when machine learning or the AI algorithm has contributed to a part of the inventive concept, there is currently a lack of clarity regarding the ownership of artificially created inventions. Since not only all European patent law regimes but also the Chinese and Singaporean patent law approaches include identical terms, this paper ultimately offers a comparative analysis of the most relevant patent law regimes.Keywords: algorithms, inventor, genetic breeding models, machine learning, patentability
Procedia PDF Downloads 1094984 Sleep Apnea Hypopnea Syndrom Diagnosis Using Advanced ANN Techniques
Authors: Sachin Singh, Thomas Penzel, Dinesh Nandan
Abstract:
Accurate identification of Sleep Apnea Hypopnea Syndrom Diagnosis is difficult problem for human expert because of variability among persons and unwanted noise. This paper proposes the diagonosis of Sleep Apnea Hypopnea Syndrome (SAHS) using airflow, ECG, Pulse and SaO2 signals. The features of each type of these signals are extracted using statistical methods and ANN learning methods. These extracted features are used to approximate the patient's Apnea Hypopnea Index(AHI) using sample signals in model. Advance signal processing is also applied to snore sound signal to locate snore event and SaO2 signal is used to support whether determined snore event is true or noise. Finally, Apnea Hypopnea Index (AHI) event is calculated as per true snore event detected. Experiment results shows that the sensitivity can reach up to 96% and specificity to 96% as AHI greater than equal to 5.Keywords: neural network, AHI, statistical methods, autoregressive models
Procedia PDF Downloads 1214983 Online Authenticity Verification of a Biometric Signature Using Dynamic Time Warping Method and Neural Networks
Authors: Gałka Aleksandra, Jelińska Justyna, Masiak Albert, Walentukiewicz Krzysztof
Abstract:
An offline signature is well-known however not the safest way to verify identity. Nowadays, to ensure proper authentication, i.e. in banking systems, multimodal verification is more widely used. In this paper the online signature analysis based on dynamic time warping (DTW) coupled with machine learning approaches has been presented. In our research signatures made with biometric pens were gathered. Signature features as well as their forgeries have been described. For verification of authenticity various methods were used including convolutional neural networks using DTW matrix and multilayer perceptron using sums of DTW matrix paths. System efficiency has been evaluated on signatures and signature forgeries collected on the same day. Results are presented and discussed in this paper.Keywords: dynamic time warping, handwritten signature verification, feature-based recognition, online signature
Procedia PDF Downloads 1774982 Keypoint Detection Method Based on Multi-Scale Feature Fusion of Attention Mechanism
Authors: Xiaoxiao Li, Shuangcheng Jia, Qian Li
Abstract:
Keypoint detection has always been a challenge in the field of image recognition. This paper proposes a novelty keypoint detection method which is called Multi-Scale Feature Fusion Convolutional Network with Attention (MFFCNA). We verified that the multi-scale features with the attention mechanism module have better feature expression capability. The feature fusion between different scales makes the information that the network model can express more abundant, and the network is easier to converge. On our self-made street sign corner dataset, we validate the MFFCNA model with an accuracy of 97.8% and a recall of 81%, which are 5 and 8 percentage points higher than the HRNet network, respectively. On the COCO dataset, the AP is 71.9%, and the AR is 75.3%, which are 3 points and 2 points higher than HRNet, respectively. Extensive experiments show that our method has a remarkable improvement in the keypoint recognition tasks, and the recognition effect is better than the existing methods. Moreover, our method can be applied not only to keypoint detection but also to image classification and semantic segmentation with good generality.Keywords: keypoint detection, feature fusion, attention, semantic segmentation
Procedia PDF Downloads 1214981 Emerging Technology for Business Intelligence Applications
Authors: Hsien-Tsen Wang
Abstract:
Business Intelligence (BI) has long helped organizations make informed decisions based on data-driven insights and gain competitive advantages in the marketplace. In the past two decades, businesses witnessed not only the dramatically increasing volume and heterogeneity of business data but also the emergence of new technologies, such as Artificial Intelligence (AI), Semantic Web (SW), Cloud Computing, and Big Data. It is plausible that the convergence of these technologies would bring more value out of business data by establishing linked data frameworks and connecting in ways that enable advanced analytics and improved data utilization. In this paper, we first review and summarize current BI applications and methodology. Emerging technologies that can be integrated into BI applications are then discussed. Finally, we conclude with a proposed synergy framework that aims at achieving a more flexible, scalable, and intelligent BI solution.Keywords: business intelligence, artificial intelligence, semantic web, big data, cloud computing
Procedia PDF Downloads 994980 Deep Learning for Qualitative and Quantitative Grain Quality Analysis Using Hyperspectral Imaging
Authors: Ole-Christian Galbo Engstrøm, Erik Schou Dreier, Birthe Møller Jespersen, Kim Steenstrup Pedersen
Abstract:
Grain quality analysis is a multi-parameterized problem that includes a variety of qualitative and quantitative parameters such as grain type classification, damage type classification, and nutrient regression. Currently, these parameters require human inspection, a multitude of instruments employing a variety of sensor technologies, and predictive model types or destructive and slow chemical analysis. This paper investigates the feasibility of applying near-infrared hyperspectral imaging (NIR-HSI) to grain quality analysis. For this study two datasets of NIR hyperspectral images in the wavelength range of 900 nm - 1700 nm have been used. Both datasets contain images of sparsely and densely packed grain kernels. The first dataset contains ~87,000 image crops of bulk wheat samples from 63 harvests where protein value has been determined by the FOSS Infratec NOVA which is the golden industry standard for protein content estimation in bulk samples of cereal grain. The second dataset consists of ~28,000 image crops of bulk grain kernels from seven different wheat varieties and a single rye variety. In the first dataset, protein regression analysis is the problem to solve while variety classification analysis is the problem to solve in the second dataset. Deep convolutional neural networks (CNNs) have the potential to utilize spatio-spectral correlations within a hyperspectral image to simultaneously estimate the qualitative and quantitative parameters. CNNs can autonomously derive meaningful representations of the input data reducing the need for advanced preprocessing techniques required for classical chemometric model types such as artificial neural networks (ANNs) and partial least-squares regression (PLS-R). A comparison between different CNN architectures utilizing 2D and 3D convolution is conducted. These results are compared to the performance of ANNs and PLS-R. Additionally, a variety of preprocessing techniques from image analysis and chemometrics are tested. These include centering, scaling, standard normal variate (SNV), Savitzky-Golay (SG) filtering, and detrending. The results indicate that the combination of NIR-HSI and CNNs has the potential to be the foundation for an automatic system unifying qualitative and quantitative grain quality analysis within a single sensor technology and predictive model type.Keywords: deep learning, grain analysis, hyperspectral imaging, preprocessing techniques
Procedia PDF Downloads 1004979 Automatic Calibration of Agent-Based Models Using Deep Neural Networks
Authors: Sima Najafzadehkhoei, George Vega Yon
Abstract:
This paper presents an approach for calibrating Agent-Based Models (ABMs) efficiently, utilizing Convolutional Neural Networks (CNNs) and Long Short-Term Memory (LSTM) networks. These machine learning techniques are applied to Susceptible-Infected-Recovered (SIR) models, which are a core framework in the study of epidemiology. Our method replicates parameter values from observed trajectory curves, enhancing the accuracy of predictions when compared to traditional calibration techniques. Through the use of simulated data, we train the models to predict epidemiological parameters more accurately. Two primary approaches were explored: one where the number of susceptible, infected, and recovered individuals is fully known, and another using only the number of infected individuals. Our method shows promise for application in other ABMs where calibration is computationally intensive and expensive.Keywords: ABM, calibration, CNN, LSTM, epidemiology
Procedia PDF Downloads 274978 Dynamic Performance Analysis of Distribution/ Sub-Transmission Networks with High Penetration of PV Generation
Authors: Cristian F.T. Montenegro, Luís F. N. Lourenço, Maurício B. C. Salles, Renato M. Monaro
Abstract:
More PV systems have been connected to the electrical network each year. As the number of PV systems increases, some issues affecting grid operations have been identified. This paper studied the impacts related to changes in solar irradiance on a distribution/sub-transmission network, considering variations due to moving clouds and daily cycles. Using MATLAB/Simulink software, a solar farm of 30 MWp was built and then implemented to a test network. From simulations, it has been determined that irradiance changes can have a significant impact on the grid by causing voltage fluctuations outside the allowable thresholds. This work discussed some local control strategies and grid reinforcements to mitigate the negative effects of the irradiance changes on the grid.Keywords: reactive power control, solar irradiance, utility-scale PV systems, voltage fluctuations
Procedia PDF Downloads 4614977 Synthesizing an Artificial Loess for Geotechnical Investigations of Collapsible Soil Behavior
Authors: Hamed Sadeghi, Pouya A. Panahi, Hamed Nasiri, Mohammad Sadeghi
Abstract:
Collapsible soils like loess comprise an important category of problematic soils for construction purposes and sustainable development. As a result, research on both geological and geotechnical aspects of this type of soil have been in progress for decades. However, considerable natural variability in physical properties of in-situ loess strata even in a single block sample challenges the fundamental laboratory investigations. The reason behind this is that it is somehow impossible to remove the effect of a specific factor like void ratio from fair comparisons to come with a reliable conclusion. In order to cope with this limitation, two types of artificially made dispersive and calcareous loess are introduced which can be easily reproduced in any soil mechanics laboratory provided that all its compositions are known and controlled. The collapse potential is explored for a variety of soil water salinity and lime content and comparisons are made against the natural soil behavior. Trends are reported for the influence of pore water salinity on collapse potential under different osmotic flow conditions. The most important advantage of artificial loess is the ease of controlling cementing agent content like calcite or dispersive potential for studying their influence on mechanical soil behavior.Keywords: artificial loess, unsaturated soils, collapse potential, dispersive clays, laboratory tests
Procedia PDF Downloads 1964976 Enhancement Method of Network Traffic Anomaly Detection Model Based on Adversarial Training With Category Tags
Authors: Zhang Shuqi, Liu Dan
Abstract:
For the problems in intelligent network anomaly traffic detection models, such as low detection accuracy caused by the lack of training samples, poor effect with small sample attack detection, a classification model enhancement method, F-ACGAN(Flow Auxiliary Classifier Generative Adversarial Network) which introduces generative adversarial network and adversarial training, is proposed to solve these problems. Generating adversarial data with category labels could enhance the training effect and improve classification accuracy and model robustness. FACGAN consists of three steps: feature preprocess, which includes data type conversion, dimensionality reduction and normalization, etc.; A generative adversarial network model with feature learning ability is designed, and the sample generation effect of the model is improved through adversarial iterations between generator and discriminator. The adversarial disturbance factor of the gradient direction of the classification model is added to improve the diversity and antagonism of generated data and to promote the model to learn from adversarial classification features. The experiment of constructing a classification model with the UNSW-NB15 dataset shows that with the enhancement of FACGAN on the basic model, the classification accuracy has improved by 8.09%, and the score of F1 has improved by 6.94%.Keywords: data imbalance, GAN, ACGAN, anomaly detection, adversarial training, data augmentation
Procedia PDF Downloads 1084975 Experimental Assessment of Artificial Flavors Production
Authors: M. Unis, S. Turky, A. Elalem, A. Meshrghi
Abstract:
The Esterification kinetics of acetic acid with isopropnol in the presence of sulfuric acid as a homogenous catalyst was studied with isothermal batch experiments at 60,70 and 80°C and at a different molar ratio of isopropnol to acetic acid. Investigation of kinetics of the reaction indicated that the low of molar ratio is favored for esterification reaction, this is due to the reaction is catalyzed by acid. The maximum conversion, approximately 60.6% was obtained at 80°C for molar ratio of 1:3 acid : alcohol. It was found that increasing temperature of the reaction, increases the rate constant and conversion at a certain mole ratio, that is due to the esterification is exothermic. The homogenous reaction has been described with simple power-law model. The chemical equilibrium combustion calculated from the kinetic model in agreement with the measured chemical equilibrium.Keywords: artificial flavors, esterification, chemical equilibria, isothermal
Procedia PDF Downloads 3354974 Integration Network ASI in Lab Automation and Networks Industrial in IFCE
Authors: Jorge Fernandes Teixeira Filho, André Oliveira Alcantara Fontenele, Érick Aragão Ribeiro
Abstract:
The constant emergence of new technologies used in automated processes makes it necessary for teachers and traders to apply new technologies in their classes. This paper presents an application of a new technology that will be employed in a didactic plant, which represents an effluent treatment process located in a laboratory of a federal educational institution. At work were studied in the first place, all components to be placed on automation laboratory in order to determine ways to program, parameterize and organize the plant. New technologies that have been implemented to the process are basically an AS-i network and a Profinet network, a SCADA system, which represented a major innovation in the laboratory. The project makes it possible to carry out in the laboratory various practices of industrial networks and SCADA systems.Keywords: automation, industrial networks, SCADA systems, lab automation
Procedia PDF Downloads 5504973 Addressing the Exorbitant Cost of Labeling Medical Images with Active Learning
Authors: Saba Rahimi, Ozan Oktay, Javier Alvarez-Valle, Sujeeth Bharadwaj
Abstract:
Successful application of deep learning in medical image analysis necessitates unprecedented amounts of labeled training data. Unlike conventional 2D applications, radiological images can be three-dimensional (e.g., CT, MRI), consisting of many instances within each image. The problem is exacerbated when expert annotations are required for effective pixel-wise labeling, which incurs exorbitant labeling effort and cost. Active learning is an established research domain that aims to reduce labeling workload by prioritizing a subset of informative unlabeled examples to annotate. Our contribution is a cost-effective approach for U-Net 3D models that uses Monte Carlo sampling to analyze pixel-wise uncertainty. Experiments on the AAPM 2017 lung CT segmentation challenge dataset show that our proposed framework can achieve promising segmentation results by using only 42% of the training data.Keywords: image segmentation, active learning, convolutional neural network, 3D U-Net
Procedia PDF Downloads 1574972 Neural Correlates of Attention Bias to Threat during the Emotional Stroop Task in Schizophrenia
Authors: Camellia Al-Ibrahim, Jenny Yiend, Sukhwinder S. Shergill
Abstract:
Background: Attention bias to threat play a role in the development, maintenance, and exacerbation of delusional beliefs in schizophrenia in which patients emphasize the threatening characteristics of stimuli and prioritise them for processing. Cognitive control deficits arise when task-irrelevant emotional information elicits attentional bias and obstruct optimal performance. This study is investigating neural correlates of interference effect of linguistic threat and whether these effects are independent of delusional severity. Methods: Using an event-related functional magnetic resonance imaging (fMRI), neural correlates of interference effect of linguistic threat during the emotional Stroop task were investigated and compared patients with schizophrenia with high (N=17) and low (N=16) paranoid symptoms and healthy controls (N=20). Participants were instructed to identify the font colour of each word presented on the screen as quickly and accurately as possible. Stimuli types vary between threat-relevant, positive and neutral words. Results: Group differences in whole brain effects indicate decreased amygdala activity in patients with high paranoid symptoms compared with low paranoid patients and healthy controls. Regions of interest analysis (ROI) validated our results within the amygdala and investigated changes within the striatum showing a pattern of reduced activation within the clinical group compared to healthy controls. Delusional severity was associated with significant decreased neural activity in the striatum within the clinical group. Conclusion: Our findings suggest that the emotional interference mediated by the amygdala and striatum may reduce responsiveness to threat-related stimuli in schizophrenia and that attenuation of fMRI Blood-oxygen-level dependent (BOLD) signal within these areas might be influenced by the severity of delusional symptoms.Keywords: attention bias, fMRI, Schizophrenia, Stroop
Procedia PDF Downloads 2024971 Continual Learning Using Data Generation for Hyperspectral Remote Sensing Scene Classification
Authors: Samiah Alammari, Nassim Ammour
Abstract:
When providing a massive number of tasks successively to a deep learning process, a good performance of the model requires preserving the previous tasks data to retrain the model for each upcoming classification. Otherwise, the model performs poorly due to the catastrophic forgetting phenomenon. To overcome this shortcoming, we developed a successful continual learning deep model for remote sensing hyperspectral image regions classification. The proposed neural network architecture encapsulates two trainable subnetworks. The first module adapts its weights by minimizing the discrimination error between the land-cover classes during the new task learning, and the second module tries to learn how to replicate the data of the previous tasks by discovering the latent data structure of the new task dataset. We conduct experiments on HSI dataset Indian Pines. The results confirm the capability of the proposed method.Keywords: continual learning, data reconstruction, remote sensing, hyperspectral image segmentation
Procedia PDF Downloads 2684970 Adhering to the Traditional Standard of Originality in the Era of Artificial Intelligence Copyright Protection
Authors: Xiaochen Mu
Abstract:
Whether in common law countries that adhere to the "commercial copyright theory" or in civil law countries that center around "author's rights," the standards for judging originality have undergone continuous adjustments in response to the development of information technology. The adherence to originality standards does not arbitrarily dictate that all types of works be judged according to a single standard of originality, nor does it rigidly ignore the changes in creative methods and dissemination models brought about by technology. Adjustments and interpretations should be allowed based on the different forms of expression of works. Appropriate adjustments and interpretations are our response to technological advancements. However, what should be upheld are the principles and bottom lines of these adjustments and interpretations, namely the legislative intent and purpose of copyright law, which are to encourage the creation and dissemination of outstanding cultural works and to promote the flourishing of culture.Keywords: generative artificial intelligence, originality, works, copyright
Procedia PDF Downloads 454969 Analyzing Keyword Networks for the Identification of Correlated Research Topics
Authors: Thiago M. R. Dias, Patrícia M. Dias, Gray F. Moita
Abstract:
The production and publication of scientific works have increased significantly in the last years, being the Internet the main factor of access and distribution of these works. Faced with this, there is a growing interest in understanding how scientific research has evolved, in order to explore this knowledge to encourage research groups to become more productive. Therefore, the objective of this work is to explore repositories containing data from scientific publications and to characterize keyword networks of these publications, in order to identify the most relevant keywords, and to highlight those that have the greatest impact on the network. To do this, each article in the study repository has its keywords extracted and in this way the network is characterized, after which several metrics for social network analysis are applied for the identification of the highlighted keywords.Keywords: bibliometrics, data analysis, extraction and data integration, scientometrics
Procedia PDF Downloads 2604968 Monitoring of Water Quality Using Wireless Sensor Network: Case Study of Benue State of Nigeria
Authors: Desmond Okorie, Emmanuel Prince
Abstract:
Availability of portable water has been a global challenge especially to the developing continents/nations such as Africa/Nigeria. The World Health Organization WHO has produced the guideline for drinking water quality GDWQ which aims at ensuring water safety from source to consumer. Portable water parameters test include physical (colour, odour, temperature, turbidity), chemical (PH, dissolved solids) biological (algae, plytoplankton). This paper discusses the use of wireless sensor networks to monitor water quality using efficient and effective sensors that have the ability to sense, process and transmit sensed data. The integration of wireless sensor network to a portable sensing device offers the feasibility of sensing distribution capability, on site data measurements and remote sensing abilities. The current water quality tests that are performed in government water quality institutions in Benue State Nigeria are carried out in problematic locations that require taking manual water samples to the institution laboratory for examination, to automate the entire process based on wireless sensor network, a system was designed. The system consists of sensor node containing one PH sensor, one temperature sensor, a microcontroller, a zigbee radio and a base station composed by a zigbee radio and a PC. Due to the advancement of wireless sensor network technology, unexpected contamination events in water environments can be observed continuously. local area network (LAN) wireless local area network (WLAN) and internet web-based also commonly used as a gateway unit for data communication via local base computer using standard global system for mobile communication (GSM). The improvement made on this development show a water quality monitoring system and prospect for more robust and reliable system in the future.Keywords: local area network, Ph measurement, wireless sensor network, zigbee
Procedia PDF Downloads 1744967 Histopathological Features of Basal Cell Carcinoma: A Ten Year Retrospective Statistical Study in Egypt
Authors: Hala M. El-hanbuli, Mohammed F. Darweesh
Abstract:
The incidence rates of any tumor vary hugely with geographical location. Basal Cell Carcinoma (BCC) is one of the most common skin cancer that has many histopathologic subtypes. Objective: The aim was to study the histopathological features of BCC cases that were received in the Pathology Department, Kasr El-Aini hospital, Cairo University, Egypt during the period from Jan 2004 to Dec 2013 and to evaluate the clinical characters through the patient data available in the request sheets. Methods: Slides and data of BCC cases were collected from the archives of the pathology department, Kasr El-Aini hospital. Revision of all available slides and histological classification of BCC according to WHO (2006) was done. Results: A total number of 310 cases of BCC representing about 65% from the total number of malignant skin tumors examined during the 10-years duration in the department. The age ranged from 8 to 84 years, the mean age was (55.7 ± 15.5). Most of the patients (85%) were above the age of 40 years. There was a slight male predominance (55%). Ulcerated BCC was the most common gross picture (60%), followed by nodular lesion (30%) and finally the ulcerated nodule (10%). Most of the lesions situated in the high-risk sites (77%) where the nose was the most common site (35%) followed by the periocular area (22%), then periauricular (15%) and finally perioral (5%). No lesion was reported outside the head. The tumor size was less than 2 centimeters in 65% of cases, and from 2-5 centimeters in the lesions' greatest dimension in the rest of cases. Histopathological reclassification revealed that the nodular BCC was the most common (68%) followed by the pigmented nodular (18.75%). The histologic high-risk groups represented (7.5%) about half of them (3.75%) being basosquamous carcinoma. The total incidence for multiple BCC and 2nd primary was 12%. Recurrent BCC represented 8%. All of the recurrent lesions of BCC belonged to the histologic high-risk group. Conclusion: Basal Cell Carcinoma is the most common skin cancer in the 10-year survey. Histopathological diagnosis and classification of BCC cases are essential for the determination of the tumor type and its biological behavior.Keywords: basal cell carcinoma, high risk, histopathological features, statistical analysis
Procedia PDF Downloads 1514966 Social Network Analysis as a Research and Pedagogy Tool in Problem-Focused Undergraduate Social Innovation Courses
Authors: Sean McCarthy, Patrice M. Ludwig, Will Watson
Abstract:
This exploratory case study explores the deployment of Social Network Analysis (SNA) in mapping community assets in an interdisciplinary, undergraduate, team-taught course focused on income insecure populations in a rural area in the US. Specifically, it analyzes how students were taught to collect data on community assets and to visualize the connections between those assets using Kumu, an SNA data visualization tool. Further, the case study shows how social network data was also collected about student teams via their written communications in Slack, an enterprise messaging tool, which enabled instructors to manage and guide student research activity throughout the semester. The discussion presents how SNA methods can simultaneously inform both community-based research and social innovation pedagogy through the use of data visualization and collaboration-focused communication technologies.Keywords: social innovation, social network analysis, pedagogy, problem-based learning, data visualization, information communication technologies
Procedia PDF Downloads 1484965 Analysis and Performance of Handover in Universal Mobile Telecommunications System (UMTS) Network Using OPNET Modeller
Authors: Latif Adnane, Benaatou Wafa, Pla Vicent
Abstract:
Handover is of great significance to achieve seamless connectivity in wireless networks. This paper gives an impression of the main factors which are being affected by the soft and the hard handovers techniques. To know and understand the handover process in The Universal Mobile Telecommunications System (UMTS) network, different statistics are calculated. This paper focuses on the quality of service (QoS) of soft and hard handover in UMTS network, which includes the analysis of received power, signal to noise radio, throughput, delay traffic, traffic received, delay, total transmit load, end to end delay and upload response time using OPNET simulator.Keywords: handover, UMTS, mobility, simulation, OPNET modeler
Procedia PDF Downloads 3224964 Accounting for Downtime Effects in Resilience-Based Highway Network Restoration Scheduling
Authors: Zhenyu Zhang, Hsi-Hsien Wei
Abstract:
Highway networks play a vital role in post-disaster recovery for disaster-damaged areas. Damaged bridges in such networks can disrupt the recovery activities by impeding the transportation of people, cargo, and reconstruction resources. Therefore, rapid restoration of damaged bridges is of paramount importance to long-term disaster recovery. In the post-disaster recovery phase, the key to restoration scheduling for a highway network is prioritization of bridge-repair tasks. Resilience is widely used as a measure of the ability to recover with which a network can return to its pre-disaster level of functionality. In practice, highways will be temporarily blocked during the downtime of bridge restoration, leading to the decrease of highway-network functionality. The failure to take downtime effects into account can lead to overestimation of network resilience. Additionally, post-disaster recovery of highway networks is generally divided into emergency bridge repair (EBR) in the response phase and long-term bridge repair (LBR) in the recovery phase, and both of EBR and LBR are different in terms of restoration objectives, restoration duration, budget, etc. Distinguish these two phases are important to precisely quantify highway network resilience and generate suitable restoration schedules for highway networks in the recovery phase. To address the above issues, this study proposes a novel resilience quantification method for the optimization of long-term bridge repair schedules (LBRS) taking into account the impact of EBR activities and restoration downtime on a highway network’s functionality. A time-dependent integer program with recursive functions is formulated for optimally scheduling LBR activities. Moreover, since uncertainty always exists in the LBRS problem, this paper extends the optimization model from the deterministic case to the stochastic case. A hybrid genetic algorithm that integrates a heuristic approach into a traditional genetic algorithm to accelerate the evolution process is developed. The proposed methods are tested using data from the 2008 Wenchuan earthquake, based on a regional highway network in Sichuan, China, consisting of 168 highway bridges on 36 highways connecting 25 cities/towns. The results show that, in this case, neglecting the bridge restoration downtime can lead to approximately 15% overestimation of highway network resilience. Moreover, accounting for the impact of EBR on network functionality can help to generate a more specific and reasonable LBRS. The theoretical and practical values are as follows. First, the proposed network recovery curve contributes to comprehensive quantification of highway network resilience by accounting for the impact of both restoration downtime and EBR activities on the recovery curves. Moreover, this study can improve the highway network resilience from the organizational dimension by providing bridge managers with optimal LBR strategies.Keywords: disaster management, highway network, long-term bridge repair schedule, resilience, restoration downtime
Procedia PDF Downloads 1514963 AI Ethical Values as Dependent on the Role and Perspective of the Ethical AI Code Founder- A Mapping Review
Authors: Moshe Davidian, Shlomo Mark, Yotam Lurie
Abstract:
With the rapid development of technology and the concomitant growth in the capability of Artificial Intelligence (AI) systems and their power, the ethical challenges involved in these systems are also evolving and increasing. In recent years, various organizations, including governments, international institutions, professional societies, civic organizations, and commercial companies, have been choosing to address these various challenges by publishing ethical codes for AI systems. However, despite the apparent agreement that AI should be “ethical,” there is debate about the definition of “ethical artificial intelligence.” This study investigates the various AI ethical codes and their key ethical values. From the vast collection of codes that exist, it analyzes and compares 25 ethical codes that were found to be representative of different types of organizations. In addition, as part of its literature review, the study overviews data collected in three recent reviews of AI codes. The results of the analyses demonstrate a convergence around seven key ethical values. However, the key finding is that the different AI ethical codes eventually reflect the type of organization that designed the code; i.e., the organizations’ role as regulator, user, or developer affects the view of what ethical AI is. The results show a relationship between the organization’s role and the dominant values in its code. The main contribution of this study is the development of a list of the key values for all AI systems and specific values that need to impact the development and design of AI systems, but also allowing for differences according to the organization for which the system is being developed. This will allow an analysis of AI values in relation to stakeholders.Keywords: artificial intelligence, ethical codes, principles, values
Procedia PDF Downloads 108