Search results for: random features
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5642

Search results for: random features

4832 Integrating Cyber-Physical System toward Advance Intelligent Industry: Features, Requirements and Challenges

Authors: V. Reyes, P. Ferreira

Abstract:

In response to high levels of competitiveness, industrial systems have evolved to improve productivity. As a consequence, a rapid increase in volume production and simultaneously, a customization process require lower costs, more variety, and accurate quality of products. Reducing time-cycle production, enabling customizability, and ensure continuous quality improvement are key features in advance intelligent industry. In this scenario, customers and producers will be able to participate in the ongoing production life cycle through real-time interaction. To achieve this vision, transparency, predictability, and adaptability are key features that provide the industrial systems the capability to adapt to customer demands modifying the manufacturing process through an autonomous response and acting preventively to avoid errors. The industrial system incorporates a diversified number of components that in advanced industry are expected to be decentralized, end to end communicating, and with the capability to make own decisions through feedback. The evolving process towards advanced intelligent industry defines a set of stages to empower components of intelligence and enhancing efficiency to achieve the decision-making stage. The integrated system follows an industrial cyber-physical system (CPS) architecture whose real-time integration, based on a set of enabler technologies, links the physical and virtual world generating the digital twin (DT). This instance allows incorporating sensor data from real to virtual world and the required transparency for real-time monitoring and control, contributing to address important features of the advanced intelligent industry and simultaneously improve sustainability. Assuming the industrial CPS as the core technology toward the latest advanced intelligent industry stage, this paper reviews and highlights the correlation and contributions of the enabler technologies for the operationalization of each stage in the path toward advanced intelligent industry. From this research, a real-time integration architecture for a cyber-physical system with applications to collaborative robotics is proposed. The required functionalities and issues to endow the industrial system of adaptability are identified.

Keywords: cyber-physical systems, digital twin, sensor data, system integration, virtual model

Procedia PDF Downloads 103
4831 A Topological Approach for Motion Track Discrimination

Authors: Tegan H. Emerson, Colin C. Olson, George Stantchev, Jason A. Edelberg, Michael Wilson

Abstract:

Detecting small targets at range is difficult because there is not enough spatial information present in an image sub-region containing the target to use correlation-based methods to differentiate it from dynamic confusers present in the scene. Moreover, this lack of spatial information also disqualifies the use of most state-of-the-art deep learning image-based classifiers. Here, we use characteristics of target tracks extracted from video sequences as data from which to derive distinguishing topological features that help robustly differentiate targets of interest from confusers. In particular, we calculate persistent homology from time-delayed embeddings of dynamic statistics calculated from motion tracks extracted from a wide field-of-view video stream. In short, we use topological methods to extract features related to target motion dynamics that are useful for classification and disambiguation and show that small targets can be detected at range with high probability.

Keywords: motion tracks, persistence images, time-delay embedding, topological data analysis

Procedia PDF Downloads 99
4830 Determination of Klebsiella Pneumoniae Susceptibility to Antibiotics Using Infrared Spectroscopy and Machine Learning Algorithms

Authors: Manal Suleiman, George Abu-Aqil, Uraib Sharaha, Klaris Riesenberg, Itshak Lapidot, Ahmad Salman, Mahmoud Huleihel

Abstract:

Klebsiella pneumoniae is one of the most aggressive multidrug-resistant bacteria associated with human infections resulting in high mortality and morbidity. Thus, for an effective treatment, it is important to diagnose both the species of infecting bacteria and their susceptibility to antibiotics. Current used methods for diagnosing the bacterial susceptibility to antibiotics are time-consuming (about 24h following the first culture). Thus, there is a clear need for rapid methods to determine the bacterial susceptibility to antibiotics. Infrared spectroscopy is a well-known method that is known as sensitive and simple which is able to detect minor biomolecular changes in biological samples associated with developing abnormalities. The main goal of this study is to evaluate the potential of infrared spectroscopy in tandem with Random Forest and XGBoost machine learning algorithms to diagnose the susceptibility of Klebsiella pneumoniae to antibiotics within approximately 20 minutes following the first culture. In this study, 1190 Klebsiella pneumoniae isolates were obtained from different patients with urinary tract infections. The isolates were measured by the infrared spectrometer, and the spectra were analyzed by machine learning algorithms Random Forest and XGBoost to determine their susceptibility regarding nine specific antibiotics. Our results confirm that it was possible to classify the isolates into sensitive and resistant to specific antibiotics with a success rate range of 80%-85% for the different tested antibiotics. These results prove the promising potential of infrared spectroscopy as a powerful diagnostic method for determining the Klebsiella pneumoniae susceptibility to antibiotics.

Keywords: urinary tract infection (UTI), Klebsiella pneumoniae, bacterial susceptibility, infrared spectroscopy, machine learning

Procedia PDF Downloads 147
4829 The Role of Urban Development Patterns for Mitigating Extreme Urban Heat: The Case Study of Doha, Qatar

Authors: Yasuyo Makido, Vivek Shandas, David J. Sailor, M. Salim Ferwati

Abstract:

Mitigating extreme urban heat is challenging in a desert climate such as Doha, Qatar, since outdoor daytime temperature area often too high for the human body to tolerate. Recent studies demonstrate that cities in arid and semiarid areas can exhibit ‘urban cool islands’ - urban areas that are cooler than the surrounding desert. However, the variation of temperatures as a result of the time of day and factors leading to temperature change remain at the question. To address these questions, we examined the spatial and temporal variation of air temperature in Doha, Qatar by conducting multiple vehicle-base local temperature observations. We also employed three statistical approaches to model surface temperatures using relevant predictors: (1) Ordinary Least Squares, (2) Regression Tree Analysis and (3) Random Forest for three time periods. Although the most important determinant factors varied by day and time, distance to the coast was the significant determinant at midday. A 70%/30% holdout method was used to create a testing dataset to validate the results through Pearson’s correlation coefficient. The Pearson’s analysis suggests that the Random Forest model more accurately predicts the surface temperatures than the other methods. We conclude with recommendations about the types of development patterns that show the greatest potential for reducing extreme heat in air climates.

Keywords: desert cities, tree-structure regression model, urban cool Island, vehicle temperature traverse

Procedia PDF Downloads 377
4828 Automatic Classification of Periodic Heart Sounds Using Convolutional Neural Network

Authors: Jia Xin Low, Keng Wah Choo

Abstract:

This paper presents an automatic normal and abnormal heart sound classification model developed based on deep learning algorithm. MITHSDB heart sounds datasets obtained from the 2016 PhysioNet/Computing in Cardiology Challenge database were used in this research with the assumption that the electrocardiograms (ECG) were recorded simultaneously with the heart sounds (phonocardiogram, PCG). The PCG time series are segmented per heart beat, and each sub-segment is converted to form a square intensity matrix, and classified using convolutional neural network (CNN) models. This approach removes the need to provide classification features for the supervised machine learning algorithm. Instead, the features are determined automatically through training, from the time series provided. The result proves that the prediction model is able to provide reasonable and comparable classification accuracy despite simple implementation. This approach can be used for real-time classification of heart sounds in Internet of Medical Things (IoMT), e.g. remote monitoring applications of PCG signal.

Keywords: convolutional neural network, discrete wavelet transform, deep learning, heart sound classification

Procedia PDF Downloads 332
4827 Preparation of Papers - Developing a Leukemia Diagnostic System Based on Hybrid Deep Learning Architectures in Actual Clinical Environments

Authors: Skyler Kim

Abstract:

An early diagnosis of leukemia has always been a challenge to doctors and hematologists. On a worldwide basis, it was reported that there were approximately 350,000 new cases in 2012, and diagnosing leukemia was time-consuming and inefficient because of an endemic shortage of flow cytometry equipment in current clinical practice. As the number of medical diagnosis tools increased and a large volume of high-quality data was produced, there was an urgent need for more advanced data analysis methods. One of these methods was the AI approach. This approach has become a major trend in recent years, and several research groups have been working on developing these diagnostic models. However, designing and implementing a leukemia diagnostic system in real clinical environments based on a deep learning approach with larger sets remains complex. Leukemia is a major hematological malignancy that results in mortality and morbidity throughout different ages. We decided to select acute lymphocytic leukemia to develop our diagnostic system since acute lymphocytic leukemia is the most common type of leukemia, accounting for 74% of all children diagnosed with leukemia. The results from this development work can be applied to all other types of leukemia. To develop our model, the Kaggle dataset was used, which consists of 15135 total images, 8491 of these are images of abnormal cells, and 5398 images are normal. In this paper, we design and implement a leukemia diagnostic system in a real clinical environment based on deep learning approaches with larger sets. The proposed diagnostic system has the function of detecting and classifying leukemia. Different from other AI approaches, we explore hybrid architectures to improve the current performance. First, we developed two independent convolutional neural network models: VGG19 and ResNet50. Then, using both VGG19 and ResNet50, we developed a hybrid deep learning architecture employing transfer learning techniques to extract features from each input image. In our approach, fusing the features from specific abstraction layers can be deemed as auxiliary features and lead to further improvement of the classification accuracy. In this approach, features extracted from the lower levels are combined into higher dimension feature maps to help improve the discriminative capability of intermediate features and also overcome the problem of network gradient vanishing or exploding. By comparing VGG19 and ResNet50 and the proposed hybrid model, we concluded that the hybrid model had a significant advantage in accuracy. The detailed results of each model’s performance and their pros and cons will be presented in the conference.

Keywords: acute lymphoblastic leukemia, hybrid model, leukemia diagnostic system, machine learning

Procedia PDF Downloads 176
4826 Evaluation of Gesture-Based Password: User Behavioral Features Using Machine Learning Algorithms

Authors: Lakshmidevi Sreeramareddy, Komalpreet Kaur, Nane Pothier

Abstract:

Graphical-based passwords have existed for decades. Their major advantage is that they are easier to remember than an alphanumeric password. However, their disadvantage (especially recognition-based passwords) is the smaller password space, making them more vulnerable to brute force attacks. Graphical passwords are also highly susceptible to the shoulder-surfing effect. The gesture-based password method that we developed is a grid-free, template-free method. In this study, we evaluated the gesture-based passwords for usability and vulnerability. The results of the study are significant. We developed a gesture-based password application for data collection. Two modes of data collection were used: Creation mode and Replication mode. In creation mode (Session 1), users were asked to create six different passwords and reenter each password five times. In replication mode, users saw a password image created by some other user for a fixed duration of time. Three different duration timers, such as 5 seconds (Session 2), 10 seconds (Session 3), and 15 seconds (Session 4), were used to mimic the shoulder-surfing attack. After the timer expired, the password image was removed, and users were asked to replicate the password. There were 74, 57, 50, and 44 users participated in Session 1, Session 2, Session 3, and Session 4 respectfully. In this study, the machine learning algorithms have been applied to determine whether the person is a genuine user or an imposter based on the password entered. Five different machine learning algorithms were deployed to compare the performance in user authentication: namely, Decision Trees, Linear Discriminant Analysis, Naive Bayes Classifier, Support Vector Machines (SVMs) with Gaussian Radial Basis Kernel function, and K-Nearest Neighbor. Gesture-based password features vary from one entry to the next. It is difficult to distinguish between a creator and an intruder for authentication. For each password entered by the user, four features were extracted: password score, password length, password speed, and password size. All four features were normalized before being fed to a classifier. Three different classifiers were trained using data from all four sessions. Classifiers A, B, and C were trained and tested using data from the password creation session and the password replication with a timer of 5 seconds, 10 seconds, and 15 seconds, respectively. The classification accuracies for Classifier A using five ML algorithms are 72.5%, 71.3%, 71.9%, 74.4%, and 72.9%, respectively. The classification accuracies for Classifier B using five ML algorithms are 69.7%, 67.9%, 70.2%, 73.8%, and 71.2%, respectively. The classification accuracies for Classifier C using five ML algorithms are 68.1%, 64.9%, 68.4%, 71.5%, and 69.8%, respectively. SVMs with Gaussian Radial Basis Kernel outperform other ML algorithms for gesture-based password authentication. Results confirm that the shorter the duration of the shoulder-surfing attack, the higher the authentication accuracy. In conclusion, behavioral features extracted from the gesture-based passwords lead to less vulnerable user authentication.

Keywords: authentication, gesture-based passwords, machine learning algorithms, shoulder-surfing attacks, usability

Procedia PDF Downloads 87
4825 Improving Fake News Detection Using K-means and Support Vector Machine Approaches

Authors: Kasra Majbouri Yazdi, Adel Majbouri Yazdi, Saeid Khodayi, Jingyu Hou, Wanlei Zhou, Saeed Saedy

Abstract:

Fake news and false information are big challenges of all types of media, especially social media. There is a lot of false information, fake likes, views and duplicated accounts as big social networks such as Facebook and Twitter admitted. Most information appearing on social media is doubtful and in some cases misleading. They need to be detected as soon as possible to avoid a negative impact on society. The dimensions of the fake news datasets are growing rapidly, so to obtain a better result of detecting false information with less computation time and complexity, the dimensions need to be reduced. One of the best techniques of reducing data size is using feature selection method. The aim of this technique is to choose a feature subset from the original set to improve the classification performance. In this paper, a feature selection method is proposed with the integration of K-means clustering and Support Vector Machine (SVM) approaches which work in four steps. First, the similarities between all features are calculated. Then, features are divided into several clusters. Next, the final feature set is selected from all clusters, and finally, fake news is classified based on the final feature subset using the SVM method. The proposed method was evaluated by comparing its performance with other state-of-the-art methods on several specific benchmark datasets and the outcome showed a better classification of false information for our work. The detection performance was improved in two aspects. On the one hand, the detection runtime process decreased, and on the other hand, the classification accuracy increased because of the elimination of redundant features and the reduction of datasets dimensions.

Keywords: clustering, fake news detection, feature selection, machine learning, social media, support vector machine

Procedia PDF Downloads 160
4824 A Comparative Study of Social Entrepreneurship Centers in Universities of the World

Authors: Farnoosh Alami, Nazgol Azimi

Abstract:

Universities have recently paid much attention to the subject of social entrepreneurship. As a result, many of the highly ranked universities have established centers in this regard. The present research aims to investigate vision and mission of social entrepreneurship centers of the best universities ranked under 50 by Shanghai List 2013. It tries to find the common goals and features of their mission, vision, and activities which lead to their present success. This investigation is based on the web content of the first top 10 universities; among which six had social entrepreneurship centers. This is a qualitative research, and the findings are based on content analysis of documents. The findings confirm that education, research, talent development, innovative solutions, and supporting social innovation, are shared in the vision of these centers. In regard to their missions, social participation, networking, and leader education are the most shared features. Their common activities are focused on five categories of education, research, support, promotion, and networking.

Keywords: comparative study, qualitative research, social entrepreneurship centers, universities in the world

Procedia PDF Downloads 282
4823 Wind Velocity Mitigation for Conceptual Design: A Spatial Decision (Support Framework)

Authors: Mohamed Khallaf, Hossein M Rizeei

Abstract:

Simulating wind pattern behavior over proposed urban features is critical in the early stage of the conceptual design of both architectural and urban disciplines. However, it is typically not possible for designers to explore the impact of wind flow profiles across new urban developments due to a lack of real data and inaccurate estimation of building parameters. Modeling the details of existing and proposed urban features and testing them against wind flows is the missing part of the conceptual design puzzle where architectural and urban discipline can focus. This research aims to develop a spatial decision-support design method utilizing LiDAR, GIS, and performance-based wind simulation technology to mitigate wind-related hazards on a design by simulating alternative design scenarios at the pedestrian level prior to its implementation in Sydney, Australia. The result of the experiment demonstrates the capability of the proposed framework to improve pedestrian comfort in relation to wind profile.

Keywords: spatial decision-support design, performance-based wind simulation, LiDAR, GIS

Procedia PDF Downloads 106
4822 Combination between Intrusion Systems and Honeypots

Authors: Majed Sanan, Mohammad Rammal, Wassim Rammal

Abstract:

Today, security is a major concern. Intrusion Detection, Prevention Systems and Honeypot can be used to moderate attacks. Many researchers have proposed to use many IDSs ((Intrusion Detection System) time to time. Some of these IDS’s combine their features of two or more IDSs which are called Hybrid Intrusion Detection Systems. Most of the researchers combine the features of Signature based detection methodology and Anomaly based detection methodology. For a signature based IDS, if an attacker attacks slowly and in organized way, the attack may go undetected through the IDS, as signatures include factors based on duration of the events but the actions of attacker do not match. Sometimes, for an unknown attack there is no signature updated or an attacker attack in the mean time when the database is updating. Thus, signature-based IDS fail to detect unknown attacks. Anomaly based IDS suffer from many false-positive readings. So there is a need to hybridize those IDS which can overcome the shortcomings of each other. In this paper we propose a new approach to IDS (Intrusion Detection System) which is more efficient than the traditional IDS (Intrusion Detection System). The IDS is based on Honeypot Technology and Anomaly based Detection Methodology. We have designed Architecture for the IDS in a packet tracer and then implemented it in real time. We have discussed experimental results performed: both the Honeypot and Anomaly based IDS have some shortcomings but if we hybridized these two technologies, the newly proposed Hybrid Intrusion Detection System (HIDS) is capable enough to overcome these shortcomings with much enhanced performance. In this paper, we present a modified Hybrid Intrusion Detection System (HIDS) that combines the positive features of two different detection methodologies - Honeypot methodology and anomaly based intrusion detection methodology. In the experiment, we ran both the Intrusion Detection System individually first and then together and recorded the data from time to time. From the data we can conclude that the resulting IDS are much better in detecting intrusions from the existing IDSs.

Keywords: security, intrusion detection, intrusion prevention, honeypot, anomaly-based detection, signature-based detection, cloud computing, kfsensor

Procedia PDF Downloads 359
4821 Feature-Based Summarizing and Ranking from Customer Reviews

Authors: Dim En Nyaung, Thin Lai Lai Thein

Abstract:

Due to the rapid increase of Internet, web opinion sources dynamically emerge which is useful for both potential customers and product manufacturers for prediction and decision purposes. These are the user generated contents written in natural languages and are unstructured-free-texts scheme. Therefore, opinion mining techniques become popular to automatically process customer reviews for extracting product features and user opinions expressed over them. Since customer reviews may contain both opinionated and factual sentences, a supervised machine learning technique applies for subjectivity classification to improve the mining performance. In this paper, we dedicate our work is the task of opinion summarization. Therefore, product feature and opinion extraction is critical to opinion summarization, because its effectiveness significantly affects the identification of semantic relationships. The polarity and numeric score of all the features are determined by Senti-WordNet Lexicon. The problem of opinion summarization refers how to relate the opinion words with respect to a certain feature. Probabilistic based model of supervised learning will improve the result that is more flexible and effective.

Keywords: opinion mining, opinion summarization, sentiment analysis, text mining

Procedia PDF Downloads 312
4820 Evaluation and Analysis of ZigBee-Based Wireless Sensor Network: Home Monitoring as Case Study

Authors: Omojokun G. Aju, Adedayo O. Sule

Abstract:

ZigBee wireless sensor and control network is one of the most popularly deployed wireless technologies in recent years. This is because ZigBee is an open standard lightweight, low-cost, low-speed, low-power protocol that allows true operability between systems. It is built on existing IEEE 802.15.4 protocol and therefore combines the IEEE 802.15.4 features and newly added features to meet required functionalities thereby finding applications in wide variety of wireless networked systems. ZigBee‘s current focus is on embedded applications of general-purpose, inexpensive, self-organising networks which requires low to medium data rates, high number of nodes and very low power consumption such as home/industrial automation, embedded sensing, medical data collection, smart lighting, safety and security sensor networks, and monitoring systems. Although the ZigBee design specification includes security features to protect data communication confidentiality and integrity, however, when simplicity and low-cost are the goals, security is normally traded-off. A lot of researches have been carried out on ZigBee technology in which emphasis has mainly been placed on ZigBee network performance characteristics such as energy efficiency, throughput, robustness, packet delay and delivery ratio in different scenarios and applications. This paper investigate and analyse the data accuracy, network implementation difficulties and security challenges of ZigBee network applications in star-based and mesh-based topologies with emphases on its home monitoring application using the ZigBee ProBee ZE-10 development boards for the network setup. The paper also expose some factors that need to be considered when designing ZigBee network applications and suggest ways in which ZigBee network can be designed to provide more resilient to network attacks.

Keywords: home monitoring, IEEE 802.14.5, topology, wireless security, wireless sensor network (WSN), ZigBee

Procedia PDF Downloads 362
4819 Impact of Tablet Based Learning on Continuous Assessment (ESPRIT Smart School Framework)

Authors: Mehdi Attia, Sana Ben Fadhel, Lamjed Bettaieb

Abstract:

Mobile technology has become a part of our daily lives and assist learners (despite their level and age) in their leaning process using various apparatus and mobile devices (laptop, tablets, etc.). This paper presents a new learning framework based on tablets. This solution has been developed and tested in ESPRIT “Ecole Supérieure Privée d’Igénieurie et de Technologies”, a Tunisian school of engineering. This application is named ESSF: Esprit Smart School Framework. In this work, the main features of the proposed solution are listed, particularly its impact on the learners’ evaluation process. Learner’s assessment has always been a critical component of the learning process as it measures students’ knowledge. However, traditional evaluation methods in which the learner is evaluated once or twice each year cannot reflect his real level. This is why a continuous assessment (CA) process becomes necessary. In this context we have proved that ESSF offers many important features that enhance and facilitate the implementation of the CA process.

Keywords: continuous assessment, mobile learning, tablet based learning, smart school, ESSF

Procedia PDF Downloads 315
4818 Model-Driven and Data-Driven Approaches for Crop Yield Prediction: Analysis and Comparison

Authors: Xiangtuo Chen, Paul-Henry Cournéde

Abstract:

Crop yield prediction is a paramount issue in agriculture. The main idea of this paper is to find out efficient way to predict the yield of corn based meteorological records. The prediction models used in this paper can be classified into model-driven approaches and data-driven approaches, according to the different modeling methodologies. The model-driven approaches are based on crop mechanistic modeling. They describe crop growth in interaction with their environment as dynamical systems. But the calibration process of the dynamic system comes up with much difficulty, because it turns out to be a multidimensional non-convex optimization problem. An original contribution of this paper is to propose a statistical methodology, Multi-Scenarios Parameters Estimation (MSPE), for the parametrization of potentially complex mechanistic models from a new type of datasets (climatic data, final yield in many situations). It is tested with CORNFLO, a crop model for maize growth. On the other hand, the data-driven approach for yield prediction is free of the complex biophysical process. But it has some strict requirements about the dataset. A second contribution of the paper is the comparison of these model-driven methods with classical data-driven methods. For this purpose, we consider two classes of regression methods, methods derived from linear regression (Ridge and Lasso Regression, Principal Components Regression or Partial Least Squares Regression) and machine learning methods (Random Forest, k-Nearest Neighbor, Artificial Neural Network and SVM regression). The dataset consists of 720 records of corn yield at county scale provided by the United States Department of Agriculture (USDA) and the associated climatic data. A 5-folds cross-validation process and two accuracy metrics: root mean square error of prediction(RMSEP), mean absolute error of prediction(MAEP) were used to evaluate the crop prediction capacity. The results show that among the data-driven approaches, Random Forest is the most robust and generally achieves the best prediction error (MAEP 4.27%). It also outperforms our model-driven approach (MAEP 6.11%). However, the method to calibrate the mechanistic model from dataset easy to access offers several side-perspectives. The mechanistic model can potentially help to underline the stresses suffered by the crop or to identify the biological parameters of interest for breeding purposes. For this reason, an interesting perspective is to combine these two types of approaches.

Keywords: crop yield prediction, crop model, sensitivity analysis, paramater estimation, particle swarm optimization, random forest

Procedia PDF Downloads 217
4817 Supervised Machine Learning Approach for Studying the Effect of Different Joint Sets on Stability of Mine Pit Slopes Under the Presence of Different External Factors

Authors: Sudhir Kumar Singh, Debashish Chakravarty

Abstract:

Slope stability analysis is an important aspect in the field of geotechnical engineering. It is also important from safety, and economic point of view as any slope failure leads to loss of valuable lives and damage to property worth millions. This paper aims at mitigating the risk of slope failure by studying the effect of different joint sets on the stability of mine pit slopes under the influence of various external factors, namely degree of saturation, rainfall intensity, and seismic coefficients. Supervised machine learning approach has been utilized for making accurate and reliable predictions regarding the stability of slopes based on the value of Factor of Safety. Numerous cases have been studied for analyzing the stability of slopes using the popular Finite Element Method, and the data thus obtained has been used as training data for the supervised machine learning models. The input data has been trained on different supervised machine learning models, namely Random Forest, Decision Tree, Support vector Machine, and XGBoost. Distinct test data that is not present in training data has been used for measuring the performance and accuracy of different models. Although all models have performed well on the test dataset but Random Forest stands out from others due to its high accuracy of greater than 95%, thus helping us by providing a valuable tool at our disposition which is neither computationally expensive nor time consuming and in good accordance with the numerical analysis result.

Keywords: finite element method, geotechnical engineering, machine learning, slope stability

Procedia PDF Downloads 88
4816 Drug-Drug Interaction Prediction in Diabetes Mellitus

Authors: Rashini Maduka, C. R. Wijesinghe, A. R. Weerasinghe

Abstract:

Drug-drug interactions (DDIs) can happen when two or more drugs are taken together. Today DDIs have become a serious health issue due to adverse drug effects. In vivo and in vitro methods for identifying DDIs are time-consuming and costly. Therefore, in-silico-based approaches are preferred in DDI identification. Most machine learning models for DDI prediction are used chemical and biological drug properties as features. However, some drug features are not available and costly to extract. Therefore, it is better to make automatic feature engineering. Furthermore, people who have diabetes already suffer from other diseases and take more than one medicine together. Then adverse drug effects may happen to diabetic patients and cause unpleasant reactions in the body. In this study, we present a model with a graph convolutional autoencoder and a graph decoder using a dataset from DrugBank version 5.1.3. The main objective of the model is to identify unknown interactions between antidiabetic drugs and the drugs taken by diabetic patients for other diseases. We considered automatic feature engineering and used Known DDIs only as the input for the model. Our model has achieved 0.86 in AUC and 0.86 in AP.

Keywords: drug-drug interaction prediction, graph embedding, graph convolutional networks, adverse drug effects

Procedia PDF Downloads 81
4815 Association of Musculoskeletal and Radiological Features with Clinical and Serological Findings in Systemic Sclerosis: A Single-Centre Registry Study

Authors: Rezvan Hosseinian

Abstract:

Aim: Systemic sclerosis (SSc) is a chronic connective tissue disease with the clinical hallmark of skin thickening and tethering. The correlation of musculoskeletal features with other parameters should be considered in SSc patients. Methods: We reviewed the records of all patients who had more than one visit and standard anteroposterior radiography of hand. We used univariate analysis, and factors with p<0.05 were included in logistic regression to find out dependent factors. Results: Overall, 180 SSc patients were enrolled in our study, 161 (89.4%) of whom were women. The median age (IQR) was 47.0 years (16), and 52% had a diffuse subtype of the disease. In multivariate analysis, tendon friction rubs (TFRs) were associated with the presence of calcinosis, muscle tenderness, and flexion contracture (FC) on physical examination (p<0.05). Arthritis showed no differences in the two subtypes of the disease (p=0.98), and in multivariate analysis, there were no correlations between radiographic arthritis and serological and clinical features. The radiographic results indicated that disease duration correlated with joint erosion, acro-osteolysis, resorption of the distal ulna, calcinosis and radiologic FC (p< 0.05). Acro-osteolysis was more frequent in the dcSSc subtype, TFRs, and anti-TOPO I antibody. Radiologic FC showed an association with skin score, calcinosis and haematocrit <30% (p<0.05). Joint flexion on radiography was associated with disease duration, modified Rodnan skin score, calcinosis, and low hematocrit (P<0.01). Conclusion: Disease duration was a main dependent factor for developing joint erosion, acro-osteolysis, bone resorption, calcinosis, and flexion contracture on hand radiography. Acro-osteolysis presented in the severe form of the disease. Acro-osteolysis was the only dependent variable associated with bone demineralization.

Keywords: disease subsets, hand radiography, joint erosion, sclerosis

Procedia PDF Downloads 67
4814 Association of Musculoskeletal and Radiological Features with Clinical and Serological Findings in Systemic Sclerosis: A Single-Centre Registry Study

Authors: Nasrin Azarbani

Abstract:

Aim: Systemic sclerosis (SSc) is a chronic connective tissue disease with the clinical hallmark of skin thickening and tethering. Correlation of musculoskeletal features with other parameters should be considered in SSc patients. Methods: We reviewed the records of all patients who had more than one visit and standard anteroposterior radiography of hand. We used univariate analysis, and factors with p<0.05 were included in logistic regression to find out dependent factors. Results: Overall, 180 SSc patients were enrolled in our study, 161 (89.4%) of whom were women. Median age (IQR) was 47.0 years (16), and 52% had diffuse subtype of the disease. In multivariate analysis, tendon friction rubs (TFRs) was associated with the presence of calcinosis, muscle tenderness, and flexion contracture (FC) on physical examination (p<0.05). Arthritis showed no differences in the two subtypes of the disease (p=0.98), and in multivariate analysis, there were no correlations between radiographic arthritis and serological and clinical features. The radiographic results indicated that disease duration correlated with joint erosion, acro-osteolysis, resorption of distal ulna, calcinosis and radiologic FC (p< 0.05). Acro-osteolysis was more frequent in the dcSSc subtype, TFRs, and anti-TOPO I antibody. Radiologic FC showed an association with skin score, calcinosis and haematocrit <30% (p<0.05). Joint flexion on radiography was associated with disease duration, modified Rodnan skin score, calcinosis, and low haematocrit (P<0.01). Conclusion: Disease duration was a main dependent factor for developing joint erosion, acro-osteolysis, bone resorption, calcinosis, and flexion contracture on hand radiography. Acro-osteolysis presented in the severe form of the disease. Acro-osteolysis was the only dependent variable associated with bone demineralization.

Keywords: sclerosis, disease subsets, joint erosion, musculoskeletal

Procedia PDF Downloads 49
4813 Computer-Aided Exudate Diagnosis for the Screening of Diabetic Retinopathy

Authors: Shu-Min Tsao, Chung-Ming Lo, Shao-Chun Chen

Abstract:

Most diabetes patients tend to suffer from its complication of retina diseases. Therefore, early detection and early treatment are important. In clinical examinations, using color fundus image was the most convenient and available examination method. According to the exudates appeared in the retinal image, the status of retina can be confirmed. However, the routine screening of diabetic retinopathy by color fundus images would bring time-consuming tasks to physicians. This study thus proposed a computer-aided exudate diagnosis for the screening of diabetic retinopathy. After removing vessels and optic disc in the retinal image, six quantitative features including region number, region area, and gray-scale values etc… were extracted from the remaining regions for classification. As results, all six features were evaluated to be statistically significant (p-value < 0.001). The accuracy of classifying the retinal images into normal and diabetic retinopathy achieved 82%. Based on this system, the clinical workload could be reduced. The examination procedure may also be improved to be more efficient.

Keywords: computer-aided diagnosis, diabetic retinopathy, exudate, image processing

Procedia PDF Downloads 253
4812 Asia Pacific University of Technology and Innovation

Authors: Esther O. Adebitan, Florence Oyelade

Abstract:

The Millennium Development Goals (MDGs) was initiated by the UN member nations’ aspiration for the betterment of human life. It is expressed in a set of numerical ‎and time-bound targets. In more recent time, the aspiration is shifting away from just the achievement to the sustainability of achieved MDGs beyond the 2015 target. The main objective of this study was assessing how much the hotel industry within the Nigerian Federal Capital Territory (FCT) as a member of the global community is involved in the achievement of sustainable MDGs within the FCT. The study had two population groups consisting of 160 hotels and the communities where these are located. Stratified random sampling technique was adopted in selecting 60 hotels based on large, medium ‎and small hotels categorisation, while simple random sampling technique was used to elicit information from 30 residents of three of the hotels host communities. The study was guided by tree research questions and two hypotheses aimed to ascertain if hotels see the need to be involved in, and have policies in pursuit of achieving sustained MDGs, and to determine public opinion regarding hotels contribution towards the achievement of the MDGs in their communities. A 22 item questionnaire was designed ‎and administered to hotel managers while 11 item questionnaire was designed ‎and administered to hotels’ host communities. Frequency distribution and percentage as well as Chi-square were used to analyse data. Results showed no significant involvement of the hotel industry in achieving sustained MDGs in the FCT and that there was disconnect between the hotels and their immediate communities. The study recommended that hotels should, as part of their Corporate Social Responsibility pick at least one of the goals to work on in order to be involved in the attainment of enduring Millennium Development Goals.

Keywords: MDGs, hotels, FCT, host communities, corporate social responsibility

Procedia PDF Downloads 401
4811 Tool Condition Monitoring of Ceramic Inserted Tools in High Speed Machining through Image Processing

Authors: Javier A. Dominguez Caballero, Graeme A. Manson, Matthew B. Marshall

Abstract:

Cutting tools with ceramic inserts are often used in the process of machining many types of superalloy, mainly due to their high strength and thermal resistance. Nevertheless, during the cutting process, the plastic flow wear generated in these inserts enhances and propagates cracks due to high temperature and high mechanical stress. This leads to a very variable failure of the cutting tool. This article explores the relationship between the continuous wear that ceramic SiAlON (solid solutions based on the Si3N4 structure) inserts experience during a high-speed machining process and the evolution of sparks created during the same process. These sparks were analysed through pictures of the cutting process recorded using an SLR camera. Features relating to the intensity and area of the cutting sparks were extracted from the individual pictures using image processing techniques. These features were then related to the ceramic insert’s crater wear area.

Keywords: ceramic cutting tools, high speed machining, image processing, tool condition monitoring, tool wear

Procedia PDF Downloads 282
4810 Exploring Individual Decision Making Processes and the Role of Information Structure in Promoting Uptake of Energy Efficient Technologies

Authors: Rebecca J. Hafner, Daniel Read, David Elmes

Abstract:

The current research applies decision making theory in order to address the problem of increasing uptake of energy-efficient technologies in the market place, where uptake is currently slower than one might predict following rational choice models. Specifically, in two studies we apply the alignable/non-alignable features effect and explore the impact of varying information structure on the consumers’ preference for standard versus energy efficient technologies. As researchers in the Interdisciplinary centre for Storage, Transformation and Upgrading of Thermal Energy (i-STUTE) are currently developing energy efficient heating systems for homes and businesses, we focus on the context of home heating choice, and compare preference for a standard condensing boiler versus an energy efficient heat pump, according to experimental manipulations in the structure of prior information. In Study 1, we find that people prefer stronger alignable features when options are similar; an effect which is mediated by an increased tendency to infer missing information is the same. Yet, in contrast to previous research, we find no effects of alignability on option preference when options differ. The advanced methodological approach used here, which is the first study of its kind to randomly allocate features as either alignable or non-alignable, highlights potential design effects in previous work. Study 2 is designed to explore the interaction between alignability and construal level as an explanation for the shift in attentional focus when options differ. Theoretical and applied implications for promoting energy efficient technologies are discussed.

Keywords: energy-efficient technologies, decision-making, alignability effects, construal level theory, CO2 reduction

Procedia PDF Downloads 316
4809 The Causes and Effects of Delinquent Behaviour among Students in Juvenile Home: A Case Study of Osun State

Authors: Baleeqs, O. Adegoke, Adeola, O. Aburime

Abstract:

Juvenile delinquency is fast becoming one of the largest problems facing many societies due to many different factors ranging from parental factors to bullying at schools all which had led to different theoretical notions by different scholars. Delinquency is an illegal or immoral behaviour, especially by the young person who behaves in a way that is illegal or that society does not approve of. The purpose of the study was to investigate causes and effects of delinquent behaviours among adolescent in juvenile home in Osun State. A descriptive survey research type was employed. The random sampling technique was used to select 100 adolescents in Juvenile home in Osun State. Questionnaires were developed and given to them. The data collected from this study were analyzed using frequency counts and percentage for the demographic data in section A, while the two research hypotheses postulated for this study were tested using t-test statistics at the significance level of 0.05. Findings revealed that the greatest school effects of delinquent behaviours among adolescent in juvenile home in Osun by respondents were their aggressive behaviours. Findings revealed that there was a significant difference in the causes and effects of delinquent behaviours among adolescent in juvenile home in Osun State. It was also revealed that there was no significant difference in the causes and effects of delinquent behaviours among secondary school students in Osun based on gender. These recommendations were made in order to address the findings of this study: More number of teachers should be appointed in the observation home so that it will be possible to provide teaching to the different age group of delinquents. Developing the infrastructure facilities of short stay homes and observation home is a top priority. Proper counseling session’s interval is highly essential for these juveniles.

Keywords: behaviour, delinquency, juvenile, random sampling, statistical techniques, survey

Procedia PDF Downloads 174
4808 Multimodal Convolutional Neural Network for Musical Instrument Recognition

Authors: Yagya Raj Pandeya, Joonwhoan Lee

Abstract:

The dynamic behavior of music and video makes it difficult to evaluate musical instrument playing in a video by computer system. Any television or film video clip with music information are rich sources for analyzing musical instruments using modern machine learning technologies. In this research, we integrate the audio and video information sources using convolutional neural network (CNN) and pass network learned features through recurrent neural network (RNN) to preserve the dynamic behaviors of audio and video. We use different pre-trained CNN for music and video feature extraction and then fine tune each model. The music network use 2D convolutional network and video network use 3D convolution (C3D). Finally, we concatenate each music and video feature by preserving the time varying features. The long short term memory (LSTM) network is used for long-term dynamic feature characterization and then use late fusion with generalized mean. The proposed network performs better performance to recognize the musical instrument using audio-video multimodal neural network.

Keywords: multimodal, 3D convolution, music-video feature extraction, generalized mean

Procedia PDF Downloads 197
4807 Preventive Maintenance of Rotating Machinery Based on Vibration Diagnosis of Rolling Bearing

Authors: T. Bensana, S. Mekhilef

Abstract:

The methodology of vibration based condition monitoring technology has been developing at a rapid stage in the recent years suiting to the maintenance of sophisticated and complicated machines. The ability of wavelet analysis to efficiently detect non-stationary, non-periodic, transient features of the vibration signal makes it a demanding tool for condition monitoring. This paper presents a methodology for fault diagnosis of rolling element bearings based on wavelet envelope power spectrum technique is analysed in both the time and frequency domains. In the time domain the auto-correlation of the wavelet de-noised signal is applied to evaluate the period of the fault pulses. However, in the frequency domain the wavelet envelope power spectrum has been used to identify the fault frequencies with the single sided complex Laplace wavelet as the mother wavelet function. Results show the superiority of the proposed method and its effectiveness in extracting fault features from the raw vibration signal.

Keywords: preventive maintenance, fault diagnostics, rolling element bearings, wavelet de-noising

Procedia PDF Downloads 359
4806 Bias-Corrected Estimation Methods for Receiver Operating Characteristic Surface

Authors: Khanh To Duc, Monica Chiogna, Gianfranco Adimari

Abstract:

With three diagnostic categories, assessment of the performance of diagnostic tests is achieved by the analysis of the receiver operating characteristic (ROC) surface, which generalizes the ROC curve for binary diagnostic outcomes. The volume under the ROC surface (VUS) is a summary index usually employed for measuring the overall diagnostic accuracy. When the true disease status can be exactly assessed by means of a gold standard (GS) test, unbiased nonparametric estimators of the ROC surface and VUS are easily obtained. In practice, unfortunately, disease status verification via the GS test could be unavailable for all study subjects, due to the expensiveness or invasiveness of the GS test. Thus, often only a subset of patients undergoes disease verification. Statistical evaluations of diagnostic accuracy based only on data from subjects with verified disease status are typically biased. This bias is known as verification bias. Here, we consider the problem of correcting for verification bias when continuous diagnostic tests for three-class disease status are considered. We assume that selection for disease verification does not depend on disease status, given test results and other observed covariates, i.e., we assume that the true disease status, when missing, is missing at random. Under this assumption, we discuss several solutions for ROC surface analysis based on imputation and re-weighting methods. In particular, verification bias-corrected estimators of the ROC surface and of VUS are proposed, namely, full imputation, mean score imputation, inverse probability weighting and semiparametric efficient estimators. Consistency and asymptotic normality of the proposed estimators are established, and their finite sample behavior is investigated by means of Monte Carlo simulation studies. Two illustrations using real datasets are also given.

Keywords: imputation, missing at random, inverse probability weighting, ROC surface analysis

Procedia PDF Downloads 401
4805 Effect of Monotonically Decreasing Parameters on Margin Softmax for Deep Face Recognition

Authors: Umair Rashid

Abstract:

Normally softmax loss is used as the supervision signal in face recognition (FR) system, and it boosts the separability of features. In the last two years, a number of techniques have been proposed by reformulating the original softmax loss to enhance the discriminating power of Deep Convolutional Neural Networks (DCNNs) for FR system. To learn angularly discriminative features Cosine-Margin based softmax has been adjusted as monotonically decreasing angular function, that is the main challenge for angular based softmax. On that issue, we propose monotonically decreasing element for Cosine-Margin based softmax and also, we discussed the effect of different monotonically decreasing parameters on angular Margin softmax for FR system. We train the model on publicly available dataset CASIA- WebFace via our proposed monotonically decreasing parameters for cosine function and the tests on YouTube Faces (YTF, Labeled Face in the Wild (LFW), VGGFace1 and VGGFace2 attain the state-of-the-art performance.

Keywords: deep convolutional neural networks, cosine margin face recognition, softmax loss, monotonically decreasing parameter

Procedia PDF Downloads 81
4804 Human-Wildlife Conflicts in Urban Areas of Zimbabwe

Authors: Davie G. Dave, Prisca H. Mugabe, Tonderai Mutibvu

Abstract:

Globally, HWCs are on the rise. Such is the case with urban areas in Zimbabwe, yet little has been documented about it. This study was done to provide insights into the occurrence of human-wildlife conflicts in urban areas. The study was carried out in Harare, Bindura, Masvingo, Beitbridge, and Chiredzi to determine the cause, nature, extent, and frequency of occurrence of HWC, to determine the key wildlife species involved in conflicts and management practices done to combat wildlife conflicts in these areas. Several sampling techniques encompassing multi-stage sampling, stratified random, purposive, and simple random sampling were employed for placing residential areas into three strata according to population density, selecting residential areas, and selecting actual participants. Data were collected through a semi-structured questionnaire and key informant interviews. The results revealed that property destruction and crop damage were the most prevalent conflicts. Of the 15 animals that were cited, snakes, baboons, and monkeys were associated with the most conflicts. The occurrence of HWCs was mainly attributed to the increase in both animal and human populations. To curtail these HWCs, the local people mainly used non-lethal methods, whilst lethal methods were used by authorities for some of the reported cases. The majority of the conflicts were seasonal and less severe. There were growing concerns by respondents on the issues of wildlife conflicts, especially in those areas that had primates, such as Warren Park in Harare and Limpopo View in Beitbridge. There are HWCs hotspots in urban areas, and to ameliorate this, suggestions are that there is a need for a multi-action approach that includes general awareness campaigns on HWCs and land use planning that involves the creation of green spaces to ease wildlife management.

Keywords: human-wildlife conflicts, mitigation measures, residential areas, types of conflicts, urban areas

Procedia PDF Downloads 50
4803 Development of Fake News Model Using Machine Learning through Natural Language Processing

Authors: Sajjad Ahmed, Knut Hinkelmann, Flavio Corradini

Abstract:

Fake news detection research is still in the early stage as this is a relatively new phenomenon in the interest raised by society. Machine learning helps to solve complex problems and to build AI systems nowadays and especially in those cases where we have tacit knowledge or the knowledge that is not known. We used machine learning algorithms and for identification of fake news; we applied three classifiers; Passive Aggressive, Naïve Bayes, and Support Vector Machine. Simple classification is not completely correct in fake news detection because classification methods are not specialized for fake news. With the integration of machine learning and text-based processing, we can detect fake news and build classifiers that can classify the news data. Text classification mainly focuses on extracting various features of text and after that incorporating those features into classification. The big challenge in this area is the lack of an efficient way to differentiate between fake and non-fake due to the unavailability of corpora. We applied three different machine learning classifiers on two publicly available datasets. Experimental analysis based on the existing dataset indicates a very encouraging and improved performance.

Keywords: fake news detection, natural language processing, machine learning, classification techniques.

Procedia PDF Downloads 146