Search results for: features based techniques
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 34146

Search results for: features based techniques

33516 Video Processing of a Football Game: Detecting Features of a Football Match for Automated Calculation of Statistics

Authors: Rishabh Beri, Sahil Shah

Abstract:

We have applied a range of filters and processing in order to extract out the various features of the football game, like the field lines of a football field. Another important aspect was the detection of the players in the field and tagging them according to their teams distinguished by their jersey colours. This extracted information combined about the players and field helped us to create a virtual field that consists of the playing field and the players mapped to their locations in it.

Keywords: Detect, Football, Players, Virtual

Procedia PDF Downloads 328
33515 Degradation of Heating, Ventilation, and Air Conditioning Components across Locations

Authors: Timothy E. Frank, Josh R. Aldred, Sophie B. Boulware, Michelle K. Cabonce, Justin H. White

Abstract:

Materials degrade at different rates in different environments depending on factors such as temperature, aridity, salinity, and solar radiation. Therefore, predicting asset longevity depends, in part, on the environmental conditions to which the asset is exposed. Heating, ventilation, and air conditioning (HVAC) systems are critical to building operations yet are responsible for a significant proportion of their energy consumption. HVAC energy use increases substantially with slight operational inefficiencies. Understanding the environmental influences on HVAC degradation in detail will inform maintenance schedules and capital investment, reduce energy use, and increase lifecycle management efficiency. HVAC inspection records spanning 14 years from 21 locations across the United States were compiled and associated with the climate conditions to which they were exposed. Three environmental features were explored in this study: average high temperature, average low temperature, and annual precipitation, as well as four non-environmental features. Initial insights showed no correlations between individual features and the rate of HVAC component degradation. Using neighborhood component analysis, however, the most critical features related to degradation were identified. Two models were considered, and results varied between them. However, longitude and latitude emerged as potentially the best predictors of average HVAC component degradation. Further research is needed to evaluate additional environmental features, increase the resolution of the environmental data, and develop more robust models to achieve more conclusive results.

Keywords: climate, degradation, HVAC, neighborhood component analysis

Procedia PDF Downloads 429
33514 Correlation between Funding and Publications: A Pre-Step towards Future Research Prediction

Authors: Ning Kang, Marius Doornenbal

Abstract:

Funding is a very important – if not crucial – resource for research projects. Usually, funding organizations will publish a description of the funded research to describe the scope of the funding award. Logically, we would expect research outcomes to align with this funding award. For that reason, we might be able to predict future research topics based on present funding award data. That said, it remains to be shown if and how future research topics can be predicted by using the funding information. In this paper, we extract funding project information and their generated paper abstracts from the Gateway to Research database as a group, and use the papers from the same domains and publication years in the Scopus database as a baseline comparison group. We annotate both the project awards and the papers resulting from the funded projects with linguistic features (noun phrases), and then calculate tf-idf and cosine similarity between these two set of features. We show that the cosine similarity between the project-generated papers group is bigger than the project-baseline group, and also that these two groups of similarities are significantly different. Based on this result, we conclude that the funding information actually correlates with the content of future research output for the funded project on the topical level. How funding really changes the course of science or of scientific careers remains an elusive question.

Keywords: natural language processing, noun phrase, tf-idf, cosine similarity

Procedia PDF Downloads 244
33513 Ultrasonic Techniques to Characterize and Monitor Water-in-Oil Emulsion

Authors: E. A. Alshaafi, A. Prakash

Abstract:

Oil-water emulsions are commonly encountered in various industrial operations and at different stages of crude oil production and processing. Emulsions are often difficult to track and treat and can cause a number of costly problems which need to be avoided. The characteristics of the emulsion phase can vary with crude composition and types of impurities present in oil. The objectives of this study are the development of ultrasonic techniques to track and characterize emulsion phase generated during production and cleaning of crude oil. The position of emulsion layer is monitored with the help of ultrasonic probes suitably placed in the vessel. The sensitivity of the technique and its potential has been demonstrated based on extensive testing with different oil samples. The technique is also being developed to monitor emulsion phase characteristics such as stability, composition, and droplet size distribution. The ultrasonic parameters recorded are changes in acoustic velocity, signal attenuation and its frequency spectrum. Emulsion has been prepared with light mineral oil sample and the effects of various factors including mixing speed, temperature, surfactant, and solid particles concentrations have been investigated. The applied frequency for ultrasonic waves has been varied from 1 to 5 MHz to carry out a sensitivity analysis. Emulsion droplet structure is observed with optical microscopy and stability is examined by tracking the changes in ultrasonic parameters with time. A model based on ultrasonic attenuation spectroscopy is being developed and tested to track changes in droplet size distribution with time.

Keywords: ultrasonic techniques, emulsion, characterization, droplet size

Procedia PDF Downloads 172
33512 Keypoints Extraction for Markerless Tracking in Augmented Reality Applications: A Case Study in Dar As-Saraya Museum

Authors: Jafar W. Al-Badarneh, Abdalkareem R. Al-Hawary, Abdulmalik M. Morghem, Mostafa Z. Ali, Rami S. Al-Gharaibeh

Abstract:

Archeological heritage is at the heart of each country’s national glory. Moreover, it could develop into a source of national income. Heritage management requires socially-responsible marketing that achieves high visitor satisfaction while maintaining high site conservation. We have developed an Augmented Reality (AR) experience for heritage and cultural reservation at Dar-As-Saraya museum in Jordan. Our application of this notion relied on markerless-based tracking approach. This approach uses keypoints extraction technique where features of the environment are identified and defined into the system as keypoints. A set of these keypoints forms a tracker for an augmented object to be displayed and overlaid with a real scene at Dar As-Saraya museum. We tested and compared several techniques for markerless tracking and then applied the best technique to complete a mosaic artifact with AR content. The successful results from our application open the door for applications in open archeological sites where markerless tracking is mostly needed.

Keywords: augmented reality, cultural heritage, keypoints extraction, virtual recreation

Procedia PDF Downloads 336
33511 Application of Improved Semantic Communication Technology in Remote Sensing Data Transmission

Authors: Tingwei Shu, Dong Zhou, Chengjun Guo

Abstract:

Semantic communication is an emerging form of communication that realize intelligent communication by extracting semantic information of data at the source and transmitting it, and recovering the data at the receiving end. It can effectively solve the problem of data transmission under the situation of large data volume, low SNR and restricted bandwidth. With the development of Deep Learning, semantic communication further matures and is gradually applied in the fields of the Internet of Things, Uumanned Air Vehicle cluster communication, remote sensing scenarios, etc. We propose an improved semantic communication system for the situation where the data volume is huge and the spectrum resources are limited during the transmission of remote sensing images. At the transmitting, we need to extract the semantic information of remote sensing images, but there are some problems. The traditional semantic communication system based on Convolutional Neural Network cannot take into account the global semantic information and local semantic information of the image, which results in less-than-ideal image recovery at the receiving end. Therefore, we adopt the improved vision-Transformer-based structure as the semantic encoder instead of the mainstream one using CNN to extract the image semantic features. In this paper, we first perform pre-processing operations on remote sensing images to improve the resolution of the images in order to obtain images with more semantic information. We use wavelet transform to decompose the image into high-frequency and low-frequency components, perform bilinear interpolation on the high-frequency components and bicubic interpolation on the low-frequency components, and finally perform wavelet inverse transform to obtain the preprocessed image. We adopt the improved Vision-Transformer structure as the semantic coder to extract and transmit the semantic information of remote sensing images. The Vision-Transformer structure can better train the huge data volume and extract better image semantic features, and adopt the multi-layer self-attention mechanism to better capture the correlation between semantic features and reduce redundant features. Secondly, to improve the coding efficiency, we reduce the quadratic complexity of the self-attentive mechanism itself to linear so as to improve the image data processing speed of the model. We conducted experimental simulations on the RSOD dataset and compared the designed system with a semantic communication system based on CNN and image coding methods such as BGP and JPEG to verify that the method can effectively alleviate the problem of excessive data volume and improve the performance of image data communication.

Keywords: semantic communication, transformer, wavelet transform, data processing

Procedia PDF Downloads 77
33510 Profiling Risky Code Using Machine Learning

Authors: Zunaira Zaman, David Bohannon

Abstract:

This study explores the application of machine learning (ML) for detecting security vulnerabilities in source code. The research aims to assist organizations with large application portfolios and limited security testing capabilities in prioritizing security activities. ML-based approaches offer benefits such as increased confidence scores, false positives and negatives tuning, and automated feedback. The initial approach using natural language processing techniques to extract features achieved 86% accuracy during the training phase but suffered from overfitting and performed poorly on unseen datasets during testing. To address these issues, the study proposes using the abstract syntax tree (AST) for Java and C++ codebases to capture code semantics and structure and generate path-context representations for each function. The Code2Vec model architecture is used to learn distributed representations of source code snippets for training a machine-learning classifier for vulnerability prediction. The study evaluates the performance of the proposed methodology using two datasets and compares the results with existing approaches. The Devign dataset yielded 60% accuracy in predicting vulnerable code snippets and helped resist overfitting, while the Juliet Test Suite predicted specific vulnerabilities such as OS-Command Injection, Cryptographic, and Cross-Site Scripting vulnerabilities. The Code2Vec model achieved 75% accuracy and a 98% recall rate in predicting OS-Command Injection vulnerabilities. The study concludes that even partial AST representations of source code can be useful for vulnerability prediction. The approach has the potential for automated intelligent analysis of source code, including vulnerability prediction on unseen source code. State-of-the-art models using natural language processing techniques and CNN models with ensemble modelling techniques did not generalize well on unseen data and faced overfitting issues. However, predicting vulnerabilities in source code using machine learning poses challenges such as high dimensionality and complexity of source code, imbalanced datasets, and identifying specific types of vulnerabilities. Future work will address these challenges and expand the scope of the research.

Keywords: code embeddings, neural networks, natural language processing, OS command injection, software security, code properties

Procedia PDF Downloads 105
33509 Early Diagnosis of Myocardial Ischemia Based on Support Vector Machine and Gaussian Mixture Model by Using Features of ECG Recordings

Authors: Merve Begum Terzi, Orhan Arikan, Adnan Abaci, Mustafa Candemir

Abstract:

Acute myocardial infarction is a major cause of death in the world. Therefore, its fast and reliable diagnosis is a major clinical need. ECG is the most important diagnostic methodology which is used to make decisions about the management of the cardiovascular diseases. In patients with acute myocardial ischemia, temporary chest pains together with changes in ST segment and T wave of ECG occur shortly before the start of myocardial infarction. In this study, a technique which detects changes in ST/T sections of ECG is developed for the early diagnosis of acute myocardial ischemia. For this purpose, a database of real ECG recordings that contains a set of records from 75 patients presenting symptoms of chest pain who underwent elective percutaneous coronary intervention (PCI) is constituted. 12-lead ECG’s of the patients were recorded before and during the PCI procedure. Two ECG epochs, which are the pre-inflation ECG which is acquired before any catheter insertion and the occlusion ECG which is acquired during balloon inflation, are analyzed for each patient. By using pre-inflation and occlusion recordings, ECG features that are critical in the detection of acute myocardial ischemia are identified and the most discriminative features for the detection of acute myocardial ischemia are extracted. A classification technique based on support vector machine (SVM) approach operating with linear and radial basis function (RBF) kernels to detect ischemic events by using ST-T derived joint features from non-ischemic and ischemic states of the patients is developed. The dataset is randomly divided into training and testing sets and the training set is used to optimize SVM hyperparameters by using grid-search method and 10fold cross-validation. SVMs are designed specifically for each patient by tuning the kernel parameters in order to obtain the optimal classification performance results. As a result of implementing the developed classification technique to real ECG recordings, it is shown that the proposed technique provides highly reliable detections of the anomalies in ECG signals. Furthermore, to develop a detection technique that can be used in the absence of ECG recording obtained during healthy stage, the detection of acute myocardial ischemia based on ECG recordings of the patients obtained during ischemia is also investigated. For this purpose, a Gaussian mixture model (GMM) is used to represent the joint pdf of the most discriminating ECG features of myocardial ischemia. Then, a Neyman-Pearson type of approach is developed to provide detection of outliers that would correspond to acute myocardial ischemia. Neyman – Pearson decision strategy is used by computing the average log likelihood values of ECG segments and comparing them with a range of different threshold values. For different discrimination threshold values and number of ECG segments, probability of detection and probability of false alarm values are computed, and the corresponding ROC curves are obtained. The results indicate that increasing number of ECG segments provide higher performance for GMM based classification. Moreover, the comparison between the performances of SVM and GMM based classification showed that SVM provides higher classification performance results over ECG recordings of considerable number of patients.

Keywords: ECG classification, Gaussian mixture model, Neyman–Pearson approach, support vector machine

Procedia PDF Downloads 160
33508 An Intelligent Baby Care System Based on IoT and Deep Learning Techniques

Authors: Chinlun Lai, Lunjyh Jiang

Abstract:

Due to the heavy burden and pressure of caring for infants, an integrated automatic baby watching system based on IoT smart sensing and deep learning machine vision techniques is proposed in this paper. By monitoring infant body conditions such as heartbeat, breathing, body temperature, sleeping posture, as well as the surrounding conditions such as dangerous/sharp objects, light, noise, humidity and temperature, the proposed system can analyze and predict the obvious/potential dangerous conditions according to observed data and then adopt suitable actions in real time to protect the infant from harm. Thus, reducing the burden of the caregiver and improving safety efficiency of the caring work. The experimental results show that the proposed system works successfully for the infant care work and thus can be implemented in various life fields practically.

Keywords: baby care system, Internet of Things, deep learning, machine vision

Procedia PDF Downloads 223
33507 Using the Smith-Waterman Algorithm to Extract Features in the Classification of Obesity Status

Authors: Rosa Figueroa, Christopher Flores

Abstract:

Text categorization is the problem of assigning a new document to a set of predetermined categories, on the basis of a training set of free-text data that contains documents whose category membership is known. To train a classification model, it is necessary to extract characteristics in the form of tokens that facilitate the learning and classification process. In text categorization, the feature extraction process involves the use of word sequences also known as N-grams. In general, it is expected that documents belonging to the same category share similar features. The Smith-Waterman (SW) algorithm is a dynamic programming algorithm that performs a local sequence alignment in order to determine similar regions between two strings or protein sequences. This work explores the use of SW algorithm as an alternative to feature extraction in text categorization. The dataset used for this purpose, contains 2,610 annotated documents with the classes Obese/Non-Obese. This dataset was represented in a matrix form using the Bag of Word approach. The score selected to represent the occurrence of the tokens in each document was the term frequency-inverse document frequency (TF-IDF). In order to extract features for classification, four experiments were conducted: the first experiment used SW to extract features, the second one used unigrams (single word), the third one used bigrams (two word sequence) and the last experiment used a combination of unigrams and bigrams to extract features for classification. To test the effectiveness of the extracted feature set for the four experiments, a Support Vector Machine (SVM) classifier was tuned using 20% of the dataset. The remaining 80% of the dataset together with 5-Fold Cross Validation were used to evaluate and compare the performance of the four experiments of feature extraction. Results from the tuning process suggest that SW performs better than the N-gram based feature extraction. These results were confirmed by using the remaining 80% of the dataset, where SW performed the best (accuracy = 97.10%, weighted average F-measure = 97.07%). The second best was obtained by the combination of unigrams-bigrams (accuracy = 96.04, weighted average F-measure = 95.97) closely followed by the bigrams (accuracy = 94.56%, weighted average F-measure = 94.46%) and finally unigrams (accuracy = 92.96%, weighted average F-measure = 92.90%).

Keywords: comorbidities, machine learning, obesity, Smith-Waterman algorithm

Procedia PDF Downloads 296
33506 Study of Behavior Tribological Cutting Tools Based on Coating

Authors: A. Achour L. Chekour, A. Mekroud

Abstract:

Tribology, the science of lubrication, friction and wear, plays an important role in science "crossroads" initiated by the recent developments in the industry. Its multidisciplinary nature reinforces its scientific interest. It covers all the sciences that deal with the contact between two solids loaded and relative motion. It is thus one of the many intersections more clearly established disciplines such as solid mechanics and the fluids, rheological, thermal, materials science and chemistry. As for his experimental approach, it is based on the physical and processing signals and images. The optimization of operating conditions by cutting tool must contribute significantly to the development and productivity of advanced automation of machining techniques because their implementation requires sufficient knowledge of how the process and in particular the evolution of tool wear. In addition, technological advances have developed the use of very hard materials, refractory difficult machinability, requiring highly resistant materials tools. In this study, we present the behavior wear a machining tool during the roughing operation according to the cutting parameters. The interpretation of the experimental results is based mainly on observations and analyzes of sharp edges e tool using the latest techniques: scanning electron microscopy (SEM) and optical rugosimetry laser beam.

Keywords: friction, wear, tool, cutting

Procedia PDF Downloads 330
33505 Design and Analysis of Proximity Fed Single Band Microstrip Patch Antenna with Parasitic Lines

Authors: Inderpreet Kaur, Sukhjit Kaur, Balwinder Singh Sohi

Abstract:

The design proposed in this paper mainly focuses on implementation of a single feed compact rectangular microstrip patch antenna (MSA) for single band application. The antenna presented here also works in dual band but its best performance has been obtained when optimised to work in single band mode. In this paper, a new feeding structure is applied in the patch antenna design to overcome undesirable features of the earlier multilayer feeding structures while maintaining their interesting features.To make the proposed antenna more efficient the optimization of the antenna design parameters have been done using HFSS’s optometric. For the proposed antenna one resonant frequency has been obtained at 6.03GHz, with Bandwidth of 167MHz and return loss of -33.82db. The characteristics of the designed structure are investigated by using FEM based electromagnetic solver.

Keywords: bandwidth, retun loss, parasitic lines, microstrip antenna

Procedia PDF Downloads 460
33504 Speech Emotion Recognition with Bi-GRU and Self-Attention based Feature Representation

Authors: Bubai Maji, Monorama Swain

Abstract:

Speech is considered an essential and most natural medium for the interaction between machines and humans. However, extracting effective features for speech emotion recognition (SER) is remains challenging. The present studies show that the temporal information captured but high-level temporal-feature learning is yet to be investigated. In this paper, we present an efficient novel method using the Self-attention (SA) mechanism in a combination of Convolutional Neural Network (CNN) and Bi-directional Gated Recurrent Unit (Bi-GRU) network to learn high-level temporal-feature. In order to further enhance the representation of the high-level temporal-feature, we integrate a Bi-GRU output with learnable weights features by SA, and improve the performance. We evaluate our proposed method on our created SITB-OSED and IEMOCAP databases. We report that the experimental results of our proposed method achieve state-of-the-art performance on both databases.

Keywords: Bi-GRU, 1D-CNNs, self-attention, speech emotion recognition

Procedia PDF Downloads 112
33503 Offline Signature Verification Using Minutiae and Curvature Orientation

Authors: Khaled Nagaty, Heba Nagaty, Gerard McKee

Abstract:

A signature is a behavioral biometric that is used for authenticating users in most financial and legal transactions. Signatures can be easily forged by skilled forgers. Therefore, it is essential to verify whether a signature is genuine or forged. The aim of any signature verification algorithm is to accommodate the differences between signatures of the same person and increase the ability to discriminate between signatures of different persons. This work presented in this paper proposes an automatic signature verification system to indicate whether a signature is genuine or not. The system comprises four phases: (1) The pre-processing phase in which image scaling, binarization, image rotation, dilation, thinning, and connecting ridge breaks are applied. (2) The feature extraction phase in which global and local features are extracted. The local features are minutiae points, curvature orientation, and curve plateau. The global features are signature area, signature aspect ratio, and Hu moments. (3) The post-processing phase, in which false minutiae are removed. (4) The classification phase in which features are enhanced before feeding it into the classifier. k-nearest neighbors and support vector machines are used. The classifier was trained on a benchmark dataset to compare the performance of the proposed offline signature verification system against the state-of-the-art. The accuracy of the proposed system is 92.3%.

Keywords: signature, ridge breaks, minutiae, orientation

Procedia PDF Downloads 144
33502 Bridge Health Monitoring: A Review

Authors: Mohammad Bakhshandeh

Abstract:

Structural Health Monitoring (SHM) is a crucial and necessary practice that plays a vital role in ensuring the safety and integrity of critical structures, and in particular, bridges. The continuous monitoring of bridges for signs of damage or degradation through Bridge Health Monitoring (BHM) enables early detection of potential problems, allowing for prompt corrective action to be taken before significant damage occurs. Although all monitoring techniques aim to provide accurate and decisive information regarding the remaining useful life, safety, integrity, and serviceability of bridges, understanding the development and propagation of damage is vital for maintaining uninterrupted bridge operation. Over the years, extensive research has been conducted on BHM methods, and experts in the field have increasingly adopted new methodologies. In this article, we provide a comprehensive exploration of the various BHM approaches, including sensor-based, non-destructive testing (NDT), model-based, and artificial intelligence (AI)-based methods. We also discuss the challenges associated with BHM, including sensor placement and data acquisition, data analysis and interpretation, cost and complexity, and environmental effects, through an extensive review of relevant literature and research studies. Additionally, we examine potential solutions to these challenges and propose future research ideas to address critical gaps in BHM.

Keywords: structural health monitoring (SHM), bridge health monitoring (BHM), sensor-based methods, machine-learning algorithms, and model-based techniques, sensor placement, data acquisition, data analysis

Procedia PDF Downloads 88
33501 Wastes of Oil Drilling: Treatment Techniques and Their Effectiveness

Authors: Abbas Hadj Abbas, Hacini Massaoud, Aiad Lahcen

Abstract:

In Hassi-Messoud’s oil industry, the systems which are water based (WBM) are generally used for drilling in the first phase. For the rest of the well, the oil mud systems are employed (OBM). In the field of oil exploration, panoply of chemical products is employed in the drilling fluids formulation. These components of different natures and whose toxicity and biodegradability are of ill-defined parameters are; however, thrown into nature. In addition to the hydrocarbon (HC, such as diesel) which is a major constituent of oil based mud, we also can notice spills as well as a variety of other products and additives on the drilling sites. These wastes are usually stored in places called (crud wastes). These may cause major problems to the ecosystem. To treat these wastes, we have considered two methods which are: solidification/ stabilization (chemical) and thermal. So that we can evaluate the techniques of treatment, a series of analyses are performed on dozens of specimens of wastes before treatment. After that, and on the basis of our analyses of wastes, we opted for diagnostic treatments of pollution before and after solidification and stabilization. Finally, we have done some analyses before and after the thermal treatment to check the efficiency of the methods followed in the study.

Keywords: wastes treatment, the oil pollution, the norms, wastes drilling

Procedia PDF Downloads 290
33500 An Evaluative Microbiological Risk Assessment of Drinking Water Supply in the Carpathian Region: Identification of Occurrent Hazardous Bacteria with Quantitative Microbial Risk Assessment Method

Authors: Anikó Kaluzsa

Abstract:

The article's author aims to introduce and analyze those microbiological safety hazards which indicate the presence of secondary contamination in the water supply system. Since drinking water belongs to primary foods and is the basic condition of life, special attention should be paid on its quality. There are such indicators among the microbiological features can be found in water, which are clear evidence of the presence of water contamination, and based on this there is no need to perform other diagnostics, because they prove properly the contamination of the given water supply section. Laboratory analysis can help - both technologically and temporally – to identify contamination, but it does matter how long takes the removal and if the disinfection process takes place in time. The identification of the factors that often occur in the same places or the chance of their occurrence is greater than the average, facilitates our work. The pathogen microbiological risk assessment by the help of several features determines the most likely occurring microbiological features in the Carpathian basin. From among all the microbiological indicators, that are recommended targets for routine inspection by the World Health Organization, there is a paramount importance of the appearance of Escherichia coli in the water network, as its presence indicates the potential ubietiy of enteric pathogens or other contaminants in the water network. In addition, the author presents the steps of microbiological risk assessment analyzing those pathogenic micro-organisms registered to be the most critical.

Keywords: drinking water, E. coli, microbiological indicators, risk assessment, water safety plan

Procedia PDF Downloads 330
33499 Improving Fingerprinting-Based Localization System Using Generative Artificial Intelligence

Authors: Getaneh Berie Tarekegn

Abstract:

A precise localization system is crucial for many artificial intelligence Internet of Things (AI-IoT) applications in the era of smart cities. Their applications include traffic monitoring, emergency alarming, environmental monitoring, location-based advertising, intelligent transportation, and smart health care. The most common method for providing continuous positioning services in outdoor environments is by using a global navigation satellite system (GNSS). Due to nonline-of-sight, multipath, and weather conditions, GNSS systems do not perform well in dense urban, urban, and suburban areas.This paper proposes a generative AI-based positioning scheme for large-scale wireless settings using fingerprinting techniques. In this article, we presented a novel semi-supervised deep convolutional generative adversarial network (S-DCGAN)-based radio map construction method for real-time device localization. We also employed a reliable signal fingerprint feature extraction method with t-distributed stochastic neighbor embedding (t-SNE), which extracts dominant features while eliminating noise from hybrid WLAN and long-term evolution (LTE) fingerprints. The proposed scheme reduced the workload of site surveying required to build the fingerprint database by up to 78.5% and significantly improved positioning accuracy. The results show that the average positioning error of GAILoc is less than 39 cm, and more than 90% of the errors are less than 82 cm. That is, numerical results proved that, in comparison to traditional methods, the proposed SRCLoc method can significantly improve positioning performance and reduce radio map construction costs.

Keywords: location-aware services, feature extraction technique, generative adversarial network, long short-term memory, support vector machine

Procedia PDF Downloads 70
33498 Epileptic Seizure Prediction Focusing on Relative Change in Consecutive Segments of EEG Signal

Authors: Mohammad Zavid Parvez, Manoranjan Paul

Abstract:

Epilepsy is a common neurological disorders characterized by sudden recurrent seizures. Electroencephalogram (EEG) is widely used to diagnose possible epileptic seizure. Many research works have been devoted to predict epileptic seizure by analyzing EEG signal. Seizure prediction by analyzing EEG signals are challenging task due to variations of brain signals of different patients. In this paper, we propose a new approach for feature extraction based on phase correlation in EEG signals. In phase correlation, we calculate relative change between two consecutive segments of an EEG signal and then combine the changes with neighboring signals to extract features. These features are then used to classify preictal/ictal and interictal EEG signals for seizure prediction. Experiment results show that the proposed method carries good prediction rate with greater consistence for the benchmark data set in different brain locations compared to the existing state-of-the-art methods.

Keywords: EEG, epilepsy, phase correlation, seizure

Procedia PDF Downloads 306
33497 Reduction of False Positives in Head-Shoulder Detection Based on Multi-Part Color Segmentation

Authors: Lae-Jeong Park

Abstract:

The paper presents a method that utilizes figure-ground color segmentation to extract effective global feature in terms of false positive reduction in the head-shoulder detection. Conventional detectors that rely on local features such as HOG due to real-time operation suffer from false positives. Color cue in an input image provides salient information on a global characteristic which is necessary to alleviate the false positives of the local feature based detectors. An effective approach that uses figure-ground color segmentation has been presented in an effort to reduce the false positives in object detection. In this paper, an extended version of the approach is presented that adopts separate multipart foregrounds instead of a single prior foreground and performs the figure-ground color segmentation with each of the foregrounds. The multipart foregrounds include the parts of the head-shoulder shape and additional auxiliary foregrounds being optimized by a search algorithm. A classifier is constructed with the feature that consists of a set of the multiple resulting segmentations. Experimental results show that the presented method can discriminate more false positive than the single prior shape-based classifier as well as detectors with the local features. The improvement is possible because the presented approach can reduce the false positives that have the same colors in the head and shoulder foregrounds.

Keywords: pedestrian detection, color segmentation, false positive, feature extraction

Procedia PDF Downloads 278
33496 The Formulation of R&D Strategy for Biofuel Technology: A Case Study of the Aviation Industry in Iran

Authors: Maryam Amiri, Ali Rajabzade, Gholam Reza Goudarzi, Reza Heidari

Abstract:

Growth of technology and environmental changes are so fast and therefore, companies and industries have much tendency to do activities of R&D for active participation in the market and achievement to a competitive advantages. Aviation industry and its subdivisions have high level technology and play a special role in economic and social development of countries. So, in the aviation industry for getting new technologies and competing with other countries aviation industry, there is a requirement for capability in R&D. Considering of appropriate R&D strategy is supportive that day technologies of the world can be achieved. Biofuel technology is one of the newest technologies that has allocated discussion of the world in aviation industry to itself. The purpose of this research has been formulation of R&D strategy of biofuel technology in aviation industry of Iran. After reviewing of the theoretical foundations of the methods and R&D strategies, finally we classified R&D strategies in four main categories as follows: internal R&D, collaboration R&D, out sourcing R&D and in-house R&D. After a review of R&D strategies, a model for formulation of R&D strategy with the aim of developing biofuel technology in aviation industry in Iran was offered. With regard to the requirements and aracteristics of industry and technology in the model, we presented an integrated approach to R&D. Based on the techniques of decision making and analyzing of structured expert opinion, 4 R&D strategies for different scenarios and with the aim of developing biofuel technology in aviation industry in Iran were recommended. In this research, based on the common features of the implementation process of R&D, a logical classification of these methods are presented as R&D strategies. Then, R&D strategies and their characteristics was developed according to the experts. In the end, we introduced a model to consider the role of aviation industry and biofuel technology in R&D strategies. And lastly, for conditions and various scenarios of the aviation industry, we have formulated a specific R&D strategy.

Keywords: aviation industry, biofuel technology, R&D, R&D strategy

Procedia PDF Downloads 576
33495 3D Object Detection for Autonomous Driving: A Comprehensive Review

Authors: Ahmed Soliman Nagiub, Mahmoud Fayez, Heba Khaled, Said Ghoniemy

Abstract:

Accurate perception is a critical component in enabling autonomous vehicles to understand their driving environment. The acquisition of 3D information about objects, including their location and pose, is essential for achieving this understanding. This survey paper presents a comprehensive review of 3D object detection techniques specifically tailored for autonomous vehicles. The survey begins with an introduction to 3D object detection, elucidating the significance of the third dimension in perceiving the driving environment. It explores the types of sensors utilized in this context and the corresponding data extracted from these sensors. Additionally, the survey investigates the different types of datasets employed, including their formats, sizes, and provides a comparative analysis. Furthermore, the paper categorizes and thoroughly examines the perception methods employed for 3D object detection based on the diverse range of sensors utilized. Each method is evaluated based on its effectiveness in accurately detecting objects in a three-dimensional space. Additionally, the evaluation metrics used to assess the performance of these methods are discussed. By offering a comprehensive overview of 3D object detection techniques for autonomous vehicles, this survey aims to advance the field of perception systems. It serves as a valuable resource for researchers and practitioners, providing insights into the techniques, sensors, and evaluation metrics employed in 3D object detection for autonomous vehicles.

Keywords: computer vision, 3D object detection, autonomous vehicles, deep learning

Procedia PDF Downloads 60
33494 Analyzing Conflict Text; ‘Akunyili Memo: State of the Nation’: an Approach from CDA

Authors: Nengi A. H. Ejiobih

Abstract:

Conflict is one of the defining features of human societies. Often, the use or misuse of language in interaction is the genesis of conflict. As such, it is expected that when people use language they do so in socially determined ways and with almost predictable social effects. The objective of this paper was to examine the interest at work as manifested in language choice and collocations in conflict discourse. It also scrutinized the implications of linguistic features in conflict discourse as it concerns ideology and power relations in political discourse in Nigeria. The methodology used for this paper is an approach from Critical discourse analysis because of its multidisciplinary model of analysis, linguistic features and its implications were analysed. The datum used is a text from the Sunday Sun Newspaper in Nigeria, West Africa titled Akunyili Memo: State of the Nation. Some of the findings include; different ideologies are inherent in conflict discourse, there is the presence of power relations being produced, exercised, maintained and produced throughout the discourse and the use of pronouns in conflict discourse is valuable because it is used to initiate and maintain relationships in social context. This paper has provided evidence that, taking into consideration the nature of the social actions and the way these activities are translated into languages, the meanings people convey by their words are identified by their immediate social, political and historical conditions.

Keywords: conflicts, discourse, language, linguistic features, social context

Procedia PDF Downloads 478
33493 An Evaluation of Different Weed Management Techniques in Organic Arable Systems

Authors: Nicola D. Cannon

Abstract:

A range of field experiments have been conducted since 1991 to 2017 on organic land at the Royal Agricultural University’s Harnhill Manor Farm near Cirencester, UK to explore the impact of different management practices on weed infestation in organic winter and spring wheat. The experiments were designed using randomised complete block and some with split plot arrangements. Sowing date, variety choice, crop height and crop establishment technique have all shown a significant impact on weed infestations. Other techniques have also been investigated but with less clear, but, still often significant effects on weed control including grazing with sheep, undersowing with different legumes and mechanical weeding techniques. Tillage treatments included traditional plough based systems, minimum tillage and direct drilling. Direct drilling had significantly higher weed dry matter than the other two techniques. Taller wheat varieties which do not contain Rht1 or Rht2 had higher weed populations than the wheat without dwarfing genes. Early sown winter wheat had greater weed dry matter than later sown wheat. Grazing with sheep interacted strongly with sowing date, with shorter varieties and also late sowing dates providing much less forage but, grazing did reduce weed biomass in June. Undersowing had mixed impacts which were related to the success of establishment of the undersown legume crop. Weeds are most successfully controlled when a range of techniques are implemented to give the wheat crop the greatest chance of competing with weeds.

Keywords: crop establishment, drilling date, grazing, undersowing, varieties, weeds

Procedia PDF Downloads 183
33492 The Application and Relevance of Costing Techniques in Service Oriented Business Organisations: A Review of the Activity-Based Costing (ABC) Technique

Authors: Udeh Nneka Evelyn

Abstract:

The shortcomings of traditional costing system, in terms of validity, accuracy, consistency and relevance increased the need for modern management accounting system. ABC (Activity-Based Costing) can be used as a modern tool for planning, control and decision making for management. Past studies on activity-based costing (ABC) system have focused on manufacturing firms thereby making the studies on service firms scanty to some extent. This paper reviewed the application and relevance of activity-based costing techniques in service oriented business organisations by employing a qualitative research method which relied heavily on literature review of past and current relevant articles focusing on activity-based costing (ABC). Findings suggest that ABC is not only appropriate for use in a manufacturing environment; it is also most appropriate for service organizations such as financial institutions, the healthcare industry, and government organizations. In fact, some banking and financial institutions have been applying the concept for years under other names. One of them is unit costing, which is used to calculate the cost of banking services by determining the cost and consumption of each unit of output of functions required to deliver the service. ABC in very basic terms may provide very good payback for businesses. Some of the benefits that relate directly to the financial services industry are: Identification of the most profitable customers; more accurate product and service pricing; increase product profitability; well-organized process costs.

Keywords: profitability, activity-based costing (ABC), management accounting, manufacture

Procedia PDF Downloads 579
33491 Processing and Modeling of High-Resolution Geophysical Data for Archaeological Prospection, Nuri Area, Northern Sudan

Authors: M. Ibrahim Ali, M. El Dawi, M. A. Mohamed Ali

Abstract:

In this study, the use of magnetic gradient survey, and the geoelectrical ground methods used together to explore archaeological features in Nuri’s pyramids area. Research methods used and the procedures and methodologies have taken full right during the study. The magnetic survey method was used to search for archaeological features using (Geoscan Fluxgate Gradiometer (FM36)). The study area was divided into a number of squares (networks) exactly equal (20 * 20 meters). These squares were collected at the end of the study to give a major network for each region. Networks also divided to take the sample using nets typically equal to (0.25 * 0.50 meter), in order to give a more specific archaeological features with some small bipolar anomalies that caused by buildings built from fired bricks. This definition is important to monitor many of the archaeological features such as rooms and others. This main network gives us an integrated map displayed for easy presentation, and it also allows for all the operations required using (Geoscan Geoplot software). The parallel traverse is the main way to take readings of the magnetic survey, to get out the high-quality data. The study area is very rich in old buildings that vary from small to very large. According to the proportion of the sand dunes and the loose soil, most of these buildings are not visible from the surface. Because of the proportion of the sandy dry soil, there is no connection between the ground surface and the electrodes. We tried to get electrical readings by adding salty water to the soil, but, unfortunately, we failed to confirm the magnetic readings with electrical readings as previously planned.

Keywords: archaeological features, independent grids, magnetic gradient, Nuri pyramid

Procedia PDF Downloads 482
33490 Using Data Mining Technique for Scholarship Disbursement

Authors: J. K. Alhassan, S. A. Lawal

Abstract:

This work is on decision tree-based classification for the disbursement of scholarship. Tree-based data mining classification technique is used in other to determine the generic rule to be used to disburse the scholarship. The system based on the defined rules from the tree is able to determine the class (status) to which an applicant shall belong whether Granted or Not Granted. The applicants that fall to the class of granted denote a successful acquirement of scholarship while those in not granted class are unsuccessful in the scheme. An algorithm that can be used to classify the applicants based on the rules from tree-based classification was also developed. The tree-based classification is adopted because of its efficiency, effectiveness, and easy to comprehend features. The system was tested with the data of National Information Technology Development Agency (NITDA) Abuja, a Parastatal of Federal Ministry of Communication Technology that is mandated to develop and regulate information technology in Nigeria. The system was found working according to the specification. It is therefore recommended for all scholarship disbursement organizations.

Keywords: classification, data mining, decision tree, scholarship

Procedia PDF Downloads 372
33489 Developed Text-Independent Speaker Verification System

Authors: Mohammed Arif, Abdessalam Kifouche

Abstract:

Speech is a very convenient way of communication between people and machines. It conveys information about the identity of the talker. Since speaker recognition technology is increasingly securing our everyday lives, the objective of this paper is to develop two automatic text-independent speaker verification systems (TI SV) using low-level spectral features and machine learning methods. (i) The first system is based on a support vector machine (SVM), which was widely used in voice signal processing with the aim of speaker recognition involving verifying the identity of the speaker based on its voice characteristics, and (ii) the second is based on Gaussian Mixture Model (GMM) and Universal Background Model (UBM) to combine different functions from different resources to implement the SVM based.

Keywords: speaker verification, text-independent, support vector machine, Gaussian mixture model, cepstral analysis

Procedia PDF Downloads 56
33488 An Improved Modular Multilevel Converter Voltage Balancing Approach for Grid Connected PV System

Authors: Safia Bashir, Zulfiqar Memon

Abstract:

During the last decade, renewable energy sources in particular solar photovoltaic (PV) has gained increased attention. Therefore, various PV converters topologies have emerged. Among this topology, the modular multilevel converter (MMC) is considered as one of the most promising topologies for the grid-connected PV system due to its modularity and transformerless features. When it comes to the safe operation of MMC, the balancing of the Submodules Voltages (SMs) plays a critical role. This paper proposes a balancing approach based on space vector PWM (SVPWM). Unlike the existing techniques, this method generates the switching vectors for the MMC by using only one SVPWM for the upper arm. The lower arm switching vectors are obtained by finding the complement of the upper arm switching vectors. The use of one SVPWM not only simplifies the calculation but also helped in reducing the circulating current in the MMC. The proposed method is varied through simulation using Matlab/Simulink and compared with other available modulation methods. The results validate the ability of the suggested method in balancing the SMs capacitors voltages and reducing the circulating current which will help in reducing the power loss of the PV system.

Keywords: capacitor voltage balancing, circulating current, modular multilevel converter, PV system

Procedia PDF Downloads 156
33487 Deciphering Orangutan Drawing Behavior Using Artificial Intelligence

Authors: Benjamin Beltzung, Marie Pelé, Julien P. Renoult, Cédric Sueur

Abstract:

To this day, it is not known if drawing is specifically human behavior or if this behavior finds its origins in ancestor species. An interesting window to enlighten this question is to analyze the drawing behavior in genetically close to human species, such as non-human primate species. A good candidate for this approach is the orangutan, who shares 97% of our genes and exhibits multiple human-like behaviors. Focusing on figurative aspects may not be suitable for orangutans’ drawings, which may appear as scribbles but may have meaning. A manual feature selection would lead to an anthropocentric bias, as the features selected by humans may not match with those relevant for orangutans. In the present study, we used deep learning to analyze the drawings of a female orangutan named Molly († in 2011), who has produced 1,299 drawings in her last five years as part of a behavioral enrichment program at the Tama Zoo in Japan. We investigate multiple ways to decipher Molly’s drawings. First, we demonstrate the existence of differences between seasons by training a deep learning model to classify Molly’s drawings according to the seasons. Then, to understand and interpret these seasonal differences, we analyze how the information spreads within the network, from shallow to deep layers, where early layers encode simple local features and deep layers encode more complex and global information. More precisely, we investigate the impact of feature complexity on classification accuracy through features extraction fed to a Support Vector Machine. Last, we leverage style transfer to dissociate features associated with drawing style from those describing the representational content and analyze the relative importance of these two types of features in explaining seasonal variation. Content features were relevant for the classification, showing the presence of meaning in these non-figurative drawings and the ability of deep learning to decipher these differences. The style of the drawings was also relevant, as style features encoded enough information to have a classification better than random. The accuracy of style features was higher for deeper layers, demonstrating and highlighting the variation of style between seasons in Molly’s drawings. Through this study, we demonstrate how deep learning can help at finding meanings in non-figurative drawings and interpret these differences.

Keywords: cognition, deep learning, drawing behavior, interpretability

Procedia PDF Downloads 163