Search results for: European Standard Classification
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8426

Search results for: European Standard Classification

7706 Investigating The Effects of Utilizing Different Curing Agents on High-Performance Concrete

Authors: Mostafa M. Ahmed, Kotaro Nose, Takashi Fujii, Toshiki Ayano

Abstract:

The Study shed the light on the effects of employing varied curing agents (No.1-No.6): bleeding water, and sprinkling water, aqueous basic silica compound, modified acrylic resin, the emulsion of solid wax and nonionic surfactant, and water-based paraffin wax, on the properties of high-performance concrete (HPC) in comparison with the cured specimens according to the standard curing at 20 ± 3°C (JIS A 0203:2019). The specimens cured in accordance with standard curing exhibit a better compressive strength and higher freeze-thaw resistance compared to most non-standard-cured samples.

Keywords: curing agents, high-performance concrete, compressive strength, cumulative scaling, freeze-thaw resistance

Procedia PDF Downloads 69
7705 Predictive Analytics of Student Performance Determinants

Authors: Mahtab Davari, Charles Edward Okon, Somayeh Aghanavesi

Abstract:

Every institute of learning is usually interested in the performance of enrolled students. The level of these performances determines the approach an institute of study may adopt in rendering academic services. The focus of this paper is to evaluate students' academic performance in given courses of study using machine learning methods. This study evaluated various supervised machine learning classification algorithms such as Logistic Regression (LR), Support Vector Machine, Random Forest, Decision Tree, K-Nearest Neighbors, Linear Discriminant Analysis, and Quadratic Discriminant Analysis, using selected features to predict study performance. The accuracy, precision, recall, and F1 score obtained from a 5-Fold Cross-Validation were used to determine the best classification algorithm to predict students’ performances. SVM (using a linear kernel), LDA, and LR were identified as the best-performing machine learning methods. Also, using the LR model, this study identified students' educational habits such as reading and paying attention in class as strong determinants for a student to have an above-average performance. Other important features include the academic history of the student and work. Demographic factors such as age, gender, high school graduation, etc., had no significant effect on a student's performance.

Keywords: student performance, supervised machine learning, classification, cross-validation, prediction

Procedia PDF Downloads 118
7704 Finite Element Modelling of Log Wall Corner Joints

Authors: Reza Kalantari, Ghazanfarah Hafeez

Abstract:

The paper presents outcomes of the numerical research performed on standard and dovetail corner joints under lateral loads. An overview of the past research on log shear walls is also presented. To the authors’ best knowledge, currently, there are no specific design guidelines available in the code for the design of log shear walls, implying the need to investigate the performance of log shear walls. This research explores the performance of the log shear wall corner joint system of standard joint and dovetail types using numerical methods based on research available in the literature. A parametric study is performed to study the effect of gap size provided between two orthogonal logs and the presence of wood and steel dowels provided as joinery between log courses on the performance of such a structural system. The research outcomes are the force-displacement curves. 8% variability is seen in the reaction forces with the change of gap size for the case of the standard joint, while a variation of 10% is observed in the reaction forces for the dovetail joint system.

Keywords: dovetail joint, finite element modelling, log shear walls, standard joint

Procedia PDF Downloads 208
7703 Deep Learning Approach to Trademark Design Code Identification

Authors: Girish J. Showkatramani, Arthi M. Krishna, Sashi Nareddi, Naresh Nula, Aaron Pepe, Glen Brown, Greg Gabel, Chris Doninger

Abstract:

Trademark examination and approval is a complex process that involves analysis and review of the design components of the marks such as the visual representation as well as the textual data associated with marks such as marks' description. Currently, the process of identifying marks with similar visual representation is done manually in United States Patent and Trademark Office (USPTO) and takes a considerable amount of time. Moreover, the accuracy of these searches depends heavily on the experts determining the trademark design codes used to catalog the visual design codes in the mark. In this study, we explore several methods to automate trademark design code classification. Based on recent successes of convolutional neural networks in image classification, we have used several different convolutional neural networks such as Google’s Inception v3, Inception-ResNet-v2, and Xception net. The study also looks into other techniques to augment the results from CNNs such as using Open Source Computer Vision Library (OpenCV) to pre-process the images. This paper reports the results of the various models trained on year of annotated trademark images.

Keywords: trademark design code, convolutional neural networks, trademark image classification, trademark image search, Inception-ResNet-v2

Procedia PDF Downloads 225
7702 Facility Data Model as Integration and Interoperability Platform

Authors: Nikola Tomasevic, Marko Batic, Sanja Vranes

Abstract:

Emerging Semantic Web technologies can be seen as the next step in evolution of the intelligent facility management systems. Particularly, this considers increased usage of open source and/or standardized concepts for data classification and semantic interpretation. To deliver such facility management systems, providing the comprehensive integration and interoperability platform in from of the facility data model is a prerequisite. In this paper, one of the possible modelling approaches to provide such integrative facility data model which was based on the ontology modelling concept was presented. Complete ontology development process, starting from the input data acquisition, ontology concepts definition and finally ontology concepts population, was described. At the beginning, the core facility ontology was developed representing the generic facility infrastructure comprised of the common facility concepts relevant from the facility management perspective. To develop the data model of a specific facility infrastructure, first extension and then population of the core facility ontology was performed. For the development of the full-blown facility data models, Malpensa and Fiumicino airports in Italy, two major European air-traffic hubs, were chosen as a test-bed platform. Furthermore, the way how these ontology models supported the integration and interoperability of the overall airport energy management system was analyzed as well.

Keywords: airport ontology, energy management, facility data model, ontology modeling

Procedia PDF Downloads 443
7701 The Mineralogy of Shales from the Pilbara and How Chemical Weathering Affects the Intact Strength

Authors: Arturo Maldonado

Abstract:

In the iron ore mining industry, the intact strength of rock units is defined using the uniaxial compressive strength (UCS). This parameter is very important for the classification of shale materials, allowing the split between rock and cohesive soils based on the magnitude of UCS. For this research, it is assumed that UCS less than or equal to 1 MPa is representative of soils. Several researchers have anticipated that the magnitude of UCS reduces with weathering progression, also since UCS is a directional property, its magnitude depends upon the rock fabric orientation. Thus, the paper presents how the UCS of shales is affected by both weathering grade and bedding orientation. The mineralogy of shales has been defined using Hyper-spectral and chemical assays to define the mineral constituents of shale and other non-shale materials. Geological classification tools have been used to define distinct lithological types, and in this manner, the author uses mineralogical datasets to recognize and isolate shales from other rock types and develop tertiary plots for fresh and weathered shales. The mineralogical classification of shales has reduced the contamination of lithology types and facilitated the study of the physical factors affecting the intact strength of shales, like anisotropic strength due to bedding orientation. The analysis of mineralogical characteristics of shales is perhaps the most important contribution of this paper to other researchers who may wish to explore similar methods.

Keywords: rock mechanics, mineralogy, shales, weathering, anisotropy

Procedia PDF Downloads 48
7700 A Comparative Study between Japan and the European Union on Software Vulnerability Public Policies

Authors: Stefano Fantin

Abstract:

The present analysis outcomes from the research undertaken in the course of the European-funded project EUNITY, which targets the gaps in research and development on cybersecurity and privacy between Europe and Japan. Under these auspices, the research presents a study on the policy approach of Japan, the EU and a number of Member States of the Union with regard to the handling and discovery of software vulnerabilities, with the aim of identifying methodological differences and similarities. This research builds upon a functional comparative analysis of both public policies and legal instruments from the identified jurisdictions. The result of this analysis is based on semi-structured interviews with EUNITY partners, as well as by the participation of the researcher to a recent report from the Center for EU Policy Study on software vulnerability. The European Union presents a rather fragmented legal framework on software vulnerabilities. The presence of a number of different legislations at the EU level (including Network and Information Security Directive, Critical Infrastructure Directive, Directive on the Attacks at Information Systems and the Proposal for a Cybersecurity Act) with no clear focus on such a subject makes it difficult for both national governments and end-users (software owners, researchers and private citizens) to gain a clear understanding of the Union’s approach. Additionally, the current data protection reform package (general data protection regulation), seems to create legal uncertainty around security research. To date, at the member states level, a few efforts towards transparent practices have been made, namely by the Netherlands, France, and Latvia. This research will explain what policy approach such countries have taken. Japan has started implementing a coordinated vulnerability disclosure policy in 2004. To date, two amendments can be registered on the framework (2014 and 2017). The framework is furthermore complemented by a series of instruments allowing researchers to disclose responsibly any new discovery. However, the policy has started to lose its efficiency due to a significant increase in reports made to the authority in charge. To conclude, the research conducted reveals two asymmetric policy approaches, time-wise and content-wise. The analysis therein will, therefore, conclude with a series of policy recommendations based on the lessons learned from both regions, towards a common approach to the security of European and Japanese markets, industries and citizens.

Keywords: cybersecurity, vulnerability, European Union, Japan

Procedia PDF Downloads 152
7699 Proposal for a Web System for the Control of Fungal Diseases in Grapes in Fruits Markets

Authors: Carlos Tarmeño Noriega, Igor Aguilar Alonso

Abstract:

Fungal diseases are common in vineyards; they cause a decrease in the quality of the products that can be sold, generating distrust of the customer towards the seller when buying fruit. Currently, technology allows the classification of fruits according to their characteristics thanks to artificial intelligence. This study proposes the implementation of a control system that allows the identification of the main fungal diseases present in the Italia grape, making use of a convolutional neural network (CNN), OpenCV, and TensorFlow. The methodology used was based on a collection of 20 articles referring to the proposed research on quality control, classification, and recognition of fruits through artificial vision techniques.

Keywords: computer vision, convolutional neural networks, quality control, fruit market, OpenCV, TensorFlow

Procedia PDF Downloads 75
7698 On Dynamic Chaotic S-BOX Based Advanced Encryption Standard Algorithm for Image Encryption

Authors: Ajish Sreedharan

Abstract:

Security in transmission and storage of digital images has its importance in today’s image communications and confidential video conferencing. Due to the increasing use of images in industrial process, it is essential to protect the confidential image data from unauthorized access. Advanced Encryption Standard (AES) is a well known block cipher that has several advantages in data encryption. However, it is not suitable for real-time applications. This paper presents modifications to the Advanced Encryption Standard to reflect a high level security and better image encryption. The modifications are done by adjusting the ShiftRow Transformation and using On Dynamic chaotic S-BOX. In AES the Substitute bytes, Shift row and Mix columns by themselves would provide no security because they do not use the key. In Dynamic chaotic S-BOX Based AES the Substitute bytes provide security because the S-Box is constructed from the key. Experimental results verify and prove that the proposed modification to image cryptosystem is highly secure from the cryptographic viewpoint. The results also prove that with a comparison to original AES encryption algorithm the modified algorithm gives better encryption results in terms of security against statistical attacks.

Keywords: advanced encryption standard (AES), on dynamic chaotic S-BOX, image encryption, security analysis, ShiftRow transformation

Procedia PDF Downloads 428
7697 The Planning Criteria of Block-Unit Redevelopment to Improve Residential Environment: Focused on Redevelopment Project in Seoul

Authors: Hong-Nam Choi, Hyeong-Wook Song, Sungwan Hong, Hong-Kyu Kim

Abstract:

In Korea, elements that decide the quality of residential environment are not only diverse, but show deviation as well. However, people do not consider these elements and instead, they try to settle the uniformed style of residential environment, which focuses on the construction development of apartment housing and business based plans. Recently, block-unit redevelopment is becoming the standout alternative plan of standardize redevelopment projects, but constructions become inefficient because of indefinite planning criteria. In conclusion, the following research is about analyzing and categorizing the development method and legal ground of redevelopment project district, plan determinant and applicable standard. The purpose of this study is to become a basis in compatible analysis of planning standards that will happen in the future.

Keywords: shape restrictions, improvement of regulation, diversity of residential environment, classification of redevelopment project, planning criteria of redevelopment, special architectural district (SAD)

Procedia PDF Downloads 481
7696 An Empirical Evaluation of Performance of Machine Learning Techniques on Imbalanced Software Quality Data

Authors: Ruchika Malhotra, Megha Khanna

Abstract:

The development of change prediction models can help the software practitioners in planning testing and inspection resources at early phases of software development. However, a major challenge faced during the training process of any classification model is the imbalanced nature of the software quality data. A data with very few minority outcome categories leads to inefficient learning process and a classification model developed from the imbalanced data generally does not predict these minority categories correctly. Thus, for a given dataset, a minority of classes may be change prone whereas a majority of classes may be non-change prone. This study explores various alternatives for adeptly handling the imbalanced software quality data using different sampling methods and effective MetaCost learners. The study also analyzes and justifies the use of different performance metrics while dealing with the imbalanced data. In order to empirically validate different alternatives, the study uses change data from three application packages of open-source Android data set and evaluates the performance of six different machine learning techniques. The results of the study indicate extensive improvement in the performance of the classification models when using resampling method and robust performance measures.

Keywords: change proneness, empirical validation, imbalanced learning, machine learning techniques, object-oriented metrics

Procedia PDF Downloads 415
7695 Towards an Intelligent Ontology Construction Cost Estimation System: Using BIM and New Rules of Measurement Techniques

Authors: F. H. Abanda, B. Kamsu-Foguem, J. H. M. Tah

Abstract:

Construction cost estimation is one of the most important aspects of construction project design. For generations, the process of cost estimating has been manual, time-consuming and error-prone. This has partly led to most cost estimates to be unclear and riddled with inaccuracies that at times lead to over- or under-estimation of construction cost. The development of standard set of measurement rules that are understandable by all those involved in a construction project, have not totally solved the challenges. Emerging Building Information Modelling (BIM) technologies can exploit standard measurement methods to automate cost estimation process and improves accuracies. This requires standard measurement methods to be structured in ontologically and machine readable format; so that BIM software packages can easily read them. Most standard measurement methods are still text-based in textbooks and require manual editing into tables or Spreadsheet during cost estimation. The aim of this study is to explore the development of an ontology based on New Rules of Measurement (NRM) commonly used in the UK for cost estimation. The methodology adopted is Methontology, one of the most widely used ontology engineering methodologies. The challenges in this exploratory study are also reported and recommendations for future studies proposed.

Keywords: BIM, construction projects, cost estimation, NRM, ontology

Procedia PDF Downloads 546
7694 Monitoring of Cannabis Cultivation with High-Resolution Images

Authors: Levent Basayigit, Sinan Demir, Burhan Kara, Yusuf Ucar

Abstract:

Cannabis is mostly used for drug production. In some countries, an excessive amount of illegal cannabis is cultivated and sold. Most of the illegal cannabis cultivation occurs on the lands far from settlements. In farmlands, it is cultivated with other crops. In this method, cannabis is surrounded by tall plants like corn and sunflower. It is also cultivated with tall crops as the mixed culture. The common method of the determination of the illegal cultivation areas is to investigate the information obtained from people. This method is not sufficient for the determination of illegal cultivation in remote areas. For this reason, more effective methods are needed for the determination of illegal cultivation. Remote Sensing is one of the most important technologies to monitor the plant growth on the land. The aim of this study is to monitor cannabis cultivation area using satellite imagery. The main purpose of this study was to develop an applicable method for monitoring the cannabis cultivation. For this purpose, cannabis was grown as single or surrounded by the corn and sunflower in plots. The morphological characteristics of cannabis were recorded two times per month during the vegetation period. The spectral signature library was created with the spectroradiometer. The parcels were monitored with high-resolution satellite imagery. With the processing of satellite imagery, the cultivation areas of cannabis were classified. To separate the Cannabis plots from the other plants, the multiresolution segmentation algorithm was found to be the most successful for classification. WorldView Improved Vegetative Index (WV-VI) classification was the most accurate method for monitoring the plant density. As a result, an object-based classification method and vegetation indices were sufficient for monitoring the cannabis cultivation in multi-temporal Earthwiev images.

Keywords: Cannabis, drug, remote sensing, object-based classification

Procedia PDF Downloads 266
7693 The Classification Performance in Parametric and Nonparametric Discriminant Analysis for a Class- Unbalanced Data of Diabetes Risk Groups

Authors: Lily Ingsrisawang, Tasanee Nacharoen

Abstract:

Introduction: The problems of unbalanced data sets generally appear in real world applications. Due to unequal class distribution, many research papers found that the performance of existing classifier tends to be biased towards the majority class. The k -nearest neighbors’ nonparametric discriminant analysis is one method that was proposed for classifying unbalanced classes with good performance. Hence, the methods of discriminant analysis are of interest to us in investigating misclassification error rates for class-imbalanced data of three diabetes risk groups. Objective: The purpose of this study was to compare the classification performance between parametric discriminant analysis and nonparametric discriminant analysis in a three-class classification application of class-imbalanced data of diabetes risk groups. Methods: Data from a healthy project for 599 staffs in a government hospital in Bangkok were obtained for the classification problem. The staffs were diagnosed into one of three diabetes risk groups: non-risk (90%), risk (5%), and diabetic (5%). The original data along with the variables; diabetes risk group, age, gender, cholesterol, and BMI was analyzed and bootstrapped up to 50 and 100 samples, 599 observations per sample, for additional estimation of misclassification error rate. Each data set was explored for the departure of multivariate normality and the equality of covariance matrices of the three risk groups. Both the original data and the bootstrap samples show non-normality and unequal covariance matrices. The parametric linear discriminant function, quadratic discriminant function, and the nonparametric k-nearest neighbors’ discriminant function were performed over 50 and 100 bootstrap samples and applied to the original data. In finding the optimal classification rule, the choices of prior probabilities were set up for both equal proportions (0.33: 0.33: 0.33) and unequal proportions with three choices of (0.90:0.05:0.05), (0.80: 0.10: 0.10) or (0.70, 0.15, 0.15). Results: The results from 50 and 100 bootstrap samples indicated that the k-nearest neighbors approach when k = 3 or k = 4 and the prior probabilities of {non-risk:risk:diabetic} as {0.90:0.05:0.05} or {0.80:0.10:0.10} gave the smallest error rate of misclassification. Conclusion: The k-nearest neighbors approach would be suggested for classifying a three-class-imbalanced data of diabetes risk groups.

Keywords: error rate, bootstrap, diabetes risk groups, k-nearest neighbors

Procedia PDF Downloads 430
7692 2D Point Clouds Features from Radar for Helicopter Classification

Authors: Danilo Habermann, Aleksander Medella, Carla Cremon, Yusef Caceres

Abstract:

This paper aims to analyze the ability of 2d point clouds features to classify different models of helicopters using radars. This method does not need to estimate the blade length, the number of blades of helicopters, and the period of their micro-Doppler signatures. It is also not necessary to generate spectrograms (or any other image based on time and frequency domain). This work transforms a radar return signal into a 2D point cloud and extracts features of it. Three classifiers are used to distinguish 9 different helicopter models in order to analyze the performance of the features used in this work. The high accuracy obtained with each of the classifiers demonstrates that the 2D point clouds features are very useful for classifying helicopters from radar signal.

Keywords: helicopter classification, point clouds features, radar, supervised classifiers

Procedia PDF Downloads 216
7691 Interest Rate Prediction with Taylor Rule

Authors: T. Bouchabchoub, A. Bendahmane, A. Haouriqui, N. Attou

Abstract:

This paper presents simulation results of Forex predicting model equations in order to give approximately a prevision of interest rates. First, Hall-Taylor (HT) equations have been used with Taylor rule (TR) to adapt them to European and American Forex Markets. Indeed, initial Taylor Rule equation is conceived for all Forex transactions in every States: It includes only one equation and six parameters. Here, the model has been used with Hall-Taylor equations, initially including twelve equations which have been reduced to only three equations. Analysis has been developed on the following base macroeconomic variables: Real change rate, investment wages, anticipated inflation, realized inflation, real production, interest rates, gap production and potential production. This model has been used to specifically study the impact of an inflation shock on macroeconomic director interest rates.

Keywords: interest rate, Forex, Taylor rule, production, European Central Bank (ECB), Federal Reserve System (FED).

Procedia PDF Downloads 520
7690 Analyzing the Results of Buildings Energy Audit by Using Grey Set Theory

Authors: Tooraj Karimi, Mohammadreza Sadeghi Moghadam

Abstract:

Grey set theory has the advantage of using fewer data to analyze many factors, and it is therefore more appropriate for system study rather than traditional statistical regression which require massive data, normal distribution in the data and few variant factors. So, in this paper grey clustering and entropy of coefficient vector of grey evaluations are used to analyze energy consumption in buildings of the Oil Ministry in Tehran. In fact, this article intends to analyze the results of energy audit reports and defines most favorable characteristics of system, which is energy consumption of buildings, and most favorable factors affecting these characteristics in order to modify and improve them. According to the results of the model, ‘the real Building Load Coefficient’ has been selected as the most important system characteristic and ‘uncontrolled area of the building’ has been diagnosed as the most favorable factor which has the greatest effect on energy consumption of building. Grey clustering in this study has been used for two purposes: First, all the variables of building relate to energy audit cluster in two main groups of indicators and the number of variables is reduced. Second, grey clustering with variable weights has been used to classify all buildings in three categories named ‘no standard deviation’, ‘low standard deviation’ and ‘non- standard’. Entropy of coefficient vector of Grey evaluations is calculated to investigate greyness of results. It shows that among the 38 buildings surveyed in terms of energy consumption, 3 cases are in standard group, 24 cases are in ‘low standard deviation’ group and 11 buildings are completely non-standard. In addition, clustering greyness of 13 buildings is less than 0.5 and average uncertainly of clustering results is 66%.

Keywords: energy audit, grey set theory, grey incidence matrixes, grey clustering, Iran oil ministry

Procedia PDF Downloads 370
7689 E-Hailing Taxi Industry Management Mode Innovation Based on the Credit Evaluation

Authors: Yuan-lin Liu, Ye Li, Tian Xia

Abstract:

There are some shortcomings in Chinese existing taxi management modes. This paper suggests to establish the third-party comprehensive information management platform and put forward an evaluation model based on credit. Four indicators are used to evaluate the drivers’ credit, they are passengers’ evaluation score, driving behavior evaluation, drivers’ average bad record number, and personal credit score. A weighted clustering method is used to achieve credit level evaluation for taxi drivers. The management of taxi industry is based on the credit level, while the grade of the drivers is accorded to their credit rating. Credit rating determines the cost, income levels, the market access, useful period of license and the level of wage and bonus, as well as violation fine. These methods can make the credit evaluation effective. In conclusion, more credit data will help to set up a more accurate and detailed classification standard library.

Keywords: credit, mobile internet, e-hailing taxi, management mode, weighted cluster

Procedia PDF Downloads 314
7688 Sentiment Analysis of Fake Health News Using Naive Bayes Classification Models

Authors: Danielle Shackley, Yetunde Folajimi

Abstract:

As more people turn to the internet seeking health-related information, there is more risk of finding false, inaccurate, or dangerous information. Sentiment analysis is a natural language processing technique that assigns polarity scores to text, ranging from positive, neutral, and negative. In this research, we evaluate the weight of a sentiment analysis feature added to fake health news classification models. The dataset consists of existing reliably labeled health article headlines that were supplemented with health information collected about COVID-19 from social media sources. We started with data preprocessing and tested out various vectorization methods such as Count and TFIDF vectorization. We implemented 3 Naive Bayes classifier models, including Bernoulli, Multinomial, and Complement. To test the weight of the sentiment analysis feature on the dataset, we created benchmark Naive Bayes classification models without sentiment analysis, and those same models were reproduced, and the feature was added. We evaluated using the precision and accuracy scores. The Bernoulli initial model performed with 90% precision and 75.2% accuracy, while the model supplemented with sentiment labels performed with 90.4% precision and stayed constant at 75.2% accuracy. Our results show that the addition of sentiment analysis did not improve model precision by a wide margin; while there was no evidence of improvement in accuracy, we had a 1.9% improvement margin of the precision score with the Complement model. Future expansion of this work could include replicating the experiment process and substituting the Naive Bayes for a deep learning neural network model.

Keywords: sentiment analysis, Naive Bayes model, natural language processing, topic analysis, fake health news classification model

Procedia PDF Downloads 89
7687 Development of an EEG-Based Real-Time Emotion Recognition System on Edge AI

Authors: James Rigor Camacho, Wansu Lim

Abstract:

Over the last few years, the development of new wearable and processing technologies has accelerated in order to harness physiological data such as electroencephalograms (EEGs) for EEG-based applications. EEG has been demonstrated to be a source of emotion recognition signals with the highest classification accuracy among physiological signals. However, when emotion recognition systems are used for real-time classification, the training unit is frequently left to run offline or in the cloud rather than working locally on the edge. That strategy has hampered research, and the full potential of using an edge AI device has yet to be realized. Edge AI devices are computers with high performance that can process complex algorithms. It is capable of collecting, processing, and storing data on its own. It can also analyze and apply complicated algorithms like localization, detection, and recognition on a real-time application, making it a powerful embedded device. The NVIDIA Jetson series, specifically the Jetson Nano device, was used in the implementation. The cEEGrid, which is integrated to the open-source brain computer-interface platform (OpenBCI), is used to collect EEG signals. An EEG-based real-time emotion recognition system on Edge AI is proposed in this paper. To perform graphical spectrogram categorization of EEG signals and to predict emotional states based on input data properties, machine learning-based classifiers were used. Until the emotional state was identified, the EEG signals were analyzed using the K-Nearest Neighbor (KNN) technique, which is a supervised learning system. In EEG signal processing, after each EEG signal has been received in real-time and translated from time to frequency domain, the Fast Fourier Transform (FFT) technique is utilized to observe the frequency bands in each EEG signal. To appropriately show the variance of each EEG frequency band, power density, standard deviation, and mean are calculated and employed. The next stage is to identify the features that have been chosen to predict emotion in EEG data using the K-Nearest Neighbors (KNN) technique. Arousal and valence datasets are used to train the parameters defined by the KNN technique.Because classification and recognition of specific classes, as well as emotion prediction, are conducted both online and locally on the edge, the KNN technique increased the performance of the emotion recognition system on the NVIDIA Jetson Nano. Finally, this implementation aims to bridge the research gap on cost-effective and efficient real-time emotion recognition using a resource constrained hardware device, like the NVIDIA Jetson Nano. On the cutting edge of AI, EEG-based emotion identification can be employed in applications that can rapidly expand the research and implementation industry's use.

Keywords: edge AI device, EEG, emotion recognition system, supervised learning algorithm, sensors

Procedia PDF Downloads 102
7686 Human Capital Mobility of a Skilled Workforce: A Need for a Future of Europe

Authors: Tiron-Tudor Adriana, Farcas Teodora Viorica, Ciolomic Ioana Andreea

Abstract:

The issue of human capital mobility inside Europe is still an open one. Even though there were created some tools in order to better move from one country to another to work and study the number of the people doing this is very low because of various factors presented in this paper. The "rethinking educational" agenda of the European Commission has open the floor for new projects which can create steps towards a European language for skills and competences, qualifications. One of these projects is the Partnership for Exchange of experience in Student on-the-job Training. As part of this project, we are interested to see the situation of the human capital inside EU and the elements that were created until now to support this mobility. Also, the main objective of the project is to make a comparison between the four countries involved in PEST project (Romania, Hungary, Finland, and Estonia), at the education and internship level. The results are helpful for the follow of the project, for identifying where changes can be done and need to be done.

Keywords: ECVET, human capital mobility, partnership exchange, students on the job mobility, vocational education and training

Procedia PDF Downloads 417
7685 Globalisation's Effect on Environmental Activism: A Multi-Level Analysis of Individuals in European Countries

Authors: Dafni Kalatzi Pantera

Abstract:

How globalisation affects environmental activism? Existing research on this relationship focuses on the influence of the world polity on individuals’ willingness to participate in environmental movements. However, globalisation is a multidimensional process which promotes pro-environmental ideas through the world polity, but it also fosters economic growth which is considered antagonistic to the environment. This article models the way that globalisation as a whole affects individuals’ willingness to participate in environmental activism, and the main argument is that globalisation’s impact is conditional on political ideology. To test the above hypothesis, individual and country level data are used for European countries between 1981-2020. The results support the expectation of the article that although globalisation has a positive impact on individuals’ willingness to participate in environmental activism when it interacts with political ideology, its influence differs between ideological spectrums.

Keywords: environmental activism, globalisation, political ideology, world polity

Procedia PDF Downloads 190
7684 National Defense and Armed Forces Development in the Member States of the Visegrad Group

Authors: E. Hronyecz

Abstract:

Guaranteeing the independence of the V4 Member States, the protection of their national values and their citizens, and the security of the Central and Eastern European region requires the development of military capabilities in terms of the capabilities of nations. As a result, European countries have begun developing capabilities and forces, within which nations are seeking to strengthen the capabilities of their armies and make their interoperability more effective. One aspect of this is the upgrading of military equipment, personnel equipment, and other human resources. Based on the author's preliminary researches - analyzing the scientific literature, the relevant statistical data and conducting of professional consultations with the experts of the research field – it can clearly claimed for all four states of Visegrad Group that a change of direction in the field of defense has been noticeable since the second half of the last decade. Collective defense came to the forefront again; the military training, professionalism, and radical modernization of technical equipment becoming crucial.

Keywords: armed forces, cooperation, development, Visegrad Group

Procedia PDF Downloads 127
7683 Deployment of Armed Soldiers in European Cities as a Source of Insecurity among Czech Population

Authors: Blanka Havlickova

Abstract:

In the last ten years, there are growing numbers of troops with machine guns serving on streets of European cities. We can see them around government buildings, major transport hubs, synagogues, galleries and main tourist landmarks. As the main purpose of armed soldier’s presence in European cities authorities declare the prevention of terrorist attacks and psychological support for tourists and domestic population. The main objective of the following study is to find out whether the deployment of armed soldiers in European cities has a calming and reassuring effect on Czech citizens (if the presence at armed soldiers make the Czech population feel more secure) or rather becomes a stress factor (the presence of soldiers standing guard in full military fatigues recalls serious criminality and terrorist attacks which are reflected in the fears and insecurity of Czech population). The initial hypothesis of this study is connected with the priming theory, the idea that when we are exposed to an image (armed soldier), it makes us unconsciously focus on a topic connected with this image (terrorism). This paper is based on a quantitative public survey, which was carried out in the form of electronic questioning among the citizens of the Czech Republic. Respondents answered 14 questions about two European cities – London and Paris. Besides general questions investigating the respondents' awareness of these cities, some of the questions focused on the fear that the respondents had when picturing themselves leaving next Monday for the given city (London or Paris). The questions asking about respondent´s travel fears and concerns were accompanied by different photos. When answering the question about fear some respondents have been presented with a photo of Westminster Palace and the Eiffel with ordinary citizens while other respondents have been presented with a picture of the Westminster Palace, the and Eiffel's tower not only with ordinary citizens, but also with one soldier holding a machine gun. The main goal of this paper is to analyse and compare data about concerns for these two groups of respondents (presented with different pictures) and find out if and how an armed soldier with a machine gun in front of the Westminster Palace or the Eiffel Tower affects the public's concerns about visiting the site. In other words, the aim of this paper is to confirm or rebut the hypothesis that the look at a soldier with a machine gun in front of the Eiffel Tower or the Westminster Palace automatically triggers the association with a terrorist attack leading to an increase in fear and insecurity among Czech population.

Keywords: terrorism, security measures, priming, risk perception

Procedia PDF Downloads 247
7682 Unionisation, Participation and Democracy: Forms of Convergence and Divergence between Union Membership and Civil and Political Activism in European Countries

Authors: Silvia Lucciarini, Antonio Corasaniti

Abstract:

The issue of democracy in capitalist countries has once again become the focus of debate in recent years. A number of socio-economic and political tensions have triggered discussion of this topic from various perspectives and disciplines. Political developments, the rise of both right-wing parties and populism and the constant growth of inequalities in a context of welfare downsizing, have led scholars to question if European capitalist countries are really capable of creating and redistributing resources and look for elements that might make democratic capital in European countries more dense. The aim of the work is to shed light on the trajectories, intensity and convergence or divergence between political and associative participation, on one hand, and organization, on the other, as these constitute two of the main points of connection between the norms, values and actions that bind citizens to the state. Using the European Social Survey database, some studies have sought to analyse degrees of unionization by investigating the relationship between systems of industrial relations and vulnerable groups (in terms of value-oriented practices or political participation). This paper instead aims to investigate the relationship between union participation and civil/political participation, comparing union members and non-members and then distinguishing between employees and self-employed professionals to better understand participatory behaviors among different workers. The first component of the research will employ a multilinear logistic model to examine a sample of 10 countries selected according to a grid that combines the industrial relations models identified by Visser (2006) and the Welfare State systems identified by Esping-Andersen (1990). On the basis of this sample, we propose to compare the choices made by workers and their propensity to join trade unions, together with their level of social and political participation, from 2002 to 2016. In the second component, we aim to verify whether workers within the same system of industrial relations and welfare show a similar propensity to engage in civil participation through political bodies and associations, or if instead these tendencies take on more specific and varied forms. The results will allow us to see: (1) if political participation is higher among unionized workers than it is among the non-unionized. (2) what are the differences in unionisation and civil/political participation between self-employed, temporary and full-time employees and (3) whether the trajectories within industrial relations and welfare models display greater inclusiveness and participation, thereby confirming or disproving the patterns that have been documented among the different European countries.

Keywords: union membership, participation, democracy, industrial relations, welfare systems

Procedia PDF Downloads 135
7681 Using the Smith-Waterman Algorithm to Extract Features in the Classification of Obesity Status

Authors: Rosa Figueroa, Christopher Flores

Abstract:

Text categorization is the problem of assigning a new document to a set of predetermined categories, on the basis of a training set of free-text data that contains documents whose category membership is known. To train a classification model, it is necessary to extract characteristics in the form of tokens that facilitate the learning and classification process. In text categorization, the feature extraction process involves the use of word sequences also known as N-grams. In general, it is expected that documents belonging to the same category share similar features. The Smith-Waterman (SW) algorithm is a dynamic programming algorithm that performs a local sequence alignment in order to determine similar regions between two strings or protein sequences. This work explores the use of SW algorithm as an alternative to feature extraction in text categorization. The dataset used for this purpose, contains 2,610 annotated documents with the classes Obese/Non-Obese. This dataset was represented in a matrix form using the Bag of Word approach. The score selected to represent the occurrence of the tokens in each document was the term frequency-inverse document frequency (TF-IDF). In order to extract features for classification, four experiments were conducted: the first experiment used SW to extract features, the second one used unigrams (single word), the third one used bigrams (two word sequence) and the last experiment used a combination of unigrams and bigrams to extract features for classification. To test the effectiveness of the extracted feature set for the four experiments, a Support Vector Machine (SVM) classifier was tuned using 20% of the dataset. The remaining 80% of the dataset together with 5-Fold Cross Validation were used to evaluate and compare the performance of the four experiments of feature extraction. Results from the tuning process suggest that SW performs better than the N-gram based feature extraction. These results were confirmed by using the remaining 80% of the dataset, where SW performed the best (accuracy = 97.10%, weighted average F-measure = 97.07%). The second best was obtained by the combination of unigrams-bigrams (accuracy = 96.04, weighted average F-measure = 95.97) closely followed by the bigrams (accuracy = 94.56%, weighted average F-measure = 94.46%) and finally unigrams (accuracy = 92.96%, weighted average F-measure = 92.90%).

Keywords: comorbidities, machine learning, obesity, Smith-Waterman algorithm

Procedia PDF Downloads 290
7680 A Novel Method for Face Detection

Authors: H. Abas Nejad, A. R. Teymoori

Abstract:

Facial expression recognition is one of the open problems in computer vision. Robust neutral face recognition in real time is a major challenge for various supervised learning based facial expression recognition methods. This is due to the fact that supervised methods cannot accommodate all appearance variability across the faces with respect to race, pose, lighting, facial biases, etc. in the limited amount of training data. Moreover, processing each and every frame to classify emotions is not required, as the user stays neutral for the majority of the time in usual applications like video chat or photo album/web browsing. Detecting neutral state at an early stage, thereby bypassing those frames from emotion classification would save the computational power. In this work, we propose a light-weight neutral vs. emotion classification engine, which acts as a preprocessor to the traditional supervised emotion classification approaches. It dynamically learns neutral appearance at Key Emotion (KE) points using a textural statistical model, constructed by a set of reference neutral frames for each user. The proposed method is made robust to various types of user head motions by accounting for affine distortions based on a textural statistical model. Robustness to dynamic shift of KE points is achieved by evaluating the similarities on a subset of neighborhood patches around each KE point using the prior information regarding the directionality of specific facial action units acting on the respective KE point. The proposed method, as a result, improves ER accuracy and simultaneously reduces the computational complexity of ER system, as validated on multiple databases.

Keywords: neutral vs. emotion classification, Constrained Local Model, procrustes analysis, Local Binary Pattern Histogram, statistical model

Procedia PDF Downloads 333
7679 Multi-Layer Perceptron and Radial Basis Function Neural Network Models for Classification of Diabetic Retinopathy Disease Using Video-Oculography Signals

Authors: Ceren Kaya, Okan Erkaymaz, Orhan Ayar, Mahmut Özer

Abstract:

Diabetes Mellitus (Diabetes) is a disease based on insulin hormone disorders and causes high blood glucose. Clinical findings determine that diabetes can be diagnosed by electrophysiological signals obtained from the vital organs. 'Diabetic Retinopathy' is one of the most common eye diseases resulting on diabetes and it is the leading cause of vision loss due to structural alteration of the retinal layer vessels. In this study, features of horizontal and vertical Video-Oculography (VOG) signals have been used to classify non-proliferative and proliferative diabetic retinopathy disease. Twenty-five features are acquired by using discrete wavelet transform with VOG signals which are taken from 21 subjects. Two models, based on multi-layer perceptron and radial basis function, are recommended in the diagnosis of Diabetic Retinopathy. The proposed models also can detect level of the disease. We show comparative classification performance of the proposed models. Our results show that proposed the RBF model (100%) results in better classification performance than the MLP model (94%).

Keywords: diabetic retinopathy, discrete wavelet transform, multi-layer perceptron, radial basis function, video-oculography (VOG)

Procedia PDF Downloads 251
7678 Spectrogram Pre-Processing to Improve Isotopic Identification to Discriminate Gamma and Neutrons Sources

Authors: Mustafa Alhamdi

Abstract:

Industrial application to classify gamma rays and neutron events is investigated in this study using deep machine learning. The identification using a convolutional neural network and recursive neural network showed a significant improvement in predication accuracy in a variety of applications. The ability to identify the isotope type and activity from spectral information depends on feature extraction methods, followed by classification. The features extracted from the spectrum profiles try to find patterns and relationships to present the actual spectrum energy in low dimensional space. Increasing the level of separation between classes in feature space improves the possibility to enhance classification accuracy. The nonlinear nature to extract features by neural network contains a variety of transformation and mathematical optimization, while principal component analysis depends on linear transformations to extract features and subsequently improve the classification accuracy. In this paper, the isotope spectrum information has been preprocessed by finding the frequencies components relative to time and using them as a training dataset. Fourier transform implementation to extract frequencies component has been optimized by a suitable windowing function. Training and validation samples of different isotope profiles interacted with CdTe crystal have been simulated using Geant4. The readout electronic noise has been simulated by optimizing the mean and variance of normal distribution. Ensemble learning by combing voting of many models managed to improve the classification accuracy of neural networks. The ability to discriminate gamma and neutron events in a single predication approach using deep machine learning has shown high accuracy using deep learning. The paper findings show the ability to improve the classification accuracy by applying the spectrogram preprocessing stage to the gamma and neutron spectrums of different isotopes. Tuning deep machine learning models by hyperparameter optimization of neural network models enhanced the separation in the latent space and provided the ability to extend the number of detected isotopes in the training database. Ensemble learning contributed significantly to improve the final prediction.

Keywords: machine learning, nuclear physics, Monte Carlo simulation, noise estimation, feature extraction, classification

Procedia PDF Downloads 143
7677 The Reenactment of Historic Memory and the Ways to Read past Traces through Contemporary Architecture in European Urban Contexts: The Case Study of the Medieval Walls of Naples

Authors: Francesco Scarpati

Abstract:

Because of their long history, ranging from ancient times to the present day, European cities feature many historical layers, whose single identities are represented by traces surviving in the urban design. However, urban transformations, in particular, the ones that have been produced by the property speculation phenomena of the 20th century, often compromised the readability of these traces, resulting in a loss of the historical identities of the single layers. The purpose of this research is, therefore, a reflection on the theme of the reenactment of the historical memory in the stratified European contexts and on how contemporary architecture can help to reveal past signs of the cities. The research work starts from an analysis of a series of emblematic examples that have already provided an original solution to the described problem, going from the architectural detail scale to the urban and landscape scale. The results of these analyses are then applied to the case study of the city of Naples, as an emblematic example of a stratified city, with an ancient Greek origin; a city where it is possible to read most of the traces of its transformations. Particular consideration is given to the trace of the medieval walls of the city, which a long time ago clearly divided the city itself from the outer fields, and that is no longer readable at the current time. Finally, solutions and methods of intervention are proposed to ensure that the trace of the walls, read as a boundary, can be revealed through the contemporary project.

Keywords: contemporary project, historic memory, historic urban contexts, medieval walls, naples, stratified cities, urban traces

Procedia PDF Downloads 263