Search results for: Affection support
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1805

Search results for: Affection support

335 Visual Text Analytics Technologies for Real-Time Big Data: Chronological Evolution and Issues

Authors: Siti Azrina B. A. Aziz, Siti Hafizah A. Hamid

Abstract:

New approaches to analyze and visualize data stream in real-time basis is important in making a prompt decision by the decision maker. Financial market trading and surveillance, large-scale emergency response and crowd control are some example scenarios that require real-time analytic and data visualization. This situation has led to the development of techniques and tools that support humans in analyzing the source data. With the emergence of Big Data and social media, new techniques and tools are required in order to process the streaming data. Today, ranges of tools which implement some of these functionalities are available. In this paper, we present chronological evolution evaluation of technologies for supporting of real-time analytic and visualization of the data stream. Based on the past research papers published from 2002 to 2014, we gathered the general information, main techniques, challenges and open issues. The techniques for streaming text visualization are identified based on Text Visualization Browser in chronological order. This paper aims to review the evolution of streaming text visualization techniques and tools, as well as to discuss the problems and challenges for each of identified tools.

Keywords: Information visualization, visual analytics, text mining, visual text analytics tools, big data visualization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 969
334 A Novel Approach for Coin Identification using Eigenvalues of Covariance Matrix, Hough Transform and Raster Scan Algorithms

Authors: J. Prakash, K. Rajesh

Abstract:

In this paper we present a new method for coin identification. The proposed method adopts a hybrid scheme using Eigenvalues of covariance matrix, Circular Hough Transform (CHT) and Bresenham-s circle algorithm. The statistical and geometrical properties of the small and large Eigenvalues of the covariance matrix of a set of edge pixels over a connected region of support are explored for the purpose of circular object detection. Sparse matrix technique is used to perform CHT. Since sparse matrices squeeze zero elements and contain only a small number of non-zero elements, they provide an advantage of matrix storage space and computational time. Neighborhood suppression scheme is used to find the valid Hough peaks. The accurate position of the circumference pixels is identified using Raster scan algorithm which uses geometrical symmetry property. After finding circular objects, the proposed method uses the texture on the surface of the coins called texton, which are unique properties of coins, refers to the fundamental micro structure in generic natural images. This method has been tested on several real world images including coin and non-coin images. The performance is also evaluated based on the noise withstanding capability.

Keywords: Circular Hough Transform, Coin detection, Covariance matrix, Eigenvalues, Raster scan Algorithm, Texton.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1853
333 Detecting Earnings Management via Statistical and Neural Network Techniques

Authors: Mohammad Namazi, Mohammad Sadeghzadeh Maharluie

Abstract:

Predicting earnings management is vital for the capital market participants, financial analysts and managers. The aim of this research is attempting to respond to this query: Is there a significant difference between the regression model and neural networks’ models in predicting earnings management, and which one leads to a superior prediction of it? In approaching this question, a Linear Regression (LR) model was compared with two neural networks including Multi-Layer Perceptron (MLP), and Generalized Regression Neural Network (GRNN). The population of this study includes 94 listed companies in Tehran Stock Exchange (TSE) market from 2003 to 2011. After the results of all models were acquired, ANOVA was exerted to test the hypotheses. In general, the summary of statistical results showed that the precision of GRNN did not exhibit a significant difference in comparison with MLP. In addition, the mean square error of the MLP and GRNN showed a significant difference with the multi variable LR model. These findings support the notion of nonlinear behavior of the earnings management. Therefore, it is more appropriate for capital market participants to analyze earnings management based upon neural networks techniques, and not to adopt linear regression models.

Keywords: Earnings management, generalized regression neural networks, linear regression, multi-layer perceptron, Tehran stock exchange.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2072
332 Evaluation of Coastal Erosion in the Jurisdiction of the Municipalities of Puerto Colombia and Tubará, Atlántico, Colombia in Google Earth Engine with Landsat and Sentinel 2 Images

Authors: Francisco Javier Reyes Salazar, Héctor Mauricio Ramírez

Abstract:

The coastal zones are home to mangrove swamps, coral reefs, and seagrass ecosystems, which are the most biodiverse and fragile on the planet. These areas support a great diversity of marine life; they are also extraordinarily important for humans in the provision of food, water, wood, and other associated goods and services; they also contribute to climate regulation. The lack of an automated model that generates information on the dynamics of changes in coastlines and coastal erosion is identified as a central problem. In this paper, coastlines were determined from 1984 to 2020 on the Google Earth Engine platform from Landsat and Sentinel images. Then, we determined the Modified Normalized Difference Water Index (MNDWI) and used Digital Shoreline Analysis System (DSAS) v5.0. Starting from the 2020 coastline; the 10-year prediction (Year 2031) was determined with the erosion of 238.32 hectares and an accretion of 181.96 hectares. For the 20-year prediction (Year 2041) will be presented an erosion of 544.04 hectares and an accretion of 133.94 hectares. The erosion and accretion of Playa Muelle in the municipality of Puerto Colombia were established, which will register the highest value of erosion. The coverage that presented the greatest change was that of artificialized territories.

Keywords: Coastline, coastal erosion, MNDWI, Google Earth Engine, Colombia.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 125
331 A TIPSO-SVM Expert System for Efficient Classification of TSTO Surrogates

Authors: Ali Sarosh, Dong Yun-Feng, Muhammad Umer

Abstract:

Fully reusable spaceplanes do not exist as yet. This implies that design-qualification for optimized highly-integrated forebody-inlet configuration of booster-stage vehicle cannot be based on archival data of other spaceplanes. Therefore, this paper proposes a novel TIPSO-SVM expert system methodology. A non-trivial problem related to optimization and classification of hypersonic forebody-inlet configuration in conjunction with mass-model of the two-stage-to-orbit (TSTO) vehicle is solved. The hybrid-heuristic machine learning methodology is based on two-step improved particle swarm optimizer (TIPSO) algorithm and two-step support vector machine (SVM) data classification method. The efficacy of method is tested by first evolving an optimal configuration for hypersonic compression system using TIPSO algorithm; thereafter, classifying the results using two-step SVM method. In the first step extensive but non-classified mass-model training data for multiple optimized configurations is segregated and pre-classified for learning of SVM algorithm. In second step the TIPSO optimized mass-model data is classified using the SVM classification. Results showed remarkable improvement in configuration and mass-model along with sizing parameters.

Keywords: TIPSO-SVM expert system, TIPSO algorithm, two-step SVM method, aerothermodynamics, mass-modeling, TSTO vehicle.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2271
330 Emerging Wireless Standards - WiFi, ZigBee and WiMAX

Authors: Bhavneet Sidhu, Hardeep Singh, Amit Chhabra

Abstract:

The world of wireless telecommunications is rapidly evolving. Technologies under research and development promise to deliver more services to more users in less time. This paper presents the emerging technologies helping wireless systems grow from where we are today into our visions of the future. This paper will cover the applications and characteristics of emerging wireless technologies: Wireless Local Area Networks (WiFi-802.11n), Wireless Personal Area Networks (ZigBee) and Wireless Metropolitan Area Networks (WiMAX). The purpose of this paper is to explain the impending 802.11n standard and how it will enable WLANs to support emerging media-rich applications. The paper will also detail how 802.11n compares with existing WLAN standards and offer strategies for users considering higher-bandwidth alternatives. The emerging IEEE 802.15.4 (ZigBee) standard aims to provide low data rate wireless communications with high-precision ranging and localization, by employing UWB technologies for a low-power and low cost solution. WiMAX (Worldwide Interoperability for Microwave Access) is a standard for wireless data transmission covering a range similar to cellular phone towers. With high performance in both distance and throughput, WiMAX technology could be a boon to current Internet providers seeking to become the leader of next generation wireless Internet access. This paper also explores how these emerging technologies differ from one another.

Keywords: MIMO technology, WiFi, WiMAX, ZigBee.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4952
329 The Role of Paraphrase in Interpreting Students’ Writing

Authors: Maya Lisa Aryanti, S. S. M. Hum

Abstract:

To improve students’ skill, writing is the most challenging skill to be developed. The reason is that besides helping the students to develop their skill, this activity also helps them to express themselves. This paper depicts how paraphrasing is very helpful to interpret students’ writing. Syntactic units, used tenses and meanings will indeed change once the writings were paraphrased. The objectives of this research are to reveal the inappropriate structure of syntactic units, to show what types of sentences the students often make, and to show how paraphrasing can help to infer the message. The methodology of this research is descriptive qualitative research. In addition, theories of linguistics are also included. This includes theory of Syntax to describe syntactic units and tenses and theory of Semantics to describe theories of meaning and how paraphrasing works. The theories of general linguistics, grammar and writing are also provided to support the theories of Syntax and Semantics. The results of this research are concerned with how the message is received in the end. The message written in the students’ essay is not clear because of the improper structure of syntactic units and use of incorrect of tenses. The students tend to use simple sentences, compound sentences and complex sentences with a few mistakes in their writing. In addition, they tend to create unnecessary phrases. The last point is that this research shows how paraphrase works to attain complete meaning of a sentence.

Keywords: Paraphrase, meanings, syntactic units and tenses.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 984
328 Digital Transformation of Payment Systems Using Field Service Management

Authors: Hamze Torabian, Mohammad Mehrabioun Mohammadi

Abstract:

Like many other industries, the payment industry has been affected by digital transformation. The importance of digital transformation in the payment industry is very crucial. Because the payment industry is considered a leading industry in digital and emerging technologies, and the digitalization of other industries such as retail, health, and telecommunication, it also depends on the growth rate of digitalized payment systems. One of the technological innovations in service management is Field Service Management (FSM). Despite the widespread use of FSM in various industries such as petrochemical, health, maintenance, etc., this technology can also be recruited in the payment industry, transforming the payment industry into a more agile and efficient one. Accordingly, the present study pays close attention to the application of FSM in the payment industry. Given the importance of merchants' bargaining power in the payment industry, this study aims to use FSM in the digital transformation initiative with a targeted focus on providing real-time services to merchants. The research method consists of three parts. Firstly, conducting the review of past research, applications of FSM in the payment industry are considered. In the next step, merchants' benefits such as emotional, functional, economic, and social benefits in using FSM are identified using in-depth interviews and content analysis methods. The related business model in helping the payment industry transforming into a more agile and efficient industry is considered in the following step. The results revealed the 10 main pillars required to realize the digital transformation of payment systems using FSM.

Keywords: Digital transformation, field service management, merchant support systems, payment industry.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 559
327 Initial Experiences of the First Version of Slovene Sustainable Building Indicators That Are Based on Level(s)

Authors: Sabina Jordan, Miha Tomšič, Friderik Knez, Marjana Šijanec Zavrl

Abstract:

To determine the possibilities for the implementation of sustainable building indicators in Slovenia, testing of the first version of the indicators, developed in the CARE4CLIMATE project and based on the EU Level(s) framework, was carried out in 2022. Invited and interested stakeholders of the construction process were provided with video content and instructions on the Slovenian e-platform of sustainable building indicators. In addition, workshops and lectures with individual subjects were also performed. The final phase of the training and testing procedure included a questionnaire, which was used to obtain information about the participants' opinions regarding the indicators. The analysis of the results of the testing, which was focused on level 2, confirmed the key preliminary finding of the development group, namely that currently, due to the lack of certain knowledge, data, and tools, all indicators for this level are not yet feasible in practice. The research also highlighted the greater need for training and specialization of experts in this field. At the same time, it showed that the testing of the first version itself was a big challenge: only 30 experts fully participated and filled out the online questionnaire. This number seems alarmingly low at first glance, but compared to level(s) testing in the EU member states, it is much more than 50 times higher. However, for the further execution of the indicators in Slovenia, it will therefore be necessary to invest a lot of effort and engagement. It is likely that state support will also be needed, for example, in the form of financial mechanisms or incentives and/or legislative background.

Keywords: Sustainability, building indicator, project CARE4CLIMATE, alpha version SLO kTG, Level(s), sustainable construction stakeholders.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 139
326 Perceived Benefits of Technology Enhanced Learning by Learners in Uganda: Three Band Benefits

Authors: Kafuko M. Maria, Namisango Fatuma, Byomire Gorretti

Abstract:

Mobile learning (m-learning) is steadily growing and has undoubtedly derived benefits to learners and tutors in different learning environments. This paper investigates the variation in benefits derived from enhanced classroom learning through use of m-learning platforms in the context of a developing country owing to the fact that it is still in its initial stages. The study focused on how basic technology-enhanced pedagogic innovation like cell phone-based learning is enhancing classroom learning from the learners’ perspective. The paper explicitly indicates the opportunities presented by enhanced learning to a conventional learning environment like a physical classroom. The findings were obtained through a survey of two universities in Uganda in which data was quantitatively collected, analyzed and presented in a three banded diagram depicting the variation in the obtainable benefits. Learners indicated that a smartphone is the most commonly used device. Learners also indicate that straight lectures, student to student plus student to lecturer communication, accessing learning material and assignments are core activities. In a TEL environment support by smartphones, learners indicated that they conveniently achieve the prior activities plus discussions and group work. Learners seemed not attracted to the possibility of using TEL environment to take lectures, as well as make class presentations. The less attractiveness of these two factors may be due to the teacher centered approach commonly applied in the country’s education system.

Keywords: Technology enhanced learning, mobile learning classroom learning, perceived benefits.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1650
325 A Panel Cointegration Analysis for Macroeconomic Determinants of International Housing Market

Authors: Mei-Se Chien, Chien-Chiang Lee, Sin-Jie Cai

Abstract:

The main purpose of this paper is to investigate thelong-run equilibrium and short-run dynamics of international housing prices when macroeconomic variables change. We apply the Pedroni’s, panel cointegration, using the unbalanced panel data analysis of 33 countries over the period from 1980Q1 to 2013Q1, to examine the relationships among house prices and macroeconomic variables. Our empirical results of panel data cointegration tests support the existence of a cointegration among these macroeconomic variables and house prices. Besides, the empirical results of panel DOLS further present that a 1% increase in economic activity, long-term interest rates, and construction costs cause house prices to respectively change 2.16%, -0.04%, and 0.22% in the long run.Furthermore, the increasing economic activity and the construction cost would cause strongerimpacts on the house prices for lower income countries than higher income countries.The results lead to the conclusion that policy of house prices growth can be regarded as economic growth for lower income countries. Finally, in America region, the coefficient of economic activity is the highest, which displays that increasing economic activity causes a faster rise in house prices there than in other regions. There are some special cases whereby the coefficients of interest rates are significantly positive in America and Asia regions.

Keywords: House prices, Macroeconomic Variables, Panel cointegration, Dynamic OLS.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5125
324 Intelligent Recognition of Diabetes Disease via FCM Based Attribute Weighting

Authors: Kemal Polat

Abstract:

In this paper, an attribute weighting method called fuzzy C-means clustering based attribute weighting (FCMAW) for classification of Diabetes disease dataset has been used. The aims of this study are to reduce the variance within attributes of diabetes dataset and to improve the classification accuracy of classifier algorithm transforming from non-linear separable datasets to linearly separable datasets. Pima Indians Diabetes dataset has two classes including normal subjects (500 instances) and diabetes subjects (268 instances). Fuzzy C-means clustering is an improved version of K-means clustering method and is one of most used clustering methods in data mining and machine learning applications. In this study, as the first stage, fuzzy C-means clustering process has been used for finding the centers of attributes in Pima Indians diabetes dataset and then weighted the dataset according to the ratios of the means of attributes to centers of theirs. Secondly, after weighting process, the classifier algorithms including support vector machine (SVM) and k-NN (k- nearest neighbor) classifiers have been used for classifying weighted Pima Indians diabetes dataset. Experimental results show that the proposed attribute weighting method (FCMAW) has obtained very promising results in the classification of Pima Indians diabetes dataset.

Keywords: Fuzzy C-means clustering, Fuzzy C-means clustering based attribute weighting, Pima Indians diabetes dataset, SVM.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1730
323 Optimized Facial Features-based Age Classification

Authors: Md. Zahangir Alom, Mei-Lan Piao, Md. Shariful Islam, Nam Kim, Jae-Hyeung Park

Abstract:

The evaluation and measurement of human body dimensions are achieved by physical anthropometry. This research was conducted in view of the importance of anthropometric indices of the face in forensic medicine, surgery, and medical imaging. The main goal of this research is to optimization of facial feature point by establishing a mathematical relationship among facial features and used optimize feature points for age classification. Since selected facial feature points are located to the area of mouth, nose, eyes and eyebrow on facial images, all desire facial feature points are extracted accurately. According this proposes method; sixteen Euclidean distances are calculated from the eighteen selected facial feature points vertically as well as horizontally. The mathematical relationships among horizontal and vertical distances are established. Moreover, it is also discovered that distances of the facial feature follows a constant ratio due to age progression. The distances between the specified features points increase with respect the age progression of a human from his or her childhood but the ratio of the distances does not change (d = 1 .618 ) . Finally, according to the proposed mathematical relationship four independent feature distances related to eight feature points are selected from sixteen distances and eighteen feature point-s respectively. These four feature distances are used for classification of age using Support Vector Machine (SVM)-Sequential Minimal Optimization (SMO) algorithm and shown around 96 % accuracy. Experiment result shows the proposed system is effective and accurate for age classification.

Keywords: 3D Face Model, Face Anthropometrics, Facial Features Extraction, Feature distances, SVM-SMO

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2015
322 Face Recognition Using Principal Component Analysis, K-Means Clustering, and Convolutional Neural Network

Authors: Zukisa Nante, Wang Zenghui

Abstract:

Face recognition is the problem of identifying or recognizing individuals in an image. This paper investigates a possible method to bring a solution to this problem. The method proposes an amalgamation of Principal Component Analysis (PCA), K-Means clustering, and Convolutional Neural Network (CNN) for a face recognition system. It is trained and evaluated using the ORL dataset. This dataset consists of 400 different faces with 40 classes of 10 face images per class. Firstly, PCA enabled the usage of a smaller network. This reduces the training time of the CNN. Thus, we get rid of the redundancy and preserve the variance with a smaller number of coefficients. Secondly, the K-Means clustering model is trained using the compressed PCA obtained data which select the K-Means clustering centers with better characteristics. Lastly, the K-Means characteristics or features are an initial value of the CNN and act as input data. The accuracy and the performance of the proposed method were tested in comparison to other Face Recognition (FR) techniques namely PCA, Support Vector Machine (SVM), as well as K-Nearest Neighbour (kNN). During experimentation, the accuracy and the performance of our suggested method after 90 epochs achieved the highest performance: 99% accuracy F1-Score, 99% precision, and 99% recall in 463.934 seconds. It outperformed the PCA that obtained 97% and KNN with 84% during the conducted experiments. Therefore, this method proved to be efficient in identifying faces in the images.

Keywords: Face recognition, Principal Component Analysis, PCA, Convolutional Neural Network, CNN, Rectified Linear Unit, ReLU, feature extraction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 443
321 Optimal Simultaneous Sizing and Siting of DGs and Smart Meters Considering Voltage Profile Improvement in Active Distribution Networks

Authors: T. Sattarpour, D. Nazarpour

Abstract:

This paper investigates the effect of simultaneous placement of DGs and smart meters (SMs), on voltage profile improvement in active distribution networks (ADNs). A substantial center of attention has recently been on responsive loads initiated in power system problem studies such as distributed generations (DGs). Existence of responsive loads in active distribution networks (ADNs) would have undeniable effect on sizing and siting of DGs. For this reason, an optimal framework is proposed for sizing and siting of DGs and SMs in ADNs. SMs are taken into consideration for the sake of successful implementing of demand response programs (DRPs) such as direct load control (DLC) with end-side consumers. Looking for voltage profile improvement, the optimization procedure is solved by genetic algorithm (GA) and tested on IEEE 33-bus distribution test system. Different scenarios with variations in the number of DG units, individual or simultaneous placing of DGs and SMs, and adaptive power factor (APF) mode for DGs to support reactive power have been established. The obtained results confirm the significant effect of DRPs and APF mode in determining the optimal size and site of DGs to be connected in ADN resulting to the improvement of voltage profile as well.

Keywords: Active distribution network (ADN), distributed generations (DGs), smart meters (SMs), demand response programs (DRPs), adaptive power factor (APF).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1739
320 Routing Medical Images with Tabu Search and Simulated Annealing: A Study on Quality of Service

Authors: Mejía M. Paula, Ramírez L. Leonardo, Puerta A. Gabriel

Abstract:

In telemedicine, the image repository service is important to increase the accuracy of diagnostic support of medical personnel. This study makes comparison between two routing algorithms regarding the quality of service (QoS), to be able to analyze the optimal performance at the time of loading and/or downloading of medical images. This study focused on comparing the performance of Tabu Search with other heuristic and metaheuristic algorithms that improve QoS in telemedicine services in Colombia. For this, Tabu Search and Simulated Annealing heuristic algorithms are chosen for their high usability in this type of applications; the QoS is measured taking into account the following metrics: Delay, Throughput, Jitter and Latency. In addition, routing tests were carried out on ten images in digital image and communication in medicine (DICOM) format of 40 MB. These tests were carried out for ten minutes with different traffic conditions, reaching a total of 25 tests, from a server of Universidad Militar Nueva Granada (UMNG) in Bogotá-Colombia to a remote user in Universidad de Santiago de Chile (USACH) - Chile. The results show that Tabu search presents a better QoS performance compared to Simulated Annealing, managing to optimize the routing of medical images, a basic requirement to offer diagnostic images services in telemedicine.

Keywords: Medical image, QoS, simulated annealing, Tabu search, telemedicine.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 913
319 A Psychophysiological Evaluation of an Effective Recognition Technique Using Interactive Dynamic Virtual Environments

Authors: Mohammadhossein Moghimi, Robert Stone, Pia Rotshtein

Abstract:

Recording psychological and physiological correlates of human performance within virtual environments and interpreting their impacts on human engagement, ‘immersion’ and related emotional or ‘effective’ states is both academically and technologically challenging. By exposing participants to an effective, real-time (game-like) virtual environment, designed and evaluated in an earlier study, a psychophysiological database containing the EEG, GSR and Heart Rate of 30 male and female gamers, exposed to 10 games, was constructed. Some 174 features were subsequently identified and extracted from a number of windows, with 28 different timing lengths (e.g. 2, 3, 5, etc. seconds). After reducing the number of features to 30, using a feature selection technique, K-Nearest Neighbour (KNN) and Support Vector Machine (SVM) methods were subsequently employed for the classification process. The classifiers categorised the psychophysiological database into four effective clusters (defined based on a 3-dimensional space – valence, arousal and dominance) and eight emotion labels (relaxed, content, happy, excited, angry, afraid, sad, and bored). The KNN and SVM classifiers achieved average cross-validation accuracies of 97.01% (±1.3%) and 92.84% (±3.67%), respectively. However, no significant differences were found in the classification process based on effective clusters or emotion labels.

Keywords: Virtual Reality, effective computing, effective VR, emotion-based effective physiological database.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 958
318 Analysis of the Influence of Reshoring on the Structural Behavior of Reinforced Concrete Beams

Authors: Keith Danila Aquino Neves, Júlia Borges dos Santos

Abstract:

There is little published research about the influence of execution methods on structural behavior. Structural analysis is typically based on a constructed building, considering the actions of all forces under which it was designed. However, during construction, execution loads do not match those designed, and in some cases the loads begin to act when the concrete has not yet reached its maximum strength. Changes to structural element support conditions may occur, resulting in unforeseen alterations to the structure’s behavior. Shoring is an example of a construction process that, if executed improperly, will directly influence the structural performance, and may result in unpredicted cracks and displacements. The NBR 14931/2004 standard, which guides the execution of reinforced concrete structures, mentions that shoring must be executed in a way that avoids unpredicted loads and that it may be removed after previous analysis of the structure’s behavior by the professional responsible for the structure’s design. Differences in structural behavior are reduced for small spans. It is important to qualify and quantify how the incorrect placement of shores can compromise a structure’s safety. The results of this research allowed a more precise acknowledgment of the relationship between spans and loads, for which the influence of execution processes can be considerable, and reinforced that civil engineering practice must be performed with the presence of a qualified professional, respecting existing standards’ guidelines.

Keywords: Structural analysis, structural behavior, reshoring, static scheme, reinforced concrete.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 728
317 Movie Genre Preference Prediction Using Machine Learning for Customer-Based Information

Authors: Haifeng Wang, Haili Zhang

Abstract:

Most movie recommendation systems have been developed for customers to find items of interest. This work introduces a predictive model usable by small and medium-sized enterprises (SMEs) who are in need of a data-based and analytical approach to stock proper movies for local audiences and retain more customers. We used classification models to extract features from thousands of customers’ demographic, behavioral and social information to predict their movie genre preference. In the implementation, a Gaussian kernel support vector machine (SVM) classification model and a logistic regression model were established to extract features from sample data and their test error-in-sample were compared. Comparison of error-out-sample was also made under different Vapnik–Chervonenkis (VC) dimensions in the machine learning algorithm to find and prevent overfitting. Gaussian kernel SVM prediction model can correctly predict movie genre preferences in 85% of positive cases. The accuracy of the algorithm increased to 93% with a smaller VC dimension and less overfitting. These findings advance our understanding of how to use machine learning approach to predict customers’ preferences with a small data set and design prediction tools for these enterprises.

Keywords: Computational social science, movie preference, machine learning, SVM.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1611
316 Storage Method for Parts from End of Life Vehicles' Dismantling Process According to Sustainable Development Requirements: Polish Case Study

Authors: M. Kosacka, I. Kudelska

Abstract:

Vehicle is one of the most influential and complex product worldwide, which affects people’s life, state of the environment and condition of the economy (all aspects of sustainable development concept) during each stage of lifecycle. With the increase of vehicles’ number, there is growing potential for management of End of Life Vehicle (ELV), which is hazardous waste. From one point of view, the ELV should be managed to ensure risk elimination, but from another point, it should be treated as a source of valuable materials and spare parts. In order to obtain materials and spare parts, there are established recycling networks, which are an example of sustainable policy realization at the national level. The basic object in the polish recycling network is dismantling facility. The output material streams in dismantling stations include waste, which very often generate costs and spare parts, that have the biggest potential for revenues creation. Both outputs are stored into warehouses, according to the law. In accordance to the revenue creation and sustainability potential, it has been placed a strong emphasis on storage process. We present the concept of storage method, which takes into account the specific of the dismantling facility in order to support decision-making process with regard to the principles of sustainable development. The method was developed on the basis of case study of one of the greatest dismantling facility in Poland.

Keywords: Dismantling, end of life vehicle, sustainability, storage.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1330
315 Authenticity of Lipid and Soluble Sugar Profiles of Various Oat Cultivars (Avena sativa)

Authors: Marijana M. Ačanski, Kristian A. Pastor, Djura N. Vujić

Abstract:

The identification of lipid and soluble sugar components in flour samples of different cultivars belonging to common oat species (Avena sativa L.) was performed: spring oat, winter oat and hulless oat. Fatty acids were extracted from flour samples with n-hexane, and derivatized into volatile methyl esters, using TMSH (trimethylsulfonium hydroxide in methanol). Soluble sugars were then extracted from defatted and dried samples of oat flour with 96% ethanol, and further derivatized into corresponding TMS-oximes, using hydroxylamine hydrochloride solution and BSTFA (N,O-bis-(trimethylsilyl)-trifluoroacetamide). The hexane and ethanol extracts of each oat cultivar were analyzed using GC-MS system. Lipid and simple sugar compositions are very similar in all samples of investigated cultivars. Chemometric tool was applied to numeric values of automatically integrated surface areas of detected lipid and simple sugar components in their corresponding derivatized forms. Hierarchical cluster analysis shows a very high similarity between the investigated flour samples of oat cultivars, according to the fatty acid content (0.9955). Moderate similarity was observed according to the content of soluble sugars (0.50). These preliminary results support the idea of establishing methods for oat flour authentication, and provide the means for distinguishing oat flour samples, regardless of the variety, from flour samples made of other cereal species, just by lipid and simple sugar profile analysis.

Keywords: Authentication, chemometrics, GC-MS, lipid and soluble sugar composition, oat cultivars.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1338
314 Classification of Potential Biomarkers in Breast Cancer Using Artificial Intelligence Algorithms and Anthropometric Datasets

Authors: Aref Aasi, Sahar Ebrahimi Bajgani, Erfan Aasi

Abstract:

Breast cancer (BC) continues to be the most frequent cancer in females and causes the highest number of cancer-related deaths in women worldwide. Inspired by recent advances in studying the relationship between different patient attributes and features and the disease, in this paper, we have tried to investigate the different classification methods for better diagnosis of BC in the early stages. In this regard, datasets from the University Hospital Centre of Coimbra were chosen, and different machine learning (ML)-based and neural network (NN) classifiers have been studied. For this purpose, we have selected favorable features among the nine provided attributes from the clinical dataset by using a random forest algorithm. This dataset consists of both healthy controls and BC patients, and it was noted that glucose, BMI, resistin, and age have the most importance, respectively. Moreover, we have analyzed these features with various ML-based classifier methods, including Decision Tree (DT), K-Nearest Neighbors (KNN), eXtreme Gradient Boosting (XGBoost), Logistic Regression (LR), Naive Bayes (NB), and Support Vector Machine (SVM) along with NN-based Multi-Layer Perceptron (MLP) classifier. The results revealed that among different techniques, the SVM and MLP classifiers have the most accuracy, with amounts of 96% and 92%, respectively. These results divulged that the adopted procedure could be used effectively for the classification of cancer cells, and also it encourages further experimental investigations with more collected data for other types of cancers.

Keywords: Breast cancer, health diagnosis, Machine Learning, biomarker classification, Neural Network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 243
313 The Taiwanese Institutional Arrangement for Coastal Management Due to Climate Change

Authors: Wen-Hong Liu, Hao-Tang Jhan, Kun-Lung Lin, Meng-Tsung Lee

Abstract:

Weather disaster events were frequent and caused loss of lives and property in Taiwan recently. Excessive concentration of population and lacking of integrated planning led to Taiwanese coastal zone face the impacts of climate change directly. Comparing to many countries which have already set up legislation, competent authorities and national adaptation strategies, the ability of coastal management adapting to climate change is still insufficient in Taiwan. Therefore, it is necessary to establish a complete institutional arrangement for coastal management due to climate change in order to protect environment and sustain socio-economic development. This paper firstly reviews the impact of climate change on Taiwanese coastal zone. Secondly, development of Taiwanese institutional arrangement of coastal management is introduced. Followed is the analysis of four dimensions of legal basis, competent authority, scientific and financial support and international cooperations of institutional arrangement. The results show that Taiwanese government shall: 1) integrate climate change issue into Coastal Act, Wetland Act and territorial planning Act and pass them; 2) establish the high level competent authority for coastal management; 3) set up the climate change disaster coordinate platform; 4) link scientific information and decision markers; 5) establish the climate change adjustment fund; 6) participate in international climate change organizations and meetings actively; 7) cooperate with near countries to exchange experiences.

Keywords: Climate Change, Coastal Zone Management, Institution Arrangement, Adaptation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1928
312 Steps towards the Development of National Health Data Standards in Developing Countries: An Exploratory Qualitative Study in Saudi Arabia

Authors: Abdullah I. Alkraiji, Thomas W. Jackson, Ian R. Murray

Abstract:

The proliferation of health data standards today is somewhat overlapping and conflicting, resulting in market confusion and leading to increasing proprietary interests. The government role and support in standardization for health data are thought to be crucial in order to establish credible standards for the next decade, to maximize interoperability across the health sector, and to decrease the risks associated with the implementation of non-standard systems. The normative literature missed out the exploration of the different steps required to be undertaken by the government towards the development of national health data standards. Based on the lessons learned from a qualitative study investigating the different issues to the adoption of health data standards in the major tertiary hospitals in Saudi Arabia and the opinions and feedback from different experts in the areas of data exchange and standards and medical informatics in Saudi Arabia and UK, a list of steps required towards the development of national health data standards was constructed. Main steps are the existence of: a national formal reference for health data standards, an agreed national strategic direction for medical data exchange, a national medical information management plan and a national accreditation body, and more important is the change management at the national and organizational level. The outcome of this study can be used by academics and practitioners to develop the planning of health data standards, and in particular those in developing countries.

Keywords: Interoperability, Case Study, Health Data Standards, Medical Data Exchange, Saudi Arabia.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1968
311 From Electroencephalogram to Epileptic Seizures Detection by Using Artificial Neural Networks

Authors: Gaetano Zazzaro, Angelo Martone, Roberto V. Montaquila, Luigi Pavone

Abstract:

Seizure is the main factor that affects the quality of life of epileptic patients. The diagnosis of epilepsy, and hence the identification of epileptogenic zone, is commonly made by using continuous Electroencephalogram (EEG) signal monitoring. Seizure identification on EEG signals is made manually by epileptologists and this process is usually very long and error prone. The aim of this paper is to describe an automated method able to detect seizures in EEG signals, using knowledge discovery in database process and data mining methods and algorithms, which can support physicians during the seizure detection process. Our detection method is based on Artificial Neural Network classifier, trained by applying the multilayer perceptron algorithm, and by using a software application, called Training Builder that has been developed for the massive extraction of features from EEG signals. This tool is able to cover all the data preparation steps ranging from signal processing to data analysis techniques, including the sliding window paradigm, the dimensionality reduction algorithms, information theory, and feature selection measures. The final model shows excellent performances, reaching an accuracy of over 99% during tests on data of a single patient retrieved from a publicly available EEG dataset.

Keywords: Artificial Neural Network, Data Mining, Electroencephalogram, Epilepsy, Feature Extraction, Seizure Detection, Signal Processing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1267
310 Selecting Negative Examples for Protein-Protein Interaction

Authors: Mohammad Shoyaib, M. Abdullah-Al-Wadud, Oksam Chae

Abstract:

Proteomics is one of the largest areas of research for bioinformatics and medical science. An ambitious goal of proteomics is to elucidate the structure, interactions and functions of all proteins within cells and organisms. Predicting Protein-Protein Interaction (PPI) is one of the crucial and decisive problems in current research. Genomic data offer a great opportunity and at the same time a lot of challenges for the identification of these interactions. Many methods have already been proposed in this regard. In case of in-silico identification, most of the methods require both positive and negative examples of protein interaction and the perfection of these examples are very much crucial for the final prediction accuracy. Positive examples are relatively easy to obtain from well known databases. But the generation of negative examples is not a trivial task. Current PPI identification methods generate negative examples based on some assumptions, which are likely to affect their prediction accuracy. Hence, if more reliable negative examples are used, the PPI prediction methods may achieve even more accuracy. Focusing on this issue, a graph based negative example generation method is proposed, which is simple and more accurate than the existing approaches. An interaction graph of the protein sequences is created. The basic assumption is that the longer the shortest path between two protein-sequences in the interaction graph, the less is the possibility of their interaction. A well established PPI detection algorithm is employed with our negative examples and in most cases it increases the accuracy more than 10% in comparison with the negative pair selection method in that paper.

Keywords: Interaction graph, Negative training data, Protein-Protein interaction, Support vector machine.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1668
309 A Framework for Enhancing Mobile Development Software for Rangsit University, Thailand

Authors: Thossaporn Thossansin

Abstract:

This paper presents the development of a mobile application for students at the Faculty of Information Technology, Rangsit University (RSU), Thailand. RSU upgrades an enrollment process by improving its information systems. Students can download the RSU APP easily in order to access the RSU substantial information. The reason of having a mobile application is to help students to access the system regardless of time and place. The objectives of this paper include: 1. To develop an application on iOS platform for those students at the Faculty of Information Technology, Rangsit University, Thailand. 2. To obtain the students’ perception towards the new mobile app. The target group is those from the freshman year till the senior year of the faculty of Information Technology, Rangsit University. The new mobile application, called as RSU APP, is developed by the department of Information Technology, Rangsit University. It contains useful features and various functionalities particularly on those that can give support to students. The core contents of the app consist of RSU’s announcement, calendar, events, activities, and ebook. The mobile app is developed on the iOS platform. The user satisfaction is analyzed from the interview data from 81 interviewees as well as a Google application like a Google form which 122 interviewees are involved. The result shows that users are satisfied with the application as they score it the most satisfaction level at 4.67 SD 0.52. The score for the question if users can learn and use the application quickly is high which is 4.82 SD 0.71. On the other hand, the lowest satisfaction rating is in the app’s form, apps lists, with the satisfaction level as 4.01 SD 0.45.

Keywords: Mobile application, development of mobile application, framework of mobile development, software development for mobile devices.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1658
308 AI-Based Approaches for Task Offloading, ‎Resource ‎Allocation and Service Placement of ‎IoT Applications: State of the Art

Authors: Fatima Z. Cherhabil, Mammar Sedrati, Sonia-Sabrina Bendib‎

Abstract:

In order to support the continued growth, critical latency of ‎IoT ‎applications and ‎various obstacles of traditional data centers, ‎Mobile Edge ‎Computing (MEC) has ‎emerged as a promising solution that extends the cloud data-processing and decision-making to edge devices. ‎By adopting a MEC structure, IoT applications could be executed ‎locally, on ‎an edge server, different fog nodes or distant cloud ‎data centers. However, we are ‎often ‎faced with wanting to optimize conflicting criteria such as ‎minimizing energy ‎consumption of limited local capabilities (in terms of CPU, RAM, storage, bandwidth) of mobile edge ‎devices and trying to ‎keep ‎high performance (reducing ‎response time, increasing throughput and service availability) ‎at the same ‎time‎. Achieving one goal may affect the other making Task Offloading (TO), ‎Resource Allocation (RA) and Service Placement (SP) complex ‎processes. ‎It is a nontrivial multi-objective optimization ‎problem ‎to study the trade-off between conflicting criteria. ‎The paper provides a survey on different TO, SP and RA recent Multi-‎Objective Optimization (MOO) approaches used in edge computing environments, particularly Artificial Intelligent (AI) ones, to satisfy various objectives, constraints and dynamic conditions related to IoT applications‎.

Keywords: Mobile Edge Computing, Multi-Objective Optimization, Artificial Intelligence ‎Approaches, Task Offloading, Resource Allocation, Service Placement‎.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 431
307 Geophysical Investigation for Pre-Engineering Construction Works in Part of Ilorin, Northcentral Nigeria

Authors: O. Ologe, A. I. Augie

Abstract:

A geophysical investigation involving geoelectric depths sounding has been conducted as pre-foundation study in part of Ilorin, Nigeria. The area is underlain by the Precambrian basement complex rocks. 15 sounding stations were established along five traverses. The Vertical Electrical Sounding (VES) (three-five) conducted along each of the traverses was subjected to computer iteration using IP2Win software. Three -five subsurface geologic layers were delineated in the study area. These include the topsoil with resistivity and thickness values ranging from 103 Ωm-210 Ωm and 0 m-1 m; lateritic (117 Ωm-590 Ωm and 1 m-4.7 m); sandy clay (137 – 859 Ωm and 2.9 m – 4.3 m); weathered (60.5 Ωm to 2539 Ωm and 3,2 m-10 m) and fresh basement (2253-∞ and 7.1 m-∞) respectively. The resistivity pseudosection shows continuous high resistivity zone on the surface. Resistivity of this layer from depth 0-5 m varies from 300-800 Ωm along traverse 1 and 2. Hence, this layer is rated competent as it has the ability to support engineering structure. However, along traverse 1, very low resistive layer occurs between VES 5 and 15 with resistivity values ranging from 30 Ωm-70 Ωm. This layer was rated incompetent based on the competence rating. This study revealed the importance of geophysical survey as a pre-construction engineering survey at any civil engineering site since it can reliably evaluate the competence of the subsurface geomaterials.

Keywords: Competence rating, geoelectric, pseudosection, soil, vertical electrical sounding.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 509
306 Implementation of Congestion Management Strategies on Arterial Roads: Case Study of Geelong

Authors: A. Das, L. Hitihamillage, S. Moridpour

Abstract:

Natural disasters are inevitable to the biodiversity. Disasters such as flood, tsunami and tornadoes could be brutal, harsh and devastating. In Australia, flooding is a major issue experienced by different parts of the country. In such crisis, delays in evacuation could decide the life and death of the people living in those regions. Congestion management could become a mammoth task if there are no steps taken before such situations. In the past to manage congestion in such circumstances, many strategies were utilised such as converting the road shoulders to extra lanes or changing the road geometry by adding more lanes. However, expansion of road to resolving congestion problems is not considered a viable option nowadays. The authorities avoid this option due to many reasons, such as lack of financial support and land space. They tend to focus their attention on optimising the current resources they possess and use traffic signals to overcome congestion problems. Traffic Signal Management strategy was considered a viable option, to alleviate congestion problems in the City of Geelong, Victoria. Arterial road with signalised intersections considered in this paper and the traffic data required for modelling collected from VicRoads. Traffic signalling software SIDRA used to model the roads, and the information gathered from VicRoads. In this paper, various signal parameters utilised to assess and improve the corridor performance to achieve the best possible Level of Services (LOS) for the arterial road.

Keywords: Congestion, constraints, management, LOS.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 920