Search results for: learner support
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1918

Search results for: learner support

328 Intelligent Recognition of Diabetes Disease via FCM Based Attribute Weighting

Authors: Kemal Polat

Abstract:

In this paper, an attribute weighting method called fuzzy C-means clustering based attribute weighting (FCMAW) for classification of Diabetes disease dataset has been used. The aims of this study are to reduce the variance within attributes of diabetes dataset and to improve the classification accuracy of classifier algorithm transforming from non-linear separable datasets to linearly separable datasets. Pima Indians Diabetes dataset has two classes including normal subjects (500 instances) and diabetes subjects (268 instances). Fuzzy C-means clustering is an improved version of K-means clustering method and is one of most used clustering methods in data mining and machine learning applications. In this study, as the first stage, fuzzy C-means clustering process has been used for finding the centers of attributes in Pima Indians diabetes dataset and then weighted the dataset according to the ratios of the means of attributes to centers of theirs. Secondly, after weighting process, the classifier algorithms including support vector machine (SVM) and k-NN (k- nearest neighbor) classifiers have been used for classifying weighted Pima Indians diabetes dataset. Experimental results show that the proposed attribute weighting method (FCMAW) has obtained very promising results in the classification of Pima Indians diabetes dataset.

Keywords: Fuzzy C-means clustering, Fuzzy C-means clustering based attribute weighting, Pima Indians diabetes dataset, SVM.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1763
327 Optimized Facial Features-based Age Classification

Authors: Md. Zahangir Alom, Mei-Lan Piao, Md. Shariful Islam, Nam Kim, Jae-Hyeung Park

Abstract:

The evaluation and measurement of human body dimensions are achieved by physical anthropometry. This research was conducted in view of the importance of anthropometric indices of the face in forensic medicine, surgery, and medical imaging. The main goal of this research is to optimization of facial feature point by establishing a mathematical relationship among facial features and used optimize feature points for age classification. Since selected facial feature points are located to the area of mouth, nose, eyes and eyebrow on facial images, all desire facial feature points are extracted accurately. According this proposes method; sixteen Euclidean distances are calculated from the eighteen selected facial feature points vertically as well as horizontally. The mathematical relationships among horizontal and vertical distances are established. Moreover, it is also discovered that distances of the facial feature follows a constant ratio due to age progression. The distances between the specified features points increase with respect the age progression of a human from his or her childhood but the ratio of the distances does not change (d = 1 .618 ) . Finally, according to the proposed mathematical relationship four independent feature distances related to eight feature points are selected from sixteen distances and eighteen feature point-s respectively. These four feature distances are used for classification of age using Support Vector Machine (SVM)-Sequential Minimal Optimization (SMO) algorithm and shown around 96 % accuracy. Experiment result shows the proposed system is effective and accurate for age classification.

Keywords: 3D Face Model, Face Anthropometrics, Facial Features Extraction, Feature distances, SVM-SMO

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2047
326 Face Recognition Using Principal Component Analysis, K-Means Clustering, and Convolutional Neural Network

Authors: Zukisa Nante, Wang Zenghui

Abstract:

Face recognition is the problem of identifying or recognizing individuals in an image. This paper investigates a possible method to bring a solution to this problem. The method proposes an amalgamation of Principal Component Analysis (PCA), K-Means clustering, and Convolutional Neural Network (CNN) for a face recognition system. It is trained and evaluated using the ORL dataset. This dataset consists of 400 different faces with 40 classes of 10 face images per class. Firstly, PCA enabled the usage of a smaller network. This reduces the training time of the CNN. Thus, we get rid of the redundancy and preserve the variance with a smaller number of coefficients. Secondly, the K-Means clustering model is trained using the compressed PCA obtained data which select the K-Means clustering centers with better characteristics. Lastly, the K-Means characteristics or features are an initial value of the CNN and act as input data. The accuracy and the performance of the proposed method were tested in comparison to other Face Recognition (FR) techniques namely PCA, Support Vector Machine (SVM), as well as K-Nearest Neighbour (kNN). During experimentation, the accuracy and the performance of our suggested method after 90 epochs achieved the highest performance: 99% accuracy F1-Score, 99% precision, and 99% recall in 463.934 seconds. It outperformed the PCA that obtained 97% and KNN with 84% during the conducted experiments. Therefore, this method proved to be efficient in identifying faces in the images.

Keywords: Face recognition, Principal Component Analysis, PCA, Convolutional Neural Network, CNN, Rectified Linear Unit, ReLU, feature extraction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 505
325 Optimal Simultaneous Sizing and Siting of DGs and Smart Meters Considering Voltage Profile Improvement in Active Distribution Networks

Authors: T. Sattarpour, D. Nazarpour

Abstract:

This paper investigates the effect of simultaneous placement of DGs and smart meters (SMs), on voltage profile improvement in active distribution networks (ADNs). A substantial center of attention has recently been on responsive loads initiated in power system problem studies such as distributed generations (DGs). Existence of responsive loads in active distribution networks (ADNs) would have undeniable effect on sizing and siting of DGs. For this reason, an optimal framework is proposed for sizing and siting of DGs and SMs in ADNs. SMs are taken into consideration for the sake of successful implementing of demand response programs (DRPs) such as direct load control (DLC) with end-side consumers. Looking for voltage profile improvement, the optimization procedure is solved by genetic algorithm (GA) and tested on IEEE 33-bus distribution test system. Different scenarios with variations in the number of DG units, individual or simultaneous placing of DGs and SMs, and adaptive power factor (APF) mode for DGs to support reactive power have been established. The obtained results confirm the significant effect of DRPs and APF mode in determining the optimal size and site of DGs to be connected in ADN resulting to the improvement of voltage profile as well.

Keywords: Active distribution network (ADN), distributed generations (DGs), smart meters (SMs), demand response programs (DRPs), adaptive power factor (APF).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1770
324 Routing Medical Images with Tabu Search and Simulated Annealing: A Study on Quality of Service

Authors: Mejía M. Paula, Ramírez L. Leonardo, Puerta A. Gabriel

Abstract:

In telemedicine, the image repository service is important to increase the accuracy of diagnostic support of medical personnel. This study makes comparison between two routing algorithms regarding the quality of service (QoS), to be able to analyze the optimal performance at the time of loading and/or downloading of medical images. This study focused on comparing the performance of Tabu Search with other heuristic and metaheuristic algorithms that improve QoS in telemedicine services in Colombia. For this, Tabu Search and Simulated Annealing heuristic algorithms are chosen for their high usability in this type of applications; the QoS is measured taking into account the following metrics: Delay, Throughput, Jitter and Latency. In addition, routing tests were carried out on ten images in digital image and communication in medicine (DICOM) format of 40 MB. These tests were carried out for ten minutes with different traffic conditions, reaching a total of 25 tests, from a server of Universidad Militar Nueva Granada (UMNG) in Bogotá-Colombia to a remote user in Universidad de Santiago de Chile (USACH) - Chile. The results show that Tabu search presents a better QoS performance compared to Simulated Annealing, managing to optimize the routing of medical images, a basic requirement to offer diagnostic images services in telemedicine.

Keywords: Medical image, QoS, simulated annealing, Tabu search, telemedicine.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 956
323 A Psychophysiological Evaluation of an Effective Recognition Technique Using Interactive Dynamic Virtual Environments

Authors: Mohammadhossein Moghimi, Robert Stone, Pia Rotshtein

Abstract:

Recording psychological and physiological correlates of human performance within virtual environments and interpreting their impacts on human engagement, ‘immersion’ and related emotional or ‘effective’ states is both academically and technologically challenging. By exposing participants to an effective, real-time (game-like) virtual environment, designed and evaluated in an earlier study, a psychophysiological database containing the EEG, GSR and Heart Rate of 30 male and female gamers, exposed to 10 games, was constructed. Some 174 features were subsequently identified and extracted from a number of windows, with 28 different timing lengths (e.g. 2, 3, 5, etc. seconds). After reducing the number of features to 30, using a feature selection technique, K-Nearest Neighbour (KNN) and Support Vector Machine (SVM) methods were subsequently employed for the classification process. The classifiers categorised the psychophysiological database into four effective clusters (defined based on a 3-dimensional space – valence, arousal and dominance) and eight emotion labels (relaxed, content, happy, excited, angry, afraid, sad, and bored). The KNN and SVM classifiers achieved average cross-validation accuracies of 97.01% (±1.3%) and 92.84% (±3.67%), respectively. However, no significant differences were found in the classification process based on effective clusters or emotion labels.

Keywords: Virtual Reality, effective computing, effective VR, emotion-based effective physiological database.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 994
322 Analysis of the Influence of Reshoring on the Structural Behavior of Reinforced Concrete Beams

Authors: Keith Danila Aquino Neves, Júlia Borges dos Santos

Abstract:

There is little published research about the influence of execution methods on structural behavior. Structural analysis is typically based on a constructed building, considering the actions of all forces under which it was designed. However, during construction, execution loads do not match those designed, and in some cases the loads begin to act when the concrete has not yet reached its maximum strength. Changes to structural element support conditions may occur, resulting in unforeseen alterations to the structure’s behavior. Shoring is an example of a construction process that, if executed improperly, will directly influence the structural performance, and may result in unpredicted cracks and displacements. The NBR 14931/2004 standard, which guides the execution of reinforced concrete structures, mentions that shoring must be executed in a way that avoids unpredicted loads and that it may be removed after previous analysis of the structure’s behavior by the professional responsible for the structure’s design. Differences in structural behavior are reduced for small spans. It is important to qualify and quantify how the incorrect placement of shores can compromise a structure’s safety. The results of this research allowed a more precise acknowledgment of the relationship between spans and loads, for which the influence of execution processes can be considerable, and reinforced that civil engineering practice must be performed with the presence of a qualified professional, respecting existing standards’ guidelines.

Keywords: Structural analysis, structural behavior, reshoring, static scheme, reinforced concrete.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 776
321 Movie Genre Preference Prediction Using Machine Learning for Customer-Based Information

Authors: Haifeng Wang, Haili Zhang

Abstract:

Most movie recommendation systems have been developed for customers to find items of interest. This work introduces a predictive model usable by small and medium-sized enterprises (SMEs) who are in need of a data-based and analytical approach to stock proper movies for local audiences and retain more customers. We used classification models to extract features from thousands of customers’ demographic, behavioral and social information to predict their movie genre preference. In the implementation, a Gaussian kernel support vector machine (SVM) classification model and a logistic regression model were established to extract features from sample data and their test error-in-sample were compared. Comparison of error-out-sample was also made under different Vapnik–Chervonenkis (VC) dimensions in the machine learning algorithm to find and prevent overfitting. Gaussian kernel SVM prediction model can correctly predict movie genre preferences in 85% of positive cases. The accuracy of the algorithm increased to 93% with a smaller VC dimension and less overfitting. These findings advance our understanding of how to use machine learning approach to predict customers’ preferences with a small data set and design prediction tools for these enterprises.

Keywords: Computational social science, movie preference, machine learning, SVM.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1649
320 Post Pandemic Mobility Analysis through Indexing and Sharding in MongoDB: Performance Optimization and Insights

Authors: Karan Vishavjit, Aakash Lakra, Shafaq Khan

Abstract:

The COVID-19 pandemic has pushed healthcare professionals to use big data analytics as a vital tool for tracking and evaluating the effects of contagious viruses. To effectively analyse huge datasets, efficient NoSQL databases are needed. The analysis of post-COVID-19 health and well-being outcomes and the evaluation of the effectiveness of government efforts during the pandemic is made possible by this research’s integration of several datasets, which cuts down on query processing time and creates predictive visual artifacts. We recommend applying sharding and indexing technologies to improve query effectiveness and scalability as the dataset expands. Effective data retrieval and analysis are made possible by spreading the datasets into a sharded database and doing indexing on individual shards. Analysis of connections between governmental activities, poverty levels, and post-pandemic wellbeing is the key goal. We want to evaluate the effectiveness of governmental initiatives to improve health and lower poverty levels. We will do this by utilising advanced data analysis and visualisations. The findings provide relevant data that support the advancement of UN sustainable objectives, future pandemic preparation, and evidence-based decision-making. This study shows how Big Data and NoSQL databases may be used to address problems with global health.

Keywords: COVID-19, big data, data analysis, indexing, NoSQL, sharding, scalability, poverty.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 67
319 Storage Method for Parts from End of Life Vehicles' Dismantling Process According to Sustainable Development Requirements: Polish Case Study

Authors: M. Kosacka, I. Kudelska

Abstract:

Vehicle is one of the most influential and complex product worldwide, which affects people’s life, state of the environment and condition of the economy (all aspects of sustainable development concept) during each stage of lifecycle. With the increase of vehicles’ number, there is growing potential for management of End of Life Vehicle (ELV), which is hazardous waste. From one point of view, the ELV should be managed to ensure risk elimination, but from another point, it should be treated as a source of valuable materials and spare parts. In order to obtain materials and spare parts, there are established recycling networks, which are an example of sustainable policy realization at the national level. The basic object in the polish recycling network is dismantling facility. The output material streams in dismantling stations include waste, which very often generate costs and spare parts, that have the biggest potential for revenues creation. Both outputs are stored into warehouses, according to the law. In accordance to the revenue creation and sustainability potential, it has been placed a strong emphasis on storage process. We present the concept of storage method, which takes into account the specific of the dismantling facility in order to support decision-making process with regard to the principles of sustainable development. The method was developed on the basis of case study of one of the greatest dismantling facility in Poland.

Keywords: Dismantling, end of life vehicle, sustainability, storage.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1366
318 Authenticity of Lipid and Soluble Sugar Profiles of Various Oat Cultivars (Avena sativa)

Authors: Marijana M. Ačanski, Kristian A. Pastor, Djura N. Vujić

Abstract:

The identification of lipid and soluble sugar components in flour samples of different cultivars belonging to common oat species (Avena sativa L.) was performed: spring oat, winter oat and hulless oat. Fatty acids were extracted from flour samples with n-hexane, and derivatized into volatile methyl esters, using TMSH (trimethylsulfonium hydroxide in methanol). Soluble sugars were then extracted from defatted and dried samples of oat flour with 96% ethanol, and further derivatized into corresponding TMS-oximes, using hydroxylamine hydrochloride solution and BSTFA (N,O-bis-(trimethylsilyl)-trifluoroacetamide). The hexane and ethanol extracts of each oat cultivar were analyzed using GC-MS system. Lipid and simple sugar compositions are very similar in all samples of investigated cultivars. Chemometric tool was applied to numeric values of automatically integrated surface areas of detected lipid and simple sugar components in their corresponding derivatized forms. Hierarchical cluster analysis shows a very high similarity between the investigated flour samples of oat cultivars, according to the fatty acid content (0.9955). Moderate similarity was observed according to the content of soluble sugars (0.50). These preliminary results support the idea of establishing methods for oat flour authentication, and provide the means for distinguishing oat flour samples, regardless of the variety, from flour samples made of other cereal species, just by lipid and simple sugar profile analysis.

Keywords: Authentication, chemometrics, GC-MS, lipid and soluble sugar composition, oat cultivars.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1372
317 Classification of Potential Biomarkers in Breast Cancer Using Artificial Intelligence Algorithms and Anthropometric Datasets

Authors: Aref Aasi, Sahar Ebrahimi Bajgani, Erfan Aasi

Abstract:

Breast cancer (BC) continues to be the most frequent cancer in females and causes the highest number of cancer-related deaths in women worldwide. Inspired by recent advances in studying the relationship between different patient attributes and features and the disease, in this paper, we have tried to investigate the different classification methods for better diagnosis of BC in the early stages. In this regard, datasets from the University Hospital Centre of Coimbra were chosen, and different machine learning (ML)-based and neural network (NN) classifiers have been studied. For this purpose, we have selected favorable features among the nine provided attributes from the clinical dataset by using a random forest algorithm. This dataset consists of both healthy controls and BC patients, and it was noted that glucose, BMI, resistin, and age have the most importance, respectively. Moreover, we have analyzed these features with various ML-based classifier methods, including Decision Tree (DT), K-Nearest Neighbors (KNN), eXtreme Gradient Boosting (XGBoost), Logistic Regression (LR), Naive Bayes (NB), and Support Vector Machine (SVM) along with NN-based Multi-Layer Perceptron (MLP) classifier. The results revealed that among different techniques, the SVM and MLP classifiers have the most accuracy, with amounts of 96% and 92%, respectively. These results divulged that the adopted procedure could be used effectively for the classification of cancer cells, and also it encourages further experimental investigations with more collected data for other types of cancers.

Keywords: Breast cancer, health diagnosis, Machine Learning, biomarker classification, Neural Network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 320
316 The Taiwanese Institutional Arrangement for Coastal Management Due to Climate Change

Authors: Wen-Hong Liu, Hao-Tang Jhan, Kun-Lung Lin, Meng-Tsung Lee

Abstract:

Weather disaster events were frequent and caused loss of lives and property in Taiwan recently. Excessive concentration of population and lacking of integrated planning led to Taiwanese coastal zone face the impacts of climate change directly. Comparing to many countries which have already set up legislation, competent authorities and national adaptation strategies, the ability of coastal management adapting to climate change is still insufficient in Taiwan. Therefore, it is necessary to establish a complete institutional arrangement for coastal management due to climate change in order to protect environment and sustain socio-economic development. This paper firstly reviews the impact of climate change on Taiwanese coastal zone. Secondly, development of Taiwanese institutional arrangement of coastal management is introduced. Followed is the analysis of four dimensions of legal basis, competent authority, scientific and financial support and international cooperations of institutional arrangement. The results show that Taiwanese government shall: 1) integrate climate change issue into Coastal Act, Wetland Act and territorial planning Act and pass them; 2) establish the high level competent authority for coastal management; 3) set up the climate change disaster coordinate platform; 4) link scientific information and decision markers; 5) establish the climate change adjustment fund; 6) participate in international climate change organizations and meetings actively; 7) cooperate with near countries to exchange experiences.

Keywords: Climate Change, Coastal Zone Management, Institution Arrangement, Adaptation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1967
315 Steps towards the Development of National Health Data Standards in Developing Countries: An Exploratory Qualitative Study in Saudi Arabia

Authors: Abdullah I. Alkraiji, Thomas W. Jackson, Ian R. Murray

Abstract:

The proliferation of health data standards today is somewhat overlapping and conflicting, resulting in market confusion and leading to increasing proprietary interests. The government role and support in standardization for health data are thought to be crucial in order to establish credible standards for the next decade, to maximize interoperability across the health sector, and to decrease the risks associated with the implementation of non-standard systems. The normative literature missed out the exploration of the different steps required to be undertaken by the government towards the development of national health data standards. Based on the lessons learned from a qualitative study investigating the different issues to the adoption of health data standards in the major tertiary hospitals in Saudi Arabia and the opinions and feedback from different experts in the areas of data exchange and standards and medical informatics in Saudi Arabia and UK, a list of steps required towards the development of national health data standards was constructed. Main steps are the existence of: a national formal reference for health data standards, an agreed national strategic direction for medical data exchange, a national medical information management plan and a national accreditation body, and more important is the change management at the national and organizational level. The outcome of this study can be used by academics and practitioners to develop the planning of health data standards, and in particular those in developing countries.

Keywords: Interoperability, Case Study, Health Data Standards, Medical Data Exchange, Saudi Arabia.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2002
314 From Electroencephalogram to Epileptic Seizures Detection by Using Artificial Neural Networks

Authors: Gaetano Zazzaro, Angelo Martone, Roberto V. Montaquila, Luigi Pavone

Abstract:

Seizure is the main factor that affects the quality of life of epileptic patients. The diagnosis of epilepsy, and hence the identification of epileptogenic zone, is commonly made by using continuous Electroencephalogram (EEG) signal monitoring. Seizure identification on EEG signals is made manually by epileptologists and this process is usually very long and error prone. The aim of this paper is to describe an automated method able to detect seizures in EEG signals, using knowledge discovery in database process and data mining methods and algorithms, which can support physicians during the seizure detection process. Our detection method is based on Artificial Neural Network classifier, trained by applying the multilayer perceptron algorithm, and by using a software application, called Training Builder that has been developed for the massive extraction of features from EEG signals. This tool is able to cover all the data preparation steps ranging from signal processing to data analysis techniques, including the sliding window paradigm, the dimensionality reduction algorithms, information theory, and feature selection measures. The final model shows excellent performances, reaching an accuracy of over 99% during tests on data of a single patient retrieved from a publicly available EEG dataset.

Keywords: Artificial Neural Network, Data Mining, Electroencephalogram, Epilepsy, Feature Extraction, Seizure Detection, Signal Processing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1314
313 Quality Determinants of Client Satisfaction: A Case Study of Australian Consulting Engineers, Sydney, Australia

Authors: Elham S. Hasham, Anthony S. Hasham

Abstract:

The construction industry is one of Australia’s fastest growing industries and its success is a result of a firm’s client satisfaction with focus on product determinants such as price and quality. Ensuring quality at every phase is a must and building rapport with the client will go a long way. To capitalize on the growing demand for Engineering Consulting Firms (ECFs), it is imperative to stress the implication of customer satisfaction and excellence in standards and performance. Consequently, the emphasis should be on improving employee skills through various training provisions. Clients seek consistency and thus expect that all services should be similar in respect to quality and the ability of the service to meet their needs. This calls for empowerment and comfortable work conditions to motivate employees and give them incentive to deliver quality and excellent output. The methodology utilized is triangulation - a combination of both quantitative and qualitative research. The case study - Australian Consulting Engineers (ACE) - was established in Australia in 1995 and has operations throughout Australia, the Philippines, Europe, UAE, KSA, and with a branch in Lebanon. ACE is affiliated with key agencies and support organizations in the engineering industry with International Organization for Standardization (ISO) certifications in Safety and Quality Management. The objective of this study, conducted in Australia, is significant as it sheds light on employee motivation and client satisfaction as imperative determinants of the success of an organization.

Keywords: Organizational behaviour, leadership, satisfaction, motivation, quality service.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 45
312 Selecting Negative Examples for Protein-Protein Interaction

Authors: Mohammad Shoyaib, M. Abdullah-Al-Wadud, Oksam Chae

Abstract:

Proteomics is one of the largest areas of research for bioinformatics and medical science. An ambitious goal of proteomics is to elucidate the structure, interactions and functions of all proteins within cells and organisms. Predicting Protein-Protein Interaction (PPI) is one of the crucial and decisive problems in current research. Genomic data offer a great opportunity and at the same time a lot of challenges for the identification of these interactions. Many methods have already been proposed in this regard. In case of in-silico identification, most of the methods require both positive and negative examples of protein interaction and the perfection of these examples are very much crucial for the final prediction accuracy. Positive examples are relatively easy to obtain from well known databases. But the generation of negative examples is not a trivial task. Current PPI identification methods generate negative examples based on some assumptions, which are likely to affect their prediction accuracy. Hence, if more reliable negative examples are used, the PPI prediction methods may achieve even more accuracy. Focusing on this issue, a graph based negative example generation method is proposed, which is simple and more accurate than the existing approaches. An interaction graph of the protein sequences is created. The basic assumption is that the longer the shortest path between two protein-sequences in the interaction graph, the less is the possibility of their interaction. A well established PPI detection algorithm is employed with our negative examples and in most cases it increases the accuracy more than 10% in comparison with the negative pair selection method in that paper.

Keywords: Interaction graph, Negative training data, Protein-Protein interaction, Support vector machine.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1702
311 A Framework for Enhancing Mobile Development Software for Rangsit University, Thailand

Authors: Thossaporn Thossansin

Abstract:

This paper presents the development of a mobile application for students at the Faculty of Information Technology, Rangsit University (RSU), Thailand. RSU upgrades an enrollment process by improving its information systems. Students can download the RSU APP easily in order to access the RSU substantial information. The reason of having a mobile application is to help students to access the system regardless of time and place. The objectives of this paper include: 1. To develop an application on iOS platform for those students at the Faculty of Information Technology, Rangsit University, Thailand. 2. To obtain the students’ perception towards the new mobile app. The target group is those from the freshman year till the senior year of the faculty of Information Technology, Rangsit University. The new mobile application, called as RSU APP, is developed by the department of Information Technology, Rangsit University. It contains useful features and various functionalities particularly on those that can give support to students. The core contents of the app consist of RSU’s announcement, calendar, events, activities, and ebook. The mobile app is developed on the iOS platform. The user satisfaction is analyzed from the interview data from 81 interviewees as well as a Google application like a Google form which 122 interviewees are involved. The result shows that users are satisfied with the application as they score it the most satisfaction level at 4.67 SD 0.52. The score for the question if users can learn and use the application quickly is high which is 4.82 SD 0.71. On the other hand, the lowest satisfaction rating is in the app’s form, apps lists, with the satisfaction level as 4.01 SD 0.45.

Keywords: Mobile application, development of mobile application, framework of mobile development, software development for mobile devices.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1696
310 AI-Based Approaches for Task Offloading, ‎Resource ‎Allocation and Service Placement of ‎IoT Applications: State of the Art

Authors: Fatima Z. Cherhabil, Mammar Sedrati, Sonia-Sabrina Bendib‎

Abstract:

In order to support the continued growth, critical latency of ‎IoT ‎applications and ‎various obstacles of traditional data centers, ‎Mobile Edge ‎Computing (MEC) has ‎emerged as a promising solution that extends the cloud data-processing and decision-making to edge devices. ‎By adopting a MEC structure, IoT applications could be executed ‎locally, on ‎an edge server, different fog nodes or distant cloud ‎data centers. However, we are ‎often ‎faced with wanting to optimize conflicting criteria such as ‎minimizing energy ‎consumption of limited local capabilities (in terms of CPU, RAM, storage, bandwidth) of mobile edge ‎devices and trying to ‎keep ‎high performance (reducing ‎response time, increasing throughput and service availability) ‎at the same ‎time‎. Achieving one goal may affect the other making Task Offloading (TO), ‎Resource Allocation (RA) and Service Placement (SP) complex ‎processes. ‎It is a nontrivial multi-objective optimization ‎problem ‎to study the trade-off between conflicting criteria. ‎The paper provides a survey on different TO, SP and RA recent Multi-‎Objective Optimization (MOO) approaches used in edge computing environments, particularly Artificial Intelligent (AI) ones, to satisfy various objectives, constraints and dynamic conditions related to IoT applications‎.

Keywords: Mobile Edge Computing, Multi-Objective Optimization, Artificial Intelligence ‎Approaches, Task Offloading, Resource Allocation, Service Placement‎.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 514
309 Geophysical Investigation for Pre-Engineering Construction Works in Part of Ilorin, Northcentral Nigeria

Authors: O. Ologe, A. I. Augie

Abstract:

A geophysical investigation involving geoelectric depths sounding has been conducted as pre-foundation study in part of Ilorin, Nigeria. The area is underlain by the Precambrian basement complex rocks. 15 sounding stations were established along five traverses. The Vertical Electrical Sounding (VES) (three-five) conducted along each of the traverses was subjected to computer iteration using IP2Win software. Three -five subsurface geologic layers were delineated in the study area. These include the topsoil with resistivity and thickness values ranging from 103 Ωm-210 Ωm and 0 m-1 m; lateritic (117 Ωm-590 Ωm and 1 m-4.7 m); sandy clay (137 – 859 Ωm and 2.9 m – 4.3 m); weathered (60.5 Ωm to 2539 Ωm and 3,2 m-10 m) and fresh basement (2253-∞ and 7.1 m-∞) respectively. The resistivity pseudosection shows continuous high resistivity zone on the surface. Resistivity of this layer from depth 0-5 m varies from 300-800 Ωm along traverse 1 and 2. Hence, this layer is rated competent as it has the ability to support engineering structure. However, along traverse 1, very low resistive layer occurs between VES 5 and 15 with resistivity values ranging from 30 Ωm-70 Ωm. This layer was rated incompetent based on the competence rating. This study revealed the importance of geophysical survey as a pre-construction engineering survey at any civil engineering site since it can reliably evaluate the competence of the subsurface geomaterials.

Keywords: Competence rating, geoelectric, pseudosection, soil, vertical electrical sounding.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 559
308 Implementation of Congestion Management Strategies on Arterial Roads: Case Study of Geelong

Authors: A. Das, L. Hitihamillage, S. Moridpour

Abstract:

Natural disasters are inevitable to the biodiversity. Disasters such as flood, tsunami and tornadoes could be brutal, harsh and devastating. In Australia, flooding is a major issue experienced by different parts of the country. In such crisis, delays in evacuation could decide the life and death of the people living in those regions. Congestion management could become a mammoth task if there are no steps taken before such situations. In the past to manage congestion in such circumstances, many strategies were utilised such as converting the road shoulders to extra lanes or changing the road geometry by adding more lanes. However, expansion of road to resolving congestion problems is not considered a viable option nowadays. The authorities avoid this option due to many reasons, such as lack of financial support and land space. They tend to focus their attention on optimising the current resources they possess and use traffic signals to overcome congestion problems. Traffic Signal Management strategy was considered a viable option, to alleviate congestion problems in the City of Geelong, Victoria. Arterial road with signalised intersections considered in this paper and the traffic data required for modelling collected from VicRoads. Traffic signalling software SIDRA used to model the roads, and the information gathered from VicRoads. In this paper, various signal parameters utilised to assess and improve the corridor performance to achieve the best possible Level of Services (LOS) for the arterial road.

Keywords: Congestion, constraints, management, LOS.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 963
307 Cultural Practices as a Coping Measure for Women who Terminated a Pregnancy in Adolescence: A Qualitative Study

Authors: Botshelo R. Sebola

Abstract:

Unintended pregnancy often results in pregnancy termination. Most countries have legalised the termination of a pregnancy and pregnant adolescents can visit designated clinics without their parents’ consent. In most African and Asian countries, certain cultural practices are performed following any form of childbirth, including abortion, and such practices are ingrained in societies. The aim of this paper was to understand how women who terminated a pregnancy during adolescence coped by embracing cultural practices. A descriptive multiple case study design was adopted for the study. In-depth, semi-structured interviews and reflective diaries were used for data collection. Participants were 13 women aged 20 to 35 years who had terminated a pregnancy in adolescence. Three women kept their soiled sanitary pads, burned them to ash and waited for the rainy season to scatter the ash in a flowing stream. This ritual was performed to appease the ancestors, ask them for forgiveness and as a send-off for the aborted foetus. Five women secretly consulted Sangoma (traditional healers) to perform certain rituals. Three women isolated themselves to perform herbal cleansings, and the last two chose not to engage in any sexual activity for one year, which led to the loss of their partners. This study offers a unique contribution to understanding the solitary journey of women who terminate a pregnancy. The study challenges healthcare professionals who work in clinics that offer pregnancy termination services to look beyond releasing the foetus to advocating and providing women with the necessary care and support in performing cultural practices.

Keywords: Adolescence, case study, cultural rituals, pregnancy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 398
306 Resilient Manufacturing: Use of Augmented Reality to Advance Training and Operating Practices in Manual Assembly

Authors: L. C. Moreira, M. Kauffman

Abstract:

This paper outlines the results of an experimental research on deploying an emerging augmented reality (AR) system for real-time task assistance (or work instructions) of highly customised and high-risk manual operations. The focus is on human operators’ training effectiveness and performance and the aim is to test if such technologies can support enhancing the knowledge retention levels and accuracy of task execution to improve health and safety (H&S). An AR enhanced assembly method is proposed and experimentally tested using a real industrial process as case study for electric vehicles’ (EV) battery module assembly. The experimental results revealed that the proposed method improved the training practices and performance through increases in the knowledge retention levels from 40% to 84%, and accuracy of task execution from 20% to 71%, when compared to the traditional paper-based method. The results of this research validate and demonstrate how emerging technologies are advancing the choice for manual, hybrid or fully automated processes by promoting the XR-assisted processes, and the connected worker (a vision for Industry 4 and 5.0), and supporting manufacturing become more resilient in times of constant market changes.

Keywords: Augmented reality, extended reality, connected worker, XR-assisted operator, manual assembly 4.0, industry 5.0, smart training, battery assembly.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 379
305 Modeling Spatial Distributions of Point and Nonpoint Source Pollution Loadings in the Great Lakes Watersheds

Authors: Chansheng He, Carlo DeMarchi

Abstract:

A physically based, spatially-distributed water quality model is being developed to simulate spatial and temporal distributions of material transport in the Great Lakes Watersheds of the U.S. Multiple databases of meteorology, land use, topography, hydrography, soils, agricultural statistics, and water quality were used to estimate nonpoint source loading potential in the study watersheds. Animal manure production was computed from tabulations of animals by zip code area for the census years of 1987, 1992, 1997, and 2002. Relative chemical loadings for agricultural land use were calculated from fertilizer and pesticide estimates by crop for the same periods. Comparison of these estimates to the monitored total phosphorous load indicates that both point and nonpoint sources are major contributors to the total nutrient loads in the study watersheds, with nonpoint sources being the largest contributor, particularly in the rural watersheds. These estimates are used as the input to the distributed water quality model for simulating pollutant transport through surface and subsurface processes to Great Lakes waters. Visualization and GIS interfaces are developed to visualize the spatial and temporal distribution of the pollutant transport in support of water management programs.

Keywords: Distributed Large Basin Runoff Model, Great LakesWatersheds, nonpoint source pollution, and point sources.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1533
304 Weighted-Distance Sliding Windows and Cooccurrence Graphs for Supporting Entity-Relationship Discovery in Unstructured Text

Authors: Paolo Fantozzi, Luigi Laura, Umberto Nanni

Abstract:

The problem of Entity relation discovery in structured data, a well covered topic in literature, consists in searching within unstructured sources (typically, text) in order to find connections among entities. These can be a whole dictionary, or a specific collection of named items. In many cases machine learning and/or text mining techniques are used for this goal. These approaches might be unfeasible in computationally challenging problems, such as processing massive data streams. A faster approach consists in collecting the cooccurrences of any two words (entities) in order to create a graph of relations - a cooccurrence graph. Indeed each cooccurrence highlights some grade of semantic correlation between the words because it is more common to have related words close each other than having them in the opposite sides of the text. Some authors have used sliding windows for such problem: they count all the occurrences within a sliding windows running over the whole text. In this paper we generalise such technique, coming up to a Weighted-Distance Sliding Window, where each occurrence of two named items within the window is accounted with a weight depending on the distance between items: a closer distance implies a stronger evidence of a relationship. We develop an experiment in order to support this intuition, by applying this technique to a data set consisting in the text of the Bible, split into verses.

Keywords: Cooccurrence graph, entity relation graph, unstructured text, weighted distance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 684
303 Increasing Fishery Economic Added Value through Post Fishing Program: Cold Storage Program

Authors: Indrijuli Magsari Putri, Dicky R. Munaf

Abstract:

The purpose of this paper is to guide the effort in improving the economic added value of Indonesian fisheries product through post fishing program, which is cold storage program. Indonesia's fisheries potential has been acknowledged by the world. FAO (2009) stated that Indonesia is one of the tenth highest producers of fishery products in the world. Based on BPS (Statistics Indonesia data), the national fisheries production in 2011 reached 5.714 million tons, which 93.55% came from marine fisheries and 6.45% from open waters. Indonesian territory consist of 2/3 of Indonesian waters, has given enormous benefits for Indonesia, especially fishermen. To improve the economic level of fishermen requires efforts to develop fisheries business unit. On of the efforts is by improving the quality of products which are marketed in the regional and international levels. It is certainly need the support of the existence of various fishery facilities (infrastructure to superstructure), one of which is cold storage. Given the many benefits of cold storage as a means of processing of fishery resources, Indonesia Maritime Security Coordinating Board (IMSCB) as one of the maritime institutions for maritime security and safety, has a program to empower the coastal community through encourages the development of cold storage in the middle and lower fishery business unit. The development of cold storage facilities which able to run its maximum role requires synergistic efforts of various parties.

Keywords: Cold Storage, Fish, Regulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2116
302 Using SMS Mobile Technology to Assess the Mastery of Subject Content Knowledge of Science and Mathematics Teachers of Secondary Schools in Tanzania

Authors: Joel S. Mtebe, Aron Kondoro, Mussa M. Kissaka, Elia Kibga

Abstract:

Sub-Saharan Africa is described as the second fastest growing in mobile phone penetration in the world more than in the United States or the European Union. Mobile phones have been used to provide a lot of opportunities to improve people’s lives in the region such as in banking, marketing, entertainment, and paying for various bills such as water, TV, and electricity. However, the potential of mobile phones to enhance teaching and learning has not been explored. This study presents an experience of developing and delivering SMS based quiz questions used to assess mastery of subject content knowledge of science and mathematics secondary school teachers in Tanzania. The SMS quizzes were used as a follow up support mechanism to 500 teachers who participated in a project to upgrade subject content knowledge of teachers in science and mathematics subjects in Tanzania. Quizzes of 10-15 questions were sent to teachers each week for 8 weeks and the results were analyzed using SPSS. Results show that teachers who participated in chemistry and biology subjects have better performance compared to those who participated in mathematics and physics subjects. Teachers reported some challenges that led to poor performance, This research has several practical implications for those who are implementing or planning to use mobile phones in teaching and learning especially in rural secondary schools in sub-Saharan Africa.

Keywords: Mobile learning, e-learning, educational technologies, SMS, secondary education, assessment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2067
301 Object Negotiation Mechanism for an Intelligent Environment Using Event Agents

Authors: Chiung-Hui Chen

Abstract:

With advancements in science and technology, the concept of the Internet of Things (IoT) has gradually developed. The development of the intelligent environment adds intelligence to objects in the living space by using the IoT. In the smart environment, when multiple users share the living space, if different service requirements from different users arise, then the context-aware system will have conflicting situations for making decisions about providing services. Therefore, the purpose of establishing a communication and negotiation mechanism among objects in the intelligent environment is to resolve those service conflicts among users. This study proposes developing a decision-making methodology that uses “Event Agents” as its core. When the sensor system receives information, it evaluates a user’s current events and conditions; analyses object, location, time, and environmental information; calculates the priority of the object; and provides the user services based on the event. Moreover, when the event is not single but overlaps with another, conflicts arise. This study adopts the “Multiple Events Correlation Matrix” in order to calculate the degree values of incidents and support values for each object. The matrix uses these values as the basis for making inferences for system service, and to further determine appropriate services when there is a conflict.

Keywords: Internet of things, intelligent object, event agents, negotiation mechanism, degree of similarity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1190
300 Application of KL Divergence for Estimation of Each Metabolic Pathway Genes

Authors: Shohei Maruyama, Yasuo Matsuyama, Sachiyo Aburatani

Abstract:

Development of a method to estimate gene functions is an important task in bioinformatics. One of the approaches for the annotation is the identification of the metabolic pathway that genes are involved in. Since gene expression data reflect various intracellular phenomena, those data are considered to be related with genes’ functions. However, it has been difficult to estimate the gene function with high accuracy. It is considered that the low accuracy of the estimation is caused by the difficulty of accurately measuring a gene expression. Even though they are measured under the same condition, the gene expressions will vary usually. In this study, we proposed a feature extraction method focusing on the variability of gene expressions to estimate the genes' metabolic pathway accurately. First, we estimated the distribution of each gene expression from replicate data. Next, we calculated the similarity between all gene pairs by KL divergence, which is a method for calculating the similarity between distributions. Finally, we utilized the similarity vectors as feature vectors and trained the multiclass SVM for identifying the genes' metabolic pathway. To evaluate our developed method, we applied the method to budding yeast and trained the multiclass SVM for identifying the seven metabolic pathways. As a result, the accuracy that calculated by our developed method was higher than the one that calculated from the raw gene expression data. Thus, our developed method combined with KL divergence is useful for identifying the genes' metabolic pathway.

Keywords: Metabolic pathways, gene expression data, microarray, Kullback–Leibler divergence, KL divergence, support vector machines, SVM, machine learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2336
299 A Survey on Data-Centric and Data-Aware Techniques for Large Scale Infrastructures

Authors: Silvina Caíno-Lores, Jesús Carretero

Abstract:

Large scale computing infrastructures have been widely developed with the core objective of providing a suitable platform for high-performance and high-throughput computing. These systems are designed to support resource-intensive and complex applications, which can be found in many scientific and industrial areas. Currently, large scale data-intensive applications are hindered by the high latencies that result from the access to vastly distributed data. Recent works have suggested that improving data locality is key to move towards exascale infrastructures efficiently, as solutions to this problem aim to reduce the bandwidth consumed in data transfers, and the overheads that arise from them. There are several techniques that attempt to move computations closer to the data. In this survey we analyse the different mechanisms that have been proposed to provide data locality for large scale high-performance and high-throughput systems. This survey intends to assist scientific computing community in understanding the various technical aspects and strategies that have been reported in recent literature regarding data locality. As a result, we present an overview of locality-oriented techniques, which are grouped in four main categories: application development, task scheduling, in-memory computing and storage platforms. Finally, the authors include a discussion on future research lines and synergies among the former techniques.

Keywords: Co-scheduling, data-centric, data-intensive, data locality, in-memory storage, large scale.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1491