Search results for: topography. Subject classification: 86 A 05
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1769

Search results for: topography. Subject classification: 86 A 05

1199 Improving Fake News Detection Using K-means and Support Vector Machine Approaches

Authors: Kasra Majbouri Yazdi, Adel Majbouri Yazdi, Saeid Khodayi, Jingyu Hou, Wanlei Zhou, Saeed Saedy

Abstract:

Fake news and false information are big challenges of all types of media, especially social media. There is a lot of false information, fake likes, views and duplicated accounts as big social networks such as Facebook and Twitter admitted. Most information appearing on social media is doubtful and in some cases misleading. They need to be detected as soon as possible to avoid a negative impact on society. The dimensions of the fake news datasets are growing rapidly, so to obtain a better result of detecting false information with less computation time and complexity, the dimensions need to be reduced. One of the best techniques of reducing data size is using feature selection method. The aim of this technique is to choose a feature subset from the original set to improve the classification performance. In this paper, a feature selection method is proposed with the integration of K-means clustering and Support Vector Machine (SVM) approaches which work in four steps. First, the similarities between all features are calculated. Then, features are divided into several clusters. Next, the final feature set is selected from all clusters, and finally, fake news is classified based on the final feature subset using the SVM method. The proposed method was evaluated by comparing its performance with other state-of-the-art methods on several specific benchmark datasets and the outcome showed a better classification of false information for our work. The detection performance was improved in two aspects. On the one hand, the detection runtime process decreased, and on the other hand, the classification accuracy increased because of the elimination of redundant features and the reduction of datasets dimensions.

Keywords: Fake news detection, feature selection, support vector machine, K-means clustering, machine learning, social media.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4524
1198 Assessing Land Cover Change Trajectories in Olomouc, Czech Republic

Authors: Mukesh Singh Boori, Vít Voženílek

Abstract:

Olomouc is a unique and complex landmark with widespread forestation and land use. This research work was conducted to assess important and complex land use change trajectories in Olomouc region. Multi-temporal satellite data from 1991, 2001 and 2013 were used to extract land use/cover types by object oriented classification method. To achieve the objectives, three different aspects were used: (1) Calculate the quantity of each transition; (2) Allocate location based landscape pattern (3) Compare land use/cover evaluation procedure. Land cover change trajectories shows that 16.69% agriculture, 54.33% forest and 21.98% other areas (settlement, pasture and water-body) were stable in all three decade. Approximately 30% of the study area maintained as a same land cove type from 1991 to 2013. Here broad scale of political and socioeconomic factors was also affect the rate and direction of landscape changes. Distance from the settlements was the most important predictor of land cover change trajectories. This showed that most of landscape trajectories were caused by socio-economic activities and mainly led to virtuous change on the ecological environment.

Keywords: Remote Sensing, land use/cover, Change trajectories, Image classification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2867
1197 Threshold Concepts in TESOL: A Thematic Analysis of Disciplinary Guiding Principles

Authors: Neil Morgan

Abstract:

The notion of Threshold Concepts has offered a fertile new perspective on the transformative effects of mastery of particular concepts on student understanding of subject matter and their developing identities as inductees into disciplinary discourse communities. Only by successfully traversing essential knowledge thresholds can neophytes achieve the more sophisticated understandings of subject matter possessed by mature members of a discipline. This paper uses thematic analysis of disciplinary guiding principles to identify nine candidate Threshold Concepts that appear to underpin effective TESOL practice. The relationship between these candidate TESOL Threshold Concepts, TESOL principles, and TESOL instructional techniques appears to be amenable to a schematic representation based on superordinate categories of TESOL practitioner concern and, as such, offers an alternative to the view of Threshold Concepts as a privileged subset of disciplinary core concepts. The paper concludes by exploring the potential of a Threshold Concepts framework to productively inform TESOL initial teacher education (ITE) and in-service education and training (INSET).

Keywords: TESOL, threshold concepts, TESOL principles, TESOL ITE/INSET, community of practice.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 742
1196 Motor Imaginary Signal Classification Using Adaptive Recursive Bandpass Filter and Adaptive Autoregressive Models for Brain Machine Interface Designs

Authors: Vickneswaran Jeyabalan, Andrews Samraj, Loo Chu Kiong

Abstract:

The noteworthy point in the advancement of Brain Machine Interface (BMI) research is the ability to accurately extract features of the brain signals and to classify them into targeted control action with the easiest procedures since the expected beneficiaries are of disabled. In this paper, a new feature extraction method using the combination of adaptive band pass filters and adaptive autoregressive (AAR) modelling is proposed and applied to the classification of right and left motor imagery signals extracted from the brain. The introduction of the adaptive bandpass filter improves the characterization process of the autocorrelation functions of the AAR models, as it enhances and strengthens the EEG signal, which is noisy and stochastic in nature. The experimental results on the Graz BCI data set have shown that by implementing the proposed feature extraction method, a LDA and SVM classifier outperforms other AAR approaches of the BCI 2003 competition in terms of the mutual information, the competition criterion, or misclassification rate.

Keywords: Adaptive autoregressive, adaptive bandpass filter, brain machine Interface, EEG, motor imaginary.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2901
1195 Site Selection of Traffic Camera based on Dempster-Shafer and Bagging Theory

Authors: S. Rokhsari, M. Delavar, A. Sadeghi-Niaraki, A. Abed-Elmdoust, B. Moshiri

Abstract:

Traffic incident has bad effect on all parts of society so controlling road networks with enough traffic devices could help to decrease number of accidents, so using the best method for optimum site selection of these devices could help to implement good monitoring system. This paper has considered here important criteria for optimum site selection of traffic camera based on aggregation methods such as Bagging and Dempster-Shafer concepts. In the first step, important criteria such as annual traffic flow, distance from critical places such as parks that need more traffic controlling were identified for selection of important road links for traffic camera installation, Then classification methods such as Artificial neural network and Decision tree algorithms were employed for classification of road links based on their importance for camera installation. Then for improving the result of classifiers aggregation methods such as Bagging and Dempster-Shafer theories were used.

Keywords: Aggregation, Bagging theory, Dempster-Shafer theory, Site selection

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1706
1194 Defining of the Shape of the Spine Using Moiré Method in Case of Patients with Scheuermann Disease

Authors: Petra Balla, Gabor Manhertz, Akos Antal

Abstract:

Nowadays spinal deformities are very frequent problems among teenagers. Scheuermann disease is a one dimensional deformity of the spine, but it has prevalence over 11% of the children. A traditional technology, the moiré method was used by us for screening and diagnosing this type of spinal deformity. A LabVIEW program has been developed to evaluate the moiré pictures of patients with Scheuermann disease. Two different solutions were tested in this computer program, the extreme and the inflexion point calculation methods. Effects using these methods were compared and according to the results both solutions seemed to be appropriate. Statistical results showed better efficiency in case of the extreme search method where the average difference was only 6,09⁰.

Keywords: Spinal deformity, picture evaluation, moiré method, Scheuermann disease, curve detection, moiré topography.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3577
1193 Current Status of Industry 4.0 in Material Handling Automation and In-house Logistics

Authors: Orestis Κ. Efthymiou, Stavros T. Ponis

Abstract:

In the last decade, a new industrial revolution seems to be emerging, supported -once again- by the rapid advancements of Information Technology in the areas of Machine-to-Machine (M2M) communication permitting large numbers of intelligent devices, e.g. sensors to communicate with each other and take decisions without any or minimum indirect human intervention. The advent of these technologies have triggered the emergence of a new category of hybrid (cyber-physical) manufacturing systems, combining advanced manufacturing techniques with innovative M2M applications based on the Internet of Things (IoT), under the umbrella term Industry 4.0. Even though the topic of Industry 4.0 has attracted much attention during the last few years, the attempts of providing a systematic literature review of the subject are scarce. In this paper, we present the authors’ initial study of the field with a special focus on the use and applications of Industry 4.0 principles in material handling automations and in-house logistics. Research shows that despite the vivid discussion and attractiveness of the subject, there are still many challenges and issues that have to be addressed before Industry 4.0 becomes standardized and widely applicable.

Keywords: Industry 4.0, internet of things, manufacturing systems, material handling, logistics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1658
1192 A Comparison of SVM-based Criteria in Evolutionary Method for Gene Selection and Classification of Microarray Data

Authors: Rameswar Debnath, Haruhisa Takahashi

Abstract:

An evolutionary method whose selection and recombination operations are based on generalization error-bounds of support vector machine (SVM) can select a subset of potentially informative genes for SVM classifier very efficiently [7]. In this paper, we will use the derivative of error-bound (first-order criteria) to select and recombine gene features in the evolutionary process, and compare the performance of the derivative of error-bound with the error-bound itself (zero-order) in the evolutionary process. We also investigate several error-bounds and their derivatives to compare the performance, and find the best criteria for gene selection and classification. We use 7 cancer-related human gene expression datasets to evaluate the performance of the zero-order and first-order criteria of error-bounds. Though both criteria have the same strategy in theoretically, experimental results demonstrate the best criterion for microarray gene expression data.

Keywords: support vector machine, generalization error-bound, feature selection, evolutionary algorithm, microarray data

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1536
1191 Multivariate High Order Fuzzy Time Series Forecasting for Car Road Accidents

Authors: Tahseen A. Jilani, S. M. Aqil Burney, C. Ardil

Abstract:

In this paper, we have presented a new multivariate fuzzy time series forecasting method. This method assumes mfactors with one main factor of interest. History of past three years is used for making new forecasts. This new method is applied in forecasting total number of car accidents in Belgium using four secondary factors. We also make comparison of our proposed method with existing methods of fuzzy time series forecasting. Experimentally, it is shown that our proposed method perform better than existing fuzzy time series forecasting methods. Practically, actuaries are interested in analysis of the patterns of causalities in road accidents. Thus using fuzzy time series, actuaries can define fuzzy premium and fuzzy underwriting of car insurance and life insurance for car insurance. National Institute of Statistics, Belgium provides region of risk classification for each road. Thus using this risk classification, we can predict premium rate and underwriting of insurance policy holders.

Keywords: Average forecasting error rate (AFER), Fuzziness offuzzy sets Fuzzy, If-Then rules, Multivariate fuzzy time series.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2491
1190 Machine Learning Approach for Identifying Dementia from MRI Images

Authors: S. K. Aruna, S. Chitra

Abstract:

This research paper presents a framework for classifying Magnetic Resonance Imaging (MRI) images for Dementia. Dementia, an age-related cognitive decline is indicated by degeneration of cortical and sub-cortical structures. Characterizing morphological changes helps understand disease development and contributes to early prediction and prevention of the disease. Modelling, that captures the brain’s structural variability and which is valid in disease classification and interpretation is very challenging. Features are extracted using Gabor filter with 0, 30, 60, 90 orientations and Gray Level Co-occurrence Matrix (GLCM). It is proposed to normalize and fuse the features. Independent Component Analysis (ICA) selects features. Support Vector Machine (SVM) classifier with different kernels is evaluated, for efficiency to classify dementia. This study evaluates the presented framework using MRI images from OASIS dataset for identifying dementia. Results showed that the proposed feature fusion classifier achieves higher classification accuracy.

Keywords: Magnetic resonance imaging, dementia, Gabor filter, gray level co-occurrence matrix, support vector machine.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2116
1189 Artificial Generation of Visual Evoked Potential to Enhance Visual Ability

Authors: A. Vani, M. N. Mamatha

Abstract:

Visual signal processing in human beings occurs in the occipital lobe of the brain. The signals that are generated in the brain are universal for all the human beings and they are called Visual Evoked Potential (VEP). Generally, the visually impaired people lose sight because of severe damage to only the eyes natural photo sensors, but the occipital lobe will still be functioning. In this paper, a technique of artificially generating VEP is proposed to enhance the visual ability of the subject. The system uses the electrical photoreceptors to capture image, process the image, to detect and recognize the subject or object. This voltage is further processed and can transmit wirelessly to a BIOMEMS implanted into occipital lobe of the patient’s brain. The proposed BIOMEMS consists of array of electrodes that generate the neuron potential which is similar to VEP of normal people. Thus, the neurons get the visual data from the BioMEMS which helps in generating partial vision or sight for the visually challenged patient. 

Keywords: Visual evoked potential, OpenViBe, BioMEMS, Neuro prosthesis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1466
1188 Landscape Data Transformation: Categorical Descriptions to Numerical Descriptors

Authors: Dennis A. Apuan

Abstract:

Categorical data based on description of the agricultural landscape imposed some mathematical and analytical limitations. This problem however can be overcome by data transformation through coding scheme and the use of non-parametric multivariate approach. The present study describes data transformation from qualitative to numerical descriptors. In a collection of 103 random soil samples over a 60 hectare field, categorical data were obtained from the following variables: levels of nitrogen, phosphorus, potassium, pH, hue, chroma, value and data on topography, vegetation type, and the presence of rocks. Categorical data were coded, and Spearman-s rho correlation was then calculated using PAST software ver. 1.78 in which Principal Component Analysis was based. Results revealed successful data transformation, generating 1030 quantitative descriptors. Visualization based on the new set of descriptors showed clear differences among sites, and amount of variation was successfully measured. Possible applications of data transformation are discussed.

Keywords: data transformation, numerical descriptors, principalcomponent analysis

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1505
1187 Measuring Cognitive Load - A Solution to Ease Learning of Programming

Authors: Muhammed Yousoof, Mohd Sapiyan, Khaja Kamaluddin

Abstract:

Learning programming is difficult for many learners. Some researches have found that the main difficulty relates to cognitive load. Cognitive overload happens in programming due to the nature of the subject which is intrinisicly over-bearing on the working memory. It happens due to the complexity of the subject itself. The problem is made worse by the poor instructional design methodology used in the teaching and learning process. Various efforts have been proposed to reduce the cognitive load, e.g. visualization softwares, part-program method etc. Use of many computer based systems have also been tried to tackle the problem. However, little success has been made to alleviate the problem. More has to be done to overcome this hurdle. This research attempts at understanding how cognitive load can be managed so as to reduce the problem of overloading. We propose a mechanism to measure the cognitive load during pre instruction, post instruction and in instructional stages of learning. This mechanism is used to help the instruction. As the load changes the instruction is made to adapt itself to ensure cognitive viability. This mechanism could be incorporated as a sub domain in the student model of various computer based instructional systems to facilitate the learning of programming.

Keywords: Cognitive load, Working memory, Cognitive Loadmeasurement.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2562
1186 Platform-as-a-Service Sticky Policies for Privacy Classification in the Cloud

Authors: Maha Shamseddine, Amjad Nusayr, Wassim Itani

Abstract:

In this paper, we present a Platform-as-a-Service (PaaS) model for controlling the privacy enforcement mechanisms applied on user data when stored and processed in Cloud data centers. The proposed architecture consists of establishing user configurable ‘sticky’ policies on the Graphical User Interface (GUI) data-bound components during the application development phase to specify the details of privacy enforcement on the contents of these components. Various privacy classification classes on the data components are formally defined to give the user full control on the degree and scope of privacy enforcement including the type of execution containers to process the data in the Cloud. This not only enhances the privacy-awareness of the developed Cloud services, but also results in major savings in performance and energy efficiency due to the fact that the privacy mechanisms are solely applied on sensitive data units and not on all the user content. The proposed design is implemented in a real PaaS cloud computing environment on the Microsoft Azure platform.

Keywords: Privacy enforcement, Platform-as-a-Service privacy awareness, cloud computing privacy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 759
1185 Automatic Building an Extensive Arabic FA Terms Dictionary

Authors: El-Sayed Atlam, Masao Fuketa, Kazuhiro Morita, Jun-ichi Aoe

Abstract:

Field Association (FA) terms are a limited set of discriminating terms that give us the knowledge to identify document fields which are effective in document classification, similar file retrieval and passage retrieval. But the problem lies in the lack of an effective method to extract automatically relevant Arabic FA Terms to build a comprehensive dictionary. Moreover, all previous studies are based on FA terms in English and Japanese, and the extension of FA terms to other language such Arabic could be definitely strengthen further researches. This paper presents a new method to extract, Arabic FA Terms from domain-specific corpora using part-of-speech (POS) pattern rules and corpora comparison. Experimental evaluation is carried out for 14 different fields using 251 MB of domain-specific corpora obtained from Arabic Wikipedia dumps and Alhyah news selected average of 2,825 FA Terms (single and compound) per field. From the experimental results, recall and precision are 84% and 79% respectively. Therefore, this method selects higher number of relevant Arabic FA Terms at high precision and recall.

Keywords: Arabic Field Association Terms, information extraction, document classification, information retrieval.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1734
1184 Voltage Problem Location Classification Using Performance of Least Squares Support Vector Machine LS-SVM and Learning Vector Quantization LVQ

Authors: Khaled Abduesslam. M, Mohammed Ali, Basher H Alsdai, Muhammad Nizam, Inayati

Abstract:

This paper presents the voltage problem location classification using performance of Least Squares Support Vector Machine (LS-SVM) and Learning Vector Quantization (LVQ) in electrical power system for proper voltage problem location implemented by IEEE 39 bus New- England. The data was collected from the time domain simulation by using Power System Analysis Toolbox (PSAT). Outputs from simulation data such as voltage, phase angle, real power and reactive power were taken as input to estimate voltage stability at particular buses based on Power Transfer Stability Index (PTSI).The simulation data was carried out on the IEEE 39 bus test system by considering load bus increased on the system. To verify of the proposed LS-SVM its performance was compared to Learning Vector Quantization (LVQ). The results showed that LS-SVM is faster and better as compared to LVQ. The results also demonstrated that the LS-SVM was estimated by 0% misclassification whereas LVQ had 7.69% misclassification.

Keywords: IEEE 39 bus, Least Squares Support Vector Machine, Learning Vector Quantization, Voltage Collapse.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2405
1183 The Solution of the Direct Problem of Electrical Prospecting with Direct Current under Conditions of Ground Surface Relief

Authors: Balgaisha Mukanova, Tolkyn Mirgalikyzy

Abstract:

Theory of interpretation of electromagnetic fields studied in the electrical prospecting with direct current is mainly developed for the case of a horizontal surface observation. However in practice we often have to work in difficult terrain surface. Conducting interpretation without the influence of topography can cause non-existent anomalies on sections. This raises the problem of studying the impact of different shapes of ground surface relief on the results of electrical prospecting's research. This research examines the numerical solutions of the direct problem of electrical prospecting for two-dimensional and three-dimensional media, taking into account the terrain. The problem is solved using the method of integral equations. The density of secondary currents on the relief surface is obtained.

Keywords: Ground surface relief, method of integral equations, numerical method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2124
1182 Series-Parallel Systems Reliability Optimization Using Genetic Algorithm and Statistical Analysis

Authors: Essa Abrahim Abdulgader Saleem, Thien-My Dao

Abstract:

The main objective of this paper is to optimize series-parallel system reliability using Genetic Algorithm (GA) and statistical analysis; considering system reliability constraints which involve the redundant numbers of selected components, total cost, and total weight. To perform this work, firstly the mathematical model which maximizes system reliability subject to maximum system cost and maximum system weight constraints is presented; secondly, a statistical analysis is used to optimize GA parameters, and thirdly GA is used to optimize series-parallel systems reliability. The objective is to determine the strategy choosing the redundancy level for each subsystem to maximize the overall system reliability subject to total cost and total weight constraints. Finally, the series-parallel system case study reliability optimization results are showed, and comparisons with the other previous results are presented to demonstrate the performance of our GA.

Keywords: Genetic algorithm, optimization, reliability, statistical analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1156
1181 Clinical Decision Support for Disease Classification based on the Tests Association

Authors: Sung Ho Ha, Seong Hyeon Joo, Eun Kyung Kwon

Abstract:

Until recently, researchers have developed various tools and methodologies for effective clinical decision-making. Among those decisions, chest pain diseases have been one of important diagnostic issues especially in an emergency department. To improve the ability of physicians in diagnosis, many researchers have developed diagnosis intelligence by using machine learning and data mining. However, most of the conventional methodologies have been generally based on a single classifier for disease classification and prediction, which shows moderate performance. This study utilizes an ensemble strategy to combine multiple different classifiers to help physicians diagnose chest pain diseases more accurately than ever. Specifically the ensemble strategy is applied by using the integration of decision trees, neural networks, and support vector machines. The ensemble models are applied to real-world emergency data. This study shows that the performance of the ensemble models is superior to each of single classifiers.

Keywords: Diagnosis intelligence, ensemble approach, data mining, emergency department

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1634
1180 A Learning-Community Recommendation Approach for Web-Based Cooperative Learning

Authors: Jian-Wei Li, Yao-Tien Wang, Yi-Chun Chang

Abstract:

Cooperative learning has been defined as learners working together as a team to solve a problem to complete a task or to accomplish a common goal, which emphasizes the importance of interactions among members to promote the whole learning performance. With the popularity of society networks, cooperative learning is no longer limited to traditional classroom teaching activities. Since society networks facilitate to organize online learners, to establish common shared visions, and to advance learning interaction, the online community and online learning community have triggered the establishment of web-based societies. Numerous research literatures have indicated that the collaborative learning community is a critical issue to enhance learning performance. Hence, this paper proposes a learning community recommendation approach to facilitate that a learner joins the appropriate learning communities, which is based on k-nearest neighbor (kNN) classification. To demonstrate the viability of the proposed approach, the proposed approach is implemented for 117 students to recommend learning communities. The experimental results indicate that the proposed approach can effectively recommend appropriate learning communities for learners.

Keywords: k-nearest neighbor classification, learning community, Cooperative/Collaborative Learning and Environments.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1905
1179 A Review on Hydraulic and Morphological Characteristics in River Channels Due to Spurs

Authors: M. Alauddin, M. M. Hossain, M. N. Uddin, M. E. Haque

Abstract:

An optimal design of a spur is the first requirement to make it sustainable and function properly. In view of that, a thorough understanding to the hydro- and morpho-dynamics due to spurs is essential. This paper presents a literature review on the effect of spurs to obtain the most recent design criteria. Perpendicular and upstream aligned impermeable spurs have large disturbances to flow and less stability because of strong vortices and associated scour. Downstream aligned spurs minimize scour holes, but there is a chance of strong return current which could be controlled allowing flow through them. A series arrangement of spurs is important to have the desired results with a special care for the first one. Several equations have been presented in the paper for predicting the scour depth. But, they have to be used carefully. Different flow environments developed by spurs are favorable for various aquatic species. However, it is important to maintain almost a stable flow condition providing stable spurs.

Keywords: Bed topography, flow pattern, scour, spur.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1251
1178 Influence of Argon Gas Concentration in N2-Ar Plasma for the Nitridation of Si in Abnormal Glow Discharge

Authors: K. Abbas, R. Ahmad, I. A. Khan, S. Saleem, U. Ikhlaq

Abstract:

Nitriding of p-type Si samples by pulsed DC glow discharge is carried out for different Ar concentrations (30% to 90%) in nitrogen-argon plasma whereas the other parameters like pressure (2 mbar), treatment time (4 hr) and power (175 W) are kept constant. The phase identification, crystal structure, crystallinity, chemical composition, surface morphology and topography of the nitrided layer are studied using X-ray diffraction (XRD), Fourier transform infra-red spectroscopy (FTIR), optical microscopy (OM), scanning electron microscopy (SEM) and atomic force microscopy (AFM) respectively. The XRD patterns reveal the development of different diffraction planes of Si3N4 confirming the formation of polycrystalline layer. FTIR spectrum confirms the formation of bond between Si and N. Results reveal that addition of Ar into N2 plasma plays an important role to enhance the production of active species which facilitate the nitrogen diffusion.

Keywords: Crystallinity, glow discharge, nitriding, sputtering.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1529
1177 A Real-Time Specific Weed Recognition System Using Statistical Methods

Authors: Imran Ahmed, Muhammad Islam, Syed Inayat Ali Shah, Awais Adnan

Abstract:

The identification and classification of weeds are of major technical and economical importance in the agricultural industry. To automate these activities, like in shape, color and texture, weed control system is feasible. The goal of this paper is to build a real-time, machine vision weed control system that can detect weed locations. In order to accomplish this objective, a real-time robotic system is developed to identify and locate outdoor plants using machine vision technology and pattern recognition. The algorithm is developed to classify images into broad and narrow class for real-time selective herbicide application. The developed algorithm has been tested on weeds at various locations, which have shown that the algorithm to be very effectiveness in weed identification. Further the results show a very reliable performance on weeds under varying field conditions. The analysis of the results shows over 90 percent classification accuracy over 140 sample images (broad and narrow) with 70 samples from each category of weeds.

Keywords: Weed detection, Image Processing, real-timerecognition, Standard Deviation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2264
1176 Predictive Analysis for Big Data: Extension of Classification and Regression Trees Algorithm

Authors: Ameur Abdelkader, Abed Bouarfa Hafida

Abstract:

Since its inception, predictive analysis has revolutionized the IT industry through its robustness and decision-making facilities. It involves the application of a set of data processing techniques and algorithms in order to create predictive models. Its principle is based on finding relationships between explanatory variables and the predicted variables. Past occurrences are exploited to predict and to derive the unknown outcome. With the advent of big data, many studies have suggested the use of predictive analytics in order to process and analyze big data. Nevertheless, they have been curbed by the limits of classical methods of predictive analysis in case of a large amount of data. In fact, because of their volumes, their nature (semi or unstructured) and their variety, it is impossible to analyze efficiently big data via classical methods of predictive analysis. The authors attribute this weakness to the fact that predictive analysis algorithms do not allow the parallelization and distribution of calculation. In this paper, we propose to extend the predictive analysis algorithm, Classification And Regression Trees (CART), in order to adapt it for big data analysis. The major changes of this algorithm are presented and then a version of the extended algorithm is defined in order to make it applicable for a huge quantity of data.

Keywords: Predictive analysis, big data, predictive analysis algorithms. CART algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1075
1175 Artificial Intelligence Techniques Applications for Power Disturbances Classification

Authors: K.Manimala, Dr.K.Selvi, R.Ahila

Abstract:

Artificial Intelligence (AI) methods are increasingly being used for problem solving. This paper concerns using AI-type learning machines for power quality problem, which is a problem of general interest to power system to provide quality power to all appliances. Electrical power of good quality is essential for proper operation of electronic equipments such as computers and PLCs. Malfunction of such equipment may lead to loss of production or disruption of critical services resulting in huge financial and other losses. It is therefore necessary that critical loads be supplied with electricity of acceptable quality. Recognition of the presence of any disturbance and classifying any existing disturbance into a particular type is the first step in combating the problem. In this work two classes of AI methods for Power quality data mining are studied: Artificial Neural Networks (ANNs) and Support Vector Machines (SVMs). We show that SVMs are superior to ANNs in two critical respects: SVMs train and run an order of magnitude faster; and SVMs give higher classification accuracy.

Keywords: back propagation network, power quality, probabilistic neural network, radial basis function support vector machine

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1557
1174 Optimal Maintenance Policy for a Partially Observable Two-Unit System

Authors: Leila Jafari, Viliam Makis, Akram Khaleghei G.B.

Abstract:

In this paper, we present a maintenance model of a two-unit series system with economic dependence. Unit#1 which is considered to be more expensive and more important, is subject to condition monitoring (CM) at equidistant, discrete time epochs and unit#2, which is not subject to CM has a general lifetime distribution. The multivariate observation vectors obtained through condition monitoring carry partial information about the hidden state of unit#1, which can be in a healthy or a warning state while operating. Only the failure state is assumed to be observable for both units. The objective is to find an optimal opportunistic maintenance policy minimizing the long-run expected average cost per unit time. The problem is formulated and solved in the partially observable semi-Markov decision process framework. An effective computational algorithm for finding the optimal policy and the minimum average cost is developed, illustrated by a numerical example.

Keywords: Condition-Based Maintenance, Semi-Markov Decision Process, Multivariate Bayesian Control Chart, Partially Observable System, Two-unit System.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2294
1173 Protein Graph Partitioning by Mutually Maximization of cycle-distributions

Authors: Frank Emmert Streib

Abstract:

The classification of the protein structure is commonly not performed for the whole protein but for structural domains, i.e., compact functional units preserved during evolution. Hence, a first step to a protein structure classification is the separation of the protein into its domains. We approach the problem of protein domain identification by proposing a novel graph theoretical algorithm. We represent the protein structure as an undirected, unweighted and unlabeled graph which nodes correspond the secondary structure elements of the protein. This graph is call the protein graph. The domains are then identified as partitions of the graph corresponding to vertices sets obtained by the maximization of an objective function, which mutually maximizes the cycle distributions found in the partitions of the graph. Our algorithm does not utilize any other kind of information besides the cycle-distribution to find the partitions. If a partition is found, the algorithm is iteratively applied to each of the resulting subgraphs. As stop criterion, we calculate numerically a significance level which indicates the stability of the predicted partition against a random rewiring of the protein graph. Hence, our algorithm terminates automatically its iterative application. We present results for one and two domain proteins and compare our results with the manually assigned domains by the SCOP database and differences are discussed.

Keywords: Graph partitioning, unweighted graph, protein domains.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1356
1172 Static Single Point Positioning Using The Extended Kalman Filter

Authors: I. Sarras, G. Gerakios, A. Diamantis, A. I. Dounis, G. P. Syrcos

Abstract:

Global Positioning System (GPS) technology is widely used today in the areas of geodesy and topography as well as in aeronautics mainly for military purposes. Due to the military usage of GPS, full access and use of this technology is being denied to the civilian user who must then work with a less accurate version. In this paper we focus on the estimation of the receiver coordinates ( X, Y, Z ) and its clock bias ( δtr ) of a fixed point based on pseudorange measurements of a single GPS receiver. Utilizing the instantaneous coordinates of just 4 satellites and their clock offsets, by taking into account the atmospheric delays, we are able to derive a set of pseudorange equations. The estimation of the four unknowns ( X, Y, Z , δtr ) is achieved by introducing an extended Kalman filter that processes, off-line, all the data collected from the receiver. Higher performance of position accuracy is attained by appropriate tuning of the filter noise parameters and by including other forms of biases.

Keywords: Extended Kalman filter, GPS, Pseudorange

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2577
1171 Content Based Image Retrieval of Brain MR Images across Different Classes

Authors: Abraham Varghese, Kannan Balakrishnan, Reji R. Varghese, Joseph S. Paul

Abstract:

Magnetic Resonance Imaging play a vital role in the decision-diagnosis process of brain MR images. For an accurate diagnosis of brain related problems, the experts mostly compares both T1 and T2 weighted images as the information presented in these two images are complementary. In this paper, rotational and translational invariant form of Local binary Pattern (LBP) with additional gray scale information is used to retrieve similar slices of T1 weighted images from T2 weighted images or vice versa. The incorporation of additional gray scale information on LBP can extract more local texture information. The accuracy of retrieval can be improved by extracting moment features of LBP and reweighting the features based on users feedback. Here retrieval is done in a single subject scenario where similar images of a particular subject at a particular level are retrieved, and multiple subjects scenario where relevant images at a particular level across the subjects are retrieved.

Keywords: Local Binary pattern (LBP), Modified Local Binary pattern (MOD-LBP), T1 and T2 weighted images, Moment features.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2381
1170 Detecting HCC Tumor in Three Phasic CT Liver Images with Optimization of Neural Network

Authors: Mahdieh Khalilinezhad, Silvana Dellepiane, Gianni Vernazza

Abstract:

The aim of this work is to build a model based on tissue characterization that is able to discriminate pathological and non-pathological regions from three-phasic CT images. With our research and based on a feature selection in different phases, we are trying to design a neural network system with an optimal neuron number in a hidden layer. Our approach consists of three steps: feature selection, feature reduction, and classification. For each region of interest (ROI), 6 distinct sets of texture features are extracted such as: first order histogram parameters, absolute gradient, run-length matrix, co-occurrence matrix, autoregressive model, and wavelet, for a total of 270 texture features. When analyzing more phases, we show that the injection of liquid cause changes to the high relevant features in each region. Our results demonstrate that for detecting HCC tumor phase 3 is the best one in most of the features that we apply to the classification algorithm. The percentage of detection between pathology and healthy classes, according to our method, relates to first order histogram parameters with accuracy of 85% in phase 1, 95% in phase 2, and 95% in phase 3.

Keywords: Feature selection, Multi-phasic liver images, Neural network, Texture analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2535