Search results for: Document Classification.
487 An Analysis of the Five Most Used Numerals and a Proposal for the Adoption of a Universally Acceptable Numeral (UAN)
Authors: Mufutau Ayinla Abdul-Yakeen
Abstract:
An analysis of the five most used numerals and a proposal for the adoption of a Universally Acceptable Numerals (UAN), came up as a result of the researchers inquisitiveses of the need for a set of numerals that is universally accepted. The researcher sought for the meaning of the first letter, “Nun”, “ن”, of the first verse of Suratul-Kalam (Chapter of the Pen), the Sixty-Eighth Chapter of the Holy Qur'an. It was observed that there was no universally accepted, economical, explainable, linkable and consistent set of numerals used by all scientists up till the moment of making this enquiry. As a theoretical paper, explanatory method is used to review five of the most used numerals (Tally Marks, Roman Figure, Hindu-Arabic, Arabic, and Chinese) and the urgent need for a universally accepted, economical, explainable, linkable and consistent set of numerals arises. The study discovers: ., I, \, _, L, U, =, C, O, 9, and 1.; to be used as numeral 0, 1, 2, 3, 4, 5, 6, 7, 8, 9 and 10 respectively; as a set of universally acceptable, economical, explainable, linkable, sustainable, convertible and consistent set of numerals that originates from Islam. They can be called Islameconumerals or UAN. With UAN, everything dropped, written, drawn and/or scribbled has meaning(s) as postulated by the first verse of Qur'an 68 and everyone can easily document all figures within the shortest period. It is suggested that there should be a discipline called Numeralnomics (Study of optimum utilization of Numerals) and everybody should start using the UAN, now, in order in know their strengths and weaknesses so as to suggest a better and acceptable set of numerals for the interested readers. Similarly study can be conducted for the alphabets.
Keywords: Islameconumerals, economical, Universally Acceptable Numerals (UAN), numeralnomics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 792486 An Approach for the Prediction of Cardiovascular Diseases
Authors: Nebi Gedik
Abstract:
Regardless of age or gender, cardiovascular illnesses are a serious health concern because of things like poor eating habits, stress, a sedentary lifestyle, hard work schedules, alcohol use, and weight. It tends to happen suddenly and has a high rate of recurrence. Machine learning models can be implemented to assist healthcare systems in the accurate detection and diagnosis of cardiovascular disease (CVD) in patients. Improved heart failure prediction is one of the primary goals of researchers using the heart disease dataset. The purpose of this study is to identify the feature or features that offer the best classification prediction for CVD detection. The support vector machine classifier is used to compare each feature's performance. It has been determined which feature produces the best results.
Keywords: Cardiovascular disease, feature extraction, supervised learning, support vector machine.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 170485 End-to-End Spanish-English Sequence Learning Translation Model
Authors: Vidhu Mitha Goutham, Ruma Mukherjee
Abstract:
The low availability of well-trained, unlimited, dynamic-access models for specific languages makes it hard for corporate users to adopt quick translation techniques and incorporate them into product solutions. As translation tasks increasingly require a dynamic sequence learning curve; stable, cost-free opensource models are scarce. We survey and compare current translation techniques and propose a modified sequence to sequence model repurposed with attention techniques. Sequence learning using an encoder-decoder model is now paving the path for higher precision levels in translation. Using a Convolutional Neural Network (CNN) encoder and a Recurrent Neural Network (RNN) decoder background, we use Fairseq tools to produce an end-to-end bilingually trained Spanish-English machine translation model including source language detection. We acquire competitive results using a duo-lingo-corpus trained model to provide for prospective, ready-made plug-in use for compound sentences and document translations. Our model serves a decent system for large, organizational data translation needs. While acknowledging its shortcomings and future scope, it also identifies itself as a well-optimized deep neural network model and solution.
Keywords: Attention, encoder-decoder, Fairseq, Seq2Seq, Spanish, translation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 475484 Artificial Neural Networks for Classifying Magnetic Measurements in Tokamak Reactors
Authors: A. Greco, N. Mammone, F.C. Morabito, M.Versaci
Abstract:
This paper is mainly concerned with the application of a novel technique of data interpretation to the characterization and classification of measurements of plasma columns in Tokamak reactors for nuclear fusion applications. The proposed method exploits several concepts derived from soft computing theory. In particular, Artifical Neural Networks have been exploited to classify magnetic variables useful to determine shape and position of the plasma with a reduced computational complexity. The proposed technique is used to analyze simulated databases of plasma equilibria based on ITER geometry configuration. As well as demonstrating the successful recovery of scalar equilibrium parameters, we show that the technique can yield practical advantages compares with earlier methods.
Keywords: Tokamak, sensors, artificial neural network.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1823483 Stock Movement Prediction Using Price Factor and Deep Learning
Abstract:
The development of machine learning methods and techniques has opened doors for investigation in many areas such as medicines, economics, finance, etc. One active research area involving machine learning is stock market prediction. This research paper tries to consider multiple techniques and methods for stock movement prediction using historical price or price factors. The paper explores the effectiveness of some deep learning frameworks for forecasting stock. Moreover, an architecture (TimeStock) is proposed which takes the representation of time into account apart from the price information itself. Our model achieves a promising result that shows a potential approach for the stock movement prediction problem.
Keywords: Classification, machine learning, time representation, stock prediction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1155482 Development of Software Complex for Digitalization of Enterprise Activities
Authors: G. T. Balakayeva, K. K. Nurlybayeva, M. B. Zhanuzakov
Abstract:
In the proposed work, we have developed software and designed a software architecture for the implementation of enterprise business processes. The proposed software has a multi-level architecture using a domain-specific tool. The developed architecture is a guarantor of the availability, reliability and security of the system and the implementation of business processes, which are the basis for effective enterprise management. Automating business processes, automating the algorithmic stages of an enterprise, developing optimal algorithms for managing activities, controlling and monitoring, reducing risks and improving results help organizations achieve strategic goals quickly and efficiently. The software described in this article can connect to the corporate information system via two methods: a desktop client and a web client. With an appeal to the application server, the desktop client program connects to the information system on the company's work PCs over a local network. Outside the organization, the user can interact with the information system via a web browser, which acts as a web client and connects to a web server. The developed software consists of several integrated modules that share resources and interact with each other through an API. The following technology stack was used during development: Node js, React js, MongoDB, Ngnix, Cloud Technologies, Python.
Keywords: Algorithms, document processing, automation, integrated modules, software architecture, software design, information system.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 207481 Mineralogical Characterization and Petrographic Classification of the Soil of Casablanca City
Authors: I. Fahi, T. Remmal, F. El Kamel, B. Ayoub
Abstract:
The treatment of the geotechnical database of the region of Casablanca was difficult to achieve due to the heterogeneity of the nomenclature of the lithological formations composing its soil. It appears necessary to harmonize the nomenclature of the facies and to produce cartographic documents useful for construction projects and studies before any investment program. To achieve this, more than 600 surveys made by the Public Laboratory for Testing and Studies (LPEE) in the agglomeration of Casablanca, were studied. Moreover, some local observations were made in different places of the metropolis. Each survey was the subject of a sheet containing lithological succession, macro and microscopic description of petrographic facies with photographic illustration, as well as measurements of geomechanical tests. In addition, an X-ray diffraction analysis was made in order to characterize the surficial formations of the region.
Keywords: Casablanca, guidebook, petrography, soil.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 919480 Performance Comparison and Evaluation of AdaBoost and SoftBoost Algorithms on Generic Object Recognition
Authors: Doaa Hegazy, Joachim Denzler
Abstract:
SoftBoost is a recently presented boosting algorithm, which trades off the size of achieved classification margin and generalization performance. This paper presents a performance evaluation of SoftBoost algorithm on the generic object recognition problem. An appearance-based generic object recognition model is used. The evaluation experiments are performed using a difficult object recognition benchmark. An assessment with respect to different degrees of label noise as well as a comparison to the well known AdaBoost algorithm is performed. The obtained results reveal that SoftBoost is encouraged to be used in cases when the training data is known to have a high degree of noise. Otherwise, using Adaboost can achieve better performance.Keywords: SoftBoost algorithm, AdaBoost algorithm, Generic object recognition.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1829479 Anomaly Detection and Characterization to Classify Traffic Anomalies Case Study: TOT Public Company Limited Network
Authors: O. Siriporn, S. Benjawan
Abstract:
This paper represents four unsupervised clustering algorithms namely sIB, RandomFlatClustering, FarthestFirst, and FilteredClusterer that previously works have not been used for network traffic classification. The methodology, the result, the products of the cluster and evaluation of these algorithms with efficiency of each algorithm from accuracy are shown. Otherwise, the efficiency of these algorithms considering form the time that it use to generate the cluster quickly and correctly. Our work study and test the best algorithm by using classify traffic anomaly in network traffic with different attribute that have not been used before. We analyses the algorithm that have the best efficiency or the best learning and compare it to the previously used (K-Means). Our research will be use to develop anomaly detection system to more efficiency and more require in the future.
Keywords: Unsupervised, clustering, anomaly, machine learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2113478 Learning of Class Membership Values by Ellipsoidal Decision Regions
Authors: Leehter Yao, Chin-Chin Lin
Abstract:
A novel method of learning complex fuzzy decision regions in the n-dimensional feature space is proposed. Through the fuzzy decision regions, a given pattern's class membership value of every class is determined instead of the conventional crisp class the pattern belongs to. The n-dimensional fuzzy decision region is approximated by union of hyperellipsoids. By explicitly parameterizing these hyperellipsoids, the decision regions are determined by estimating the parameters of each hyperellipsoid.Genetic Algorithm is applied to estimate the parameters of each region component. With the global optimization ability of GA, the learned decision region can be arbitrarily complex.
Keywords: Ellipsoid, genetic algorithm, decision regions, classification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1428477 TOSOM: A Topic-Oriented Self-Organizing Map for Text Organization
Authors: Hsin-Chang Yang, Chung-Hong Lee, Kuo-Lung Ke
Abstract:
The self-organizing map (SOM) model is a well-known neural network model with wide spread of applications. The main characteristics of SOM are two-fold, namely dimension reduction and topology preservation. Using SOM, a high-dimensional data space will be mapped to some low-dimensional space. Meanwhile, the topological relations among data will be preserved. With such characteristics, the SOM was usually applied on data clustering and visualization tasks. However, the SOM has main disadvantage of the need to know the number and structure of neurons prior to training, which are difficult to be determined. Several schemes have been proposed to tackle such deficiency. Examples are growing/expandable SOM, hierarchical SOM, and growing hierarchical SOM. These schemes could dynamically expand the map, even generate hierarchical maps, during training. Encouraging results were reported. Basically, these schemes adapt the size and structure of the map according to the distribution of training data. That is, they are data-driven or dataoriented SOM schemes. In this work, a topic-oriented SOM scheme which is suitable for document clustering and organization will be developed. The proposed SOM will automatically adapt the number as well as the structure of the map according to identified topics. Unlike other data-oriented SOMs, our approach expands the map and generates the hierarchies both according to the topics and their characteristics of the neurons. The preliminary experiments give promising result and demonstrate the plausibility of the method.
Keywords: Self-organizing map, topic identification, learning algorithm, text clustering.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2026476 Timescape-Based Panoramic View for Historic Landmarks
Authors: H. Ali, A. Whitehead
Abstract:
Providing a panoramic view of famous landmarks around the world offers artistic and historic value for historians, tourists, and researchers. Exploring the history of famous landmarks by presenting a comprehensive view of a temporal panorama merged with geographical and historical information presents a unique challenge of dealing with images that span a long period, from the 1800’s up to the present. This work presents the concept of temporal panorama through a timeline display of aligned historic and modern images for many famous landmarks. Utilization of this panorama requires a collection of hundreds of thousands of landmark images from the Internet comprised of historic images and modern images of the digital age. These images have to be classified for subset selection to keep the more suitable images that chronologically document a landmark’s history. Processing of historic images captured using older analog technology under various different capturing conditions represents a big challenge when they have to be used with modern digital images. Successful processing of historic images to prepare them for next steps of temporal panorama creation represents an active contribution in cultural heritage preservation through the fulfillment of one of UNESCO goals in preservation and displaying famous worldwide landmarks.
Keywords: Cultural heritage, image registration, image subset selection, registered image similarity, temporal panorama, timescapes.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1051475 Categorical Missing Data Imputation Using Fuzzy Neural Networks with Numerical and Categorical Inputs
Authors: Pilar Rey-del-Castillo, Jesús Cardeñosa
Abstract:
There are many situations where input feature vectors are incomplete and methods to tackle the problem have been studied for a long time. A commonly used procedure is to replace each missing value with an imputation. This paper presents a method to perform categorical missing data imputation from numerical and categorical variables. The imputations are based on Simpson-s fuzzy min-max neural networks where the input variables for learning and classification are just numerical. The proposed method extends the input to categorical variables by introducing new fuzzy sets, a new operation and a new architecture. The procedure is tested and compared with others using opinion poll data.
Keywords: Classifier, imputation techniques, fuzzy systems, fuzzy min-max neural networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1779474 A Classical Method of Optimizing Manufacturing Systems Using a Number of Industrial Engineering Techniques
Authors: John M. Ikome, Martha E. Ikome, Therese Van Wyk
Abstract:
Productivity optimization of a company can significantly increase the company’s output and productivity which can be in the form of corrective actions of ineffective activities, process simplification, and reduction of variations, responsiveness, and reduction of set-up-time which are all under the classification of waste within the manufacturing environment. Deriving a means to eliminate a number of these issues has a key importance for manufacturing organization. This paper focused on a number of industrial engineering techniques which include a cause and effect diagram, to identify and optimize the method or systems being used. Based on our results, it shows that there are a number of variations within the production processes that can significantly disrupt the expected output.
Keywords: Optimization, fishbone diagram, productivity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1000473 A Study of Classification Models to Predict Drill-Bit Breakage Using Degradation Signals
Authors: Bharatendra Rai
Abstract:
Cutting tools are widely used in manufacturing processes and drilling is the most commonly used machining process. Although drill-bits used in drilling may not be expensive, their breakage can cause damage to expensive work piece being drilled and at the same time has major impact on productivity. Predicting drill-bit breakage, therefore, is important in reducing cost and improving productivity. This study uses twenty features extracted from two degradation signals viz., thrust force and torque. The methodology used involves developing and comparing decision tree, random forest, and multinomial logistic regression models for classifying and predicting drill-bit breakage using degradation signals.
Keywords: Degradation signal, drill-bit breakage, random forest, multinomial logistic regression.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2243472 Performance Appraisal System using Multifactorial Evaluation Model
Abstract:
Performance appraisal of employee is important in managing the human resource of an organization. With the change towards knowledge-based capitalism, maintaining talented knowledge workers is critical. However, management classification of “outstanding", “poor" and “average" performance may not be an easy decision. Besides that, superior might also tend to judge the work performance of their subordinates informally and arbitrarily especially without the existence of a system of appraisal. In this paper, we propose a performance appraisal system using multifactorial evaluation model in dealing with appraisal grades which are often express vaguely in linguistic terms. The proposed model is for evaluating staff performance based on specific performance appraisal criteria. The project was collaboration with one of the Information and Communication Technology company in Malaysia with reference to its performance appraisal process.Keywords: Multifactorial Evaluation Model, performance appraisal system, decision support system.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4268471 The Reproducibility and Repeatability of Modified Likelihood Ratio for Forensics Handwriting Examination
Authors: O. Abiodun Adeyinka, B. Adeyemo Adesesan
Abstract:
The forensic use of handwriting depends on the analysis, comparison, and evaluation decisions made by forensic document examiners. When using biometric technology in forensic applications, it is necessary to compute Likelihood Ratio (LR) for quantifying strength of evidence under two competing hypotheses, namely the prosecution and the defense hypotheses wherein a set of assumptions and methods for a given data set will be made. It is therefore important to know how repeatable and reproducible our estimated LR is. This paper evaluated the accuracy and reproducibility of examiners' decisions. Confidence interval for the estimated LR were presented so as not get an incorrect estimate that will be used to deliver wrong judgment in the court of Law. The estimate of LR is fundamentally a Bayesian concept and we used two LR estimators, namely Logistic Regression (LoR) and Kernel Density Estimator (KDE) for this paper. The repeatability evaluation was carried out by retesting the initial experiment after an interval of six months to observe whether examiners would repeat their decisions for the estimated LR. The experimental results, which are based on handwriting dataset, show that LR has different confidence intervals which therefore implies that LR cannot be estimated with the same certainty everywhere. Though the LoR performed better than the KDE when tested using the same dataset, the two LR estimators investigated showed a consistent region in which LR value can be estimated confidently. These two findings advance our understanding of LR when used in computing the strength of evidence in handwriting using forensics.Keywords: Logistic Regression LoR, Kernel Density Estimator KDE, Handwriting, Confidence Interval, Repeatability, Reproducibility.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 472470 Scene Adaptive Shadow Detection Algorithm
Authors: Mohammed Ibrahim M, Anupama R.
Abstract:
Robustness is one of the primary performance criteria for an Intelligent Video Surveillance (IVS) system. One of the key factors in enhancing the robustness of dynamic video analysis is,providing accurate and reliable means for shadow detection. If left undetected, shadow pixels may result in incorrect object tracking and classification, as it tends to distort localization and measurement information. Most of the algorithms proposed in literature are computationally expensive; some to the extent of equalling computational requirement of motion detection. In this paper, the homogeneity property of shadows is explored in a novel way for shadow detection. An adaptive division image (which highlights homogeneity property of shadows) analysis followed by a relatively simpler projection histogram analysis for penumbra suppression is the key novelty in our approach.
Keywords: homogeneity, penumbra, projection histogram, shadow correction
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1902469 Vulnerability of Groundwater Resources Selected for Emergency Water Supply
Authors: Frantisek Bozek, Alena Bumbova, Eduard Bakos
Abstract:
Paper is dealing with vulnerability concerning elements of hydrological structures and elements of technological equipments which are acceptable for groundwater resources. The vulnerability assessment stems from the application of the register of hazards and a potential threat to individual water source elements within each type of hazard. The proposed procedure is pattern for assessing the risks of disturbance, damage, or destruction of water source by the identified natural or technological hazards and consequently for classification of these risks in relation to emergency water supply. Using of this procedure was verified on selected groundwater resource in particular region, which seems to be as potentially useful for crisis planning system.
Keywords: Hazard, Hydrogeological Structure, Elements, Index, Sensitivity, Water Source, Vulnerability
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1441468 Torque Based Selection of ANN for Fault Diagnosis of Wound Rotor Asynchronous Motor-Converter Association
Authors: Djalal Eddine Khodja, Boukhemis Chetate
Abstract:
In this paper, an automatic system of diagnosis was developed to detect and locate in real time the defects of the wound rotor asynchronous machine associated to electronic converter. For this purpose, we have treated the signals of the measured parameters (current and speed) to use them firstly, as indicating variables of the machine defects under study and, secondly, as inputs to the Artificial Neuron Network (ANN) for their classification in order to detect the defect type in progress. Once a defect is detected, the interpretation system of information will give the type of the defect and its place of appearance.Keywords: Artificial Neuron Networks (ANN), Effective Value (RMS), Experimental results, Failure detection Indicating values, Motor-converter unit.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1500467 Hand Written Digit Recognition by Multiple Classifier Fusion based on Decision Templates Approach
Authors: Reza Ebrahimpour, Samaneh Hamedi
Abstract:
Classifier fusion may generate more accurate classification than each of the basic classifiers. Fusion is often based on fixed combination rules like the product, average etc. This paper presents decision templates as classifier fusion method for the recognition of the handwritten English and Farsi numerals (1-9). The process involves extracting a feature vector on well-known image databases. The extracted feature vector is fed to multiple classifier fusion. A set of experiments were conducted to compare decision templates (DTs) with some combination rules. Results from decision templates conclude 97.99% and 97.28% for Farsi and English handwritten digits.Keywords: Decision templates, multi-layer perceptron, characteristics Loci, principle component analysis (PCA).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1956466 Deficiencies of Lung Segmentation Techniques using CT Scan Images for CAD
Authors: Nisar Ahmed Memon, Anwar Majid Mirza, S.A.M. Gilani
Abstract:
Segmentation is an important step in medical image analysis and classification for radiological evaluation or computer aided diagnosis. This paper presents the problem of inaccurate lung segmentation as observed in algorithms presented by researchers working in the area of medical image analysis. The different lung segmentation techniques have been tested using the dataset of 19 patients consisting of a total of 917 images. We obtained datasets of 11 patients from Ackron University, USA and of 8 patients from AGA Khan Medical University, Pakistan. After testing the algorithms against datasets, the deficiencies of each algorithm have been highlighted.Keywords: Computer Aided Diagnosis (CAD), MathematicalMorphology, Medical Image Analysis, Region Growing, Segmentation, Thresholding,
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2340465 Improving Academic Performance Prediction using Voting Technique in Data Mining
Authors: Ikmal Hisyam Mohamad Paris, Lilly Suriani Affendey, Norwati Mustapha
Abstract:
In this paper we compare the accuracy of data mining methods to classifying students in order to predicting student-s class grade. These predictions are more useful for identifying weak students and assisting management to take remedial measures at early stages to produce excellent graduate that will graduate at least with second class upper. Firstly we examine single classifiers accuracy on our data set and choose the best one and then ensembles it with a weak classifier to produce simple voting method. We present results show that combining different classifiers outperformed other single classifiers for predicting student performance.Keywords: Classification, Data Mining, Prediction, Combination of Multiple Classifiers.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2754464 Using Fractional Factorial Designs for Variable Importance in Random Forest Models
Authors: Ewa. M. Sztendur, Neil T. Diamond
Abstract:
Random Forests are a powerful classification technique, consisting of a collection of decision trees. One useful feature of Random Forests is the ability to determine the importance of each variable in predicting the outcome. This is done by permuting each variable and computing the change in prediction accuracy before and after the permutation. This variable importance calculation is similar to a one-factor-at a time experiment and therefore is inefficient. In this paper, we use a regular fractional factorial design to determine which variables to permute. Based on the results of the trials in the experiment, we calculate the individual importance of the variables, with improved precision over the standard method. The method is illustrated with a study of student attrition at Monash University.
Keywords: Random Forests, Variable Importance, Fractional Factorial Designs, Student Attrition.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1997463 Semantic Mobility Channel (SMC): Ubiquitous and Mobile Computing Meets the Semantic Web
Authors: José M. Cantera, Miguel Jiménez, Genoveva López, Javier Soriano
Abstract:
With the advent of emerging personal computing paradigms such as ubiquitous and mobile computing, Web contents are becoming accessible from a wide range of mobile devices. Since these devices do not have the same rendering capabilities, Web contents need to be adapted for transparent access from a variety of client agents. Such content adaptation is exploited for either an individual element or a set of consecutive elements in a Web document and results in better rendering and faster delivery to the client device. Nevertheless, Web content adaptation sets new challenges for semantic markup. This paper presents an advanced components platform, called SMC, enabling the development of mobility applications and services according to a channel model based on the principles of Services Oriented Architecture (SOA). It then goes on to describe the potential for integration with the Semantic Web through a novel framework of external semantic annotation that prescribes a scheme for representing semantic markup files and a way of associating Web documents with these external annotations. The role of semantic annotation in this framework is to describe the contents of individual documents themselves, assuring the preservation of the semantics during the process of adapting content rendering. Semantic Web content adaptation is a way of adding value to Web contents and facilitates repurposing of Web contents (enhanced browsing, Web Services location and access, etc).
Keywords: Semantic web, ubiquitous and mobile computing, web content transcoding. semantic mark-up, mobile computing, middleware and services.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1810462 Framework and Characterization of Physical Internet
Authors: Charifa Fergani, Adiba El Bouzekri El Idrissi, Suzanne Marcotte, Abdelowahed Hajjaji
Abstract:
Over the last years, a new paradigm known as Physical Internet has been developed, and studied in logistics management. The purpose of this global and open system is to deal with logistics grand challenge by setting up an efficient and sustainable Logistics Web. The purpose of this paper is to review scientific articles dedicated to Physical Internet topic, and to provide a clustering strategy enabling to classify the literature on the Physical Internet, to follow its evolution, as well as to criticize it. The classification is based on three factors: Logistics Web, organization, and resources. Several papers about Physical Internet have been classified and analyzed along the Logistics Web, resources and organization views at a strategic, tactical and operational level, respectively. A developed cluster analysis shows which topics of the Physical Internet that are the less covered actually. Future researches are outlined for these topics.Keywords: Logistics web, Physical Internet, PI characterization, taxonomy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 857461 Joint Use of Factor Analysis (FA) and Data Envelopment Analysis (DEA) for Ranking of Data Envelopment Analysis
Authors: Reza Nadimi, Fariborz Jolai
Abstract:
This article combines two techniques: data envelopment analysis (DEA) and Factor analysis (FA) to data reduction in decision making units (DMU). Data envelopment analysis (DEA), a popular linear programming technique is useful to rate comparatively operational efficiency of decision making units (DMU) based on their deterministic (not necessarily stochastic) input–output data and factor analysis techniques, have been proposed as data reduction and classification technique, which can be applied in data envelopment analysis (DEA) technique for reduction input – output data. Numerical results reveal that the new approach shows a good consistency in ranking with DEA.Keywords: Effectiveness, Decision Making, Data EnvelopmentAnalysis, Factor Analysis
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2425460 The Design and Applied of Learning Management System via Social Media on Internet: Case Study of Operating System for Business Subject
Authors: Pimploi Tirastittam, Sawanath Treesathon, Amornrath Ongkawat
Abstract:
Learning Management System (LMS) is the system which uses to manage the learning in order to grouping the content and learning activity between the lecturer and learner including online examination and evaluation. Nowadays, it is the borderless learning era so the learning activities can be accessed from everywhere in the world and also anytime via the information technology and media. The learner can easily access to the knowledge so the different in time and distance is not a constraint for learning anymore. The learning pattern which was used in this research is the integration of the in-class learning and online learning via internet and will be able to monitor the progress by the Learning management system which will create the fast response and accessible learning process via the social media. In order to increase the capability and freedom of the learner, the system can show the current and history of the learning document, video conference and also has the chat room for the learner and lecturer to interact to each other. So the objectives of the “The Design and Applied of Learning Management System via Social Media on Internet: Case Study of Operating System for Business Subject” are to expand the opportunity of learning and to increase the efficiency of learning as well as increase the communication channel between lecturer and student. The data of this research was collect from 30 users of the system which are students who enroll in the subject. And the result of the research is in the “Very Good” which is conformed to the hypothesis.
Keywords: Learning Management System, Social Media.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1878459 EHW from Consumer Point of View: Consumer-Triggered Evolution
Authors: Yerbol Sapargaliyev, Tatiana Kalganova
Abstract:
Evolvable Hardware (EHW) has been regarded as adaptive system acquired by wide application market. Consumer market of any good requires diversity to satisfy consumers- preferences. Adaptation of EHW is a key technology that could provide individual approach to every particular user. This situation raises a question: how to set target for evolutionary algorithm? The existing techniques do not allow consumer to influence evolutionary process. Only designer at the moment is capable to influence the evolution. The proposed consumer-triggered evolution overcomes this problem by introducing new features to EHW that help adaptive system to obtain targets during consumer stage. Classification of EHW is given according to responsiveness, imitation of human behavior and target circuit response. Home intelligent water heating system is considered as an example.
Keywords: Actuators, consumer-triggered evolution, evolvable hardware, sensors.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1486458 A New Face Recognition Method using PCA, LDA and Neural Network
Authors: A. Hossein Sahoolizadeh, B. Zargham Heidari, C. Hamid Dehghani
Abstract:
In this paper, a new face recognition method based on PCA (principal Component Analysis), LDA (Linear Discriminant Analysis) and neural networks is proposed. This method consists of four steps: i) Preprocessing, ii) Dimension reduction using PCA, iii) feature extraction using LDA and iv) classification using neural network. Combination of PCA and LDA is used for improving the capability of LDA when a few samples of images are available and neural classifier is used to reduce number misclassification caused by not-linearly separable classes. The proposed method was tested on Yale face database. Experimental results on this database demonstrated the effectiveness of the proposed method for face recognition with less misclassification in comparison with previous methods.Keywords: Face recognition Principal component analysis, Linear discriminant analysis, Neural networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3213