Search results for: Candidate Sets
488 Regional Stability Analysis of Rotor-Ball Bearing and Rotor- Roller Bearing Systems Considering Switching Phenomena
Authors: Jafar Abbaszadeh Chekan, Kaveh Merat, Hassan Zohoor
Abstract:
In this study the regional stability of a rotor system which is supported on rolling bearings with radial clearance is studied. The rotor is assumed to be rigid. Due to radial clearance of bearings and dynamic configuration of system, each rolling elements of bearings has the possibility to be in contact with both of the races (under compression) or lose its contact. As a result, this change in dynamic of the system makes it to be known as switching system which is a type of Hybrid systems. In this investigation by adopting Multiple Lyapunov Function theorem and using Hamiltonian function as a candidate Lyapunov function, the stability of the system is studied. The purpose of this study is to inspect the regional stability of rotor-roller bearing and rotor-ball bearing systems.
Keywords: Stability analysis, Rotor-rolling bearing systems, Switching systems, Multiple Lyapunov Function Method
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1743487 Semantic Indexing Approach of a Corpora Based On Ontology
Authors: Mohammed Erritali
Abstract:
The growth in the volume of text data such as books and articles in libraries for centuries has imposed to establish effective mechanisms to locate them. Early techniques such as abstraction, indexing and the use of classification categories have marked the birth of a new field of research called "Information Retrieval". Information Retrieval (IR) can be defined as the task of defining models and systems whose purpose is to facilitate access to a set of documents in electronic form (corpus) to allow a user to find the relevant ones for him, that is to say, the contents which matches with the information needs of the user. This paper presents a new semantic indexing approach of a documentary corpus. The indexing process starts first by a term weighting phase to determine the importance of these terms in the documents. Then the use of a thesaurus like Wordnet allows moving to the conceptual level. Each candidate concept is evaluated by determining its level of representation of the document, that is to say, the importance of the concept in relation to other concepts of the document. Finally, the semantic index is constructed by attaching to each concept of the ontology, the documents of the corpus in which these concepts are found.Keywords: Semantic, indexing, corpora, WordNet, ontology.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1368486 Analyzing Multi-Labeled Data Based on the Roll of a Concept against a Semantic Range
Authors: Masahiro Kuzunishi, Tetsuya Furukawa, Ke Lu
Abstract:
Classifying data hierarchically is an efficient approach to analyze data. Data is usually classified into multiple categories, or annotated with a set of labels. To analyze multi-labeled data, such data must be specified by giving a set of labels as a semantic range. There are some certain purposes to analyze data. This paper shows which multi-labeled data should be the target to be analyzed for those purposes, and discusses the role of a label against a set of labels by investigating the change when a label is added to the set of labels. These discussions give the methods for the advanced analysis of multi-labeled data, which are based on the role of a label against a semantic range.Keywords: Classification Hierarchies, Data Analysis, Multilabeled Data, Orders of Sets of Labels
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1208485 The Model of Blended Learning and Its Use at Foreign Language Teaching
Authors: A. A. Kudysheva, A. N. Kudyshev
Abstract:
In present article the model of Blended Learning, its advantage at foreign language teaching, and also some problems that can arise during its use are considered. The Blended Learning is a special organization of learning, which allows to combine classroom work and modern technologies in electronic distance teaching environment. Nowadays a lot of European educational institutions and companies use such technology. Through this method: student gets the opportunity to learn in a group (classroom) with a teacher and additionally at home at a convenient time; student himself sets the optimal speed and intensity of the learning process; this method helps student to discipline himself and learn to work independently.
Keywords: Foreign language, information and communication technology (ICT), model of Blended Learning, virtual cool room, technophobia
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3393484 Stability of a Special Class of Switched Positive Systems
Authors: Xiuyong Ding, Lan Shu, Xiu Liu
Abstract:
This paper is concerned with the existence of a linear copositive Lyapunov function(LCLF) for a special class of switched positive linear systems(SPLSs) composed of continuousand discrete-time subsystems. Firstly, by using system matrices, we construct a special kind of matrices in appropriate manner. Secondly, our results reveal that the Hurwitz stability of these matrices is equivalent to the existence of a common LCLF for arbitrary finite sets composed of continuous- and discrete-time positive linear timeinvariant( LTI) systems. Finally, a simple example is provided to illustrate the implication of our results.
Keywords: Linear co-positive Lyapunov functions, positive systems, switched systems.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1519483 Incremental Algorithm to Cluster the Categorical Data with Frequency Based Similarity Measure
Authors: S.Aranganayagi, K.Thangavel
Abstract:
Clustering categorical data is more complicated than the numerical clustering because of its special properties. Scalability and memory constraint is the challenging problem in clustering large data set. This paper presents an incremental algorithm to cluster the categorical data. Frequencies of attribute values contribute much in clustering similar categorical objects. In this paper we propose new similarity measures based on the frequencies of attribute values and its cardinalities. The proposed measures and the algorithm are experimented with the data sets from UCI data repository. Results prove that the proposed method generates better clusters than the existing one.Keywords: Clustering, Categorical, Incremental, Frequency, Domain
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1820482 Association Rules Mining and NOSQL Oriented Document in Big Data
Authors: Sarra Senhadji, Imene Benzeguimi, Zohra Yagoub
Abstract:
Big Data represents the recent technology of manipulating voluminous and unstructured data sets over multiple sources. Therefore, NOSQL appears to handle the problem of unstructured data. Association rules mining is one of the popular techniques of data mining to extract hidden relationship from transactional databases. The algorithm for finding association dependencies is well-solved with Map Reduce. The goal of our work is to reduce the time of generating of frequent itemsets by using Map Reduce and NOSQL database oriented document. A comparative study is given to evaluate the performances of our algorithm with the classical algorithm Apriori.
Keywords: Apriori, Association rules mining, Big Data, data mining, Hadoop, Map Reduce, MongoDB, NoSQL.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 694481 A Study on Linking Upward Substitution and Fuzzy Demands in the Newsboy-Type Problem
Authors: Pankaj Dutta, Debjani Chakraborty
Abstract:
This paper investigates the effect of product substitution in the single-period 'newsboy-type' problem in a fuzzy environment. It is supposed that the single-period problem operates under uncertainty in customer demand, which is described by imprecise terms and modelled by fuzzy sets. To perform this analysis, we consider the fuzzy model for two-item with upward substitution. This upward substitutability is reasonable when the products can be stored according to certain attribute levels such as quality, brand or package size. We show that the explicit consideration of this substitution opportunity increase the average expected profit. Computational study is performed to observe the benefits of product's substitution.Keywords: Fuzzy demand, Newsboy, Single-period problem, Substitution.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1422480 Flexible Laser Reduced Graphene Oxide/ MnO2 Electrode for Supercapacitor Applications
Authors: Ingy N. Bkrey, Ahmed A. Moniem
Abstract:
We succeeded to produce a high performance and flexible graphene/Manganese dioxide (G/MnO2) electrode coated on flexible polyethylene terephthalate (PET) substrate. The graphene film is initially synthesized by drop-casting the graphene oxide (GO) solution on the PET substrate, followed by simultaneous reduction and patterning of the dried film using carbon dioxide (CO2) laser beam with power of 1.8 W. Potentiostatic Anodic Deposition method was used to deposit thin film of MnO2 with different loading mass 10 – 50 and 100 μg.cm-2 on the pre-prepared graphene film. The electrodes were fully characterized in terms of structure, morphology, and electrochemical performance. A maximum specific capacitance of 973 F.g-1 was attributed when depositing 50μg.cm-2 MnO2 on the laser reduced graphene oxide rGO (or G/50MnO2) and over 92% of its initial capacitance was retained after 1000 cycles. The good electrochemical performance and long-term cycling stability make our proposed approach a promising candidate in the supercapacitor applications.
Keywords: Electrode Deposition, Flexible, Graphene oxide, Graphene, High Power CO2 Laser, MnO2.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3702479 Analyzing the Factors Influencing Exclusive Breastfeeding Using the Generalized Poisson Regression Model
Authors: Cheika Jahangeer, Naushad Mamode Khan, Maleika Heenaye-Mamode Khan
Abstract:
Exclusive breastfeeding is the feeding of a baby on no other milk apart from breast milk. Exclusive breastfeeding during the first 6 months of life is of fundamental importance because it supports optimal growth and development during infancy and reduces the risk of obliterating diseases and problems. Moreover, in developed countries, exclusive breastfeeding has decreased the incidence and/or severity of diarrhea, lower respiratory infection and urinary tract infection. In this paper, we study the factors that influence exclusive breastfeeding and use the Generalized Poisson regression model to analyze the practices of exclusive breastfeeding in Mauritius. We develop two sets of quasi-likelihood equations (QLE)to estimate the parameters.
Keywords: Exclusive breastfeeding, Regression model, Quasilikelihood.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1800478 Comparative Analysis of Classical and Parallel Inpainting Algorithms Based on Affine Combinations of Projections on Convex Sets
Authors: Irina Maria Artinescu, Costin Radu Boldea, Eduard-Ionut Matei
Abstract:
The paper is a comparative study of two classical vari-ants of parallel projection methods for solving the convex feasibility problem with their equivalents that involve variable weights in the construction of the solutions. We used a graphical representation of these methods for inpainting a convex area of an image in order to investigate their effectiveness in image reconstruction applications. We also presented a numerical analysis of the convergence of these four algorithms in terms of the average number of steps and execution time, in classical CPU and, alternativaly, in parallel GPU implementation.
Keywords: convex feasibility problem, convergence analysis, ınpainting, parallel projection methods
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 448477 Zero Truncated Strict Arcsine Model
Authors: Y. N. Phang, E. F. Loh
Abstract:
The zero truncated model is usually used in modeling count data without zero. It is the opposite of zero inflated model. Zero truncated Poisson and zero truncated negative binomial models are discussed and used by some researchers in analyzing the abundance of rare species and hospital stay. Zero truncated models are used as the base in developing hurdle models. In this study, we developed a new model, the zero truncated strict arcsine model, which can be used as an alternative model in modeling count data without zero and with extra variation. Two simulated and one real life data sets are used and fitted into this developed model. The results show that the model provides a good fit to the data. Maximum likelihood estimation method is used in estimating the parameters.
Keywords: Hurdle models, maximum likelihood estimation method, positive count data.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1857476 Fuzzy Modeling Tool for Creating a Component Model of Information System
Authors: Bogdan Walek, Jiri Bartos, Cyril Klimes, Jaroslav Prochazka, Pavel Smolka, Juraj Masar, Martin Pesl
Abstract:
This paper focuses on creating a component model of information system under uncertainty. The paper identifies problem in current approach of component modeling and proposes fuzzy tool, which will work with vague customer requirements and propose components of the resulting component model. The proposed tool is verified on specific information system and results are shown in paper. After finding suitable sub-components of the resulting component model, the component model is visualised by tool.
Keywords: Component, component model, fuzzy, fuzzy rules, fuzzy sets, information system, modelling, tool.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1643475 A Model for Collaborative COTS Software Acquisition (COSA)
Authors: Torsti Rantapuska, Sariseelia Sore
Abstract:
Acquiring commercial off-the-shelf (COTS) software applications is becoming routine in organizations. However, eliciting user requirements, finding the candidate COTS products and making the decision is a complex task, especially for SMEs who do not have the time and knowledge needed to do the task properly. The existing models intended to help the decision makers are originally designed for professional use. SMEs are obligated to rely on the software vendor’s ability to solve the problem with the systems provided. In this paper, we develop a model for SMEs for the acquisition of Commercial Off-The-Shelf (COTS) software products. A leading idea of the model is that the ICT investment is basically a change initiative and therefore it should also be taken as a process of organizational learning. The model is designed bearing three objectives in mind: 1) business orientation, 2) agility, and 3) Learning and knowledge management orientation. The model can be applied to ICT investments in SMEs which have a professional team leader with basic business and IT knowledge.
Keywords: COTS acquisition, ICT investment, organizational learning, ICT adoption.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1770474 A Rough-set Based Approach to Design an Expert System for Personnel Selection
Authors: Ehsan Akhlaghi
Abstract:
Effective employee selection is a critical component of a successful organization. Many important criteria for personnel selection such as decision-making ability, adaptability, ambition, and self-organization are naturally vague and imprecise to evaluate. The rough sets theory (RST) as a new mathematical approach to vagueness and uncertainty is a very well suited tool to deal with qualitative data and various decision problems. This paper provides conceptual, descriptive, and simulation results, concentrating chiefly on human resources and personnel selection factors. The current research derives certain decision rules which are able to facilitate personnel selection and identifies several significant features based on an empirical study conducted in an IT company in Iran.Keywords: Decision Making, Expert System, PersonnelSelection, Rough Set Theory
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2358473 On the Prediction of Transmembrane Helical Segments in Membrane Proteins
Abstract:
The prediction of transmembrane helical segments (TMHs) in membrane proteins is an important field in the bioinformatics research. In this paper, a method based on discrete wavelet transform (DWT) has been developed to predict the number and location of TMHs in membrane proteins. PDB coded as 1F88 was chosen as an example to describe the prediction of the number and location of TMHs in membrane proteins by using this method. One group of test data sets that contain total 19 protein sequences was utilized to access the effect of this method. Compared with the prediction results of DAS, PRED-TMR2, SOSUI, HMMTOP2.0 and TMHMM2.0, the obtained results indicate that the presented method has higher prediction accuracy.Keywords: hydrophobicity, membrane protein, transmembranehelical segments, wavelet transform
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1582472 Structure and Properties of Meltblown Polyetherimide as High Temperature Filter Media
Authors: Gajanan Bhat, Vincent Kandagor, Daniel Prather, Ramesh Bhave
Abstract:
Polyetherimide (PEI), an engineering plastic with very high glass transition temperature and excellent chemical and thermal stability, has been processed into a controlled porosity filter media of varying pore size, performance, and surface characteristics. A special grade of the PEI was processed by melt blowing to produce microfiber nonwovens suitable as filter media. The resulting microfiber webs were characterized to evaluate their structure and properties. The fiber webs were further modified by hot pressing, a post processing technique, which reduces the pore size in order to improve the barrier properties of the resulting membranes. This ongoing research has shown that PEI can be a good candidate for filter media requiring high temperature and chemical resistance with good mechanical properties. Also, by selecting the appropriate processing conditions, it is possible to achieve desired filtration performance from this engineering plastic.
Keywords: Nonwovens, melt blowing, polyehterimide, filter media, microfibers.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1367471 Categorical Missing Data Imputation Using Fuzzy Neural Networks with Numerical and Categorical Inputs
Authors: Pilar Rey-del-Castillo, Jesús Cardeñosa
Abstract:
There are many situations where input feature vectors are incomplete and methods to tackle the problem have been studied for a long time. A commonly used procedure is to replace each missing value with an imputation. This paper presents a method to perform categorical missing data imputation from numerical and categorical variables. The imputations are based on Simpson-s fuzzy min-max neural networks where the input variables for learning and classification are just numerical. The proposed method extends the input to categorical variables by introducing new fuzzy sets, a new operation and a new architecture. The procedure is tested and compared with others using opinion poll data.
Keywords: Classifier, imputation techniques, fuzzy systems, fuzzy min-max neural networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1779470 An Engineering Approach to Forecast Volatility of Financial Indices
Authors: Irwin Ma, Tony Wong, Thiagas Sankar
Abstract:
By systematically applying different engineering methods, difficult financial problems become approachable. Using a combination of theory and techniques such as wavelet transform, time series data mining, Markov chain based discrete stochastic optimization, and evolutionary algorithms, this work formulated a strategy to characterize and forecast non-linear time series. It attempted to extract typical features from the volatility data sets of S&P100 and S&P500 indices that include abrupt drops, jumps and other non-linearity. As a result, accuracy of forecasting has reached an average of over 75% surpassing any other publicly available results on the forecast of any financial index.Keywords: Discrete stochastic optimization, genetic algorithms, genetic programming, volatility forecast
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1630469 Convergence and Divergence in Telephone Conversations: A Case of Persian
Authors: Anna Mirzaiyan, Vahid Parvaresh, Mahmoud Hashemian, Masoud Saeedi
Abstract:
People usually have a telephone voice, which means they adjust their speech to fit particular situations and to blend in with other interlocutors. The question is: Do we speak differently to different people? This possibility has been suggested by social psychologists within Accommodation Theory [1]. Converging toward the speech of another person can be regarded as a polite speech strategy while choosing a language not used by the other interlocutor can be considered as the clearest example of speech divergence [2]. The present study sets out to investigate such processes in the course of everyday telephone conversations. Using Joos-s [3] model of formality in spoken English, the researchers try to explore convergence to or divergence from the addressee. The results propound the actuality that lexical choice, and subsequently, patterns of style vary intriguingly in concordance with the person being addressed.Keywords: Convergence, divergence, lexical formality, speechaccommodation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3517468 Fast Segmentation for the Piecewise Smooth Mumford-Shah Functional
Authors: Yingjie Zhang
Abstract:
This paper is concerned with an improved algorithm based on the piecewise-smooth Mumford and Shah (MS) functional for an efficient and reliable segmentation. In order to speed up convergence, an additional force, at each time step, is introduced further to drive the evolution of the curves instead of only driven by the extensions of the complementary functions u + and u - . In our scheme, furthermore, the piecewise-constant MS functional is integrated to generate the extra force based on a temporary image that is dynamically created by computing the union of u + and u - during segmenting. Therefore, some drawbacks of the original algorithm, such as smaller objects generated by noise and local minimal problem also are eliminated or improved. The resulting algorithm has been implemented in Matlab and Visual Cµ, and demonstrated efficiently by several cases.Keywords: Active contours, energy minimization, image segmentation, level sets.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1860467 A Reconfigurable Microstrip Patch Antenna with Polyphase Filter for Polarization Diversity and Cross Polarization Filtering Operation
Authors: Lakhdar Zaid, Albane Sangiovanni
Abstract:
A reconfigurable microstrip patch antenna with polyphase filter for polarization diversity and cross polarization filtering operation is presented in this paper. In our approach, a polyphase filter is used to obtain the four 90° phase shift outputs to feed a square microstrip patch antenna. The antenna can be switched between four states of polarization in transmission as well as in receiving mode. Switches are interconnected with the polyphase filter network to produce left-hand circular polarization, right-hand circular polarization, horizontal linear polarization, and vertical linear polarization. Additional advantage of using polyphase filter is its filtering capability for cross polarization filtering in right-hand circular polarization and left-hand circular polarization operation. The theoretical and simulated results demonstrated that polyphase filter is a good candidate to drive microstrip patch antenna to accomplish polarization diversity and cross polarization filtering operation.
Keywords: Microstrip patch antenna, polyphase filter, circular polarization, linear polarization, reconfigurable antenna.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1441466 Optimal Classifying and Extracting Fuzzy Relationship from Query Using Text Mining Techniques
Authors: Faisal Alshuwaier, Ali Areshey
Abstract:
Text mining techniques are generally applied for classifying the text, finding fuzzy relations and structures in data sets. This research provides plenty text mining capabilities. One common application is text classification and event extraction, which encompass deducing specific knowledge concerning incidents referred to in texts. The main contribution of this paper is the clarification of a concept graph generation mechanism, which is based on a text classification and optimal fuzzy relationship extraction. Furthermore, the work presented in this paper explains the application of fuzzy relationship extraction and branch and bound (BB) method to simplify the texts.
Keywords: Extraction, Max-Prod, Fuzzy Relations, Text Mining, Memberships, Classification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2184465 The Establishment of Cause-System of Poor Construction Site Safety and Priority Analysis from Different Perspectives
Authors: Shirong Li, Xueping Xiang
Abstract:
Construction site safety in China has aroused comprehensive concern all over the world. It is imperative to investigate the main causes of poor construction site safety. This paper divides all the causes into four aspects, namely the factors of workers, object, environment and management and sets up the accident causes element system based on Delphi Method. This is followed by the application of structural equation modeling to examine the importance of each aspect of causes from the standpoints of different roles related to the construction respectively. The results indicate that all the four aspects of factors are in need of improvement, and different roles have different ideas considering the priority of those factors. The paper has instructive significance for the practitioners to take measures to improve construction site safety in China accordingly.Keywords: construction site safety, Delphi Method, structuralequation modeling, different perspective.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1942464 Soft Computing based Retrieval System for Medical Applications
Authors: Pardeep Singh, Sanjay Sharma
Abstract:
With increasing data in medical databases, medical data retrieval is growing in popularity. Some of this analysis including inducing propositional rules from databases using many soft techniques, and then using these rules in an expert system. Diagnostic rules and information on features are extracted from clinical databases on diseases of congenital anomaly. This paper explain the latest soft computing techniques and some of the adaptive techniques encompasses an extensive group of methods that have been applied in the medical domain and that are used for the discovery of data dependencies, importance of features, patterns in sample data, and feature space dimensionality reduction. These approaches pave the way for new and interesting avenues of research in medical imaging and represent an important challenge for researchers.Keywords: CBIR, GA, Rough sets, CBMIR, SVM.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1732463 Optimal Allocation of PHEV Parking Lots to Minimize Distribution System Losses
Authors: Mahmud Fotuhi-Firuzabad, Ali Abbaspour, Mohsen Mazidi, Mohamamd Rastegar
Abstract:
To tackle the air pollution issues, Plug-in Hybrid Electric Vehicles (PHEVs) are proposed as an appropriate solution. Charging a large amount of PHEV batteries, if not controlled, would have negative impacts on the distribution system. The control process of charging of these vehicles can be centralized in parking lots that may provide a chance for better coordination than the individual charging in houses. In this paper, an optimization-based approach is proposed to determine the optimum PHEV parking capacities in candidate nodes of the distribution system. In so doing, a profile for charging and discharging of PHEVs is developed in order to flatten the network load profile. Then, this profile is used in solving an optimization problem to minimize the distribution system losses. The outputs of the proposed method are the proper place for PHEV parking lots and optimum capacity for each parking. The application of the proposed method on the IEEE-34 node test feeder verifies the effectiveness of the method.Keywords: Plug-in Hybrid Electric Vehicle (PHEV), PHEV parking lot, V2G.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2291462 Identification of Service Quality Determinants in the Hotel Sector: A Conceptual Review
Authors: Asem M. Othman
Abstract:
The expansion of the hospitality industry is distinctive in the 21st century. Services, by nature, are intangible. Hence, service quality, in general, is a complicated process to be measured and evaluated. Hotels, as a service sector and part of the hospitality industry, are growing rapidly. This research paper was carried out to identify the quality determinants that may affect hotel guests’ service quality perception. In this research paper, each quality determinant will be discussed, illustrated, and justified thoroughly via a systematic literature review. This paper sets the stage to measure the significant influence of the service quality determinants on guest satisfaction. The knowledge contribution from this study proposes to practitioners and/or hotel service providers, fundamental elements to adopt the implications into their policies.
Keywords: Hotel service, service quality, quality determinants, quality management.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 463461 Identification of MIMO Systems Using Neuro-Fuzzy Models with a Shuffled Frog Leaping Algorithm
Authors: Sana Bouzaida, Anis Sakly, Faouzi M'Sahli
Abstract:
In this paper, a TSK-type Neuro-fuzzy Inference System that combines the features of fuzzy sets and neural networks has been applied for the identification of MIMO systems. The procedure of adapting parameters in TSK model employs a Shuffled Frog Leaping Algorithm (SFLA) which is inspired from the memetic evolution of a group of frogs when seeking for food. To demonstrate the accuracy and effectiveness of the proposed controller, two nonlinear systems have been considered as the MIMO plant, and results have been compared with other learning methods based on Particle Swarm Optimization algorithm (PSO) and Genetic Algorithm (GA).Keywords: Identification, Shuffled frog Leaping Algorithm (SFLA), TSK-type neuro-fuzzy model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1773460 Integral Domains and Their Algebras: Topological Aspects
Authors: Shai Sarussi
Abstract:
Let S be an integral domain with field of fractions F and let A be an F-algebra. An S-subalgebra R of A is called S-nice if R∩F = S and the localization of R with respect to S \{0} is A. Denoting by W the set of all S-nice subalgebras of A, and defining a notion of open sets on W, one can view W as a T0-Alexandroff space. Thus, the algebraic structure of W can be viewed from the point of view of topology. It is shown that every nonempty open subset of W has a maximal element in it, which is also a maximal element of W. Moreover, a supremum of an irreducible subset of W always exists. As a notable connection with valuation theory, one considers the case in which S is a valuation domain and A is an algebraic field extension of F; if S is indecomposed in A, then W is an irreducible topological space, and W contains a greatest element.Keywords: Algebras over integral domains, Alexandroff topology, valuation domains, integral domains.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 507459 An Efficient Data Mining Approach on Compressed Transactions
Authors: Jia-Yu Dai, Don-Lin Yang, Jungpin Wu, Ming-Chuan Hung
Abstract:
In an era of knowledge explosion, the growth of data increases rapidly day by day. Since data storage is a limited resource, how to reduce the data space in the process becomes a challenge issue. Data compression provides a good solution which can lower the required space. Data mining has many useful applications in recent years because it can help users discover interesting knowledge in large databases. However, existing compression algorithms are not appropriate for data mining. In [1, 2], two different approaches were proposed to compress databases and then perform the data mining process. However, they all lack the ability to decompress the data to their original state and improve the data mining performance. In this research a new approach called Mining Merged Transactions with the Quantification Table (M2TQT) was proposed to solve these problems. M2TQT uses the relationship of transactions to merge related transactions and builds a quantification table to prune the candidate itemsets which are impossible to become frequent in order to improve the performance of mining association rules. The experiments show that M2TQT performs better than existing approaches.Keywords: Association rule, data mining, merged transaction, quantification table.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1960