Search results for: stochastic search.
563 Heuristic Search Algorithm (HSA) for Enhancing the Lifetime of Wireless Sensor Networks
Authors: Tripatjot S. Panag, J. S. Dhillon
Abstract:
The lifetime of a wireless sensor network can be effectively increased by using scheduling operations. Once the sensors are randomly deployed, the task at hand is to find the largest number of disjoint sets of sensors such that every sensor set provides complete coverage of the target area. At any instant, only one of these disjoint sets is switched on, while all other are switched off. This paper proposes a heuristic search method to find the maximum number of disjoint sets that completely cover the region. A population of randomly initialized members is made to explore the solution space. A set of heuristics has been applied to guide the members to a possible solution in their neighborhood. The heuristics escalate the convergence of the algorithm. The best solution explored by the population is recorded and is continuously updated. The proposed algorithm has been tested for applications which require sensing of multiple target points, referred to as point coverage applications. Results show that the proposed algorithm outclasses the existing algorithms. It always finds the optimum solution, and that too by making fewer number of fitness function evaluations than the existing approaches.Keywords: Coverage, disjoint sets, heuristic, lifetime, scheduling, wireless sensor networks, WSN.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1840562 Dynamic Routing to Multiple Destinations in IP Networks using Hybrid Genetic Algorithm (DRHGA)
Authors: K. Vijayalakshmi, S. Radhakrishnan
Abstract:
In this paper we have proposed a novel dynamic least cost multicast routing protocol using hybrid genetic algorithm for IP networks. Our protocol finds the multicast tree with minimum cost subject to delay, degree, and bandwidth constraints. The proposed protocol has the following features: i. Heuristic local search function has been devised and embedded with normal genetic operation to increase the speed and to get the optimized tree, ii. It is efficient to handle the dynamic situation arises due to either change in the multicast group membership or node / link failure, iii. Two different crossover and mutation probabilities have been used for maintaining the diversity of solution and quick convergence. The simulation results have shown that our proposed protocol generates dynamic multicast tree with lower cost. Results have also shown that the proposed algorithm has better convergence rate, better dynamic request success rate and less execution time than other existing algorithms. Effects of degree and delay constraints have also been analyzed for the multicast tree interns of search success rate.
Keywords: Dynamic Group membership change, Hybrid Genetic Algorithm, Link / node failure, QoS Parameters.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1447561 Highly Accurate Target Motion Compensation Using Entropy Function Minimization
Authors: Amin Aghatabar Roodbary, Mohammad Hassan Bastani
Abstract:
One of the defects of stepped frequency radar systems is their sensitivity to target motion. In such systems, target motion causes range cell shift, false peaks, Signal to Noise Ratio (SNR) reduction and range profile spreading because of power spectrum interference of each range cell in adjacent range cells which induces distortion in High Resolution Range Profile (HRRP) and disrupt target recognition process. Thus Target Motion Parameters (TMPs) effects compensation should be employed. In this paper, such a method for estimating TMPs (velocity and acceleration) and consequently eliminating or suppressing the unwanted effects on HRRP based on entropy minimization has been proposed. This method is carried out in two major steps: in the first step, a discrete search method has been utilized over the whole acceleration-velocity lattice network, in a specific interval seeking to find a less-accurate minimum point of the entropy function. Then in the second step, a 1-D search over velocity is done in locus of the minimum for several constant acceleration lines, in order to enhance the accuracy of the minimum point found in the first step. The provided simulation results demonstrate the effectiveness of the proposed method.
Keywords: ATR, HRRP, motion compensation, SFW, TMP.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 656560 Dynamics of Marital Status and Information Search through Consumer Generated Media: An Exploratory Study
Authors: Shivakumar Krishnamurti, Ruchi Agarwal
Abstract:
The study examines the influence of marital status on consumers of products and services using blogs as a source of information. A pre-designed questionnaire was used to collect the primary data from the respondents (experiences). Data were collected from one hundred and eighty seven respondents residing in and around the Emirates of Sharjah and Dubai of the United Arab Emirates. The collected data was analyzed with the help of statistical tools such as averages, percentages, factor analysis, Student’s t-test and Structural Equation Modelling Technique. Objectives of the study are to know the reasons how married and unmarried or single consumers of products and services are motivated to use blogs as a source of information, to know whether the consumers of products and services irrespective of their marital status share their views and experiences with other bloggers and to know the respondents’ future intentions towards blogging. The study revealed the following: Majority of the respondents have the motivation to blog because they are willing to receive comments on what they post about services, convenience of blogs to search for information about services and products, by blogging respondents share information on the symptoms of a disease/ disorder that may be experienced by someone, helps to share information about ready to cook mix products and are keen to spend more time blogging in the future.
Keywords: Blog, consumer, information, marital status.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1696559 Hybrid Structure Learning Approach for Assessing the Phosphate Laundries Impact
Authors: Emna Benmohamed, Hela Ltifi, Mounir Ben Ayed
Abstract:
Bayesian Network (BN) is one of the most efficient classification methods. It is widely used in several fields (i.e., medical diagnostics, risk analysis, bioinformatics research). The BN is defined as a probabilistic graphical model that represents a formalism for reasoning under uncertainty. This classification method has a high-performance rate in the extraction of new knowledge from data. The construction of this model consists of two phases for structure learning and parameter learning. For solving this problem, the K2 algorithm is one of the representative data-driven algorithms, which is based on score and search approach. In addition, the integration of the expert's knowledge in the structure learning process allows the obtainment of the highest accuracy. In this paper, we propose a hybrid approach combining the improvement of the K2 algorithm called K2 algorithm for Parents and Children search (K2PC) and the expert-driven method for learning the structure of BN. The evaluation of the experimental results, using the well-known benchmarks, proves that our K2PC algorithm has better performance in terms of correct structure detection. The real application of our model shows its efficiency in the analysis of the phosphate laundry effluents' impact on the watershed in the Gafsa area (southwestern Tunisia).
Keywords: Classification, Bayesian network; structure learning, K2 algorithm, expert knowledge, surface water analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 510558 Mathematical Models of Flow Shop and Job Shop Scheduling Problems
Authors: Miloš Šeda
Abstract:
In this paper, mathematical models for permutation flow shop scheduling and job shop scheduling problems are proposed. The first problem is based on a mixed integer programming model. As the problem is NP-complete, this model can only be used for smaller instances where an optimal solution can be computed. For large instances, another model is proposed which is suitable for solving the problem by stochastic heuristic methods. For the job shop scheduling problem, a mathematical model and its main representation schemes are presented.
Keywords: Flow shop, job shop, mixed integer model, representation scheme.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4673557 An Advanced Nelder Mead Simplex Method for Clustering of Gene Expression Data
Authors: M. Pandi, K. Premalatha
Abstract:
The DNA microarray technology concurrently monitors the expression levels of thousands of genes during significant biological processes and across the related samples. The better understanding of functional genomics is obtained by extracting the patterns hidden in gene expression data. It is handled by clustering which reveals natural structures and identify interesting patterns in the underlying data. In the proposed work clustering gene expression data is done through an Advanced Nelder Mead (ANM) algorithm. Nelder Mead (NM) method is a method designed for optimization process. In Nelder Mead method, the vertices of a triangle are considered as the solutions. Many operations are performed on this triangle to obtain a better result. In the proposed work, the operations like reflection and expansion is eliminated and a new operation called spread-out is introduced. The spread-out operation will increase the global search area and thus provides a better result on optimization. The spread-out operation will give three points and the best among these three points will be used to replace the worst point. The experiment results are analyzed with optimization benchmark test functions and gene expression benchmark datasets. The results show that ANM outperforms NM in both benchmarks.
Keywords: Spread out, simplex, multi-minima, fitness function, optimization, search area, monocyte, solution, genomes.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2557556 Using Gaussian Process in Wind Power Forecasting
Authors: Hacene Benkhoula, Mohamed Badreddine Benabdella, Hamid Bouzeboudja, Abderrahmane Asraoui
Abstract:
The wind is a random variable difficult to master, for this, we developed a mathematical and statistical methods enable to modeling and forecast wind power. Gaussian Processes (GP) is one of the most widely used families of stochastic processes for modeling dependent data observed over time, or space or time and space. GP is an underlying process formed by unrecognized operator’s uses to solve a problem. The purpose of this paper is to present how to forecast wind power by using the GP. The Gaussian process method for forecasting are presented. To validate the presented approach, a simulation under the MATLAB environment has been given.Keywords: Forecasting, Gaussian process, modeling, wind power.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1786555 Incorporating Semantic Similarity Measure in Genetic Algorithm : An Approach for Searching the Gene Ontology Terms
Authors: Razib M. Othman, Safaai Deris, Rosli M. Illias, Hany T. Alashwal, Rohayanti Hassan, FarhanMohamed
Abstract:
The most important property of the Gene Ontology is the terms. These control vocabularies are defined to provide consistent descriptions of gene products that are shareable and computationally accessible by humans, software agent, or other machine-readable meta-data. Each term is associated with information such as definition, synonyms, database references, amino acid sequences, and relationships to other terms. This information has made the Gene Ontology broadly applied in microarray and proteomic analysis. However, the process of searching the terms is still carried out using traditional approach which is based on keyword matching. The weaknesses of this approach are: ignoring semantic relationships between terms, and highly depending on a specialist to find similar terms. Therefore, this study combines semantic similarity measure and genetic algorithm to perform a better retrieval process for searching semantically similar terms. The semantic similarity measure is used to compute similitude strength between two terms. Then, the genetic algorithm is employed to perform batch retrievals and to handle the situation of the large search space of the Gene Ontology graph. The computational results are presented to show the effectiveness of the proposed algorithm.Keywords: Gene Ontology, Semantic similarity measure, Genetic algorithm, Ontology search
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1489554 Transmission Lines Loading Enhancement Using ADPSO Approach
Authors: M. Mahdavi, H. Monsef, A. Bagheri
Abstract:
Discrete particle swarm optimization (DPSO) is a powerful stochastic evolutionary algorithm that is used to solve the large-scale, discrete and nonlinear optimization problems. However, it has been observed that standard DPSO algorithm has premature convergence when solving a complex optimization problem like transmission expansion planning (TEP). To resolve this problem an advanced discrete particle swarm optimization (ADPSO) is proposed in this paper. The simulation result shows that optimization of lines loading in transmission expansion planning with ADPSO is better than DPSO from precision view point.Keywords: ADPSO, TEP problem, Lines loading optimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1618553 Piecewise Interpolation Filter for Effective Processing of Large Signal Sets
Authors: Anatoli Torokhti, Stanley Miklavcic
Abstract:
Suppose KY and KX are large sets of observed and reference signals, respectively, each containing N signals. Is it possible to construct a filter F : KY → KX that requires a priori information only on few signals, p N, from KX but performs better than the known filters based on a priori information on every reference signal from KX? It is shown that the positive answer is achievable under quite unrestrictive assumptions. The device behind the proposed method is based on a special extension of the piecewise linear interpolation technique to the case of random signal sets. The proposed technique provides a single filter to process any signal from the arbitrarily large signal set. The filter is determined in terms of pseudo-inverse matrices so that it always exists.Keywords: Wiener filter, filtering of stochastic signals.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1411552 Comparative Analysis of Transient-Fault Tolerant Schemes for Network on Chips
Authors: Muhammad Ali, Awais Adnan
Abstract:
Network on a chip (NoC) has been proposed as a viable solution to counter the inefficiency of buses in the current VLSI on-chip interconnects. However, as the silicon chip accommodates more transistors, the probability of transient faults is increasing, making fault tolerance a key concern in scaling chips. In packet based communication on a chip, transient failures can corrupt the data packet and hence, undermine the accuracy of data communication. In this paper, we present a comparative analysis of transient fault tolerant techniques including end-to-end, node-by-node, and stochastic communication based on flooding principle.
Keywords: NoC, fault-tolerance, transient faults.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1363551 Interoperability Maturity Models for Consideration When Using School Management Systems in South Africa: A Scoping Review
Authors: Keneilwe Maremi, Marlien Herselman, Adele Botha
Abstract:
The main purpose and focus of this paper are to determine the Interoperability Maturity Models to consider when using School Management Systems (SMS). The importance of this is to inform and help schools with knowing which Interoperability Maturity Model is best suited for their SMS. To address the purpose, this paper will apply a scoping review to ensure that all aspects are provided. The scoping review will include papers written from 2012-2019 and a comparison of the different types of Interoperability Maturity Models will be discussed in detail, which includes the background information, the levels of interoperability, and area for consideration in each Maturity Model. The literature was obtained from the following databases: IEEE Xplore and Scopus, the following search engines were used: Harzings, and Google Scholar. The topic of the paper was used as a search term for the literature and the term ‘Interoperability Maturity Models’ was used as a keyword. The data were analyzed in terms of the definition of Interoperability, Interoperability Maturity Models, and levels of interoperability. The results provide a table that shows the focus area of concern for each Maturity Model (based on the scoping review where only 24 papers were found to be best suited for the paper out of 740 publications initially identified in the field). This resulted in the most discussed Interoperability Maturity Model for consideration (Information Systems Interoperability Maturity Model (ISIMM) and Organizational Interoperability Maturity Model for C2 (OIM)).
Keywords: Interoperability, Interoperability Maturity Model, School Management System, scoping review.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 796550 Fuzzy Relatives of the CLARANS Algorithm With Application to Text Clustering
Authors: Mohamed A. Mahfouz, M. A. Ismail
Abstract:
This paper introduces new algorithms (Fuzzy relative of the CLARANS algorithm FCLARANS and Fuzzy c Medoids based on randomized search FCMRANS) for fuzzy clustering of relational data. Unlike existing fuzzy c-medoids algorithm (FCMdd) in which the within cluster dissimilarity of each cluster is minimized in each iteration by recomputing new medoids given current memberships, FCLARANS minimizes the same objective function minimized by FCMdd by changing current medoids in such away that that the sum of the within cluster dissimilarities is minimized. Computing new medoids may be effected by noise because outliers may join the computation of medoids while the choice of medoids in FCLARANS is dictated by the location of a predominant fraction of points inside a cluster and, therefore, it is less sensitive to the presence of outliers. In FCMRANS the step of computing new medoids in FCMdd is modified to be based on randomized search. Furthermore, a new initialization procedure is developed that add randomness to the initialization procedure used with FCMdd. Both FCLARANS and FCMRANS are compared with the robust and linearized version of fuzzy c-medoids (RFCMdd). Experimental results with different samples of the Reuter-21578, Newsgroups (20NG) and generated datasets with noise show that FCLARANS is more robust than both RFCMdd and FCMRANS. Finally, both FCMRANS and FCLARANS are more efficient and their outputs are almost the same as that of RFCMdd in terms of classification rate.Keywords: Data Mining, Fuzzy Clustering, Relational Clustering, Medoid-Based Clustering, Cluster Analysis, Unsupervised Learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2401549 Delay-Distribution-Dependent Stability Criteria for BAM Neural Networks with Time-Varying Delays
Authors: J.H. Park, S. Lakshmanan, H.Y. Jung, S.M. Lee
Abstract:
This paper is concerned with the delay-distributiondependent stability criteria for bidirectional associative memory (BAM) neural networks with time-varying delays. Based on the Lyapunov-Krasovskii functional and stochastic analysis approach, a delay-probability-distribution-dependent sufficient condition is derived to achieve the globally asymptotically mean square stable of the considered BAM neural networks. The criteria are formulated in terms of a set of linear matrix inequalities (LMIs), which can be checked efficiently by use of some standard numerical packages. Finally, a numerical example and its simulation is given to demonstrate the usefulness and effectiveness of the proposed results.Keywords: BAM neural networks, Probabilistic time-varying delays, Stability criteria.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1417548 Losses Analysis in TEP Considering Uncertainity in Demand by DPSO
Authors: S. Jalilzadeh, A. Kimiyaghalam, A. Ashouri
Abstract:
This paper presents a mathematical model and a methodology to analyze the losses in transmission expansion planning (TEP) under uncertainty in demand. The methodology is based on discrete particle swarm optimization (DPSO). DPSO is a useful and powerful stochastic evolutionary algorithm to solve the large-scale, discrete and nonlinear optimization problems like TEP. The effectiveness of the proposed idea is tested on an actual transmission network of the Azerbaijan regional electric company, Iran. The simulation results show that considering the losses even for transmission expansion planning of a network with low load growth is caused that operational costs decreases considerably and the network satisfies the requirement of delivering electric power more reliable to load centers.Keywords: DPSO, TEP, Uncertainty
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1475547 An Agent-Based Approach to Immune Modelling: Priming Individual Response
Authors: Dimitri Perrin, Heather J. Ruskin, Martin Crane
Abstract:
This study focuses on examining why the range of experience with respect to HIV infection is so diverse, especially in regard to the latency period. An agent-based approach in modelling the infection is used to extract high-level behaviour which cannot be obtained analytically from the set of interaction rules at the cellular level. A prototype model encompasses local variation in baseline properties, contributing to the individual disease experience, and is included in a network which mimics the chain of lymph nodes. The model also accounts for stochastic events such as viral mutations. The size and complexity of the model require major computational effort and parallelisation methods are used.Keywords: HIV, Immune modelling, Agent-based system, individual response.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1272546 Vehicle Routing Problem with Mixed Fleet of Conventional and Heterogenous Electric Vehicles and Time Dependent Charging Costs
Authors: Ons Sassi, Wahiba Ramdane Cherif-Khettaf, Ammar Oulamara
Abstract:
In this paper, we consider the vehicle routing problem with mixed fleet of conventional and heterogenous electric vehicles and time dependent charging costs, denoted VRP-HFCC, in which a set of geographically scattered customers have to be served by a mixed fleet of vehicles composed of a heterogenous fleet of Electric Vehicles (EVs), having different battery capacities and operating costs, and Conventional Vehicles (CVs). We include the possibility of charging EVs in the available charging stations during the routes in order to serve all customers. Each charging station offers charging service with a known technology of chargers and time dependent charging costs. Charging stations are also subject to operating time windows constraints. EVs are not necessarily compatible with all available charging technologies and a partial charging is allowed. Intermittent charging at the depot is also allowed provided that constraints related to the electricity grid are satisfied. The objective is to minimize the number of employed vehicles and then minimize the total travel and charging costs. In this study, we present a Mixed Integer Programming Model and develop a Charging Routing Heuristic and a Local Search Heuristic based on the Inject-Eject routine with different insertion methods. All heuristics are tested on real data instances.
Keywords: charging problem, electric vehicle, heuristics, local search, optimization, routing problem.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2670545 A Fuzzy-Rough Feature Selection Based on Binary Shuffled Frog Leaping Algorithm
Authors: Javad Rahimipour Anaraki, Saeed Samet, Mahdi Eftekhari, Chang Wook Ahn
Abstract:
Feature selection and attribute reduction are crucial problems, and widely used techniques in the field of machine learning, data mining and pattern recognition to overcome the well-known phenomenon of the Curse of Dimensionality. This paper presents a feature selection method that efficiently carries out attribute reduction, thereby selecting the most informative features of a dataset. It consists of two components: 1) a measure for feature subset evaluation, and 2) a search strategy. For the evaluation measure, we have employed the fuzzy-rough dependency degree (FRFDD) of the lower approximation-based fuzzy-rough feature selection (L-FRFS) due to its effectiveness in feature selection. As for the search strategy, a modified version of a binary shuffled frog leaping algorithm is proposed (B-SFLA). The proposed feature selection method is obtained by hybridizing the B-SFLA with the FRDD. Nine classifiers have been employed to compare the proposed approach with several existing methods over twenty two datasets, including nine high dimensional and large ones, from the UCI repository. The experimental results demonstrate that the B-SFLA approach significantly outperforms other metaheuristic methods in terms of the number of selected features and the classification accuracy.Keywords: Binary shuffled frog leaping algorithm, feature selection, fuzzy-rough set, minimal reduct.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 730544 In Search of Innovation: Exploring the Dynamics of Innovation
Authors: Michal Lysek, Mike Danilovic, Jasmine Lihua Liu
Abstract:
HMS Industrial Networks AB has been recognized as one of the most innovative companies in the industrial communication industry worldwide. The creation of their Anybus innovation during the 1990s contributed considerably to the company’s success. From inception, HMS’ employees were innovating for the purpose of creating new business (the creation phase). After the Anybus innovation, they began the process of internationalization (the commercialization phase), which in turn led them to concentrate on cost reduction, product quality, delivery precision, operational efficiency, and increasing growth (the growth phase). As a result of this transformation, performing new radical innovations have become more complicated. The purpose of our research was to explore the dynamics of innovation at HMS from the aspect of key actors, activities, and events, over the three phases, in order to understand what led to the creation of their Anybus innovation, and why it has become increasingly challenging for HMS to create new radical innovations for the future. Our research methodology was based on a longitudinal, retrospective study from the inception of HMS in 1988 to 2014, a single case study inspired by the grounded theory approach. We conducted 47 interviews and collected 1 024 historical documents for our research. Our analysis has revealed that HMS’ success in creating the Anybus, and developing a successful business around the innovation, was based on three main capabilities – cultivating customer relations on different managerial and organizational levels, inspiring business relations, and balancing complementary human assets for the purpose of business creation. The success of HMS has turned the management’s attention away from past activities of key actors, of their behavior, and how they influenced and stimulated the creation of radical innovations. Nowadays, they are rhetorically focusing on creativity and innovation. All the while, their real actions put emphasis on growth, cost reduction, product quality, delivery precision, operational efficiency, and moneymaking. In the process of becoming an international company, HMS gradually refocused. In so doing they became profitable and successful, but they also forgot what made them innovative in the first place. Fortunately, HMS’ management has come to realize that this is the case and they are now in search of recapturing innovation once again. Our analysis indicates that HMS’ management is facing several barriers to innovation related path dependency and other lock-in phenomena. HMS’ management has been captured, trapped in their mindset and actions, by the success of the past. But now their future has to be secured, and they have come to realize that moneymaking is not everything. In recent years, HMS’ management have begun to search for innovation once more, in order to recapture their past capabilities for creating radical innovations. In order to unlock their managerial perceptions of customer needs and their counter-innovation driven activities and events, to utilize the full potential of their employees and capture the innovation opportunity for the future.Keywords: Barriers to innovation, dynamics of innovation, in search of excellence and innovation, radical innovation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3060543 Fault Detection and Isolation in Attitude Control Subsystem of Spacecraft Formation Flying Using Extended Kalman Filters
Authors: S. Ghasemi, K. Khorasani
Abstract:
In this paper, the problem of fault detection and isolation in the attitude control subsystem of spacecraft formation flying is considered. In order to design the fault detection method, an extended Kalman filter is utilized which is a nonlinear stochastic state estimation method. Three fault detection architectures, namely, centralized, decentralized, and semi-decentralized are designed based on the extended Kalman filters. Moreover, the residual generation and threshold selection techniques are proposed for these architectures.
Keywords: Formation flight of satellites, extended Kalman filter, fault detection and isolation, actuator fault.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1947542 Information Retrieval: A Comparative Study of Textual Indexing Using an Oriented Object Database (db4o) and the Inverted File
Authors: Mohammed Erritali
Abstract:
The growth in the volume of text data such as books and articles in libraries for centuries has imposed to establish effective mechanisms to locate them. Early techniques such as abstraction, indexing and the use of classification categories have marked the birth of a new field of research called "Information Retrieval". Information Retrieval (IR) can be defined as the task of defining models and systems whose purpose is to facilitate access to a set of documents in electronic form (corpus) to allow a user to find the relevant ones for him, that is to say, the contents which matches with the information needs of the user. Most of the models of information retrieval use a specific data structure to index a corpus which is called "inverted file" or "reverse index". This inverted file collects information on all terms over the corpus documents specifying the identifiers of documents that contain the term in question, the frequency of each term in the documents of the corpus, the positions of the occurrences of the word... In this paper we use an oriented object database (db4o) instead of the inverted file, that is to say, instead to search a term in the inverted file, we will search it in the db4o database. The purpose of this work is to make a comparative study to see if the oriented object databases may be competing for the inverse index in terms of access speed and resource consumption using a large volume of data.
Keywords: Information Retrieval, indexation, oriented object database (db4o), inverted file.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1734541 Joint Use of Factor Analysis (FA) and Data Envelopment Analysis (DEA) for Ranking of Data Envelopment Analysis
Authors: Reza Nadimi, Fariborz Jolai
Abstract:
This article combines two techniques: data envelopment analysis (DEA) and Factor analysis (FA) to data reduction in decision making units (DMU). Data envelopment analysis (DEA), a popular linear programming technique is useful to rate comparatively operational efficiency of decision making units (DMU) based on their deterministic (not necessarily stochastic) input–output data and factor analysis techniques, have been proposed as data reduction and classification technique, which can be applied in data envelopment analysis (DEA) technique for reduction input – output data. Numerical results reveal that the new approach shows a good consistency in ranking with DEA.Keywords: Effectiveness, Decision Making, Data EnvelopmentAnalysis, Factor Analysis
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2424540 A Strategy for a Robust Design of Cracked Stiffened Panels
Authors: Francesco Caputo, Giuseppe Lamanna, Alessandro Soprano
Abstract:
This work is focused on the numerical prediction of the fracture resistance of a flat stiffened panel made of the aluminium alloy 2024 T3 under a monotonic traction condition. The performed numerical simulations have been based on the micromechanical Gurson-Tvergaard (GT) model for ductile damage. The applicability of the GT model to this kind of structural problems has been studied and assessed by comparing numerical results, obtained by using the WARP 3D finite element code, with experimental data available in literature. In the sequel a home-made procedure is presented, which aims to increase the residual strength of a cracked stiffened aluminum panel and which is based on the stochastic design improvement (SDI) technique; a whole application example is then given to illustrate the said technique.
Keywords: Residual strength, R-Curve, Gurson model, SDI.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1540539 A Distributed Cognition Framework to Compare E-Commerce Websites Using Data Envelopment Analysis
Authors: C. lo Storto
Abstract:
This paper presents an approach based on the adoption of a distributed cognition framework and a non parametric multicriteria evaluation methodology (DEA) designed specifically to compare e-commerce websites from the consumer/user viewpoint. In particular, the framework considers a website relative efficiency as a measure of its quality and usability. A website is modelled as a black box capable to provide the consumer/user with a set of functionalities. When the consumer/user interacts with the website to perform a task, he/she is involved in a cognitive activity, sustaining a cognitive cost to search, interpret and process information, and experiencing a sense of satisfaction. The degree of ambiguity and uncertainty he/she perceives and the needed search time determine the effort size – and, henceforth, the cognitive cost amount – he/she has to sustain to perform his/her task. On the contrary, task performing and result achievement induce a sense of gratification, satisfaction and usefulness. In total, 9 variables are measured, classified in a set of 3 website macro-dimensions (user experience, site navigability and structure). The framework is implemented to compare 40 websites of businesses performing electronic commerce in the information technology market. A questionnaire to collect subjective judgements for the websites in the sample was purposely designed and administered to 85 university students enrolled in computer science and information systems engineering undergraduate courses.Keywords: Website, e-commerce, DEA, distributed cognition, evaluation, comparison.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1705538 An Activity Based Trajectory Search Approach
Authors: Mohamed Mahmoud Hasan, Hoda M. O. Mokhtar
Abstract:
With the gigantic increment in portable applications use and the spread of positioning and location-aware technologies that we are seeing today, new procedures and methodologies for location-based strategies are required. Location recommendation is one of the highly demanded location-aware applications uniquely with the wide accessibility of social network applications that are location-aware including Facebook check-ins, Foursquare, and others. In this paper, we aim to present a new methodology for location recommendation. The proposed approach coordinates customary spatial traits alongside other essential components including shortest distance, and user interests. We also present another idea namely, "activity trajectory" that represents trajectory that fulfills the set of activities that the user is intrigued to do. The approach dispatched acquaints the related distance value to select trajectory(ies) with minimum cost value (distance) and spatial-area to prune unneeded directions. The proposed calculation utilizes the idea of movement direction to prescribe most comparable N-trajectory(ies) that matches the client's required action design with least voyaging separation. To upgrade the execution of the proposed approach, parallel handling is applied through the employment of a MapReduce based approach. Experiments taking into account genuine information sets were built up and tested for assessing the proposed approach. The exhibited tests indicate how the proposed approach beets different strategies giving better precision and run time.
Keywords: Location-based recommendation, map-reduce, recommendation system, trajectory search.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 978537 European and International Bond Markets Integration
Authors: Dimitris Georgoutsos, Petros M. Migiakis
Abstract:
The concurrent era is characterised by strengthened interactions among financial markets and increased capital mobility globally. In this frames we examine the effects the international financial integration process has on the European bond markets. We perform a comparative study of the interactions of the European and international bond markets and exploit Cointegration analysis results on the elimination of stochastic trends and the decomposition of the underlying long run equilibria and short run causal relations. Our investigation provides evidence on the relation between the European integration process and that of globalisation, viewed through the bond markets- sector. Additionally the structural formulation applied, offers significant implications of the findings. All in all our analysis offers a number of answers on crucial queries towards the European bond markets integration process.
Keywords: financial integration, bond markets, cointegration
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1820536 The Importance of Zenithal Lighting Systems for Natural Light Gains and for Local Energy Generation in Brazil
Authors: Ana Paula Esteves, Diego S. Caetano, Louise L. B. Lomardo
Abstract:
This paper presents an approach on the advantages of using adequate coverage in the zenithal lighting typology in various areas of architectural production, while at the same time to encourage to the design concerns inherent in this choice of roofing in Brazil. Understanding that sustainability needs to cover several aspects, a roofing system such as zenithal lighting system can contribute to the provision of better quality natural light for the interior of the building, which is related to the good health and welfare; it will also be able to contribute for the sustainable aspects and environmental needs, when it allows the generation of energy in semitransparent or opacity photovoltaic solutions and economize the artificial lightning. When the energy balance in the building is positive, that is, when the building generates more energy than it consumes, it may fit into the Net Zero Energy Building concept. The zenithal lighting systems could be an important ally in Brazil, when solved the burden of heat gains, participate in the set of pro-efficiency actions in search of "zero energy buildings". The paper presents comparative three cases of buildings that have used this feature in search of better environmental performance, both in light comfort and sustainability as a whole. Two of these buildings are examples in Europe: the Notley Green School in the UK and the Isofóton factory in Spain. The third building with these principles of shed´s roof is located in Brazil: the Ipel´s factory in São Paulo.
Keywords: Natural lightning, net zero energy building, sheds, semi-transparent photovoltaics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1036535 Comparative Study of Ant Colony and Genetic Algorithms for VLSI Circuit Partitioning
Authors: Sandeep Singh Gill, Rajeevan Chandel, Ashwani Chandel
Abstract:
This paper presents a comparative study of Ant Colony and Genetic Algorithms for VLSI circuit bi-partitioning. Ant colony optimization is an optimization method based on behaviour of social insects [27] whereas Genetic algorithm is an evolutionary optimization technique based on Darwinian Theory of natural evolution and its concept of survival of the fittest [19]. Both the methods are stochastic in nature and have been successfully applied to solve many Non Polynomial hard problems. Results obtained show that Genetic algorithms out perform Ant Colony optimization technique when tested on the VLSI circuit bi-partitioning problem.
Keywords: Partitioning, genetic algorithm, ant colony optimization, non-polynomial hard, netlist, mutation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2247534 Optimized Detection in Multi-Antenna System using Particle Swarm Algorithm
Authors: A. A. Khan, M. Naeem, S. Bashir, S. I. Shah
Abstract:
In this paper we propose a Particle Swarm heuristic optimized Multi-Antenna (MA) system. Efficient MA systems detection is performed using a robust stochastic evolutionary computation algorithm based on movement and intelligence of swarms. This iterative particle swarm optimized (PSO) detector significantly reduces the computational complexity of conventional Maximum Likelihood (ML) detection technique. The simulation results achieved with this proposed MA-PSO detection algorithm show near optimal performance when compared with ML-MA receiver. The performance of proposed detector is convincingly better for higher order modulation schemes and large number of antennas where conventional ML detector becomes non-practical.Keywords: Multi Antenna (MA), Multi-input Multi-output(MIMO), Particle Swarm Optimization (PSO), ML detection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1504