Search results for: data contents search
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8326

Search results for: data contents search

8086 Instruction Resource Recommendation Services for Elementary Schools in Taiwan

Authors: Hong-Ren Chen, Fang-Yu Yeh

Abstract:

In the past, there were more researches of recommendation system in applied electronic commerce. However, because all circles promote information technology integrative instruction actively, the quantity of instruction resources website is more and more increasing on the Internet. But there are less website including recommendation service, especially for teachers. This study established an instruction resource recommendation website that analyzed teaching style of teachers, then provided appropriate instruction resources for teachers immediately. We used the questionnaire survey to realize teacher-s suggestions and satisfactions with the instruction resource contents and recommendation results. The study shows: (1)The website used “Transactional Ability Inventory" that realized teacher-s style and provided appropriate instruction resources for teachers in a short time, it reduced the step of data filter. (2)According to the content satisfaction of questionnaire survey, four styles teachers were almost satisfied with the contents of the instruction resources that the website recommended, thus, the conception of developing instruction resources with different teaching style is accepted. (3) According to the recommendation satisfaction of questionnaire survey, four styles teachers were almost satisfied with the recommendation service of the website, thus, the recommendation strategy that provide different results for teachers in different teaching styles is accepted.

Keywords: Instruction resource, recommendation service, teaching style.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1489
8085 Employment Promotion and Its Role in Counteracting Unemployment during the Financial Crisis in the USA

Authors: Beata Wentura-Dudek

Abstract:

In the United States in 2007-2010 before the crisis, the US labour market policy focused mainly on providing residents with unemployment insurance, after the recession this policy changed. The aim of the article was to present quantitative research presenting the most effective labor market instruments contributing to reducing unemployment during the crisis in the USA. The article presents research based on the analysis of available documents and statistical data. The results of the conducted research show that the most effective forms of counteracting unemployment at that time were: direct job creation, job search assistance, subsidized employment, training and employment promotion using new technologies, including social media.

Keywords: United States, financial crisis, unemployment, employment promotion, social media, job creation, training, labour market, employment agencies, lifelong learning, job search assistance, subsidized employment, companies, tax.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 752
8084 Soccer Video Edition Using a Multimodal Annotation

Authors: Fendri Emna, Ben-Abdallah Hanêne, Ben-Hamadou Abdelmajid

Abstract:

In this paper, we present an approach for soccer video edition using a multimodal annotation. We propose to associate with each video sequence of a soccer match a textual document to be used for further exploitation like search, browsing and abstract edition. The textual document contains video meta data, match meta data, and match data. This document, generated automatically while the video is analyzed, segmented and classified, can be enriched semi automatically according to the user type and/or a specialized recommendation system.

Keywords: XML, Multimodal Annotation, recommendation system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1439
8083 Hybrid Structure Learning Approach for Assessing the Phosphate Laundries Impact

Authors: Emna Benmohamed, Hela Ltifi, Mounir Ben Ayed

Abstract:

Bayesian Network (BN) is one of the most efficient classification methods. It is widely used in several fields (i.e., medical diagnostics, risk analysis, bioinformatics research). The BN is defined as a probabilistic graphical model that represents a formalism for reasoning under uncertainty. This classification method has a high-performance rate in the extraction of new knowledge from data. The construction of this model consists of two phases for structure learning and parameter learning. For solving this problem, the K2 algorithm is one of the representative data-driven algorithms, which is based on score and search approach. In addition, the integration of the expert's knowledge in the structure learning process allows the obtainment of the highest accuracy. In this paper, we propose a hybrid approach combining the improvement of the K2 algorithm called K2 algorithm for Parents and Children search (K2PC) and the expert-driven method for learning the structure of BN. The evaluation of the experimental results, using the well-known benchmarks, proves that our K2PC algorithm has better performance in terms of correct structure detection. The real application of our model shows its efficiency in the analysis of the phosphate laundry effluents' impact on the watershed in the Gafsa area (southwestern Tunisia).

Keywords: Classification, Bayesian network; structure learning, K2 algorithm, expert knowledge, surface water analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 512
8082 An Evaluation Model for Semantic Enablement of Virtual Research Environments

Authors: Tristan O'Neill, Trina Myers, Jarrod Trevathan

Abstract:

The Tropical Data Hub (TDH) is a virtual research environment that provides researchers with an e-research infrastructure to congregate significant tropical data sets for data reuse, integration, searching, and correlation. However, researchers often require data and metadata synthesis across disciplines for crossdomain analyses and knowledge discovery. A triplestore offers a semantic layer to achieve a more intelligent method of search to support the synthesis requirements by automating latent linkages in the data and metadata. Presently, the benchmarks to aid the decision of which triplestore is best suited for use in an application environment like the TDH are limited to performance. This paper describes a new evaluation tool developed to analyze both features and performance. The tool comprises a weighted decision matrix to evaluate the interoperability, functionality, performance, and support availability of a range of integrated and native triplestores to rank them according to requirements of the TDH.

Keywords: Virtual research environment, Semantic Web, performance analysis, tropical data hub.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1783
8081 Geochemistry of Tektites from Hainan Island and Northeast Thailand

Authors: Yung-Tan Lee, Ren-Yi Huang, Ju-Chin Chen, Jyh-Yi Shih, Wen-Feng Chang, Yen-Tsui Hu, Chih-Cheng Chen

Abstract:

Twenty seven tektites from the Wenchang area, Hainan province (south China) and five tektites from the Khon Kaen area (northeast Thailand) were analyzed for major and trace element contents and Rb-Sr isotopic compositions. All the samples studied are splash-form tektites. Tektites of this study are characterized by high SiO2 contents ranging from 71.95 to 74.07 wt% which is consistent with previously published analyses of Australasian tektites. The trace element ratios Ba/Rb (avg. 3.89), Th/Sm (avg. 2.40), Sm/Sc (avg. 0.45), Th/Sc (avg. 0.99) and the rare earth elements (REE) contents of tektites of this study are similar to the average upper continental crust. Based on the chemical composition, it is suggested that tektites in this study are derived from similar parental material and are similar to the post-Archean upper crustal rocks. The major and trace element abundances of tektites analyzed indicate that the parental material of tektites may be a terrestrial sedimentary deposit. The tektites from the Wenchang area, Hainan Island have high positive εSr(0) values-ranging from 184.5~196.5 which indicate that the parental material for these tektites have similar Sr isotopic compositions to old terrestrial sedimentary rocks and they were not dominantly derived from recent young sediments (such as soil or loess). Based on Rb-Sr isotopic data, it has been suggested by Blum (1992) [1]that the depositional age of sedimentary target materials is close to 170Ma (Jurassic). According to the model suggested by Ho and Chen (1996)[2], mixing calculations for various amounts and combinations of target rocks have been carried out. We consider that the best fit for tektites from the Wenchang area is a mixture of 47% shale, 23% sandstone, 25% greywacke and 5% quartzite, and the other tektites from Khon Kaen area is a mixture of 46% shale, 2% sandstone, 20% greywacke and 32% quartzite.

Keywords: Geochemistry, Hainan Island, Northeast Thailand, Tektites.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1913
8080 Modeling of Nitrogen Solubility in Stainless Steel

Authors: Saeed Ghali, Hoda El-Faramawy, Mamdouh Eissa, Michael Mishreky

Abstract:

Scale-resistant austenitic stainless steel, X45CrNiW 18-9, has been developed, and modified steels produced through partial and total nickel replacement by nitrogen. These modified steels were produced in a 10 kg induction furnace under different nitrogen pressures and were cast into ingots. The produced modified stainless steels were forged, followed by air cooling. The phases of modified stainless steels have been investigated using the Schaeffler diagram, dilatometer, and microstructure observations. Both partial and total replacements of nickel using 0.33-0.50% nitrogen are effective in producing fully austenitic stainless steels. The nitrogen contents were determined and compared with those calculated using the Institute of Metal Science (IMS) equation. The results showed great deviations between the actual nitrogen contents and predicted values through IMS equation. So, an equation has been derived based on chemical composition, pressure, and temperature at 1600 oC: [N%] = 0.0078 + 0.0406*X, where X is a function of chemical composition and nitrogen pressure. The derived equation has been used to calculate the nitrogen content of different steels using published data. The results reveal the difficulty of deriving a general equation for the prediction of nitrogen content covering different steel compositions. So, it is necessary to use a narrow composition range.

Keywords: Solubility, nitrogen, stainless steel, Schaeffler.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 62
8079 Using Interval Trees for Approximate Indexing of Instances

Authors: Khalil el Hindi

Abstract:

This paper presents a simple and effective method for approximate indexing of instances for instance based learning. The method uses an interval tree to determine a good starting search point for the nearest neighbor. The search stops when an early stopping criterion is met. The method proved to be very effective especially when only the first nearest neighbor is required.

Keywords: Instance based learning, interval trees, the knn algorithm, machine learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1510
8078 Markov Chain Monte Carlo Model Composition Search Strategy for Quantitative Trait Loci in a Bayesian Hierarchical Model

Authors: Susan J. Simmons, Fang Fang, Qijun Fang, Karl Ricanek

Abstract:

Quantitative trait loci (QTL) experiments have yielded important biological and biochemical information necessary for understanding the relationship between genetic markers and quantitative traits. For many years, most QTL algorithms only allowed one observation per genotype. Recently, there has been an increasing demand for QTL algorithms that can accommodate more than one observation per genotypic distribution. The Bayesian hierarchical model is very flexible and can easily incorporate this information into the model. Herein a methodology is presented that uses a Bayesian hierarchical model to capture the complexity of the data. Furthermore, the Markov chain Monte Carlo model composition (MC3) algorithm is used to search and identify important markers. An extensive simulation study illustrates that the method captures the true QTL, even under nonnormal noise and up to 6 QTL.

Keywords: Bayesian hierarchical model, Markov chain MonteCarlo model composition, quantitative trait loci.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1962
8077 Enhanced Planar Pattern Tracking for an Outdoor Augmented Reality System

Authors: L. Yu, W. K. Li, S. K. Ong, A. Y. C. Nee

Abstract:

In this paper, a scalable augmented reality framework for handheld devices is presented. The presented framework is enabled by using a server-client data communication structure, in which the search for tracking targets among a database of images is performed on the server-side while pixel-wise 3D tracking is performed on the client-side, which, in this case, is a handheld mobile device. Image search on the server-side adopts a residual-enhanced image descriptors representation that gives the framework a scalability property. The tracking algorithm on the client-side is based on a gravity-aligned feature descriptor which takes the advantage of a sensor-equipped mobile device and an optimized intensity-based image alignment approach that ensures the accuracy of 3D tracking. Automatic content streaming is achieved by using a key-frame selection algorithm, client working phase monitoring and standardized rules for content communication between the server and client. The recognition accuracy test performed on a standard dataset shows that the method adopted in the presented framework outperforms the Bag-of-Words (BoW) method that has been used in some of the previous systems. Experimental test conducted on a set of video sequences indicated the real-time performance of the tracking system with a frame rate at 15-30 frames per second. The presented framework is exposed to be functional in practical situations with a demonstration application on a campus walk-around.

Keywords: Augmented reality framework, server-client model, vision-based tracking, image search.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1140
8076 Platform-as-a-Service Sticky Policies for Privacy Classification in the Cloud

Authors: Maha Shamseddine, Amjad Nusayr, Wassim Itani

Abstract:

In this paper, we present a Platform-as-a-Service (PaaS) model for controlling the privacy enforcement mechanisms applied on user data when stored and processed in Cloud data centers. The proposed architecture consists of establishing user configurable ‘sticky’ policies on the Graphical User Interface (GUI) data-bound components during the application development phase to specify the details of privacy enforcement on the contents of these components. Various privacy classification classes on the data components are formally defined to give the user full control on the degree and scope of privacy enforcement including the type of execution containers to process the data in the Cloud. This not only enhances the privacy-awareness of the developed Cloud services, but also results in major savings in performance and energy efficiency due to the fact that the privacy mechanisms are solely applied on sensitive data units and not on all the user content. The proposed design is implemented in a real PaaS cloud computing environment on the Microsoft Azure platform.

Keywords: Privacy enforcement, Platform-as-a-Service privacy awareness, cloud computing privacy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 759
8075 An Improved Conjugate Gradient Based Learning Algorithm for Back Propagation Neural Networks

Authors: N. M. Nawi, R. S. Ransing, M. R. Ransing

Abstract:

The conjugate gradient optimization algorithm is combined with the modified back propagation algorithm to yield a computationally efficient algorithm for training multilayer perceptron (MLP) networks (CGFR/AG). The computational efficiency is enhanced by adaptively modifying initial search direction as described in the following steps: (1) Modification on standard back propagation algorithm by introducing a gain variation term in the activation function, (2) Calculation of the gradient descent of error with respect to the weights and gains values and (3) the determination of a new search direction by using information calculated in step (2). The performance of the proposed method is demonstrated by comparing accuracy and computation time with the conjugate gradient algorithm used in MATLAB neural network toolbox. The results show that the computational efficiency of the proposed method was better than the standard conjugate gradient algorithm.

Keywords: Adaptive gain variation, back-propagation, activation function, conjugate gradient, search direction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1521
8074 String Searching in Dispersed Files using MDS Convolutional Codes

Authors: A. S. Poornima, R. Aparna, B. B. Amberker, Prashant Koulgi

Abstract:

In this paper, we propose use of convolutional codes for file dispersal. The proposed method is comparable in complexity to the information Dispersal Algorithm proposed by M.Rabin and for particular choices of (non-binary) convolutional codes, is almost as efficient as that algorithm in terms of controlling expansion in the total storage. Further, our proposed dispersal method allows string search.

Keywords: Convolutional codes, File dispersal, Filereconstruction, Information Dispersal Algorithm, String search.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1279
8073 Multiple Power Flow Solutions Using Particle Swarm Optimization with Embedded Local Search Technique

Authors: P. Acharjee, S. K. Goswami

Abstract:

Particle Swarm Optimization (PSO) with elite PSO parameters has been developed for power flow analysis under practical constrained situations. Multiple solutions of the power flow problem are useful in voltage stability assessment of power system. A method of determination of multiple power flow solutions is presented using a hybrid of Particle Swarm Optimization (PSO) and local search technique. The unique and innovative learning factors of the PSO algorithm are formulated depending upon the node power mismatch values to be highly adaptive with the power flow problems. The local search is applied on the pbest solution obtained by the PSO algorithm in each iteration. The proposed algorithm performs reliably and provides multiple solutions when applied on standard and illconditioned systems. The test results show that the performances of the proposed algorithm under critical conditions are better than the conventional methods.

Keywords: critical conditions, ill-conditioned systems, localsearch technique, multiple power flow solutions, particle swarmoptimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1816
8072 A Retrievable Genetic Algorithm for Efficient Solving of Sudoku Puzzles

Authors: Seyed Mehran Kazemi, Bahare Fatemi

Abstract:

Sudoku is a logic-based combinatorial puzzle game which is popular among people of different ages. Due to this popularity, computer softwares are being developed to generate and solve Sudoku puzzles with different levels of difficulty. Several methods and algorithms have been proposed and used in different softwares to efficiently solve Sudoku puzzles. Various search methods such as stochastic local search have been applied to this problem. Genetic Algorithm (GA) is one of the algorithms which have been applied to this problem in different forms and in several works in the literature. In these works, chromosomes with little or no information were considered and obtained results were not promising. In this paper, we propose a new way of applying GA to this problem which uses more-informed chromosomes than other works in the literature. We optimize the parameters of our GA using puzzles with different levels of difficulty. Then we use the optimized values of the parameters to solve various puzzles and compare our results to another GA-based method for solving Sudoku puzzles.

Keywords: Genetic algorithm, optimization, solving Sudoku puzzles, stochastic local search.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3771
8071 Music-Inspired Harmony Search Algorithm for Fixed Outline Non-Slicing VLSI Floorplanning

Authors: K. Sivasubramanian, K. B. Jayanthi

Abstract:

Floorplanning plays a vital role in the physical design process of Very Large Scale Integrated (VLSI) chips. It is an essential design step to estimate the chip area prior to the optimized placement of digital blocks and their interconnections. Since VLSI floorplanning is an NP-hard problem, many optimization techniques were adopted in the literature. In this work, a music-inspired Harmony Search (HS) algorithm is used for the fixed die outline constrained floorplanning, with the aim of reducing the total chip area. HS draws inspiration from the musical improvisation process of searching for a perfect state of harmony. Initially, B*-tree is used to generate the primary floorplan for the given rectangular hard modules and then HS algorithm is applied to obtain an optimal solution for the efficient floorplan. The experimental results of the HS algorithm are obtained for the MCNC benchmark circuits.

Keywords: Floor planning, harmony search, non-slicing floorplan, very large scale integrated circuits.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1956
8070 A Decision Matrix for the Evaluation of Triplestores for Use in a Virtual Research Environment

Authors: Tristan O’Neill, Trina Myers, Jarrod Trevathan

Abstract:

The Tropical Data Hub (TDH) is a virtual research environment that provides researchers with an e-research infrastructure to congregate significant tropical data sets for data reuse, integration, searching, and correlation. However, researchers often require data and metadata synthesis across disciplines for cross-domain analyses and knowledge discovery. A triplestore offers a semantic layer to achieve a more intelligent method of search to support the synthesis requirements by automating latent linkages in the data and metadata. Presently, the benchmarks to aid the decision of which triplestore is best suited for use in an application environment like the TDH are limited to performance. This paper describes a new evaluation tool developed to analyze both features and performance. The tool comprises a weighted decision matrix to evaluate the interoperability, functionality, performance, and support availability of a range of integrated and native triplestores to rank them according to requirements of the TDH.

Keywords: Virtual research environment, Semantic Web, performance analysis, tropical data hub.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1704
8069 GSA-Based Design of Dual Proportional Integral Load Frequency Controllers for Nonlinear Hydrothermal Power System

Authors: M. Elsisi, M. Soliman, M. A. S. Aboelela, W. Mansour

Abstract:

This paper considers the design of Dual Proportional- Integral (DPI) Load Frequency Control (LFC), using gravitational search algorithm (GSA). The design is carried out for nonlinear hydrothermal power system where generation rate constraint (GRC) and governor dead band are considered. Furthermore, time delays imposed by governor-turbine, thermodynamic process, and communication channels are investigated. GSA is utilized to search for optimal controller parameters by minimizing a time-domain based objective function. GSA-based DPI has been compared to Ziegler- Nichols based PI, and Genetic Algorithm (GA) based PI controllers in order to demonstrate the superior efficiency of the proposed design. Simulation results are carried for a wide range of operating conditions and system parameters variations.

Keywords: Gravitational Search Algorithm (GSA), Load Frequency Control (LFC), Dual Proportional-Integral (DPI) controller.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1985
8068 A Genetic Algorithm with Priority Selection for the Traveling Salesman Problem

Authors: Cha-Hwa Lin, Je-Wei Hu

Abstract:

The conventional GA combined with a local search algorithm, such as the 2-OPT, forms a hybrid genetic algorithm(HGA) for the traveling salesman problem (TSP). However, the geometric properties which are problem specific knowledge can be used to improve the search process of the HGA. Some tour segments (edges) of TSPs are fine while some maybe too long to appear in a short tour. This knowledge could constrain GAs to work out with fine tour segments without considering long tour segments as often. Consequently, a new algorithm is proposed, called intelligent-OPT hybrid genetic algorithm (IOHGA), to improve the GA and the 2-OPT algorithm in order to reduce the search time for the optimal solution. Based on the geometric properties, all the tour segments are assigned 2-level priorities to distinguish between good and bad genes. A simulation study was conducted to evaluate the performance of the IOHGA. The experimental results indicate that in general the IOHGA could obtain near-optimal solutions with less time and better accuracy than the hybrid genetic algorithm with simulated annealing algorithm (HGA(SA)).

Keywords: Traveling salesman problem, hybrid geneticalgorithm, priority selection, 2-OPT.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1560
8067 Deep iCrawl: An Intelligent Vision-Based Deep Web Crawler

Authors: R.Anita, V.Ganga Bharani, N.Nityanandam, Pradeep Kumar Sahoo

Abstract:

The explosive growth of World Wide Web has posed a challenging problem in extracting relevant data. Traditional web crawlers focus only on the surface web while the deep web keeps expanding behind the scene. Deep web pages are created dynamically as a result of queries posed to specific web databases. The structure of the deep web pages makes it impossible for traditional web crawlers to access deep web contents. This paper, Deep iCrawl, gives a novel and vision-based approach for extracting data from the deep web. Deep iCrawl splits the process into two phases. The first phase includes Query analysis and Query translation and the second covers vision-based extraction of data from the dynamically created deep web pages. There are several established approaches for the extraction of deep web pages but the proposed method aims at overcoming the inherent limitations of the former. This paper also aims at comparing the data items and presenting them in the required order.

Keywords: Crawler, Deep web, Web Database

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2156
8066 Prediction of Compressive Strength of SCC Containing Bottom Ash using Artificial Neural Networks

Authors: Yogesh Aggarwal, Paratibha Aggarwal

Abstract:

The paper presents a comparative performance of the models developed to predict 28 days compressive strengths using neural network techniques for data taken from literature (ANN-I) and data developed experimentally for SCC containing bottom ash as partial replacement of fine aggregates (ANN-II). The data used in the models are arranged in the format of six and eight input parameters that cover the contents of cement, sand, coarse aggregate, fly ash as partial replacement of cement, bottom ash as partial replacement of sand, water and water/powder ratio, superplasticizer dosage and an output parameter that is 28-days compressive strength and compressive strengths at 7 days, 28 days, 90 days and 365 days, respectively for ANN-I and ANN-II. The importance of different input parameters is also given for predicting the strengths at various ages using neural network. The model developed from literature data could be easily extended to the experimental data, with bottom ash as partial replacement of sand with some modifications.

Keywords: Self compacting concrete, bottom ash, strength, prediction, neural network, importance factor.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2226
8065 Solving Process Planning, Weighted Earliest Due Date Scheduling and Weighted Due Date Assignment Using Simulated Annealing and Evolutionary Strategies

Authors: Halil Ibrahim Demir, Abdullah Hulusi Kokcam, Fuat Simsir, Özer Uygun

Abstract:

Traditionally, three important manufacturing functions which are process planning, scheduling and due-date assignment are performed sequentially and separately. Although there are numerous works on the integration of process planning and scheduling and plenty of works focusing on scheduling with due date assignment, there are only a few works on integrated process planning, scheduling and due-date assignment. Although due-dates are determined without taking into account of weights of the customers in the literature, here weighted due-date assignment is employed to get better performance. Jobs are scheduled according to weighted earliest due date dispatching rule and due dates are determined according to some popular due date assignment methods by taking into account of the weights of each job. Simulated Annealing, Evolutionary Strategies, Random Search, hybrid of Random Search and Simulated Annealing, and hybrid of Random Search and Evolutionary Strategies, are applied as solution techniques. Three important manufacturing functions are integrated step-by-step and higher integration levels are found better. Search meta-heuristics are found to be very useful while improving performance measure.

Keywords: Evolutionary strategies, hybrid searches, process planning, simulated annealing, weighted due-date assignment, weighted scheduling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1158
8064 Restartings: A Technique to Improve Classic Genetic Algorithms Performance

Authors: Grigorios N. Beligiannis, Georgios A. Tsirogiannis, Panayotis E. Pintelas

Abstract:

In this contribution, a way to enhance the performance of the classic Genetic Algorithm is proposed. The idea of restarting a Genetic Algorithm is applied in order to obtain better knowledge of the solution space of the problem. A new operator of 'insertion' is introduced so as to exploit (utilize) the information that has already been collected before the restarting procedure. Finally, numerical experiments comparing the performance of the classic Genetic Algorithm and the Genetic Algorithm with restartings, for some well known test functions, are given.

Keywords: Genetic Algorithms, Restartings, Search space exploration, Search space exploitation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2138
8063 On the Continuous Service of Distributed e-Learning System

Authors: Kazunari Meguro, Shinichi Motomura, Takao Kawamura, Kazunori Sugahara

Abstract:

In this paper, backup and recovery technique for Peer to Peer applications, such as a distributed asynchronous Web-Based Training system that we have previously proposed. In order to improve the scalability and robustness of this system, all contents and function are realized on mobile agents. These agents are distributed to computers, and they can obtain using a Peer to Peer network that modified Content-Addressable Network. In the proposed system, although entire services do not become impossible even if some computers break down, the problem that contents disappear occurs with an agent-s disappearance. As a solution for this issue, backups of agents are distributed to computers. If a failure of a computer is detected, other computers will continue service using backups of the agents belonged to the computer.

Keywords: Distributed Multimedia Systems, e-Learning, P2P, Mobile Agent

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1559
8062 Optimizing PelletPAVE™ Rubberized Asphalt Mix Design Using Gyratory Compaction and Volumetrics

Authors: H. Al-Baghli

Abstract:

In this investigation, rubberized HMA technology was examined to address the most critical forms of pavement distresses in the State of Kuwait, namely, high temperature rutting, and moisture induced raveling. PelletPAVE™ additive was selected as the preferred technology, since it offered a convenient method of directly modifying the exiting local HMA recipe without having to polymer modify the bitumen. Experimental work, using various Pelletpave contents was carried out at Kuwait Institute for Scientific Research (KISR) to design an optimum rubberized HMA formulation prior to conducting a pilot-scale road trial. With the aid of a gyratory compactor, the compaction and volumetric properties of HMAs containing 2.5% and 3.0% Pelletpave additive were investigated at a range of bitumen contents, all by mass of total mix.

Keywords: Modified bitumen, rubberized hot mix asphalt, gyratory compaction, volumetric properties.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 346
8061 Novel Hybrid Approaches For Real Coded Genetic Algorithm to Compute the Optimal Control of a Single Stage Hybrid Manufacturing Systems

Authors: M. Senthil Arumugam, M.V.C. Rao

Abstract:

This paper presents a novel two-phase hybrid optimization algorithm with hybrid genetic operators to solve the optimal control problem of a single stage hybrid manufacturing system. The proposed hybrid real coded genetic algorithm (HRCGA) is developed in such a way that a simple real coded GA acts as a base level search, which makes a quick decision to direct the search towards the optimal region, and a local search method is next employed to do fine tuning. The hybrid genetic operators involved in the proposed algorithm improve both the quality of the solution and convergence speed. The phase–1 uses conventional real coded genetic algorithm (RCGA), while optimisation by direct search and systematic reduction of the size of search region is employed in the phase – 2. A typical numerical example of an optimal control problem with the number of jobs varying from 10 to 50 is included to illustrate the efficacy of the proposed algorithm. Several statistical analyses are done to compare the validity of the proposed algorithm with the conventional RCGA and PSO techniques. Hypothesis t – test and analysis of variance (ANOVA) test are also carried out to validate the effectiveness of the proposed algorithm. The results clearly demonstrate that the proposed algorithm not only improves the quality but also is more efficient in converging to the optimal value faster. They can outperform the conventional real coded GA (RCGA) and the efficient particle swarm optimisation (PSO) algorithm in quality of the optimal solution and also in terms of convergence to the actual optimum value.

Keywords: Hybrid systems, optimal control, real coded genetic algorithm (RCGA), Particle swarm optimization (PSO), Hybrid real coded GA (HRCGA), and Hybrid genetic operators.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1898
8060 Fuzzy Relatives of the CLARANS Algorithm With Application to Text Clustering

Authors: Mohamed A. Mahfouz, M. A. Ismail

Abstract:

This paper introduces new algorithms (Fuzzy relative of the CLARANS algorithm FCLARANS and Fuzzy c Medoids based on randomized search FCMRANS) for fuzzy clustering of relational data. Unlike existing fuzzy c-medoids algorithm (FCMdd) in which the within cluster dissimilarity of each cluster is minimized in each iteration by recomputing new medoids given current memberships, FCLARANS minimizes the same objective function minimized by FCMdd by changing current medoids in such away that that the sum of the within cluster dissimilarities is minimized. Computing new medoids may be effected by noise because outliers may join the computation of medoids while the choice of medoids in FCLARANS is dictated by the location of a predominant fraction of points inside a cluster and, therefore, it is less sensitive to the presence of outliers. In FCMRANS the step of computing new medoids in FCMdd is modified to be based on randomized search. Furthermore, a new initialization procedure is developed that add randomness to the initialization procedure used with FCMdd. Both FCLARANS and FCMRANS are compared with the robust and linearized version of fuzzy c-medoids (RFCMdd). Experimental results with different samples of the Reuter-21578, Newsgroups (20NG) and generated datasets with noise show that FCLARANS is more robust than both RFCMdd and FCMRANS. Finally, both FCMRANS and FCLARANS are more efficient and their outputs are almost the same as that of RFCMdd in terms of classification rate.

Keywords: Data Mining, Fuzzy Clustering, Relational Clustering, Medoid-Based Clustering, Cluster Analysis, Unsupervised Learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2402
8059 Spatial Query Localization Method in Limited Reference Point Environment

Authors: Victor Krebss

Abstract:

Task of object localization is one of the major challenges in creating intelligent transportation. Unfortunately, in densely built-up urban areas, localization based on GPS only produces a large error, or simply becomes impossible. New opportunities arise for the localization due to the rapidly emerging concept of a wireless ad-hoc network. Such network, allows estimating potential distance between these objects measuring received signal level and construct a graph of distances in which nodes are the localization objects, and edges - estimates of the distances between pairs of nodes. Due to the known coordinates of individual nodes (anchors), it is possible to determine the location of all (or part) of the remaining nodes of the graph. Moreover, road map, available in digital format can provide localization routines with valuable additional information to narrow node location search. However, despite abundance of well-known algorithms for solving the problem of localization and significant research efforts, there are still many issues that currently are addressed only partially. In this paper, we propose localization approach based on the graph mapped distances on the digital road map data basis. In fact, problem is reduced to distance graph embedding into the graph representing area geo location data. It makes possible to localize objects, in some cases even if only one reference point is available. We propose simple embedding algorithm and sample implementation as spatial queries over sensor network data stored in spatial database, allowing employing effectively spatial indexing, optimized spatial search routines and geometry functions.

Keywords: Intelligent Transportation System, Sensor Network, Localization, Spatial Query, GIS, Graph Embedding.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1535
8058 A Scalable Media Job Framework for an Open Source Search Engine

Authors: Pooja Mishra, Chris Pollett

Abstract:

This paper explores efficient ways to implement various media-updating features like news aggregation, video conversion, and bulk email handling. All of these jobs share the property that they are periodic in nature, and they all benefit from being handled in a distributed fashion. The data for these jobs also often comes from a social or collaborative source. We isolate the class of periodic, one round map reduce jobs as a useful setting to describe and handle media updating tasks. As such tasks are simpler than general map reduce jobs, programming them in a general map reduce platform could easily become tedious. This paper presents a MediaUpdater module of the Yioop Open Source Search Engine Web Portal designed to handle such jobs via an extension of a PHP class. We describe how to implement various media-updating tasks in our system as well as experiments carried out using these implementations on an Amazon Web Services cluster.

Keywords: Distributed jobs framework, news aggregation, video conversion, email.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1030
8057 Parameters Estimation of Double Diode Solar Cell Model

Authors: M. R. AlRashidi, K. M. El-Naggar, M. F. AlHajri

Abstract:

A new technique based on Pattern search optimization is proposed for estimating different solar cell parameters in this paper. The estimated parameters are the generated photocurrent, saturation current, series resistance, shunt resistance, and ideality factor. The proposed approach is tested and validated using double diode model to show its potential. Performance of the developed approach is quite interesting which signifies its potential as a promising estimation tool.

Keywords: Solar Cell, Parameter Estimation, Pattern Search.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5988