Search results for: Graphical approach
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5192

Search results for: Graphical approach

4952 Neural Network Ensemble-based Solar Power Generation Short-Term Forecasting

Authors: A. Chaouachi, R.M. Kamel, R. Ichikawa, H. Hayashi, K. Nagasaka

Abstract:

This paper presents the applicability of artificial neural networks for 24 hour ahead solar power generation forecasting of a 20 kW photovoltaic system, the developed forecasting is suitable for a reliable Microgrid energy management. In total four neural networks were proposed, namely: multi-layred perceptron, radial basis function, recurrent and a neural network ensemble consisting in ensemble of bagged networks. Forecasting reliability of the proposed neural networks was carried out in terms forecasting error performance basing on statistical and graphical methods. The experimental results showed that all the proposed networks achieved an acceptable forecasting accuracy. In term of comparison the neural network ensemble gives the highest precision forecasting comparing to the conventional networks. In fact, each network of the ensemble over-fits to some extent and leads to a diversity which enhances the noise tolerance and the forecasting generalization performance comparing to the conventional networks.

Keywords: Neural network ensemble, Solar power generation, 24 hour forecasting, Comparative study

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3276
4951 Development of a Wind Resource Assessment Framework Using Weather Research and Forecasting (WRF) Model, Python Scripting and Geographic Information Systems

Authors: Jerome T. Tolentino, Ma. Victoria Rejuso, Jara Kaye Villanueva, Loureal Camille Inocencio, Ma. Rosario Concepcion O. Ang

Abstract:

Wind energy is rapidly emerging as the primary source of electricity in the Philippines, although developing an accurate wind resource model is difficult. In this study, Weather Research and Forecasting (WRF) Model, an open source mesoscale Numerical Weather Prediction (NWP) model, was used to produce a 1-year atmospheric simulation with 4 km resolution on the Ilocos Region of the Philippines. The WRF output (netCDF) extracts the annual mean wind speed data using a Python-based Graphical User Interface. Lastly, wind resource assessment was produced using a GIS software. Results of the study showed that it is more flexible to use Python scripts than using other post-processing tools in dealing with netCDF files. Using WRF Model, Python, and Geographic Information Systems, a reliable wind resource map is produced.

Keywords: Wind resource assessment, Weather Research and Forecasting (WRF) Model, python, GIS software.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2396
4950 Development of an Automatic Calibration Framework for Hydrologic Modelling Using Approximate Bayesian Computation

Authors: A. Chowdhury, P. Egodawatta, J. M. McGree, A. Goonetilleke

Abstract:

Hydrologic models are increasingly used as tools to predict stormwater quantity and quality from urban catchments. However, due to a range of practical issues, most models produce gross errors in simulating complex hydraulic and hydrologic systems. Difficulty in finding a robust approach for model calibration is one of the main issues. Though automatic calibration techniques are available, they are rarely used in common commercial hydraulic and hydrologic modelling software e.g. MIKE URBAN. This is partly due to the need for a large number of parameters and large datasets in the calibration process. To overcome this practical issue, a framework for automatic calibration of a hydrologic model was developed in R platform and presented in this paper. The model was developed based on the time-area conceptualization. Four calibration parameters, including initial loss, reduction factor, time of concentration and time-lag were considered as the primary set of parameters. Using these parameters, automatic calibration was performed using Approximate Bayesian Computation (ABC). ABC is a simulation-based technique for performing Bayesian inference when the likelihood is intractable or computationally expensive to compute. To test the performance and usefulness, the technique was used to simulate three small catchments in Gold Coast. For comparison, simulation outcomes from the same three catchments using commercial modelling software, MIKE URBAN were used. The graphical comparison shows strong agreement of MIKE URBAN result within the upper and lower 95% credible intervals of posterior predictions as obtained via ABC. Statistical validation for posterior predictions of runoff result using coefficient of determination (CD), root mean square error (RMSE) and maximum error (ME) was found reasonable for three study catchments. The main benefit of using ABC over MIKE URBAN is that ABC provides a posterior distribution for runoff flow prediction, and therefore associated uncertainty in predictions can be obtained. In contrast, MIKE URBAN just provides a point estimate. Based on the results of the analysis, it appears as though ABC the developed framework performs well for automatic calibration.

Keywords: Automatic calibration framework, approximate Bayesian computation, hydrologic and hydraulic modelling, MIKE URBAN software, R platform.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1740
4949 On the EM Algorithm and Bootstrap Approach Combination for Improving Satellite Image Fusion

Authors: Tijani Delleji, Mourad Zribi, Ahmed Ben Hamida

Abstract:

This paper discusses EM algorithm and Bootstrap approach combination applied for the improvement of the satellite image fusion process. This novel satellite image fusion method based on estimation theory EM algorithm and reinforced by Bootstrap approach was successfully implemented and tested. The sensor images are firstly split by a Bayesian segmentation method to determine a joint region map for the fused image. Then, we use the EM algorithm in conjunction with the Bootstrap approach to develop the bootstrap EM fusion algorithm, hence producing the fused targeted image. We proposed in this research to estimate the statistical parameters from some iterative equations of the EM algorithm relying on a reference of representative Bootstrap samples of images. Sizes of those samples are determined from a new criterion called 'hybrid criterion'. Consequently, the obtained results of our work show that using the Bootstrap EM (BEM) in image fusion improve performances of estimated parameters which involve amelioration of the fused image quality; and reduce the computing time during the fusion process.

Keywords: Satellite image fusion, Bayesian segmentation, Bootstrap approach, EM algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2260
4948 A New Spectral-based Approach to Query-by-Humming for MP3 Songs Database

Authors: Leon Fu, Xiangyang Xue

Abstract:

In this paper, we propose a new approach to query-by-humming, focusing on MP3 songs database. Since MP3 songs are much more difficult in melody representation than symbolic performance data, we adopt to extract feature descriptors from the vocal sounds part of the songs. Our approach is based on signal filtering, sub-band spectral processing, MDCT coefficients analysis and peak energy detection by ignorance of the background music as much as possible. Finally, we apply dual dynamic programming algorithm for feature similarity matching. Experiments will show us its online performance in precision and efficiency.

Keywords: DP, MDCT, MP3, QBH.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1780
4947 An Efficient Approach to Mining Frequent Itemsets on Data Streams

Authors: Sara Ansari, Mohammad Hadi Sadreddini

Abstract:

The increasing importance of data stream arising in a wide range of advanced applications has led to the extensive study of mining frequent patterns. Mining data streams poses many new challenges amongst which are the one-scan nature, the unbounded memory requirement and the high arrival rate of data streams. In this paper, we propose a new approach for mining itemsets on data stream. Our approach SFIDS has been developed based on FIDS algorithm. The main attempts were to keep some advantages of the previous approach and resolve some of its drawbacks, and consequently to improve run time and memory consumption. Our approach has the following advantages: using a data structure similar to lattice for keeping frequent itemsets, separating regions from each other with deleting common nodes that results in a decrease in search space, memory consumption and run time; and Finally, considering CPU constraint, with increasing arrival rate of data that result in overloading system, SFIDS automatically detect this situation and discard some of unprocessing data. We guarantee that error of results is bounded to user pre-specified threshold, based on a probability technique. Final results show that SFIDS algorithm could attain about 50% run time improvement than FIDS approach.

Keywords: Data stream, frequent itemset, stream mining.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1419
4946 Mathematical Modeling and Analysis of Forced Vibrations in Micro-Scale Microstretch Thermoelastic Simply Supported Beam

Authors: Geeta Partap, Nitika Chugh

Abstract:

The present paper deals with the flexural vibrations of homogeneous, isotropic, generalized micropolar microstretch thermoelastic thin Euler-Bernoulli beam resonators, due to Exponential time varying load. Both the axial ends of the beam are assumed to be at simply supported conditions. The governing equations have been solved analytically by using Laplace transforms technique twice with respect to time and space variables respectively. The inversion of Laplace transform in time domain has been performed by using the calculus of residues to obtain deflection.The analytical results have been numerically analyzed with the help of MATLAB software for magnesium like material. The graphical representations and interpretations have been discussed for Deflection of beam under Simply Supported boundary condition and for distinct considered values of time and space as well. The obtained results are easy to implement for engineering analysis and designs of resonators (sensors), modulators, actuators.

Keywords: Microstretch, deflection, exponential load, Laplace transforms, Residue theorem, simply supported.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 941
4945 A Matching Algorithm of Minutiae for Real Time Fingerprint Identification System

Authors: Shahram Mohammadi, Ali Frajzadeh

Abstract:

A lot of matching algorithms with different characteristics have been introduced in recent years. For real time systems these algorithms are usually based on minutiae features. In this paper we introduce a novel approach for feature extraction in which the extracted features are independent of shift and rotation of the fingerprint and at the meantime the matching operation is performed much more easily and with higher speed and accuracy. In this new approach first for any fingerprint a reference point and a reference orientation is determined and then based on this information features are converted into polar coordinates. Due to high speed and accuracy of this approach and small volume of extracted features and easily execution of matching operation this approach is the most appropriate for real time applications.

Keywords: Matching, Minutiae, Reference point, Reference orientation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2414
4944 A New Heuristic Approach for the Large-Scale Generalized Assignment Problem

Authors: S. Raja Balachandar, K.Kannan

Abstract:

This paper presents a heuristic approach to solve the Generalized Assignment Problem (GAP) which is NP-hard. It is worth mentioning that many researches used to develop algorithms for identifying the redundant constraints and variables in linear programming model. Some of the algorithms are presented using intercept matrix of the constraints to identify redundant constraints and variables prior to the start of the solution process. Here a new heuristic approach based on the dominance property of the intercept matrix to find optimal or near optimal solution of the GAP is proposed. In this heuristic, redundant variables of the GAP are identified by applying the dominance property of the intercept matrix repeatedly. This heuristic approach is tested for 90 benchmark problems of sizes upto 4000, taken from OR-library and the results are compared with optimum solutions. Computational complexity is proved to be O(mn2) of solving GAP using this approach. The performance of our heuristic is compared with the best state-ofthe- art heuristic algorithms with respect to both the quality of the solutions. The encouraging results especially for relatively large size test problems indicate that this heuristic approach can successfully be used for finding good solutions for highly constrained NP-hard problems.

Keywords: Combinatorial Optimization Problem, Generalized Assignment Problem, Intercept Matrix, Heuristic, Computational Complexity, NP-Hard Problems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2348
4943 Selection of Material for Gear Used in Fuel Pump Using Graph Theory and Matrix Approach

Authors: Sahil, Rajeev Saha, Sanjeev Kumar

Abstract:

Material selection is one of the key issues for the production of reliable and quality products in industries. A number of materials are available for a single product due to which material selection become a difficult task. The aim of this paper is to select appropriate material for gear used in fuel pump by using Graph Theory and Matrix Approach (GTMA). GTMA is a logical and systematic approach that can be used to model and analyze various engineering systems. In present work, four alternative material and their seven attributes are used to identify the best material for given product.

Keywords: Material, GTMA, MADM, digraph, decision making.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1029
4942 Environmental Competency Framework: Development of a Modified 2-Tuple Delphi Approach

Authors: M. Bouri, L. Chraïbi, N. Sefiani

Abstract:

Currently, industries endeavor to align their environmental management system with the ISO 14001:2015 international standard, while preserving competitiveness and sustainability. Then, a key driver for these industries is to develop a skilled workforce that is able to implement, continuously improve and audit the environmental management system. The purpose of this paper is to provide an environmental competency framework that aims to identify, rank and categorize the competencies required by both the environmental managers and auditors. This competency framework is expected to be useful during competency assessment, recruitment, and training processes. To achieve this end, a modified 2-tuple Delphi approach is here proposed based on a combination of the modified Delphi approach and the 2-tuple linguistic representation model. The adopted approach is presented as numerous questionnaires that are spread over multiple rounds in order to obtain a consensus among the different Moroccan experts participating to this study.

Keywords: Competency framework, Delphi, environmental competency, 2-tuple.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 401
4941 A New Approach to Image Segmentation via Fuzzification of Rènyi Entropy of Generalized Distributions

Authors: Samy Sadek, Ayoub Al-Hamadi, Axel Panning, Bernd Michaelis, Usama Sayed

Abstract:

In this paper, we propose a novel approach for image segmentation via fuzzification of Rènyi Entropy of Generalized Distributions (REGD). The fuzzy REGD is used to precisely measure the structural information of image and to locate the optimal threshold desired by segmentation. The proposed approach draws upon the postulation that the optimal threshold concurs with maximum information content of the distribution. The contributions in the paper are as follow: Initially, the fuzzy REGD as a measure of the spatial structure of image is introduced. Then, we propose an efficient entropic segmentation approach using fuzzy REGD. However the proposed approach belongs to entropic segmentation approaches (i.e. these approaches are commonly applied to grayscale images), it is adapted to be viable for segmenting color images. Lastly, diverse experiments on real images that show the superior performance of the proposed method are carried out.

Keywords: Entropy of generalized distributions, entropy fuzzification, entropic image segmentation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3232
4940 Quadrature Formula for Sampled Functions

Authors: Khalid Minaoui, Thierry Chonavel, Benayad Nsiri, Driss Aboutajdine

Abstract:

This paper deals with efficient quadrature formulas involving functions that are observed only at fixed sampling points. The approach that we develop is derived from efficient continuous quadrature formulas, such as Gauss-Legendre or Clenshaw-Curtis quadrature. We select nodes at sampling positions that are as close as possible to those of the associated classical quadrature and we update quadrature weights accordingly. We supply the theoretical quadrature error formula for this new approach. We show on examples the potential gain of this approach.

Keywords: Gauss-Legendre, Clenshaw-Curtis, quadrature, Peano kernel, irregular sampling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1416
4939 Improving Flash Flood Forecasting with a Bayesian Probabilistic Approach: A Case Study on the Posina Basin in Italy

Authors: Zviad Ghadua, Biswa Bhattacharya

Abstract:

The Flash Flood Guidance (FFG) provides the rainfall amount of a given duration necessary to cause flooding. The approach is based on the development of rainfall-runoff curves, which helps us to find out the rainfall amount that would cause flooding. An alternative approach, mostly experimented with Italian Alpine catchments, is based on determining threshold discharges from past events and on finding whether or not an oncoming flood has its magnitude more than some critical discharge thresholds found beforehand. Both approaches suffer from large uncertainties in forecasting flash floods as, due to the simplistic approach followed, the same rainfall amount may or may not cause flooding. This uncertainty leads to the question whether a probabilistic model is preferable over a deterministic one in forecasting flash floods. We propose the use of a Bayesian probabilistic approach in flash flood forecasting. A prior probability of flooding is derived based on historical data. Additional information, such as antecedent moisture condition (AMC) and rainfall amount over any rainfall thresholds are used in computing the likelihood of observing these conditions given a flash flood has occurred. Finally, the posterior probability of flooding is computed using the prior probability and the likelihood. The variation of the computed posterior probability with rainfall amount and AMC presents the suitability of the approach in decision making in an uncertain environment. The methodology has been applied to the Posina basin in Italy. From the promising results obtained, we can conclude that the Bayesian approach in flash flood forecasting provides more realistic forecasting over the FFG.

Keywords: Flash flood, Bayesian, flash flood guidance, FFG, forecasting, Posina.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 749
4938 An EOQ Model for Non-Instantaneous Deteriorating Items with Power Demand, Time Dependent Holding Cost, Partial Backlogging and Permissible Delay in Payments

Authors: M. Palanivel, R. Uthayakumar

Abstract:

In this paper, Economic Order Quantity (EOQ) based model for non-instantaneous Weibull distribution deteriorating items with power demand pattern is presented. In this model, the holding cost per unit of the item per unit time is assumed to be an increasing linear function of time spent in storage. Here the retailer is allowed a trade-credit offer by the supplier to buy more items. Also in this model, shortages are allowed and partially backlogged. The backlogging rate is dependent on the waiting time for the next replenishment. This model aids in minimizing the total inventory cost by finding the optimal time interval and finding the optimal order quantity. The optimal solution of the model is illustrated with the help of numerical examples. Finally sensitivity analysis and graphical representations are given to demonstrate the model.

Keywords: Power demand pattern, Partial backlogging, Time dependent holding cost, Trade credit, Weibull deterioration.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3083
4937 A Novel Approach to Persian Online Hand Writing Recognition

Authors: Ramin Halavati, Mansour Jamzad, Mahdieh Soleymani

Abstract:

Persian (Farsi) script is totally cursive and each character is written in several different forms depending on its former and later characters in the word. These complexities make automatic handwriting recognition of Persian a very hard problem and there are few contributions trying to work it out. This paper presents a novel practical approach to online recognition of Persian handwriting which is based on representation of inputs and patterns with very simple visual features and comparison of these simple terms. This recognition approach is tested over a set of Persian words and the results have been quite acceptable when the possible words where unknown and they were almost all correct in cases that the words where chosen from a prespecified list.

Keywords: Image Processing, Pattern Recognition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1330
4936 An Approach to Adaptive Load Balancing for RFID Middlewares

Authors: Heung Seok Chae, Jaegeol Park

Abstract:

Recently, there have been an increasing interest in RFID system and RFID systems have been applied to various applications. Load balancing is a fundamental technique for providing scalability of systems by moving workload from overloaded nodes to under-loaded nodes. This paper presents an approach to adaptive load balancing for RFID middlewares. Workloads of RFID middlewares can have a considerable variation according to the location of the connected RFID readers and can abruptly change at a particular instance. The proposed approach considers those characteristics of RFID middle- wares to provide an efficient load balancing.

Keywords: RFID middleware, Adaptive load balancing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1564
4935 An Activity Based Trajectory Search Approach

Authors: Mohamed Mahmoud Hasan, Hoda M. O. Mokhtar

Abstract:

With the gigantic increment in portable applications use and the spread of positioning and location-aware technologies that we are seeing today, new procedures and methodologies for location-based strategies are required. Location recommendation is one of the highly demanded location-aware applications uniquely with the wide accessibility of social network applications that are location-aware including Facebook check-ins, Foursquare, and others. In this paper, we aim to present a new methodology for location recommendation. The proposed approach coordinates customary spatial traits alongside other essential components including shortest distance, and user interests. We also present another idea namely, "activity trajectory" that represents trajectory that fulfills the set of activities that the user is intrigued to do. The approach dispatched acquaints the related distance value to select trajectory(ies) with minimum cost value (distance) and spatial-area to prune unneeded directions. The proposed calculation utilizes the idea of movement direction to prescribe most comparable N-trajectory(ies) that matches the client's required action design with least voyaging separation. To upgrade the execution of the proposed approach, parallel handling is applied through the employment of a MapReduce based approach. Experiments taking into account genuine information sets were built up and tested for assessing the proposed approach. The exhibited tests indicate how the proposed approach beets different strategies giving better precision and run time.

Keywords: Location-based recommendation, map-reduce, recommendation system, trajectory search.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 980
4934 An Improved Prediction Model of Ozone Concentration Time Series Based On Chaotic Approach

Authors: N. Z. A. Hamid, M. S. M. Noorani

Abstract:

This study is focused on the development of prediction models of the Ozone concentration time series. Prediction model is built based on chaotic approach. Firstly, the chaotic nature of the time series is detected by means of phase space plot and the Cao method. Then, the prediction model is built and the local linear approximation method is used for the forecasting purposes. Traditional prediction of autoregressive linear model is also built. Moreover, an improvement in local linear approximation method is also performed. Prediction models are applied to the hourly Ozone time series observed at the benchmark station in Malaysia. Comparison of all models through the calculation of mean absolute error, root mean squared error and correlation coefficient shows that the one with improved prediction method is the best. Thus, chaotic approach is a good approach to be used to develop a prediction model for the Ozone concentration time series.

Keywords: Chaotic approach, phase space, Cao method, local linear approximation method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1783
4933 Scalable Deployment and Configuration of High-Performance Virtual Clusters

Authors: Kyrre M Begnum, Matthew Disney

Abstract:

Virtualization and high performance computing have been discussed from a performance perspective in recent publications. We present and discuss a flexible and efficient approach to the management of virtual clusters. A virtual machine management tool is extended to function as a fabric for cluster deployment and management. We show how features such as saving the state of a running cluster can be used to avoid disruption. We also compare our approach to the traditional methods of cluster deployment and present benchmarks which illustrate the efficiency of our approach.

Keywords: Cluster management, clusters, high-performance, virtual machines, Xen

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1404
4932 Two-Stage Approach for Solving the Multi-Objective Optimization Problem on Combinatorial Configurations

Authors: Liudmyla Koliechkina, Olena Dvirna

Abstract:

The statement of the multi-objective optimization problem on combinatorial configurations is formulated, and the approach to its solution is proposed. The problem is of interest as a combinatorial optimization one with many criteria, which is a model of many applied tasks. The approach to solving the multi-objective optimization problem on combinatorial configurations consists of two stages; the first is the reduction of the multi-objective problem to the single criterion based on existing multi-objective optimization methods, the second stage solves the directly replaced single criterion combinatorial optimization problem by the horizontal combinatorial method. This approach provides the optimal solution to the multi-objective optimization problem on combinatorial configurations, taking into account additional restrictions for a finite number of steps.

Keywords: Discrete set, linear combinatorial optimization, multi-objective optimization, multipermutation, Pareto solutions, partial permutation set, permutation, structural graph.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 668
4931 An Approach to Task Modeling for User Interface Design

Authors: Costin Pribeanu

Abstract:

The model-based approach to user interface design relies on developing separate models capturing various aspects about users, tasks, application domain, presentation and dialog structures. This paper presents a task modeling approach for user interface design and aims at exploring mappings between task, domain and presentation models. The basic idea of our approach is to identify typical configurations in task and domain models and to investigate how they relate each other. A special emphasis is put on applicationspecific functions and mappings between domain objects and operational task structures. In this respect, we will address two layers in task decomposition: a functional (planning) layer and an operational layer.

Keywords: task modeling, user interface design, unit tasks, basic tasks, operational task model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1878
4930 Color Image Segmentation Using Competitive and Cooperative Learning Approach

Authors: Yinggan Tang, Xinping Guan

Abstract:

Color image segmentation can be considered as a cluster procedure in feature space. k-means and its adaptive version, i.e. competitive learning approach are powerful tools for data clustering. But k-means and competitive learning suffer from several drawbacks such as dead-unit problem and need to pre-specify number of cluster. In this paper, we will explore to use competitive and cooperative learning approach to perform color image segmentation. In competitive and cooperative learning approach, seed points not only compete each other, but also the winner will dynamically select several nearest competitors to form a cooperative team to adapt to the input together, finally it can automatically select the correct number of cluster and avoid the dead-units problem. Experimental results show that CCL can obtain better segmentation result.

Keywords: Color image segmentation, competitive learning, cluster, k-means algorithm, competitive and cooperative learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1616
4929 A Discrete Choice Modeling Approach to Modular Systems Design

Authors: Ivan C. Mustakerov, Daniela I. Borissova

Abstract:

The paper proposes an approach for design of modular systems based on original technique for modeling and formulation of combinatorial optimization problems. The proposed approach is described on the example of personal computer configuration design. It takes into account the existing compatibility restrictions between the modules and can be extended and modified to reflect different functional and users- requirements. The developed design modeling technique is used to formulate single objective nonlinear mixedinteger optimization tasks. The practical applicability of the developed approach is numerically tested on the basis of real modules data. Solutions of the formulated optimization tasks define the optimal configuration of the system that satisfies all compatibility restrictions and user requirements.

Keywords: Constrained discrete combinatorial choice, modular systems design, optimization problem, PC configuration.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2020
4928 Statistical and Land Planning Study of Tourist Arrivals in Greece during 2005-2016

Authors: Dimitra Alexiou

Abstract:

During the last 10 years, in spite of the economic crisis, the number of tourists arriving in Greece has increased, particularly during the tourist season from April to October. In this paper, the number of annual tourist arrivals is studied to explore their preferences with regard to the month of travel, the selected destinations, as well the amount of money spent. The collected data are processed with statistical methods, yielding numerical and graphical results. From the computation of statistical parameters and the forecasting with exponential smoothing, useful conclusions are arrived at that can be used by the Greek tourism authorities, as well as by tourist organizations, for planning purposes for the coming years. The results of this paper and the computed forecast can also be used for decision making by private tourist enterprises that are investing in Greece. With regard to the statistical methods, the method of Simple Exponential Smoothing of time series of data is employed. The search for a best forecast for 2017 and 2018 provides the value of the smoothing coefficient. For all statistical computations and graphics Microsoft Excel is used.

Keywords: Tourism, statistical methods, exponential smoothing, land spatial planning, economy, Microsoft Excel.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 707
4927 TFRank: An Evaluation of Users Importance with Fractal Views in Social Networks

Authors: Fei Hao, Hai Wang

Abstract:

One of research issues in social network analysis is to evaluate the position/importance of users in social networks. As the information diffusion in social network is evolving, it seems difficult to evaluate the importance of users using traditional approaches. In this paper, we propose an evaluation approach for user importance with fractal view in social networks. In this approach, the global importance (Fractal Importance) and the local importance (Topological Importance) of nodes are considered. The basic idea is that the bigger the product of fractal importance and topological importance of a node is, the more important of the node is. We devise the algorithm called TFRank corresponding to the proposed approach. Finally, we evaluate TFRank by experiments. Experimental results demonstrate our TFRank has the high correlations with PageRank algorithm and potential ranking algorithm, and it shows the effectiveness and advantages of our approach.

Keywords: TFRank, Fractal Importance, Topological Importance, Social Network

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1513
4926 Recursive Wiener-Khintchine Theorem

Authors: Khalid M. Aamir, Mohammad A. Maud

Abstract:

Power Spectral Density (PSD) computed by taking the Fourier transform of auto-correlation functions (Wiener-Khintchine Theorem) gives better result, in case of noisy data, as compared to the Periodogram approach. However, the computational complexity of Wiener-Khintchine approach is more than that of the Periodogram approach. For the computation of short time Fourier transform (STFT), this problem becomes even more prominent where computation of PSD is required after every shift in the window under analysis. In this paper, recursive version of the Wiener-Khintchine theorem has been derived by using the sliding DFT approach meant for computation of STFT. The computational complexity of the proposed recursive Wiener-Khintchine algorithm, for a window size of N, is O(N).

Keywords: Power Spectral Density (PSD), Wiener-KhintchineTheorem, Periodogram, Short Time Fourier Transform (STFT), TheSliding DFT.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2484
4925 A Novel Steganographic Method for Gray-Level Images

Authors: Ahmad T. Al-Taani, Abdullah M. AL-Issa

Abstract:

In this work we propose a novel Steganographic method for hiding information within the spatial domain of the gray scale image. The proposed approach works by dividing the cover into blocks of equal sizes and then embeds the message in the edge of the block depending on the number of ones in left four bits of the pixel. The proposed approach is tested on a database consists of 100 different images. Experimental results, compared with other methods, showed that the proposed approach hide more large information and gave a good visual quality stego-image that can be seen by human eyes.

Keywords: Data Embedding, Cryptography, Watermarking, Steganography, Least Significant Bit, Information Hiding.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2266
4924 New Approach to Spectral Analysis of High Bit Rate PCM Signals

Authors: J. P. Dubois

Abstract:

Pulse code modulation is a widespread technique in digital communication with significant impact on existing modern and proposed future communication technologies. Its widespread utilization is due to its simplicity and attractive spectral characteristics. In this paper, we present a new approach to the spectral analysis of PCM signals using Riemann-Stieltjes integrals, which is very accurate for high bit rates. This approach can serve as a model for similar spectral analysis of other competing modulation schemes.

Keywords: Coding, discrete Fourier, power spectral density, pulse code modulation, Riemann-Stieltjes integrals.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1592
4923 Motion Recognition Based On Fuzzy WP Feature Extraction Approach

Authors: Keun-Chang Kwak

Abstract:

This paper is concerned with motion recognition based fuzzy WP(Wavelet Packet) feature extraction approach from Vicon physical data sets. For this purpose, we use an efficient fuzzy mutual-information-based WP transform for feature extraction. This method estimates the required mutual information using a novel approach based on fuzzy membership function. The physical action data set includes 10 normal and 10 aggressive physical actions that measure the human activity. The data have been collected from 10 subjects using the Vicon 3D tracker. The experiments consist of running, seating, and walking as physical activity motion among various activities. The experimental results revealed that the presented feature extraction approach showed good recognition performance.

Keywords: Motion recognition, fuzzy wavelet packet, Vicon physical data.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1644