Search results for: Hausdorff-based metric
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 186

Search results for: Hausdorff-based metric

96 Estimation of Component Reusability through Reusability Metrics

Authors: Aditya Pratap Singh, Pradeep Tomar

Abstract:

Software reusability is an essential characteristic of Component-Based Software (CBS). The component reusability is an important assess for the effective reuse of components in CBS. The attributes of reusability proposed by various researchers are studied and four of them are identified as potential factors affecting reusability. This paper proposes metric for reusability estimation of black-box software component along with metrics for Interface Complexity, Understandability, Customizability and Reliability. An experiment is performed for estimation of reusability through a case study on a sample web application using a real world component.

Keywords: Component-based software, component reusability, customizability, interface complexity, reliability, understandability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3058
95 Electron Density Discrepancy Analysis of Energy Metabolism Coenzymes

Authors: Alan Luo, Hunter N. B. Moseley

Abstract:

Many macromolecular structure entries in the Protein Data Bank (PDB) have a range of regional (localized) quality issues, be it derived from X-ray crystallography, Nuclear Magnetic Resonance (NMR) spectroscopy, or other experimental approaches. However, most PDB entries are judged by global quality metrics like R-factor, R-free, and resolution for X-ray crystallography or backbone phi-psi distribution statistics and average restraint violations for NMR. Regional quality is often ignored when PDB entries are re-used for a variety of structurally based analyses. The binding of ligands, especially ligands involved in energy metabolism, is of particular interest in many structurally focused protein studies. Using a regional quality metric that provides chemically interpretable information from electron density maps, a significant number of outliers in regional structural quality was detected across X-ray crystallographic PDB entries for proteins bound to biochemically critical ligands. In this study, a series of analyses was performed to evaluate both specific and general potential factors that could promote these outliers. In particular, these potential factors were the minimum distance to a metal ion, the minimum distance to a crystal contact, and the isotropic atomic b-factor. To evaluate these potential factors, Fisher’s exact tests were performed, using regional quality criteria of outlier (top 1%, 2.5%, 5%, or 10%) versus non-outlier compared to a potential factor metric above versus below a certain outlier cutoff. The results revealed a consistent general effect from region-specific normalized b-factors but no specific effect from metal ion contact distances and only a very weak effect from crystal contact distance as compared to the b-factor results. These findings indicate that no single specific potential factor explains a majority of the outlier ligand-bound regions, implying that human error is likely as important as these other factors. Thus, all factors, including human error, should be considered when regions of low structural quality are detected. Also, the downstream re-use of protein structures for studying ligand-bound conformations should screen the regional quality of the binding sites. Doing so prevents misinterpretation due to the presence of structural uncertainty or flaws in regions of interest.

Keywords: Biomacromolecular structure, coenzyme, electron density discrepancy analysis, X-ray crystallography.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 255
94 A Metric Framework for Analysis of Quality of Object Oriented Design

Authors: Amandeep Kaur, Satwinder Singh, Dr. K. S. Kahlon

Abstract:

The impact of OO design on software quality characteristics such as defect density and rework by mean of experimental validation. Encapsulation, inheritance, polymorphism, reusability, Data hiding and message-passing are the major attribute of an Object Oriented system. In order to evaluate the quality of an Object oriented system the above said attributes can act as indicators. The metrics are the well known quantifiable approach to express any attribute. Hence, in this paper we tried to formulate a framework of metrics representing the attributes of object oriented system. Empirical Data is collected from three different projects based on object oriented paradigms to calculate the metrics.

Keywords: Object Oriented, Software metrics, Methods, Attributes, cohesion, coupling, Inheritance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1939
93 A Novel Web Metric for the Evaluation of Internet Trends

Authors: Radek Malinský, Ivan Jelínek

Abstract:

Web 2.0 (social networking, blogging and online forums) can serve as a data source for social science research because it contains vast amount of information from many different users. The volume of that information has been growing at a very high rate and becoming a network of heterogeneous data; this makes things difficult to find and is therefore not almost useful. We have proposed a novel theoretical model for gathering and processing data from Web 2.0, which would reflect semantic content of web pages in better way. This article deals with the analysis part of the model and its usage for content analysis of blogs. The introductory part of the article describes methodology for the gathering and processing data from blogs. The next part of the article is focused on the evaluation and content analysis of blogs, which write about specific trend.

Keywords: Blog, Sentiment Analysis, Web 2.0, Webometrics

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3543
92 A Linear Use Case Based Software Cost Estimation Model

Authors: Hasan.O. Farahneh, Ayman A. Issa

Abstract:

Software development is moving towards agility with use cases and scenarios being used for requirements stories. Estimates of software costs are becoming even more important than before as effects of delays is much larger in successive short releases context of agile development. Thus, this paper reports on the development of new linear use case based software cost estimation model applicable in the very early stages of software development being based on simple metric. Evaluation showed that accuracy of estimates varies between 43% and 55% of actual effort of historical test projects. These results outperformed those of wellknown models when applied in the same context. Further work is being carried out to improve the performance of the proposed model when considering the effect of non-functional requirements.

Keywords: Metrics, Software Cost Estimation, Use Cases

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2012
91 Bug Localization on Single-Line Bugs of Apache Commons Math Library

Authors: Cherry Oo, Hnin Min Oo

Abstract:

Software bug localization is one of the most costly tasks in program repair technique. Therefore, there is a high claim for automated bug localization techniques that can monitor programmers to the locations of bugs, with slight human arbitration. Spectrum-based bug localization aims to help software developers to discover bugs rapidly by investigating abstractions of the program traces to make a ranking list of most possible buggy modules. Using the Apache Commons Math library project, we study the diagnostic accuracy using our spectrum-based bug localization metric. Our outcomes show that the greater performance of a specific similarity coefficient, used to inspect the program spectra, is mostly effective on localizing of single line bugs.

Keywords: Software testing, fault localization, program spectra.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1147
90 A Computational Cost-Effective Clustering Algorithm in Multidimensional Space Using the Manhattan Metric: Application to the Global Terrorism Database

Authors: Semeh Ben Salem, Sami Naouali, Moetez Sallami

Abstract:

The increasing amount of collected data has limited the performance of the current analyzing algorithms. Thus, developing new cost-effective algorithms in terms of complexity, scalability, and accuracy raised significant interests. In this paper, a modified effective k-means based algorithm is developed and experimented. The new algorithm aims to reduce the computational load without significantly affecting the quality of the clusterings. The algorithm uses the City Block distance and a new stop criterion to guarantee the convergence. Conducted experiments on a real data set show its high performance when compared with the original k-means version.

Keywords: Pattern recognition, partitional clustering, K-means clustering, Manhattan distance, terrorism data analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1359
89 A New Efficient RNS Reverse Converter for the 4-Moduli Set 

Authors: Edem K. Bankas, Kazeem A. Gbolagade

Abstract:

In this paper, we propose a new efficient reverse converter for the 4-moduli set {2n, 2n + 1, 2n 1, 22n+1 1} based on a modified Chinese Remainder Theorem and Mixed Radix Conversion. Additionally, the resulting architecture is further reduced to obtain a reverse converter that utilizes only carry save adders, a multiplexer and carry propagate adders. The proposed converter has an area cost of (12n + 2) FAs and (5n + 1) HAs with a delay of (9n + 6)tFA + tMUX. When compared with state of the art, our proposal demonstrates to be faster, at the expense of slightly more hardware resources. Further, the Area-Time square metric was computed which indicated that our proposed scheme outperforms the state of the art reverse converter.

Keywords: Modified Chinese Remainder Theorem, Mixed Radix Conversion, Reverse Converter, Carry Save Adder, Carry Propagate Adder.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2320
88 Environmental Impact of Trade Sector Growth: Evidence from Tanzania

Authors: Mosses E. Lufuke

Abstract:

This paper attempted to investigate whether there is Granger-causality running from trade to environment as evidenced in the changing climatic condition and land degradation. Using Tanzania as the reference, VAR-Granger-causality test was employed to rationalize the conundrum of causal-effect relationship between trade and environment. The changing climatic condition, as the proxy of both nitrous oxide emissions (in thousand metric tons of CO2 equivalent) and land degradation measured by the size of arable land were tested against trade using both exports and imports variables. The result indicated that neither of the trade variables Granger-cause the variability on gas emissions and arable land size. This suggests the possibility that all trade concerns in relation to environment to have been internalized in domestic policies to offset any likely negative consequence.

Keywords: Trade, growth, impact, environment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1204
87 A New Categorization of Image Quality Metrics Based On a Model of Human Quality Perception

Authors: Maria Grazia Albanesi, Riccardo Amadeo

Abstract:

This study presents a new model of the human image quality assessment process: the aim is to highlightthe foundations of the image quality metrics proposed in literature, by identifyingthe cognitive/physiological or mathematical principles of their development and the relation with the actual human quality assessment process. The model allows to createa novel categorization of objective and subjective image quality metrics. Our work includes an overview of the most used or effectiveobjective metrics in literature, and, for each of them, we underline its main characteristics, with reference to the rationale of the proposed model and categorization. From the results of this operation, we underline a problem that affects all the presented metrics: the fact that many aspects of human biasesare not taken in account at all. We then propose a possible methodology to address this issue.

Keywords: Eye-Tracking, image quality assessment metric, MOS, quality of user experience, visual perception.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2448
86 Uplink Throughput Prediction in Cellular Mobile Networks

Authors: Engin Eyceyurt, Josko Zec

Abstract:

The current and future cellular mobile communication networks generate enormous amounts of data. Networks have become extremely complex with extensive space of parameters, features and counters. These networks are unmanageable with legacy methods and an enhanced design and optimization approach is necessary that is increasingly reliant on machine learning. This paper proposes that machine learning as a viable approach for uplink throughput prediction. LTE radio metric, such as Reference Signal Received Power (RSRP), Reference Signal Received Quality (RSRQ), and Signal to Noise Ratio (SNR) are used to train models to estimate expected uplink throughput. The prediction accuracy with high determination coefficient of 91.2% is obtained from measurements collected with a simple smartphone application.

Keywords: Drive test, LTE, machine learning, uplink throughput prediction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 894
85 Application of EEG Wavelet Power to Prediction of Antidepressant Treatment Response

Authors: Dorota Witkowska, Paweł Gosek, Lukasz Swiecicki, Wojciech Jernajczyk, Bruce J. West, Miroslaw Latka

Abstract:

In clinical practice, the selection of an antidepressant often degrades to lengthy trial-and-error. In this work we employ a normalized wavelet power of alpha waves as a biomarker of antidepressant treatment response. This novel EEG metric takes into account both non-stationarity and intersubject variability of alpha waves. We recorded resting, 19-channel EEG (closed eyes) in 22 inpatients suffering from unipolar (UD, n=10) or bipolar (BD, n=12) depression. The EEG measurement was done at the end of the short washout period which followed previously unsuccessful pharmacotherapy. The normalized alpha wavelet power of 11 responders was markedly different than that of 11 nonresponders at several, mostly temporoparietal sites. Using the prediction of treatment response based on the normalized alpha wavelet power, we achieved 81.8% sensitivity and 81.8% specificity for channel T4.

Keywords: Alpha waves, antidepressant, treatment outcome, wavelet.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1974
84 Awareness for Air Pollution Impacts on Lung Cancer in Southern California: A Pilot Study for Designed Smartphone Application

Authors: M. Mohammed Raoof, A. Enkhtaivan, H. Aljuaid

Abstract:

This study follows the design science research methodology to design and implement a smartphone application artifact. The developed artifact was evaluated through three phases. The System Usability Scale (SUS) metric was used for the evaluation. The designed artifact aims to spread awareness about reducing air pollution, decreasing lung cancer development, and checking the air quality status in Southern California Counties. Participants have been drawn for a pilot study to facilitate awareness of air pollution. The study found that smartphone applications have a beneficial effect on the study’s aims.

Keywords: Air pollution, design science research, indoor air pollution, lung cancer, outdoor air pollution, smartphone application.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 312
83 Blind Source Separation Using Modified Gaussian FastICA

Authors: V. K. Ananthashayana, Jyothirmayi M.

Abstract:

This paper addresses the problem of source separation in images. We propose a FastICA algorithm employing a modified Gaussian contrast function for the Blind Source Separation. Experimental result shows that the proposed Modified Gaussian FastICA is effectively used for Blind Source Separation to obtain better quality images. In this paper, a comparative study has been made with other popular existing algorithms. The peak signal to noise ratio (PSNR) and improved signal to noise ratio (ISNR) are used as metrics for evaluating the quality of images. The ICA metric Amari error is also used to measure the quality of separation.

Keywords: Amari error, Blind Source Separation, Contrast function, Gaussian function, Independent Component Analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1743
82 A Complexity Measure for Java Bean based Software Components

Authors: Sandeep Khimta, Parvinder S. Sandhu, Amanpreet Singh Brar

Abstract:

The traditional software product and process metrics are neither suitable nor sufficient in measuring the complexity of software components, which ultimately is necessary for quality and productivity improvement within organizations adopting CBSE. Researchers have proposed a wide range of complexity metrics for software systems. However, these metrics are not sufficient for components and component-based system and are restricted to the module-oriented systems and object-oriented systems. In this proposed study it is proposed to find the complexity of the JavaBean Software Components as a reflection of its quality and the component can be adopted accordingly to make it more reusable. The proposed metric involves only the design issues of the component and does not consider the packaging and the deployment complexity. In this way, the software components could be kept in certain limit which in turn help in enhancing the quality and productivity.

Keywords: JavaBean Components, Complexity, Metrics, Validation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1527
81 Some Laws of Rhythm Formulas of Ussuli in the Dancing Culture of People in the Middle and the Central Asia

Authors: G. Saitova, A. Mashurova, F. Mashurova

Abstract:

In the national and professional music of oral tradition of many people in the East there is the metric formula called “ussuli", that is to say rhythmic constructions of different character and a composition. Ussuli in translation from Arabic means the law. The cultural contacts of the ancient and medieval inhabitants of the Central Asia, India, China, East Turkestan, Iraq, Afghanistan, Turkey, and Iran have played a certain role in formation of both musical and dancing heritage of each of these people. During theatrical shows many dances were performed under the accompaniment of percussion instruments as nagra, dayulpaz, doll. The abovementioned tools are used as the obligatory accompanying tool in an orchestra and at support of dancing acts as the solo tool. Dynamics of development of a dancing composition, at times execution of technique of movement depends on various combinations of ussuli and their receptions of execution.

Keywords: Dancing, plastic, rhythm, ussuli.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2217
80 Impact of Fixation Time on Subjective Video Quality Metric: a New Proposal for Lossy Compression Impairment Assessment

Authors: M. G. Albanesi, R. Amadeo

Abstract:

In this paper, a new approach for quality assessment tasks in lossy compressed digital video is proposed. The research activity is based on the visual fixation data recorded by an eye tracker. The method involved both a new paradigm for subjective quality evaluation and the subsequent statistical analysis to match subjective scores provided by the observer to the data obtained from the eye tracker experiments. The study brings improvements to the state of the art, as it solves some problems highlighted in literature. The experiments prove that data obtained from an eye tracker can be used to classify videos according to the level of impairment due to compression. The paper presents the methodology, the experimental results and their interpretation. Conclusions suggest that the eye tracker can be useful in quality assessment, if data are collected and analyzed in a proper way.

Keywords: eye tracker, video compression, video qualityassessment, visual attention

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1606
79 Life Time Based Analysis of MAC Protocols of Wireless Ad Hoc Networks in WSN Applications

Authors: R. Alageswaran, S. Selvakumar, P. Neelamegam

Abstract:

Wireless Sensor Networks (WSN) are emerging because of the developments in wireless communication technology and miniaturization of the hardware. WSN consists of a large number of low-cost, low-power, multifunctional sensor nodes to monitor physical conditions, such as temperature, sound, vibration, pressure, motion, etc. The MAC protocol to be used in the sensor networks must be energy efficient and this should aim at conserving the energy during its operation. In this paper, with the focus of analyzing the MAC protocols used in wireless Adhoc networks to WSN, simulation experiments were conducted in Global Mobile Simulator (GloMoSim) software. Number of packets sent by regular nodes, and received by sink node in different deployment strategies, total energy spent, and the network life time have been chosen as the metric for comparison. From the results of simulation, it is evident that the IEEE 802.11 protocol performs better compared to CSMA and MACA protocols.

Keywords: CSMA, DCF, MACA, TelosB

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1513
78 Fuzzy Clustering Analysis in Real Estate Companies in China

Authors: Jianfeng Li, Feng Jin, Xiaoyu Yang

Abstract:

This paper applies fuzzy clustering algorithm in classifying real estate companies in China according to some general financial indexes, such as income per share, share accumulation fund, net profit margins, weighted net assets yield and shareholders' equity. By constructing and normalizing initial partition matrix, getting fuzzy similar matrix with Minkowski metric and gaining the transitive closure, the dynamic fuzzy clustering analysis for real estate companies is shown clearly that different clustered result change gradually with the threshold reducing, and then, it-s shown there is the similar relationship with the prices of those companies in stock market. In this way, it-s great valuable in contrasting the real estate companies- financial condition in order to grasp some good chances of investment, and so on.

Keywords: Fuzzy clustering algorithm, data mining, real estate company, financial analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1917
77 Fuzzy Metric Approach for Fuzzy Time Series Forecasting based on Frequency Density Based Partitioning

Authors: Tahseen Ahmed Jilani, Syed Muhammad Aqil Burney, C. Ardil

Abstract:

In the last 15 years, a number of methods have been proposed for forecasting based on fuzzy time series. Most of the fuzzy time series methods are presented for forecasting of enrollments at the University of Alabama. However, the forecasting accuracy rates of the existing methods are not good enough. In this paper, we compared our proposed new method of fuzzy time series forecasting with existing methods. Our method is based on frequency density based partitioning of the historical enrollment data. The proposed method belongs to the kth order and time-variant methods. The proposed method can get the best forecasting accuracy rate for forecasting enrollments than the existing methods.

Keywords: Fuzzy logical groups, fuzzified enrollments, fuzzysets, fuzzy time series.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3224
76 A K-Means Based Clustering Approach for Finding Faulty Modules in Open Source Software Systems

Authors: Parvinder S. Sandhu, Jagdeep Singh, Vikas Gupta, Mandeep Kaur, Sonia Manhas, Ramandeep Sidhu

Abstract:

Prediction of fault-prone modules provides one way to support software quality engineering. Clustering is used to determine the intrinsic grouping in a set of unlabeled data. Among various clustering techniques available in literature K-Means clustering approach is most widely being used. This paper introduces K-Means based Clustering approach for software finding the fault proneness of the Object-Oriented systems. The contribution of this paper is that it has used Metric values of JEdit open source software for generation of the rules for the categorization of software modules in the categories of Faulty and non faulty modules and thereafter empirically validation is performed. The results are measured in terms of accuracy of prediction, probability of Detection and Probability of False Alarms.

Keywords: K-Means, Software Fault, Classification, ObjectOriented Metrics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2304
75 GPU-Accelerated Triangle Mesh Simplification Using Parallel Vertex Removal

Authors: Thomas Odaker, Dieter Kranzlmueller, Jens Volkert

Abstract:

We present an approach to triangle mesh simplification designed to be executed on the GPU. We use a quadric error metric to calculate an error value for each vertex of the mesh and order all vertices based on this value. This step is followed by the parallel removal of a number of vertices with the lowest calculated error values. To allow for the parallel removal of multiple vertices we use a set of per-vertex boundaries that prevent mesh foldovers even when simplification operations are performed on neighbouring vertices. We execute multiple iterations of the calculation of the vertex errors, ordering of the error values and removal of vertices until either a desired number of vertices remains in the mesh or a minimum error value is reached. This parallel approach is used to speed up the simplification process while maintaining mesh topology and avoiding foldovers at every step of the simplification.

Keywords: Computer graphics, half edge collapse, mesh simplification, precomputed simplification, topology preserving.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2794
74 Patient-Specific Modeling Algorithm for Medical Data Based on AUC

Authors: Guilherme Ribeiro, Alexandre Oliveira, Antonio Ferreira, Shyam Visweswaran, Gregory Cooper

Abstract:

Patient-specific models are instance-based learning algorithms that take advantage of the particular features of the patient case at hand to predict an outcome. We introduce two patient-specific algorithms based on decision tree paradigm that use AUC as a metric to select an attribute. We apply the patient specific algorithms to predict outcomes in several datasets, including medical datasets. Compared to the patient-specific decision path (PSDP) entropy-based and CART methods, the AUC-based patient-specific decision path models performed equivalently on area under the ROC curve (AUC). Our results provide support for patient-specific methods being a promising approach for making clinical predictions.

Keywords: Approach instance-based, area Under the ROC Curve, Patient-specific Decision Path, clinical predictions.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1580
73 An Optimal Control Problem for Rigid Body Motions on Lie Group SO(2, 1)

Authors: Nemat Abazari, Ilgin Sager

Abstract:

In this paper smooth trajectories are computed in the Lie group SO(2, 1) as a motion planning problem by assigning a Frenet frame to the rigid body system to optimize the cost function of the elastic energy which is spent to track a timelike curve in Minkowski space. A method is proposed to solve a motion planning problem that minimize the integral of the square norm of Darboux vector of a timelike curve. This method uses the coordinate free Maximum Principle of Optimal control and results in the theory of integrable Hamiltonian systems. The presence of several conversed quantities inherent in these Hamiltonian systems aids in the explicit computation of the rigid body motions.

Keywords: Optimal control, Hamiltonian vector field, Darboux vector, maximum principle, lie group, Rigid body motion, Lorentz metric.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1334
72 Video Quality Assessment Methods: A Bird’s-Eye View

Authors: P. M. Arun Kumar, S. Chandramathi

Abstract:

The proliferation of multimedia technology and services in today’s world provide ample research scope in the frontiers of visual signal processing. Wide spread usage of video based applications in heterogeneous environment needs viable methods of Video Quality Assessment (VQA). The evaluation of video quality not only depends on high QoS requirements but also emphasis the need of novel term ‘QoE’ (Quality of Experience) that perceive video quality as user centric. This paper discusses two vital video quality assessment methods namely, subjective and objective assessment methods. The evolution of various video quality metrics, their classification models and applications are reviewed in this work. The Mean Opinion Score (MOS) based subjective measurements and algorithm based objective metrics are discussed and their challenges are outlined. Further, this paper explores the recent progress of VQA in emerging technologies such as mobile video and 3D video.

Keywords: 3D-Video, no reference metric, quality of experience, video quality assessment, video quality metrics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4053
71 Assessing and Visualizing the Stability of Feature Selectors: A Case Study with Spectral Data

Authors: R.Guzman-Martinez, Oscar Garcia-Olalla, R.Alaiz-Rodriguez

Abstract:

Feature selection plays an important role in applications with high dimensional data. The assessment of the stability of feature selection/ranking algorithms becomes an important issue when the dataset is small and the aim is to gain insight into the underlying process by analyzing the most relevant features. In this work, we propose a graphical approach that enables to analyze the similarity between feature ranking techniques as well as their individual stability. Moreover, it works with whatever stability metric (Canberra distance, Spearman's rank correlation coefficient, Kuncheva's stability index,...). We illustrate this visualization technique evaluating the stability of several feature selection techniques on a spectral binary dataset. Experimental results with a neural-based classifier show that stability and ranking quality may not be linked together and both issues have to be studied jointly in order to offer answers to the domain experts.

Keywords: Feature Selection Stability, Spectral data, Data visualization

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1526
70 Virtual Machines Cooperation for Impatient Jobs under Cloud Paradigm

Authors: Nawfal A. Mehdi, Ali Mamat, Hamidah Ibrahim, Shamala K. Syrmabn

Abstract:

The increase on the demand of IT resources diverts the enterprises to use the cloud as a cheap and scalable solution. Cloud computing promises achieved by using the virtual machine as a basic unite of computation. However, the virtual machine pre-defined settings might be not enough to handle jobs QoS requirements. This paper addresses the problem of mapping jobs have critical start deadlines to virtual machines that have predefined specifications. These virtual machines hosted by physical machines and shared a fixed amount of bandwidth. This paper proposed an algorithm that uses the idle virtual machines bandwidth to increase the quote of other virtual machines nominated as executors to urgent jobs. An algorithm with empirical study have been given to evaluate the impact of the proposed model on impatient jobs. The results show the importance of dynamic bandwidth allocation in virtualized environment and its affect on throughput metric.

Keywords: Insufficient bandwidth, virtual machine, cloudprovider, impatient jobs.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1680
69 Using Multi-Arm Bandits to Optimize Game Play Metrics and Effective Game Design

Authors: Kenny Raharjo, Ramon Lawrence

Abstract:

Game designers have the challenging task of building games that engage players to spend their time and money on the game. There are an infinite number of game variations and design choices, and it is hard to systematically determine game design choices that will have positive experiences for players. In this work, we demonstrate how multi-arm bandits can be used to automatically explore game design variations to achieve improved player metrics. The advantage of multi-arm bandits is that they allow for continuous experimentation and variation, intrinsically converge to the best solution, and require no special infrastructure to use beyond allowing minor game variations to be deployed to users for evaluation. A user study confirms that applying multi-arm bandits was successful in determining the preferred game variation with highest play time metrics and can be a useful technique in a game designer's toolkit.

Keywords: Game design, multi-arm bandit, design exploration and data mining, player metric optimization and analytics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1535
68 DWT Based Image Steganalysis

Authors: Indradip Banerjee, Souvik Bhattacharyya, Gautam Sanyal

Abstract:

‘Steganalysis’ is one of the challenging and attractive interests for the researchers with the development of information hiding techniques. It is the procedure to detect the hidden information from the stego created by known steganographic algorithm. In this paper, a novel feature based image steganalysis technique is proposed. Various statistical moments have been used along with some similarity metric. The proposed steganalysis technique has been designed based on transformation in four wavelet domains, which include Haar, Daubechies, Symlets and Biorthogonal. Each domain is being subjected to various classifiers, namely K-nearest-neighbor, K* Classifier, Locally weighted learning, Naive Bayes classifier, Neural networks, Decision trees and Support vector machines. The experiments are performed on a large set of pictures which are available freely in image database. The system also predicts the different message length definitions.

Keywords: Steganalysis, Moments, Wavelet Domain, KNN, K*, LWL, Naive Bayes Classifier, Neural networks, Decision trees, SVM.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2571
67 Planning Rigid Body Motions and Optimal Control Problem on Lie Group SO(2, 1)

Authors: Nemat Abazari, Ilgin Sager

Abstract:

In this paper smooth trajectories are computed in the Lie group SO(2, 1) as a motion planning problem by assigning a Frenet frame to the rigid body system to optimize the cost function of the elastic energy which is spent to track a timelike curve in Minkowski space. A method is proposed to solve a motion planning problem that minimizes the integral of the Lorentz inner product of Darboux vector of a timelike curve. This method uses the coordinate free Maximum Principle of Optimal control and results in the theory of integrable Hamiltonian systems. The presence of several conversed quantities inherent in these Hamiltonian systems aids in the explicit computation of the rigid body motions.

Keywords: Optimal control, Hamiltonian vector field, Darboux vector, maximum principle, lie group, rigid body motion, Lorentz metric.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1570