Search results for: eGov Simplification measure
1092 SimplexIS: Evaluating the Impact of e-Gov Simplification Measures in the Information System Architecure
Authors: Bruno Félix, André Vasconcelos, José Tribolet
Abstract:
Nowadays increasingly the population makes use of Information Technology (IT). As such, in recent year the Portuguese government increased its focus on using the IT for improving people-s life and began to develop a set of measures to enable the modernization of the Public Administration, and so reducing the gap between Public Administration and citizens.Thus the Portuguese Government launched the Simplex Program. However these SIMPLEX eGov measures, which have been implemented over the years, present a serious challenge: how to forecast its impact on existing Information Systems Architecture (ISA). Thus, this research is focus in addressing the problem of automating the evaluation of the actual impact of implementation an eGovSimplification and Modernization measures in the Information Systems Architecture. To realize the evaluation we proposes a Framework, which is supported by some key concepts as: Quality Factors, ISA modeling, Multicriteria Approach, Polarity Profile and Quality MetricsKeywords: Information System Architecture, Evaluation, eGov Simplification measure, Multicriteria Evaluation
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14451091 GPU-Accelerated Triangle Mesh Simplification Using Parallel Vertex Removal
Authors: Thomas Odaker, Dieter Kranzlmueller, Jens Volkert
Abstract:
We present an approach to triangle mesh simplification designed to be executed on the GPU. We use a quadric error metric to calculate an error value for each vertex of the mesh and order all vertices based on this value. This step is followed by the parallel removal of a number of vertices with the lowest calculated error values. To allow for the parallel removal of multiple vertices we use a set of per-vertex boundaries that prevent mesh foldovers even when simplification operations are performed on neighbouring vertices. We execute multiple iterations of the calculation of the vertex errors, ordering of the error values and removal of vertices until either a desired number of vertices remains in the mesh or a minimum error value is reached. This parallel approach is used to speed up the simplification process while maintaining mesh topology and avoiding foldovers at every step of the simplification.Keywords: Computer graphics, half edge collapse, mesh simplification, precomputed simplification, topology preserving.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27941090 Geometric Simplification Method of Building Energy Model Based on Building Performance Simulation
Authors: Yan Lyu, Yiqun Pan, Zhizhong Huang
Abstract:
In the design stage of a new building, the energy model of this building is often required for the analysis of the performance on energy efficiency. In practice, a certain degree of geometric simplification should be done in the establishment of building energy models, since the detailed geometric features of a real building are hard to be described perfectly in most energy simulation engine, such as ESP-r, eQuest or EnergyPlus. Actually, the detailed description is not necessary when the result with extremely high accuracy is not demanded. Therefore, this paper analyzed the relationship between the error of the simulation result from building energy models and the geometric simplification of the models. Finally, the following two parameters are selected as the indices to characterize the geometric feature of in building energy simulation: the southward projected area and total side surface area of the building. Based on the parameterization method, the simplification from an arbitrary column building to a typical shape (a cuboid) building can be made for energy modeling. The result in this study indicates that no more than 7% prediction error of annual cooling/heating load will be caused by the geometric simplification for those buildings with the ratio of southward projection length to total perimeter of the bottom of 0.25~0.35, which means this method is applicable for building performance simulation.
Keywords: building energy model, simulation, geometric simplification, design, regression
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6231089 Incorporation of Long-Term Redundancy in ECG Time Domain Compression Methods through Curve Simplification and Block-Sorting
Authors: Bachir Boucheham, Youcef Ferdi, Mohamed Chaouki Batouche
Abstract:
We suggest a novel method to incorporate longterm redundancy (LTR) in signal time domain compression methods. The proposition is based on block-sorting and curve simplification. The proposition is illustrated on the ECG signal as a post-processor for the FAN method. Test applications on the new so-obtained FAN+ method using the MIT-BIH database show substantial improvement of the compression ratio-distortion behavior for a higher quality reconstructed signal.Keywords: ECG compression, Long-term redundancy, Block-sorting, Curve Simplification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15201088 3D Brain Tumor Segmentation Using Level-Sets Method and Meshes Simplification from Volumetric MR Images
Authors: K. Aloui, M. S. Naceur
Abstract:
The main objective of this paper is to provide an efficient tool for delineating brain tumors in three-dimensional magnetic resonance images. To achieve this goal, we use basically a level-sets approach to delineating three-dimensional brain tumors. Then we introduce a compression plan of 3D brain structures based for the meshes simplification, adapted for time to the specific needs of the telemedicine and to the capacities restricted by network communication. We present here the main stages of our system, and preliminary results which are very encouraging for clinical practice.
Keywords: Medical imaging, level-sets, compression, meshess implification, telemedicine.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21321087 Necessary and Sufficient Condition for the Quaternion Vector Measure
Authors: Mei Li, Fahui Zhai
Abstract:
In this paper, the definitions of the quaternion measure and the quaternion vector measure are introduced. The relation between the quaternion measure and the complex vector measure as well as the relation between the quaternion linear functional and the complex linear functional are discussed respectively. By using these relations, the necessary and sufficient condition to determine the quaternion vector measure is given.Keywords: Quaternion, Quaternion measure, Quaternion vector measure, Quaternion Banach space, Quaternion linear functional.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12741086 Integrating Fast Karnough Map and Modular Neural Networks for Simplification and Realization of Complex Boolean Functions
Authors: Hazem M. El-Bakry
Abstract:
In this paper a new fast simplification method is presented. Such method realizes Karnough map with large number of variables. In order to accelerate the operation of the proposed method, a new approach for fast detection of group of ones is presented. Such approach implemented in the frequency domain. The search operation relies on performing cross correlation in the frequency domain rather than time one. It is proved mathematically and practically that the number of computation steps required for the presented method is less than that needed by conventional cross correlation. Simulation results using MATLAB confirm the theoretical computations. Furthermore, a powerful solution for realization of complex functions is given. The simplified functions are implemented by using a new desigen for neural networks. Neural networks are used because they are fault tolerance and as a result they can recognize signals even with noise or distortion. This is very useful for logic functions used in data and computer communications. Moreover, the implemented functions are realized with minimum amount of components. This is done by using modular neural nets (MNNs) that divide the input space into several homogenous regions. Such approach is applied to implement XOR function, 16 logic functions on one bit level, and 2-bit digital multiplier. Compared to previous non- modular designs, a clear reduction in the order of computations and hardware requirements is achieved.Keywords: Boolean Functions, Simplification, KarnoughMap, Implementation of Logic Functions, Modular NeuralNetworks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18131085 Integrating Fast Karnough Map and Modular Neural Networks for Simplification and Realization of Complex Boolean Functions
Authors: Hazem M. El-Bakry
Abstract:
In this paper a new fast simplification method is presented. Such method realizes Karnough map with large number of variables. In order to accelerate the operation of the proposed method, a new approach for fast detection of group of ones is presented. Such approach implemented in the frequency domain. The search operation relies on performing cross correlation in the frequency domain rather than time one. It is proved mathematically and practically that the number of computation steps required for the presented method is less than that needed by conventional cross correlation. Simulation results using MATLAB confirm the theoretical computations. Furthermore, a powerful solution for realization of complex functions is given. The simplified functions are implemented by using a new desigen for neural networks. Neural networks are used because they are fault tolerance and as a result they can recognize signals even with noise or distortion. This is very useful for logic functions used in data and computer communications. Moreover, the implemented functions are realized with minimum amount of components. This is done by using modular neural nets (MNNs) that divide the input space into several homogenous regions. Such approach is applied to implement XOR function, 16 logic functions on one bit level, and 2-bit digital multiplier. Compared to previous non- modular designs, a clear reduction in the order of computations and hardware requirements is achieved.
Keywords: Boolean functions, simplification, Karnough map, implementation of logic functions, modular neural networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20691084 Development of an Intelligent Tool for Planning the Operation
Authors: T. R. Alencar, P. T. Leite
Abstract:
Several optimization algorithms specifically applied to the problem of Operation Planning of Hydrothermal Power Systems have been developed and are used. Although providing solutions to various problems encountered, these algorithms have some weaknesses, difficulties in convergence, simplification of the original formulation of the problem, or owing to the complexity of the objective function. Thus, this paper presents the development of a computational tool for solving optimization problem identified and to provide the User an easy handling. Adopted as intelligent optimization technique, Genetic Algorithms and programming language Java. First made the modeling of the chromosomes, then implemented the function assessment of the problem and the operators involved, and finally the drafting of the graphical interfaces for access to the User. The program has managed to relate a coherent performance in problem resolution without the need for simplification of the calculations together with the ease of manipulating the parameters of simulation and visualization of output results.Keywords: Energy, Optimization, Hydrothermal Power Systemsand Genetic Algorithms
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16961083 A New Similarity Measure Based On Edge Counting
Authors: T. Slimani, B. Ben Yaghlane, K. Mellouli
Abstract:
In the field of concepts, the measure of Wu and Palmer [1] has the advantage of being simple to implement and have good performances compared to the other similarity measures [2]. Nevertheless, the Wu and Palmer measure present the following disadvantage: in some situations, the similarity of two elements of an IS-A ontology contained in the neighborhood exceeds the similarity value of two elements contained in the same hierarchy. This situation is inadequate within the information retrieval framework. To overcome this problem, we propose a new similarity measure based on the Wu and Palmer measure. Our objective is to obtain realistic results for concepts not located in the same way. The obtained results show that compared to the Wu and Palmer approach, our measure presents a profit in terms of relevance and execution time.
Keywords: Hierarchy, IS-A ontology, Semantic Web, Similarity Measure.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14861082 Improved K-Modes for Categorical Clustering Using Weighted Dissimilarity Measure
Authors: S.Aranganayagi, K.Thangavel
Abstract:
K-Modes is an extension of K-Means clustering algorithm, developed to cluster the categorical data, where the mean is replaced by the mode. The similarity measure proposed by Huang is the simple matching or mismatching measure. Weight of attribute values contribute much in clustering; thus in this paper we propose a new weighted dissimilarity measure for K-Modes, based on the ratio of frequency of attribute values in the cluster and in the data set. The new weighted measure is experimented with the data sets obtained from the UCI data repository. The results are compared with K-Modes and K-representative, which show that the new measure generates clusters with high purity.
Keywords: Clustering, categorical data, K-Modes, weighted dissimilarity measure
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 36881081 Automatic Map Simplification for Visualization on Mobile Devices
Authors: Hang Yu
Abstract:
The visualization of geographic information on mobile devices has become popular as the widespread use of mobile Internet. The mobility of these devices brings about much convenience to people-s life. By the add-on location-based services of the devices, people can have an access to timely information relevant to their tasks. However, visual analysis of geographic data on mobile devices presents several challenges due to the small display and restricted computing resources. These limitations on the screen size and resources may impair the usability aspects of the visualization applications. In this paper, a variable-scale visualization method is proposed to handle the challenge of small mobile display. By merging multiple scales of information into a single image, the viewer is able to focus on the interesting region, while having a good grasp of the surrounding context. This is essentially visualizing the map through a fisheye lens. However, the fisheye lens induces undesirable geometric distortion in the peripheral, which renders the information meaningless. The proposed solution is to apply map generalization that removes excessive information around the peripheral and an automatic smoothing process to correct the distortion while keeping the local topology consistent. The proposed method is applied on both artificial and real geographical data for evaluation.
Keywords: Map simplification, visualization, mobile devices.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14351080 Further the Effectiveness of Software Testability Measure
Authors: Liang Zhao, Feng Wang, Bo Deng, Bo Yang
Abstract:
Software testability is proposed to address the problem of increasing cost of test and the quality of software. Testability measure provides a quantified way to denote the testability of software. Since 1990s, many testability measure models are proposed to address the problem. By discussing the contradiction between domain testability and domain range ratio (DRR), a new testability measure, semantic fault distance, is proposed. Its validity is discussed.
Keywords: Software testability, DRR, Domain testability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20451079 Finite Element Analysis of Sheet Metal Airbending Using Hyperform LS-DYNA
Authors: Himanshu V. Gajjar, Anish H. Gandhi, Harit K. Raval
Abstract:
Air bending is one of the important metal forming processes, because of its simplicity and large field application. Accuracy of analytical and empirical models reported for the analysis of bending processes is governed by simplifying assumption and do not consider the effect of dynamic parameters. Number of researches is reported on the finite element analysis (FEA) of V-bending, Ubending, and air V-bending processes. FEA of bending is found to be very sensitive to many physical and numerical parameters. FE models must be computationally efficient for practical use. Reported work shows the 3D FEA of air bending process using Hyperform LSDYNA and its comparison with, published 3D FEA results of air bending in Ansys LS-DYNA and experimental results. Observing the planer symmetry and based on the assumption of plane strain condition, air bending problem was modeled in 2D with symmetric boundary condition in width. Stress-strain results of 2D FEA were compared with 3D FEA results and experiments. Simplification of air bending problem from 3D to 2D resulted into tremendous reduction in the solution time with only marginal effect on stressstrain results. FE model simplification by studying the problem symmetry is more efficient and practical approach for solution of more complex large dimensions slow forming processes.Keywords: Air V-bending, Finite element analysis, HyperformLS-DYNA, Planner symmetry.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 32081078 Game Skill Measure for Mixed Games
Authors: Roman V. Yampolskiy
Abstract:
Games can be classified as games of skill, games of chance or otherwise be classified as mixed. This paper deals with the topic of scientifically classifying mixed games as more reliant on elements of chance or elements of skill and ways to scientifically measure the amount of skill involved. This is predominantly useful for classification of games as legal or illegal in deferent jurisdictions based on the local gaming laws. We propose a novel measure of skill to chance ratio called the Game Skill Measure (GSM) and utilize it to calculate the skill component of a popular variant of Poker.Keywords: Chance, Game, Skill, Luck.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14621077 A Similarity Measure for Clustering and its Applications
Authors: Guadalupe J. Torres, Ram B. Basnet, Andrew H. Sung, Srinivas Mukkamala, Bernardete M. Ribeiro
Abstract:
This paper introduces a measure of similarity between two clusterings of the same dataset produced by two different algorithms, or even the same algorithm (K-means, for instance, with different initializations usually produce different results in clustering the same dataset). We then apply the measure to calculate the similarity between pairs of clusterings, with special interest directed at comparing the similarity between various machine clusterings and human clustering of datasets. The similarity measure thus can be used to identify the best (in terms of most similar to human) clustering algorithm for a specific problem at hand. Experimental results pertaining to the text categorization problem of a Portuguese corpus (wherein a translation-into-English approach is used) are presented, as well as results on the well-known benchmark IRIS dataset. The significance and other potential applications of the proposed measure are discussed.Keywords: Clustering Algorithms, Clustering Applications, Similarity Measures, Text Clustering
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15701076 Entropy Measures on Neutrosophic Soft Sets and Its Application in Multi Attribute Decision Making
Authors: I. Arockiarani
Abstract:
The focus of the paper is to furnish the entropy measure for a neutrosophic set and neutrosophic soft set which is a measure of uncertainty and it permeates discourse and system. Various characterization of entropy measures are derived. Further we exemplify this concept by applying entropy in various real time decision making problems.Keywords: Entropy measure, Hausdorff distance, neutrosophic set, soft set.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9331075 Similarity Measure Functions for Strategy-Based Biometrics
Authors: Roman V. Yampolskiy, Venu Govindaraju
Abstract:
Functioning of a biometric system in large part depends on the performance of the similarity measure function. Frequently a generalized similarity distance measure function such as Euclidian distance or Mahalanobis distance is applied to the task of matching biometric feature vectors. However, often accuracy of a biometric system can be greatly improved by designing a customized matching algorithm optimized for a particular biometric application. In this paper we propose a tailored similarity measure function for behavioral biometric systems based on the expert knowledge of the feature level data in the domain. We compare performance of a proposed matching algorithm to that of other well known similarity distance functions and demonstrate its superiority with respect to the chosen domain.Keywords: Behavioral Biometrics, Euclidian Distance, Matching, Similarity Measure.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16481074 Control-flow Complexity Measurement of Processes and Weyuker's Properties
Authors: Jorge Cardoso
Abstract:
Process measurement is the task of empirically and objectively assigning numbers to the properties of business processes in such a way as to describe them. Desirable attributes to study and measure include complexity, cost, maintainability, and reliability. In our work we will focus on investigating process complexity. We define process complexity as the degree to which a business process is difficult to analyze, understand or explain. One way to analyze a process- complexity is to use a process control-flow complexity measure. In this paper, an attempt has been made to evaluate the control-flow complexity measure in terms of Weyuker-s properties. Weyuker-s properties must be satisfied by any complexity measure to qualify as a good and comprehensive one.
Keywords: Business process measurement, workflow, complexity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 26951073 Application of Data Envelopment Analysis to Assess Quality Management Efficiency
Authors: Chuen Tse Kuah, Kuan Yew Wong, Farzad Behrouzi
Abstract:
This paper is aimed to give an illustration on the application of Data Envelopment Analysis (DEA) as a tool to assess Quality Management (QM) efficiency. A variant of DEA, slack based measure (SBM) is used for this purpose. From this study, it is found that DEA is suitable to measure QM efficiency and give improvement suggestions to the inefficient QM.Keywords: Quality Management, Data Envelopment Analysis, Slack Based Measure, Efficiency Measurement.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20881072 Project Complexity Indices based on Topology Features
Authors: Amer A. Boushaala
Abstract:
The heuristic decision rules used for project scheduling will vary depending upon the project-s size, complexity, duration, personnel, and owner requirements. The concept of project complexity has received little detailed attention. The need to differentiate between easy and hard problem instances and the interest in isolating the fundamental factors that determine the computing effort required by these procedures inspired a number of researchers to develop various complexity measures. In this study, the most common measures of project complexity are presented. A new measure of project complexity is developed. The main privilege of the proposed measure is that, it considers size, shape and logic characteristics, time characteristics, resource demands and availability characteristics as well as number of critical activities and critical paths. The degree of sensitivity of the proposed measure for complexity of project networks has been tested and evaluated against the other measures of complexity of the considered fifty project networks under consideration in the current study. The developed measure showed more sensitivity to the changes in the network data and gives accurate quantified results when comparing the complexities of networks.Keywords: Activity networks, Complexity index, Networkcomplexity measure, Network topology, Project Network.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16801071 Documents Emotions Classification Model Based on TF-IDF Weighting Measure
Authors: Amr Mansour Mohsen, Hesham Ahmed Hassan, Amira M. Idrees
Abstract:
Emotions classification of text documents is applied to reveal if the document expresses a determined emotion from its writer. As different supervised methods are previously used for emotion documents’ classification, in this research we present a novel model that supports the classification algorithms for more accurate results by the support of TF-IDF measure. Different experiments have been applied to reveal the applicability of the proposed model, the model succeeds in raising the accuracy percentage according to the determined metrics (precision, recall, and f-measure) based on applying the refinement of the lexicon, integration of lexicons using different perspectives, and applying the TF-IDF weighting measure over the classifying features. The proposed model has also been compared with other research to prove its competence in raising the results’ accuracy.
Keywords: Emotion detection, TF-IDF, WEKA tool, classification algorithms.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17231070 Application of a Similarity Measure for Graphs to Web-based Document Structures
Authors: Matthias Dehmer, Frank Emmert Streib, Alexander Mehler, Jürgen Kilian, Max Mühlhauser
Abstract:
Due to the tremendous amount of information provided by the World Wide Web (WWW) developing methods for mining the structure of web-based documents is of considerable interest. In this paper we present a similarity measure for graphs representing web-based hypertext structures. Our similarity measure is mainly based on a novel representation of a graph as linear integer strings, whose components represent structural properties of the graph. The similarity of two graphs is then defined as the optimal alignment of the underlying property strings. In this paper we apply the well known technique of sequence alignments for solving a novel and challenging problem: Measuring the structural similarity of generalized trees. In other words: We first transform our graphs considered as high dimensional objects in linear structures. Then we derive similarity values from the alignments of the property strings in order to measure the structural similarity of generalized trees. Hence, we transform a graph similarity problem to a string similarity problem for developing a efficient graph similarity measure. We demonstrate that our similarity measure captures important structural information by applying it to two different test sets consisting of graphs representing web-based document structures.Keywords: Graph similarity, hierarchical and directed graphs, hypertext, generalized trees, web structure mining.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18901069 A New Approach for Controlling Overhead Traveling Crane Using Rough Controller
Authors: Mazin Z. Othman
Abstract:
This paper presents the idea of a rough controller with application to control the overhead traveling crane system. The structure of such a controller is based on a suggested concept of a fuzzy logic controller. A measure of fuzziness in rough sets is introduced. A comparison between fuzzy logic controller and rough controller has been demonstrated. The results of a simulation comparing the performance of both controllers are shown. From these results we infer that the performance of the proposed rough controller is satisfactory.
Keywords: Accuracy measure, Fuzzy Logic Controller (FLC), Overhead Traveling Crane (OTC), Rough Set Theory (RST), Roughness measure
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17021068 Application of Intuitionistic Fuzzy Cross Entropy Measure in Decision Making for Medical Diagnosis
Authors: Shikha Maheshwari, Amit Srivastava
Abstract:
In medical investigations, uncertainty is a major challenging problem in making decision for doctors/experts to identify the diseases with a common set of symptoms and also has been extensively increasing in medical diagnosis problems. The theory of cross entropy for intuitionistic fuzzy sets (IFS) is an effective approach in coping uncertainty in decision making for medical diagnosis problem. The main focus of this paper is to propose a new intuitionistic fuzzy cross entropy measure (IFCEM), which aid in reducing the uncertainty and doctors/experts will take their decision easily in context of patient’s disease. It is shown that the proposed measure has some elegant properties, which demonstrates its potency. Further, it is also exemplified in detail the efficiency and utility of the proposed measure by using a real life case study of diagnosis the disease in medical science.
Keywords: Intuitionistic fuzzy cross entropy (IFCEM), intuitionistic fuzzy set (IFS), medical diagnosis, uncertainty.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20451067 A Bootstrap's Reliability Measure on Tests of Hypotheses
Authors: Al Jefferson J. Pabelic, Dennis A. Tarepe
Abstract:
Bootstrapping has gained popularity in different tests of hypotheses as an alternative in using asymptotic distribution if one is not sure of the distribution of the test statistic under a null hypothesis. This method, in general, has two variants – the parametric and the nonparametric approaches. However, issues on reliability of this method always arise in many applications. This paper addresses the issue on reliability by establishing a reliability measure in terms of quantiles with respect to asymptotic distribution, when this is approximately correct. The test of hypotheses used is Ftest. The simulated results show that using nonparametric bootstrapping in F-test gives better reliability than parametric bootstrapping with relatively higher degrees of freedom.
Keywords: F-test, nonparametric bootstrapping, parametric bootstrapping, reliability measure, tests of hypotheses.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16961066 Optimal Convolutive Filters for Real-Time Detection and Arrival Time Estimation of Transient Signals
Authors: Michal Natora, Felix Franke, Klaus Obermayer
Abstract:
Linear convolutive filters are fast in calculation and in application, and thus, often used for real-time processing of continuous data streams. In the case of transient signals, a filter has not only to detect the presence of a specific waveform, but to estimate its arrival time as well. In this study, a measure is presented which indicates the performance of detectors in achieving both of these tasks simultaneously. Furthermore, a new sub-class of linear filters within the class of filters which minimize the quadratic response is proposed. The proposed filters are more flexible than the existing ones, like the adaptive matched filter or the minimum power distortionless response beamformer, and prove to be superior with respect to that measure in certain settings. Simulations of a real-time scenario confirm the advantage of these filters as well as the usefulness of the performance measure.
Keywords: Adaptive matched filter, minimum variance distortionless response, beam forming, Capon beam former, linear filters, performance measure.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15221065 Video-Based Face Recognition Based On State-Space Model
Authors: Cheng-Chieh Chiang, Yi-Chia Chan, Greg C. Lee
Abstract:
This paper proposes a video-based framework for face recognition to identify which faces appear in a video sequence. Our basic idea is like a tracking task - to track a selection of person candidates over time according to the observing visual features of face images in video frames. Hence, we employ the state-space model to formulate video-based face recognition by dividing this problem into two parts: the likelihood and the transition measures. The likelihood measure is to recognize whose face is currently being observed in video frames, for which two-dimensional linear discriminant analysis is employed. The transition measure estimates the probability of changing from an incorrect recognition at the previous stage to the correct person at the current stage. Moreover, extra nodes associated with head nodes are incorporated into our proposed state-space model. The experimental results are also provided to demonstrate the robustness and efficiency of our proposed approach.
Keywords: 2DLDA, face recognition, state-space model, likelihood measure, transition measure.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16851064 On Generalized Exponential Fuzzy Entropy
Authors: Rajkumar Verma, Bhu Dev Sharma
Abstract:
In the present communication, the existing measures of fuzzy entropy are reviewed. A generalized parametric exponential fuzzy entropy is defined.Our study of the four essential and some other properties of the proposed measure, clearly establishes the validity of the measure as an entropy.Keywords: fuzzy sets, fuzzy entropy, exponential entropy, exponential fuzzy entropy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 28551063 Composite Programming for Electric Passenger Car Selection in Multiple Criteria Decision Making
Authors: C. Ardil
Abstract:
This paper discusses the use of the composite programming method to identify the optimum electric passenger automobile in multiple criteria decision making. With the composite programming approach, a set of alternatives are compared using an optimality measure that gauges how far apart they are from the optimum solution. In this paper, some key factors (range, battery, engine, maximum speed, acceleration) that customers should consider while purchasing an electric passenger car for daily use are discussed. A numerical illustration is provided to demonstrate the validity and applicability of the proximity measure approach
Keywords: electric passenger car selection, multiple criteria decision making, proximity measure method, composite programming, entropic weight method
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 331