Search results for: information entropy
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4073

Search results for: information entropy

4073 Normalization and Constrained Optimization of Measures of Fuzzy Entropy

Authors: K.C. Deshmukh, P.G. Khot, Nikhil

Abstract:

In the literature of information theory, there is necessity for comparing the different measures of fuzzy entropy and this consequently, gives rise to the need for normalizing measures of fuzzy entropy. In this paper, we have discussed this need and hence developed some normalized measures of fuzzy entropy. It is also desirable to maximize entropy and to minimize directed divergence or distance. Keeping in mind this idea, we have explained the method of optimizing different measures of fuzzy entropy.

Keywords: Fuzzy set, Uncertainty, Fuzzy entropy, Normalization, Membership function

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1471
4072 Information Entropy of Isospectral Hydrogen Atom

Authors: Anil Kumar, C. Nagaraja Kumar

Abstract:

The position and momentum space information entropies of hydrogen atom are exactly evaluated. Using isospectral Hamiltonian approach, a family of isospectral potentials is constructed having same energy eigenvalues as that of the original potential. The information entropy content is obtained in position space as well as in momentum space. It is shown that the information entropy content in each level can be re-arranged as a function of deformation parameter.

Keywords: Information Entropy, BBM inequality, Isospectral Potential.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2168
4071 On Generalized Exponential Fuzzy Entropy

Authors: Rajkumar Verma, Bhu Dev Sharma

Abstract:

In the present communication, the existing measures of fuzzy entropy are reviewed. A generalized parametric exponential fuzzy entropy is defined.Our study of the four essential and some other properties of the proposed measure, clearly establishes the validity of the measure as an entropy.

Keywords: fuzzy sets, fuzzy entropy, exponential entropy, exponential fuzzy entropy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2855
4070 Information Theoretical Analysis of Neural Spiking Activity with Temperature Modulation

Authors: Young-Seok Choi

Abstract:

This work assesses the cortical and the sub-cortical neural activity recorded from rodents using entropy and mutual information based approaches to study how hypothermia affects neural activity. By applying the multi-scale entropy and Shannon entropy, we quantify the degree of the regularity embedded in the cortical and sub-cortical neurons and characterize the dependency of entropy of these regions on temperature. We study also the degree of the mutual information on thalamocortical pathway depending on temperature. The latter is most likely an indicator of coupling between these highly connected structures in response to temperature manipulation leading to arousal after global cerebral ischemia.

Keywords: Spiking activity, entropy, mutual information, temperature modulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1625
4069 Extending the Quantum Entropy to Multidimensional Signal Processing

Authors: Youssef Khmou, Said Safi, Miloud Frikel

Abstract:

This paper treats different aspects of entropy measure in classical information theory and statistical quantum mechanics, it presents the possibility of extending the definition of Von Neumann entropy to image and array processing. In the first part, we generalize the quantum entropy using singular values of arbitrary rectangular matrices to measure the randomness and the quality of denoising operation, this new definition of entropy can be implemented to compare the performance analysis of filtering methods. In the second part, we apply the concept of pure state in quantum formalism to generalize the maximum entropy method for narrowband and farfield source localization problem. Several computer simulation results are illustrated to demonstrate the effectiveness of the proposed techniques.

Keywords: Von Neumann entropy, Filtering, array, DoA, Maximum Entropy Method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2505
4068 Entropy Measures on Neutrosophic Soft Sets and Its Application in Multi Attribute Decision Making

Authors: I. Arockiarani

Abstract:

The focus of the paper is to furnish the entropy measure for a neutrosophic set and neutrosophic soft set which is a measure of uncertainty and it permeates discourse and system. Various characterization of entropy measures are derived. Further we exemplify this concept by applying entropy in various real time decision making problems.

Keywords: Entropy measure, Hausdorff distance, neutrosophic set, soft set.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 933
4067 Entropy Analysis in a Bubble Column Based on Ultrafast X-Ray Tomography Data

Authors: Stoyan Nedeltchev, Markus Schubert

Abstract:

By means of the ultrafast X-ray tomography facility, data were obtained at different superficial gas velocities UG in a bubble column (0.1 m in ID) operated with an air-deionized water system at ambient conditions. Raw reconstructed images were treated by both the information entropy (IE) and the reconstruction entropy (RE) algorithms in order to identify the main transition velocities in a bubble column. The IE values exhibited two well-pronounced minima at UG=0.025 m/s and UG=0.085 m/s identifying the boundaries of the homogeneous, transition and heterogeneous regimes. The RE extracted from the central region of the column’s cross-section exhibited only one characteristic peak at UG=0.03 m/s, which was attributed to the transition from the homogeneous to the heterogeneous flow regime. This result implies that the transition regime is non-existent in the core of the column.

Keywords: Bubble column, ultrafast X-ray tomography, information entropy, reconstruction entropy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1523
4066 Assessing Complexity of Neuronal Multiunit Activity by Information Theoretic Measure

Authors: Young-Seok Choi

Abstract:

This paper provides a quantitative measure of the time-varying multiunit neuronal spiking activity using an entropy based approach. To verify the status embedded in the neuronal activity of a population of neurons, the discrete wavelet transform (DWT) is used to isolate the inherent spiking activity of MUA. Due to the de-correlating property of DWT, the spiking activity would be preserved while reducing the non-spiking component. By evaluating the entropy of the wavelet coefficients of the de-noised MUA, a multiresolution Shannon entropy (MRSE) of the MUA signal is developed. The proposed entropy was tested in the analysis of both simulated noisy MUA and actual MUA recorded from cortex in rodent model. Simulation and experimental results demonstrate that the dynamics of a population can be quantified by using the proposed entropy.

Keywords: Discrete wavelet transform, Entropy, Multiresolution, Multiunit activity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1844
4065 Complexity of Multivalued Maps

Authors: David Sherwell, Vivien Visaya

Abstract:

We consider the topological entropy of maps that in general, cannot be described by one-dimensional dynamics. In particular, we show that for a multivalued map F generated by singlevalued maps, the topological entropy of any of the single-value map bounds the topological entropy of F from below.

Keywords: Multivalued maps, Topological entropy, Selectors

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1250
4064 Assessment of Multiscale Information for Short Physiological Time Series

Authors: Young-Seok Choi

Abstract:

This paper presents a multiscale information measure of Electroencephalogram (EEG) for analysis with a short data length. A multiscale extension of permutation entropy (MPE) is capable of fully reflecting the dynamical characteristics of EEG across different temporal scales. However, MPE yields an imprecise estimation due to coarse-grained procedure at large scales. We present an improved MPE measure to estimate entropy more accurately with a short time series. By computing entropies of all coarse-grained time series and averaging those at each scale, it leads to the modified MPE (MMPE) which provides an enhanced accuracy as compared to MPE. Simulation and experimental studies confirmed that MMPE has proved its capability over MPE in terms of accuracy.

Keywords: Multiscale entropy, permutation entropy, EEG, seizure.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1478
4063 Identification of the Main Transition Velocities in a Bubble Column Based on a Modified Shannon Entropy

Authors: Stoyan Nedeltchev, Markus Schubert

Abstract:

The gas holdup fluctuations in a bubble column (0.15 m in ID) have been recorded by means of a conductivity wire-mesh sensor in order to extract information about the main transition velocities. These parameters are very important for bubble column design, operation and scale-up. For this purpose, the classical definition of the Shannon entropy was modified and used to identify both the onset (at UG=0.034 m/s) of the transition flow regime and the beginning (at UG=0.089 m/s) of the churn-turbulent flow regime. The results were compared with the Kolmogorov entropy (KE) results. A slight discrepancy was found, namely the transition velocities identified by means of the KE were shifted to somewhat higher (0.045 and 0.101 m/s) superficial gas velocities UG.

Keywords: Bubble column, gas holdup fluctuations, Modified Shannon entropy, Kolmogorov entropy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1695
4062 Reasons for Non-Applicability of Software Entropy Metrics for Bug Prediction in Android

Authors: Arvinder Kaur, Deepti Chopra

Abstract:

Software Entropy Metrics for bug prediction have been validated on various software systems by different researchers. In our previous research, we have validated that Software Entropy Metrics calculated for Mozilla subsystem’s predict the future bugs reasonably well. In this study, the Software Entropy metrics are calculated for a subsystem of Android and it is noticed that these metrics are not suitable for bug prediction. The results are compared with a subsystem of Mozilla and a comparison is made between the two software systems to determine the reasons why Software Entropy metrics are not applicable for Android.

Keywords: Android, bug prediction, mining software repositories, Software Entropy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1092
4061 Analysis of EEG Signals Using Wavelet Entropy and Approximate Entropy: A Case Study on Depression Patients

Authors: Subha D. Puthankattil, Paul K. Joseph

Abstract:

Analyzing brain signals of the patients suffering from the state of depression may lead to interesting observations in the signal parameters that is quite different from a normal control. The present study adopts two different methods: Time frequency domain and nonlinear method for the analysis of EEG signals acquired from depression patients and age and sex matched normal controls. The time frequency domain analysis is realized using wavelet entropy and approximate entropy is employed for the nonlinear method of analysis. The ability of the signal processing technique and the nonlinear method in differentiating the physiological aspects of the brain state are revealed using Wavelet entropy and Approximate entropy.

Keywords: EEG, Depression, Wavelet entropy, Approximate entropy, Relative Wavelet energy, Multiresolution decomposition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3639
4060 Mean Codeword Lengths and Their Correspondence with Entropy Measures

Authors: R.K.Tuli

Abstract:

The objective of the present communication is to develop new genuine exponentiated mean codeword lengths and to study deeply the problem of correspondence between well known measures of entropy and mean codeword lengths. With the help of some standard measures of entropy, we have illustrated such a correspondence. In literature, we usually come across many inequalities which are frequently used in information theory. Keeping this idea in mind, we have developed such inequalities via coding theory approach.

Keywords: Codeword, Code alphabet, Uniquely decipherablecode, Mean codeword length, Uncertainty, Noiseless channel

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1701
4059 Codes and Formulation of Appropriate Constraints via Entropy Measures

Authors: R. K. Tuli

Abstract:

In present communication, we have developed the suitable constraints for the given the mean codeword length and the measures of entropy. This development has proved that Renyi-s entropy gives the minimum value of the log of the harmonic mean and the log of power mean. We have also developed an important relation between best 1:1 code and the uniquely decipherable code by using different measures of entropy.

Keywords: Codeword, Instantaneous code, Prefix code, Uniquely decipherable code, Best one-one code, Mean codewordlength

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1297
4058 Entropy Generation for Natural Convection in a Darcy – Brinkman Porous Cavity

Authors: Ali Mchirgui, Nejib Hidouri, Mourad Magherbi, Ammar Ben Brahim

Abstract:

The paper provides a numerical investigation of the entropy generation analysis due to natural convection in an inclined square porous cavity. The coupled equations of mass, momentum, energy and species conservation are solved using the Control Volume Finite-Element Method. Effect of medium permeability and inclination angle on entropy generation is analysed. It was found that according to the Darcy number and the porous thermal Raleigh number values, the entropy generation could be mainly due to heat transfer or to fluid friction irreversibility and that entropy generation reaches extremum values for specific inclination angles.

Keywords: Porous media, entropy generation, convection, numerical method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2606
4057 An Alternative Proof for the Topological Entropy of the Motzkin Shift

Authors: Fahad Alsharari, Mohd Salmi Md Noorani

Abstract:

A Motzkin shift is a mathematical model for constraints on genetic sequences. In terms of the theory of symbolic dynamics, the Motzkin shift is nonsofic, and therefore, we cannot use the Perron- Frobenius theory to calculate its topological entropy. The Motzkin shift M(M,N) which comes from language theory, is defined to be the shift system over an alphabet A that consists of N negative symbols, N positive symbols and M neutral symbols. For an x in the full shift, x will be in the Motzkin subshift M(M,N) if and only if every finite block appearing in x has a non-zero reduced form. Therefore, the constraint for x cannot be bounded in length. K. Inoue has shown that the entropy of the Motzkin shift M(M,N) is log(M + N + 1). In this paper, a new direct method of calculating the topological entropy of the Motzkin shift is given without any measure theoretical discussion.

Keywords: Motzkin shift, topological entropy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2011
4056 Entropy Based Data Hiding for Document Images

Authors: Swetha Kurup, Sridhar G., Sridhar V.

Abstract:

In this paper we present a novel technique for data hiding in binary document images. We use the concept of entropy in order to identify document specific least distortive areas throughout the binary document image. The document image is treated as any other image and the proposed method utilizes the standard document characteristics for the embedding process. Proposed method minimizes perceptual distortion due to embedding and allows watermark extraction without the requirement of any side information at the decoder end.

Keywords: Entropy, Steganography, Watermarking.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1529
4055 Maximum Entropy Based Image Segmentation of Human Skin Lesion

Authors: Sheema Shuja Khattak, Gule Saman, Imran Khan, Abdus Salam

Abstract:

Image segmentation plays an important role in medical imaging applications. Therefore, accurate methods are needed for the successful segmentation of medical images for diagnosis and detection of various diseases. In this paper, we have used maximum entropy to achieve image segmentation. Maximum entropy has been calculated using Shannon, Renyi and Tsallis entropies. This work has novelty based on the detection of skin lesion caused by the bite of a parasite called Sand Fly causing the disease is called Cutaneous Leishmaniasis.

Keywords: Shannon, Maximum entropy, Renyi, Tsallis entropy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2395
4054 Application the Statistical Conditional Entropy Function for Definition of Cause-and-Effect Relations during Primary Soil Formation

Authors: Vladimir K. Mukhomorov

Abstract:

Within the framework of a method of the information theory it is offered statistics and probabilistic model for definition of cause-and-effect relations in the coupled multicomponent subsystems. The quantitative parameter which is defined through conditional and unconditional entropy functions is introduced. The method is applied to the analysis of the experimental data on dynamics of change of the chemical elements composition of plants organs (roots, reproductive organs, leafs and stems). Experiment is directed on studying of temporal processes of primary soil formation and their connection with redistribution dynamics of chemical elements in plant organs. This statistics and probabilistic model allows also quantitatively and unambiguously to specify the directions of the information streams on plant organs.

Keywords: Chemical elements, entropy function, information, plants.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1358
4053 Entropic Measures of a Probability Sample Space and Exponential Type (α, β) Entropy

Authors: Rajkumar Verma, Bhu Dev Sharma

Abstract:

Entropy is a key measure in studies related to information theory and its many applications. Campbell for the first time recognized that the exponential of the Shannon’s entropy is just the size of the sample space, when distribution is uniform. Here is the idea to study exponentials of Shannon’s and those other entropy generalizations that involve logarithmic function for a probability distribution in general. In this paper, we introduce a measure of sample space, called ‘entropic measure of a sample space’, with respect to the underlying distribution. It is shown in both discrete and continuous cases that this new measure depends on the parameters of the distribution on the sample space - same sample space having different ‘entropic measures’ depending on the distributions defined on it. It was noted that Campbell’s idea applied for R`enyi’s parametric entropy of a given order also. Knowing that parameters play a role in providing suitable choices and extended applications, paper studies parametric entropic measures of sample spaces also. Exponential entropies related to Shannon’s and those generalizations that have logarithmic functions, i.e. are additive have been studies for wider understanding and applications. We propose and study exponential entropies corresponding to non additive entropies of type (α, β), which include Havard and Charvˆat entropy as a special case.

Keywords: Sample space, Probability distributions, Shannon’s entropy, R`enyi’s entropy, Non-additive entropies .

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3391
4052 Generalized Maximum Entropy Method for Cosmic Source Localization

Authors: Youssef Khmou, Said Safi, Miloud Frikel

Abstract:

The Maximum entropy principle in spectral analysis was used as an estimator of Direction of Arrival (DoA) of electromagnetic or acoustic sources impinging on an array of sensors, indeed the maximum entropy operator is very efficient when the signals of the radiating sources are ergodic and complex zero mean random processes which is the case for cosmic sources. In this paper, we present basic review of the maximum entropy method (MEM) which consists of rank one operator but not a projector, and we elaborate a new operator which is full rank and sum of all possible projectors. Two dimensional Simulation results based on Monte Carlo trials prove the resolution power of the new operator where the MEM presents some erroneous fluctuations.

Keywords: Maximum entropy, Cosmic source, Localization, operator, projector, azimuth, elevation, DoA, circular array.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2133
4051 A New Approach to Image Segmentation via Fuzzification of Rènyi Entropy of Generalized Distributions

Authors: Samy Sadek, Ayoub Al-Hamadi, Axel Panning, Bernd Michaelis, Usama Sayed

Abstract:

In this paper, we propose a novel approach for image segmentation via fuzzification of Rènyi Entropy of Generalized Distributions (REGD). The fuzzy REGD is used to precisely measure the structural information of image and to locate the optimal threshold desired by segmentation. The proposed approach draws upon the postulation that the optimal threshold concurs with maximum information content of the distribution. The contributions in the paper are as follow: Initially, the fuzzy REGD as a measure of the spatial structure of image is introduced. Then, we propose an efficient entropic segmentation approach using fuzzy REGD. However the proposed approach belongs to entropic segmentation approaches (i.e. these approaches are commonly applied to grayscale images), it is adapted to be viable for segmenting color images. Lastly, diverse experiments on real images that show the superior performance of the proposed method are carried out.

Keywords: Entropy of generalized distributions, entropy fuzzification, entropic image segmentation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3231
4050 Generalized Measures of Fuzzy Entropy and their Properties

Authors: K.C. Deshmukh, P.G. Khot, Nikhil

Abstract:

In the present communication, we have proposed some new generalized measure of fuzzy entropy based upon real parameters, discussed their and desirable properties, and presented these measures graphically. An important property, that is, monotonicity of the proposed measures has also been studied.

Keywords: Fuzzy numbers, Fuzzy entropy, Characteristicfunction, Crisp set, Monotonicity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1471
4049 An Improved C-Means Model for MRI Segmentation

Authors: Ying Shen, Weihua Zhu

Abstract:

Medical images are important to help identifying different diseases, for example, Magnetic resonance imaging (MRI) can be used to investigate the brain, spinal cord, bones, joints, breasts, blood vessels, and heart. Image segmentation, in medical image analysis, is usually the first step to find out some characteristics with similar color, intensity or texture so that the diagnosis could be further carried out based on these features. This paper introduces an improved C-means model to segment the MRI images. The model is based on information entropy to evaluate the segmentation results by achieving global optimization. Several contributions are significant. Firstly, Genetic Algorithm (GA) is used for achieving global optimization in this model where fuzzy C-means clustering algorithm (FCMA) is not capable of doing that. Secondly, the information entropy after segmentation is used for measuring the effectiveness of MRI image processing. Experimental results show the outperformance of the proposed model by comparing with traditional approaches.

Keywords: Magnetic Resonance Image, C-means model, image segmentation, information entropy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 918
4048 Thermodynamic Analyses of Information Dissipation along the Passive Dendritic Trees and Active Action Potential

Authors: Bahar Hazal Yalçınkaya, Bayram Yılmaz, Mustafa Özilgen

Abstract:

Brain information transmission in the neuronal network occurs in the form of electrical signals. Neural work transmits information between the neurons or neurons and target cells by moving charged particles in a voltage field; a fraction of the energy utilized in this process is dissipated via entropy generation. Exergy loss and entropy generation models demonstrate the inefficiencies of the communication along the dendritic trees. In this study, neurons of 4 different animals were analyzed with one dimensional cable model with N=6 identical dendritic trees and M=3 order of symmetrical branching. Each branch symmetrically bifurcates in accordance with the 3/2 power law in an infinitely long cylinder with the usual core conductor assumptions, where membrane potential is conserved in the core conductor at all branching points. In the model, exergy loss and entropy generation rates are calculated for each branch of equivalent cylinders of electrotonic length (L) ranging from 0.1 to 1.5 for four different dendritic branches, input branch (BI), and sister branch (BS) and two cousin branches (BC-1 & BC-2). Thermodynamic analysis with the data coming from two different cat motoneuron studies show that in both experiments nearly the same amount of exergy is lost while generating nearly the same amount of entropy. Guinea pig vagal motoneuron loses twofold more exergy compared to the cat models and the squid exergy loss and entropy generation were nearly tenfold compared to the guinea pig vagal motoneuron model. Thermodynamic analysis show that the dissipated energy in the dendritic tress is directly proportional with the electrotonic length, exergy loss and entropy generation. Entropy generation and exergy loss show variability not only between the vertebrate and invertebrates but also within the same class. Concurrently, single action potential Na+ ion load, metabolic energy utilization and its thermodynamic aspect contributed for squid giant axon and mammalian motoneuron model. Energy demand is supplied to the neurons in the form of Adenosine triphosphate (ATP). Exergy destruction and entropy generation upon ATP hydrolysis are calculated. ATP utilization, exergy destruction and entropy generation showed differences in each model depending on the variations in the ion transport along the channels.

Keywords: ATP utilization, entropy generation, exergy loss, neuronal information transmittance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1014
4047 Entropy Generation Analysis of Free Convection Film Condensation on a Vertical Ellipsoid with Variable Wall Temperature

Authors: Sheng-An Yang, Ren-Yi Hung, Ying-Yi Ho

Abstract:

This paper aims to perform the second law analysis of thermodynamics on the laminar film condensation of pure saturated vapor flowing in the direction of gravity on an ellipsoid with variable wall temperature. The analysis provides us understanding how the geometric parameter- ellipticity and non-isothermal wall temperature variation amplitude “A." affect entropy generation during film-wise condensation heat transfer process. To understand of which irreversibility involved in this condensation process, we derived an expression for the entropy generation number in terms of ellipticity and A. The result indicates that entropy generation increases with ellipticity. Furthermore, the irreversibility due to finite temperature difference heat transfer dominates over that due to condensate film flow friction and the local entropy generation rate decreases with increasing A in the upper half of ellipsoid. Meanwhile, the local entropy generation rate enhances with A around the rear lower half of ellipsoid.

Keywords: Free convection; Non-isothermal; Thermodynamic second law; Entropy, Ellipsoid.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1955
4046 A Family of Entropies on Interval-valued Intuitionistic Fuzzy Sets and Their Applications in Multiple Attribute Decision Making

Authors: Min Sun, Jing Liu

Abstract:

The entropy of intuitionistic fuzzy sets is used to indicate the degree of fuzziness of an interval-valued intuitionistic fuzzy set(IvIFS). In this paper, we deal with the entropies of IvIFS. Firstly, we propose a family of entropies on IvIFS with a parameter λ ∈ [0, 1], which generalize two entropy measures defined independently by Zhang and Wei, for IvIFS, and then we prove that the new entropy is an increasing function with respect to the parameter λ. Furthermore, a new multiple attribute decision making (MADM) method using entropy-based attribute weights is proposed to deal with the decision making situations where the alternatives on attributes are expressed by IvIFS and the attribute weights information is unknown. Finally, a numerical example is given to illustrate the applications of the proposed method.

Keywords: Interval-valued intuitionistic fuzzy sets, intervalvalued intuitionistic fuzzy entropy, multiple attribute decision making

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1647
4045 Computing Entropy for Ortholog Detection

Authors: Hsing-Kuo Pao, John Case

Abstract:

Biological sequences from different species are called or-thologs if they evolved from a sequence of a common ancestor species and they have the same biological function. Approximations of Kolmogorov complexity or entropy of biological sequences are already well known to be useful in extracting similarity information between such sequences -in the interest, for example, of ortholog detection. As is well known, the exact Kolmogorov complexity is not algorithmically computable. In prac-tice one can approximate it by computable compression methods. How-ever, such compression methods do not provide a good approximation to Kolmogorov complexity for short sequences. Herein is suggested a new ap-proach to overcome the problem that compression approximations may notwork well on short sequences. This approach is inspired by new, conditional computations of Kolmogorov entropy. A main contribution of the empir-ical work described shows the new set of entropy-based machine learning attributes provides good separation between positive (ortholog) and nega-tive (non-ortholog) data - better than with good, previously known alter-natives (which do not employ some means to handle short sequences well).Also empirically compared are the new entropy based attribute set and a number of other, more standard similarity attributes sets commonly used in genomic analysis. The various similarity attributes are evaluated by cross validation, through boosted decision tree induction C5.0, and by Receiver Operating Characteristic (ROC) analysis. The results point to the conclu-sion: the new, entropy based attribute set by itself is not the one giving the best prediction; however, it is the best attribute set for use in improving the other, standard attribute sets when conjoined with them.

Keywords: compression, decision tree, entropy, ortholog, ROC.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1826
4044 On the Optimality Assessment of Nanoparticle Size Spectrometry and Its Association to the Entropy Concept

Authors: A. Shaygani, R. Saifi, M. S. Saidi, M. Sani

Abstract:

Particle size distribution, the most important characteristics of aerosols, is obtained through electrical characterization techniques. The dynamics of charged nanoparticles under the influence of electric field in Electrical Mobility Spectrometer (EMS) reveals the size distribution of these particles. The accuracy of this measurement is influenced by flow conditions, geometry, electric field and particle charging process, therefore by the transfer function (transfer matrix) of the instrument. In this work, a wire-cylinder corona charger was designed and the combined fielddiffusion charging process of injected poly-disperse aerosol particles was numerically simulated as a prerequisite for the study of a multichannel EMS. The result, a cloud of particles with no uniform charge distribution, was introduced to the EMS. The flow pattern and electric field in the EMS were simulated using Computational Fluid Dynamics (CFD) to obtain particle trajectories in the device and therefore to calculate the reported signal by each electrometer. According to the output signals (resulted from bombardment of particles and transferring their charges as currents), we proposed a modification to the size of detecting rings (which are connected to electrometers) in order to evaluate particle size distributions more accurately. Based on the capability of the system to transfer information contents about size distribution of the injected particles, we proposed a benchmark for the assessment of optimality of the design. This method applies the concept of Von Neumann entropy and borrows the definition of entropy from information theory (Shannon entropy) to measure optimality. Entropy, according to the Shannon entropy, is the ''average amount of information contained in an event, sample or character extracted from a data stream''. Evaluating the responses (signals) which were obtained via various configurations of detecting rings, the best configuration which gave the best predictions about the size distributions of injected particles, was the modified configuration. It was also the one that had the maximum amount of entropy. A reasonable consistency was also observed between the accuracy of the predictions and the entropy content of each configuration. In this method, entropy is extracted from the transfer matrix of the instrument for each configuration. Ultimately, various clouds of particles were introduced to the simulations and predicted size distributions were compared to the exact size distributions.

Keywords: Aerosol Nano-Particle, CFD, Electrical Mobility Spectrometer, Von Neumann entropy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1859