Mean Codeword Lengths and Their Correspondence with Entropy Measures
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 33122
Mean Codeword Lengths and Their Correspondence with Entropy Measures

Authors: R.K.Tuli

Abstract:

The objective of the present communication is to develop new genuine exponentiated mean codeword lengths and to study deeply the problem of correspondence between well known measures of entropy and mean codeword lengths. With the help of some standard measures of entropy, we have illustrated such a correspondence. In literature, we usually come across many inequalities which are frequently used in information theory. Keeping this idea in mind, we have developed such inequalities via coding theory approach.

Keywords: Codeword, Code alphabet, Uniquely decipherablecode, Mean codeword length, Uncertainty, Noiseless channel

Digital Object Identifier (DOI): doi.org/10.5281/zenodo.1327873

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1704

References:


[1] Arimoto, S. (1971): "Information theoretical consideration on estimation problems", Information and Control, 19,181-194.
[2] Bhattacharya, A. (1943): "On a measure of divergence between two statistically populations defined by their probability distributions", Bulletin of the Calcutta Mathematical Society, 35, 99-109.
[3] Behara, M. and Chawla, J. S. (1974): "Generalized -entropy", Selecta Statistica Canadiana, 2, 15-38.
[4] Burg, J. P. (1972): "The relationship between maximum entropy spectra and maximum likelihood spectra", Modern Spectral Analysis, Childrers, D.G. (ed.), pp. 130-131.
[5] Campbell, L. L. (1965): "A coding theorem and Renyi's entropy", Information and Control, 8, 423-429.
[6] Guiasu, S. and Picard, C. F. (1971): " Borne in ferictur de la longuerur utile de certains codes", Comptes Rendus Mathematique Academic des Sciences Paris, 273, 248-251.
[7] Havrada, J. H. and Charvat, F. (1967): "Quantification methods of classification process: Concept of structural -entropy", Kybernetika, 3, 30-35.
[8] Kapur, J. N. (1986): "Four families of measures of entropy", Indian Journal of Pure and Applied Mathematics", 17, 429-449.
[9] Kapur, J. N. (1995): "Measures of Information and Their Applications", Wiley Eastern, New York.
[10] Kraft (1949): "A device for quantizing grouping and coding amplitude modulated pulses", M.S. Thesis, Electrical Engineering Department, MIT.
[11] Longo, G. (1972): "Quantitative-Qualitative Measures of Information", Springer-Verlag, New York.
[12] Renyi, A. (1961): "On measures of entropy and information", Proceedings 4th Berkeley Symposium on Mathematical Statistics and Probability, 1, 547-561.
[13] Shannon, C. E. (1948): "A mathematical theory of communication", Bell System Technical Journal, 27, 379-423, 623-659.
[14] Sharma, B. D. and Mittal, D. P. (1975): "New non-additive measures of entropy for a discrete probability distributions", Journal of Mathematical Sciences, 10, 28- 40.
[15] Sharma, B. D. and Mittal, D. P. (1977): "New non-additive measures of relative information", Jr. Comb. Inf. and Sys. Sci., 2, 122-132.
[16] Varma, R. S. (1966): "Generalization of Renyi-s entropy of order", Journal Math. Sci., 34-48.