Codes and Formulation of Appropriate Constraints via Entropy Measures
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 33122
Codes and Formulation of Appropriate Constraints via Entropy Measures

Authors: R. K. Tuli

Abstract:

In present communication, we have developed the suitable constraints for the given the mean codeword length and the measures of entropy. This development has proved that Renyi-s entropy gives the minimum value of the log of the harmonic mean and the log of power mean. We have also developed an important relation between best 1:1 code and the uniquely decipherable code by using different measures of entropy.

Keywords: Codeword, Instantaneous code, Prefix code, Uniquely decipherable code, Best one-one code, Mean codewordlength

Digital Object Identifier (DOI): doi.org/10.5281/zenodo.1329330

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1302

References:


[1] Campbell, L. L. (1965): "A coding theorem and Renyi's entropy", Information and Control, 8, 423-429.
[2] Cheng, J. and Huang, T. K. (2006): "New lower and upper bounds on the expected length of optimal one-to-one codes", Proceedings of the Data Compression Conference, 43-52.
[3] Cheng, J., Huang, T. K. and Weidmann, C. (2007): "New bounds on the expected length of optimal one-to-one codes", IEEE Trans. Inform. Theory, 53(5), 1884-1895.
[4] Feinstein, A. (1958): "Foundations of Information Theory", McGraw- Hill, New York.
[5] Kapur, J.N. (1995): "Measures of Information and Their Applications", Wiley Eastern, New York.
[6] Kraft, L. G. (1949): "A Device for Quantizing Grouping and Coding Amplitude Modulated Pulses", M.S. Thesis, Electrical Engineering Department, MIT.
[7] Leung-Yan-Cheong, S. K. and Cover, T. M. (1978): "Some equivalences between Shannon entropy and Kolmogrov complexity", IEEE Trans. Inform. Theory, 24, 331-338.
[8] Renyi, A. (1961): "On measures of entropy and information", Proceedings 4th Berkeley Symposium on Mathematical Statistics and Probability, 1, 547-561.
[9] Rissanen, J. (1992): "Tight lower bounds for optimal code length", IEEE Trans. Inform. Theory, 28(2), 348-349.
[10] Savari, S. A. and Naheta, A. (2004): "Bounds on the expected cost of one-to-one codes", Proc. IEEE Int. Symp. Information Theory, 94.
[11] Shannon, C. E. (1948): "A mathematical theory of communication", Bell System Tech J., 27, 379-423.