Acta Scientific Computer Sciences

Review Article Volume 3 Issue 10

A Profile of the Generalized Information Measures in Information Theory

D S Hooda1* and M S Barak2

1Honorary Professor in Mathematics, GJ University of Science and Technology, India
2Department of Mathematics, IG University, Meerpur, Rewari, India

*Corresponding Author: Maurice HT Ling, HOHY PTE LTD, Singapore and School of Data Sciences, Perdana University, Malaysia.

Received: August 11, 2021; Published: September 29, 2021

Abstract

Information theory deals mainly with Entropy or Information Measure, Communication and Cryptography. A review of information measures with their historical development is an important input in Information theory. The concept of Shannon’s entropy and its properties are very informative. The application of entropy in coding is described with an example. Various generalizations of entropy by various authors are enumerated. The ‘useful’ information measure is defined with its generalization. The measures of Directed divergence and J-divergence are discussed. The ‘useful’ relative information and j-divergence measures are also described with conclusion and discussion in the end.

Keywords: Entropy; Information Measure; ‘Useful’ Information Measure; Directed Divergence and J-divergence.

Bibliography

  1. Aczel J and Nath P. “Axiomatic characterizations of some measures of divergence in information”. Zeitschrift für Wahrscheinlichkeitstheorie und Verwandte Gebiete3 (1972): 215-224.
  2. Aggarwal NL and Picard C F. “Functional equations and information measures with preference”. Kybernetika 3 (1978): 174-181.
  3. Arimoto S. “Information-theoretical considerations on estimation problems”. Information and Control3 (1971): 181-194.
  4. Behara M and Chawla J S. “Generalized gamma-entropy”. Selecta Statistica Canadiana 2 (1974): 15-38.
  5. Belis M and Guiasu S. “A quantitative-qualitative measure of information in cybernetic systems”. IEEE Transactions on Information Theory 4 (1968): 593-594.
  6. Bhaker U and Hooda DS. “Mean value characterization of ‘useful’ information measure”. Tamkang Journal of Mathematics4 (1993): 383-394.
  7. Boekee D E and Van der Lubbe J C. “The R-norm information measure”. Information and Control2 (1980): 136-155.
  8. Burbea J and Rao C R. “Entropy differential metric, distance and divergence measures in probability spaces: A unified approach”. Journal of Multivariate Analysis4 (1982): 575-596.
  9. De Luca A and Termini S. “A definition of a nonprobabilistic entropy in the setting of fuzzy sets theory”. Information and Control4 (1972): 301-312.
  10. Emptoz h. "Beta-type information integrating a concept of utility” (1976).
  11. Guiasu S. “Information Theory with Applications”. McGrew-Hill Mass (1976).
  12. Guiasu S and Picard CF. “Brone Inferieure DelaLogueur De Certain Codes”. C.R. Academy Science Paris, 273 (1971): 248-251.
  13. Hartley R V. “Transmission of information”. Bell System Technical Journal3 (1928): 535-563.
  14. Havrda J and F Charvat. “Quantification of classification processes concept of structural entropy”. Kybernetike 3 (1967): 30-35.
  15. Hooda D S and Kumar P. “Generalised measures of useful directed divergence and information improvement with applications”. Defence Science Journal2 (2004): 125.
  16. Hooda D S and Ram A. “Characterization of generalized R-norm entropy”. Caribbean Journal of Mathematical and Computer Science 8 (1998): 18-31.
  17. Hooda D S and Sharma D K. “Generalized R-norm information measures”. Journal of Applied Mathematics, Statistics and Informatics 4 (2008): 153-168.
  18. Hooda D S and Singh U. “On useful information generating functions”. Statistica 46 (1986): 528-535.
  19. Hooda D S and Tuteja R K. “Two generalized measures of “useful” information”. Information Sciences 1 (1981): 11-24.
  20. Hooda D S and Tuteja R K. “On Characterization of non-additive Measures of Relative Information and Inaccuracy”. Bulletin of the Calcutta Mathematical Society 77 (1985): 363-369.
  21. Hooda DS and Kumar A. “The mean and the variance of an MLE of generalized “useful” information”. The Mathematics Student 49 (1981): 263-269.
  22. Jain P and Tuteja R K. “An Axiomatic Characterization of Relative ‘Useful’Information”. Journal of Information and Optimization Sciences1 (1986): 49-57.
  23. Jaynes E T. “Information theory and statistical mechanics”. Physical Review4 (1957): 620.
  24. Kapur J N. “Generalized entropy of order α and type β”. In The Math Seminar 4 (1967): 78-82).
  25. Kapur J N. “On some applications of dynamic programming to information theory”. In Proceedings of the Indian Academy of Sciences-Section A 67.1 (1968): 1-11.
  26. Kapur J N. “Four families of measures of entropy”. Indian Journal of Pure and Applied Mathematics 17 (1968a): 429-449.
  27. Kapur J N. “New generalized measure of entropy and directed divergence research report”. Indian Institute Of Technology, Kanpur 341 (1986b).
  28. Kulback S. “Information Theory and statistics”. Courier Corporation, USA (1959).
  29. Kullback S and Leibler R A. “On information and sufficiency”. The Annals of Mathematical Statistics1 (1951): 79-86.
  30. Kumar P and Hooda DS. “On generalized measures of entropy and dependence, Math”. Slovaca (2008).
  31. Kumar S., et al. “On ‘Useful’ R-norm Relative Information and J-Divergence Measures”. International Journal of Pure and Applied Mathematics 77 (2012): 3349-358.
  32. Kvålseth TO. “The relative useful information measure: Some comments”. Information Sciences1-3 (1964): 35-38.
  33. Longo G. “Quantitative-qualitative measure of information”. New York, NY, USA: Springer (1972).
  34. Nyquist H. “Certain factors affecting telegraph speed”. Transactions of the American Institute of Electrical Engineers 43 (1924): 412-422.
  35. Pal N R and Pal S K. “A review on image segmentation techniques”. Pattern Recognition9 (1993): 1277-1294.
  36. Patni G C and Jain K C. “On axiomatic characterization of some non-additive measures of information”. Metrika 1 (1977): 23-34.
  37. Prakash O. “Characterization of information measure and information transmission”. Ph.d. Thesis, G.N.D. University, Amritsar (India) (1988).
  38. Rathie P N and Kannappan P. “A directed-divergence function of type β”. Information and Control1 (1972): 38-45.
  39. Rathie P N and Sheng L T. “The J-divergence of order α”. Journal of Combinatorics, Information and System Sciences 6 (1981): 197-205.
  40. Rényi A. “On measures of entropy and information”. In Proceedings of the Fourth Berkeley Symposium on Mathematical Statistics and Probability, Volume 1: Contributions to the Theory of Statistics (1961): 547-561.
  41. Shannon C E. “A mathematical theory of communication”. The Bell System Technical Journal3 (1948): 379-423.
  42. Sharma B D and Autar R. “Relative information functions and their type (α, β) generalizations”. Metrika1 (1974): 41-50.
  43. Sharma B D and Mittal D P. “New non-additive measures of relative information”. Journal of Combinatorics Information and System Sciences 4 (1977): 122-132.
  44. Sharma B D., et al. “On measures of “useful” information”. Information and Control3 (1978): 323-336.
  45. Sharma B D and Taneja I J. “Three generalized-additive measures of entropy”. Elektronische Informationsverarbeitung und Kybernetik7/8 (1977): 419-433.
  46. Taneja H C and Tuteja R K. “Characterization of a quantitative-qualitative measure of relative information”. Information Sciences3 (1984): 217-222.
  47. Taneja I J. “On generalized information measures and their applications”. In Advances in Electronics and Electron Physics 76 (1989): 327-413.
  48. Theil H. “Economics and information theory (No. 04; HB74. M3, T4.)” (1967).
  49. Theil H and Uribe P. “The information approach to the aggregation of input-output tables”. The Review of Economics and Statistics (1967): 451-462.
  50. Zadeh L A. “Electrical engineering at the crossroads”. IEEE Transactions on Education2 (1965): 30-33.

Citation

Citation: D S Hooda and M S Barak. “A Profile of the Generalized Information Measures in Information Theory". Acta Scientific Computer Sciences 3.10 (2021): .

Copyright

Copyright: © 2021 D S Hooda and M S Barak. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.




Metrics

Acceptance rate35%
Acceptance to publication20-30 days

Indexed In




News and Events


  • Certification for Review
    Acta Scientific certifies the Editors/reviewers for their review done towards the assigned articles of the respective journals.
  • Submission Timeline for Upcoming Issue
    The last date for submission of articles for regular Issues is December 25, 2024.
  • Publication Certificate
    Authors will be issued a "Publication Certificate" as a mark of appreciation for publishing their work.
  • Best Article of the Issue
    The Editors will elect one Best Article after each issue release. The authors of this article will be provided with a certificate of "Best Article of the Issue"

Contact US