A Profile of the Generalized Information Measures in Information Theory
D S Hooda1* and M S Barak2
1Honorary Professor in Mathematics, GJ University of Science and Technology, India
2Department of Mathematics, IG University, Meerpur, Rewari, India
*Corresponding Author: Maurice HT Ling, HOHY PTE LTD, Singapore and School of Data Sciences, Perdana University, Malaysia.
August 11, 2021; Published: September 29, 2021
Information theory deals mainly with Entropy or Information Measure, Communication and Cryptography. A review of information measures with their historical development is an important input in Information theory. The concept of Shannon’s entropy and its properties are very informative. The application of entropy in coding is described with an example. Various generalizations of entropy by various authors are enumerated. The ‘useful’ information measure is defined with its generalization. The measures of Directed divergence and J-divergence are discussed. The ‘useful’ relative information and j-divergence measures are also described with conclusion and discussion in the end.
Keywords: Entropy; Information Measure; ‘Useful’ Information Measure; Directed Divergence and J-divergence.
- Aczel J and Nath P. “Axiomatic characterizations of some measures of divergence in information”. Zeitschrift für Wahrscheinlichkeitstheorie und Verwandte Gebiete3 (1972): 215-224.
- Aggarwal NL and Picard C F. “Functional equations and information measures with preference”. Kybernetika 3 (1978): 174-181.
- Arimoto S. “Information-theoretical considerations on estimation problems”. Information and Control3 (1971): 181-194.
- Behara M and Chawla J S. “Generalized gamma-entropy”. Selecta Statistica Canadiana 2 (1974): 15-38.
- Belis M and Guiasu S. “A quantitative-qualitative measure of information in cybernetic systems”. IEEE Transactions on Information Theory 4 (1968): 593-594.
- Bhaker U and Hooda DS. “Mean value characterization of ‘useful’ information measure”. Tamkang Journal of Mathematics4 (1993): 383-394.
- Boekee D E and Van der Lubbe J C. “The R-norm information measure”. Information and Control2 (1980): 136-155.
- Burbea J and Rao C R. “Entropy differential metric, distance and divergence measures in probability spaces: A unified approach”. Journal of Multivariate Analysis4 (1982): 575-596.
- De Luca A and Termini S. “A definition of a nonprobabilistic entropy in the setting of fuzzy sets theory”. Information and Control4 (1972): 301-312.
- Emptoz h. "Beta-type information integrating a concept of utility” (1976).
- Guiasu S. “Information Theory with Applications”. McGrew-Hill Mass (1976).
- Guiasu S and Picard CF. “Brone Inferieure DelaLogueur De Certain Codes”. C.R. Academy Science Paris, 273 (1971): 248-251.
- Hartley R V. “Transmission of information”. Bell System Technical Journal3 (1928): 535-563.
- Havrda J and F Charvat. “Quantification of classification processes concept of structural entropy”. Kybernetike 3 (1967): 30-35.
- Hooda D S and Kumar P. “Generalised measures of useful directed divergence and information improvement with applications”. Defence Science Journal2 (2004): 125.
- Hooda D S and Ram A. “Characterization of generalized R-norm entropy”. Caribbean Journal of Mathematical and Computer Science 8 (1998): 18-31.
- Hooda D S and Sharma D K. “Generalized R-norm information measures”. Journal of Applied Mathematics, Statistics and Informatics 4 (2008): 153-168.
- Hooda D S and Singh U. “On useful information generating functions”. Statistica 46 (1986): 528-535.
- Hooda D S and Tuteja R K. “Two generalized measures of “useful” information”. Information Sciences 1 (1981): 11-24.
- Hooda D S and Tuteja R K. “On Characterization of non-additive Measures of Relative Information and Inaccuracy”. Bulletin of the Calcutta Mathematical Society 77 (1985): 363-369.
- Hooda DS and Kumar A. “The mean and the variance of an MLE of generalized “useful” information”. The Mathematics Student 49 (1981): 263-269.
- Jain P and Tuteja R K. “An Axiomatic Characterization of Relative ‘Useful’Information”. Journal of Information and Optimization Sciences1 (1986): 49-57.
- Jaynes E T. “Information theory and statistical mechanics”. Physical Review4 (1957): 620.
- Kapur J N. “Generalized entropy of order α and type β”. In The Math Seminar 4 (1967): 78-82).
- Kapur J N. “On some applications of dynamic programming to information theory”. In Proceedings of the Indian Academy of Sciences-Section A 67.1 (1968): 1-11.
- Kapur J N. “Four families of measures of entropy”. Indian Journal of Pure and Applied Mathematics 17 (1968a): 429-449.
- Kapur J N. “New generalized measure of entropy and directed divergence research report”. Indian Institute Of Technology, Kanpur 341 (1986b).
- Kulback S. “Information Theory and statistics”. Courier Corporation, USA (1959).
- Kullback S and Leibler R A. “On information and sufficiency”. The Annals of Mathematical Statistics1 (1951): 79-86.
- Kumar P and Hooda DS. “On generalized measures of entropy and dependence, Math”. Slovaca (2008).
- Kumar S., et al. “On ‘Useful’ R-norm Relative Information and J-Divergence Measures”. International Journal of Pure and Applied Mathematics 77 (2012): 3349-358.
- Kvålseth TO. “The relative useful information measure: Some comments”. Information Sciences1-3 (1964): 35-38.
- Longo G. “Quantitative-qualitative measure of information”. New York, NY, USA: Springer (1972).
- Nyquist H. “Certain factors affecting telegraph speed”. Transactions of the American Institute of Electrical Engineers 43 (1924): 412-422.
- Pal N R and Pal S K. “A review on image segmentation techniques”. Pattern Recognition9 (1993): 1277-1294.
- Patni G C and Jain K C. “On axiomatic characterization of some non-additive measures of information”. Metrika 1 (1977): 23-34.
- Prakash O. “Characterization of information measure and information transmission”. Ph.d. Thesis, G.N.D. University, Amritsar (India) (1988).
- Rathie P N and Kannappan P. “A directed-divergence function of type β”. Information and Control1 (1972): 38-45.
- Rathie P N and Sheng L T. “The J-divergence of order α”. Journal of Combinatorics, Information and System Sciences 6 (1981): 197-205.
- Rényi A. “On measures of entropy and information”. In Proceedings of the Fourth Berkeley Symposium on Mathematical Statistics and Probability, Volume 1: Contributions to the Theory of Statistics (1961): 547-561.
- Shannon C E. “A mathematical theory of communication”. The Bell System Technical Journal3 (1948): 379-423.
- Sharma B D and Autar R. “Relative information functions and their type (α, β) generalizations”. Metrika1 (1974): 41-50.
- Sharma B D and Mittal D P. “New non-additive measures of relative information”. Journal of Combinatorics Information and System Sciences 4 (1977): 122-132.
- Sharma B D., et al. “On measures of “useful” information”. Information and Control3 (1978): 323-336.
- Sharma B D and Taneja I J. “Three generalized-additive measures of entropy”. Elektronische Informationsverarbeitung und Kybernetik7/8 (1977): 419-433.
- Taneja H C and Tuteja R K. “Characterization of a quantitative-qualitative measure of relative information”. Information Sciences3 (1984): 217-222.
- Taneja I J. “On generalized information measures and their applications”. In Advances in Electronics and Electron Physics 76 (1989): 327-413.
- Theil H. “Economics and information theory (No. 04; HB74. M3, T4.)” (1967).
- Theil H and Uribe P. “The information approach to the aggregation of input-output tables”. The Review of Economics and Statistics (1967): 451-462.
- Zadeh L A. “Electrical engineering at the crossroads”. IEEE Transactions on Education2 (1965): 30-33.