New Information Inequalities in Terms of One Parametric Generalized Divergence Measure and Application


Authors

K. C. Jain - Department of Mathematics, Malaviya National Institute of Technology, Jaipur (Rajasthan), India P. Chhabra - Department of Mathematics, Malaviya National Institute of Technology, Jaipur (Rajasthan), India


Abstract

In this work, firstly we introduce the new information divergence measure, characterize it and get the mathematical relations with other divergences. Further, we introduce new information inequalities on the new generalized f- divergence measure in terms of the well-known one parametric generalized divergence. Further, we obtain bounds of the new divergence and the Relative J- divergence as an application of new information inequalities by using Logarithmic power mean and Identric mean, together with numerical verification by taking two discrete probability distributions: Binomial and Poisson. Approximate relations of the new divergence and Relative J- divergence with Chi- square divergence, have been obtained respectively.


Share and Cite

  • Share on Facebook
  • Share on Twitter
  • Share on LinkedIn
ISRP Style

K. C. Jain, P. Chhabra, New Information Inequalities in Terms of One Parametric Generalized Divergence Measure and Application, Journal of Mathematics and Computer Science, 15 (2015), no. 1, 1-22

AMA Style

Jain K. C., Chhabra P., New Information Inequalities in Terms of One Parametric Generalized Divergence Measure and Application. J Math Comput SCI-JM. (2015); 15(1):1-22

Chicago/Turabian Style

Jain, K. C., Chhabra, P.. "New Information Inequalities in Terms of One Parametric Generalized Divergence Measure and Application." Journal of Mathematics and Computer Science, 15, no. 1 (2015): 1-22


Keywords


MSC


References