Two New Measures of Fuzzy Divergence and Their Properties

Several measures of directed divergence and their corresponding measures of fuzzy divergence are available in the exiting literature. Two new measures of fuzzy divergence have been developed and their desirable properties have been discussed.


Introduction
he concept of distance has been proved to be very important for its applications to science and engineering.Naturally, attempts were made to extend the concept of distance for applications to problems in economics, sociology, psychology, linguistics, genetics, biology, etc.It was however soon realized that the concept of distance would need a modification for such applications.To explain the necessity for a new concept of distance for social, economics, statistical, business and management sciences, we consider the following typical problems in these sciences: 1. Find the distance between the intellectual attainment of two students whose proportions of grades A,B,C,D,E are ( ) , , , , p p p p p and ( ) , , , , p p p p p , respectively.
2. Find a measure for the variation in the intellectual attainments of a class of m students when the proportions of grades of the ith student are given by ( ) ( ) , ,...., n q q q .T 4. Find the distance between balance sheets of two companies or between the balance sheets of the same company in two different years. . 5. Find a measure for the improvement in income distribution when income distribution ( ) 1 2 , ,...., n q q q is changed to ( ) 1 2 , ,...., n r r r by some government measures when the ideal distribution is assumed to be ( ) , ,...., n p p p .
6. Find a measure for comparing the distribution of industry or poverty in different regions of a country.7. Find the measure of genetic distance between two breeds of animals or between two species of plants.
A measure D(P: Q) of divergence or distance or cross entropy or directed divergence which has found deep applications in many disciplines can be defined as the discrepancy of the probabilities distribution P from another probabilities distribution Q.In some sense, it measures the distance of P from Q.The most useful measure of directed divergence is due to Kullback and Leibler (1951).It is given by ( ) Distance measure is a term that describes the difference between fuzzy sets.Liu (1992) gave the axiom definition of distance measure and discussed the relationships between distance measure and fuzzy entropy.Distance measure can be considered as a dual concept of similarity measure.Many researchers, such as Yager (1979), Kosko (1992) and Kaufmann (1975) have used distance measures to define fuzzy entropy.Liu (1992) extended Yager's formula to give a general relationship between distance (or similarity measure) and fuzzy entropy and obtained some important conclusions.Zadeh (1968) introduced the concept of fuzzy sets in which imprecise knowledge can be used to define an event.Using the concept of fuzzy message conditioning, a fuzzy distance measure between two fuzzy sets has been suggested by Bhandari and Pal (1993).
Bhandari and Pal (1993) introduced a measure corresponding to Kullback and Leibler's (1951) probabilistic directed divergence given by where ( ) gives the degree of belongingness of the element i x to the set A. The symmetric fuzzy divergence between two fuzzy sets A and B is given by ( ) ( ) ( ) Since there are non-exponential models, innovation diffusion models, epidemic models, a variety of models in Economics, Social Sciences, Biology or even in Physical Sciences, we need a variety of information measures for each field to extend the scope of their applications.Hence the development of new generalized parametric and non-parametric measures of divergence is important for the fuzzy distributions.In this paper, we have developed such important measures of fuzzy directed divergence.
In section 2, we propose two new measures of fuzzy directed divergence and prove their validity.Some desirable properties of these measures have also been studied.In section 3, the proposed measures are generalized.

Conclusion
It has been observed that there are redundancies and overlapping in similar situations which if removed can increase the efficiency of the process.The development of new parametric and non-parametric measures of information will definitely reduce uncertainty and consequently the process can be more efficient.Thus development of new generalized probabilistic and fuzzy measures of divergence is necessary to get as much insight as possible into the various physical situations.Many such fuzzy divergence measures can be generated for different situations.

(
function of a .Hence as a increases from 0 to ∞ , measure of symmetric divergence.The following relations are obvious : Kullback Leibler's (1951) measure is given by

(
Find the distance between income distributions in two countries whose proportions of persons in n groups