This nonparametric estimator allows the designer to choose any entropy order a
and any kernel function K For the special choices of quadratic entropy (a =2) and
Gaussian kernels, equation (2.5) reduces to the estimator defined by Principe et al.
[PriOO] except for a change in kernel size.
H2(X) -log V2 (X)
1NN
2(X)= 2 G- p ) (2.6)
N j=i=l1
It is interesting to point out that the above definition is achieved without any
sample approximations as in (2.5) due to the mathematical properties of the Gaussian
kernel. It must also be noted here that in contrast to Shannon's entropy, Renyi's entropy
is much easier to estimate directly from the data samples.
2.1.2 Information Particles
The definition of Renyi's quadratic entropy as given in (2.6), has a very interesting
physical interpretation. For simplicity, let us assume that we are dealing with single
dimensional random variables, with extension to multi-variable case being trivial. If the
data samples p, are assumed as physical particles, each emanating a potential field, then
they can be called as information particles (IPCs) [PriOO]. Then, V2(X) in (2.6) can be
viewed as the total potential energy due to the pair-wise interactions of the IPCs.
If the potential field that is generated by each IPC is v(<), then this function is
required to be continuous and differentiable (except possibly at the origin), and to satisfy
the even symmetry condition v() = v(-). With these definitions, it can be observed that
the potential energy or information potential (IP) of particle p, due to particle p, is V(p,,p,)
= v(p,-p,). The total potential energy ofp, due to all other particles is then given by