%DKL Kullback-Leibler divergence % % D=DKL(DATA,PROTO) % % INPUT % DATA a dataset % PROTO the dataset with prototypes (representation set) % OUTPUT % D dataset with a distance matrix % % DESCRIPTION % The Kullback-Leibler divergence (pp. 151, Pattern Recognition, % Theodoridis, p. 176 in 2nd edition) % % $Id: dkl2.m,v 1.2 2005/03/02 13:39:45 serguei Exp $ function d=dkl2(data,proto) fprintf(' dkl2:'); if size(data,2)~=size(proto,2) error('both datasets must have the same feature sizes'); end sc=size(data,1); % training sample size pc=size(proto,1); % test sample size d=zeros(sc,pc); labdata=getlab(data); labproto=getlab(proto); data2=data; data=+data; proto2=proto; proto=+proto; % normalization: we consider a histogram being a set of probability % estimates data=data./repmat(sum(data,2),[1,size(data,2)]); proto=proto./repmat(sum(proto,2),[1,size(proto,2)]); step=round(pc/10); for i=1:pc t=repmat(proto(i,:),[sc,1]); w = warning; warning('off'); l = log(t./data); warning(w); l(~isfinite(l)) = 0; d(:,i)=sum((t-data).*l ,2); if mod(i,step)==0, fprintf('.'); end end d=setdat(data2,d); d=setfeatlab(d,labproto); return