Inheritance diagram for nipy.neurospin.clustering.gmm:
Gaussian Mixture Model Class: contains the basic fields and methods of GMMs the high level functions are/should be binded in C
Author : Bertrand Thirion, 2006-2009
Bases: nipy.neurospin.clustering.gmm.GMM
This class implements Bayesian diagonal GMMs (prec_type = 1) Besides the standard fiels of GMMs, this class contains the follwing fields - prior_centers : array of shape (k,dim): the prior on the components means - prior_precision : array of shape (k,dim): the prior on the components precisions - prior_dof : array of shape (k): the prior on the dof (should be at least equal to dim) - prior_mean_scale : array of shape (k): scaling factor of the prior precision on the mean - prior_weights : array of shape (k) the prior on the components weights - mean_scale : array of shape (k): scaling factor of the posterior precision on the mean - dof : array of shape (k): the posterior dofs
This is the basic GMM class GMM.k is the number of components in the mixture GMM.dim is the dimension of the data GMM.centers is an array that contains all the centers of the components
shape (GMM.k,GMM.dim)
GMM.prec_type type of the precision matrix - O: full coavriance matrix, one for each component. shape = (GMM.k,GMM.dim**2) - 1 : diagonal covariance matrix, one for each components. shape = (GMM.k,GMM.dim) - 2 : diagonal covariance matrix, the same for all component. shape = (1,GMM.dim) GMM.weights contains the weights of the components in the mixture GMM.estimated is a binary variable that indicates whether the model has been instantiated or not
Evaluating the GMM on some new data INPUT data : (n*p) feature array, n = nb items, p=feature dimension OUTPUT
LL : (n) array of type (‘d’) log-likelihood of the data