Table Of Contents

Previous topic

modalities.fmri.utils

Next topic

neurospin.clustering.bootstrap_hc

This Page

neurospin.clustering.GGMixture

Module: neurospin.clustering.GGMixture

Inheritance diagram for nipy.neurospin.clustering.GGMixture:

One-dimensional Gamma-Gaussian mixture density classes : Given a set of points the algo provides approcumate maximum likelihood estimates of the mixture distribution using an EM algorithm

Author: Bertrand Thirion and Merlin Keller 2005-2008

Classes

GGGM

class nipy.neurospin.clustering.GGMixture.GGGM(shape_n=1, scale_n=1, mean=0, var=1, shape_p=1, scale_p=1, mixt=array([, 0.33333333, 0.33333333, 0.33333333], ))

This is the basic one dimensional Gamma-Gaussian-Gamma Mixture estimation class, where the frist gamma has a negative sign, while the second one has a positive sign 7 parameters are used: - shape_n: negative gamma shape - scale_n: negative gamma scale - mean: gaussian mean - var: gaussian variance - shape_p: positive gamma shape - scale_p: positive gamma scale - mixt: array of mixture parameter (weights of the n-gamma,gaussian and p-gamma)

__init__(shape_n=1, scale_n=1, mean=0, var=1, shape_p=1, scale_p=1, mixt=array([, 0.33333333, 0.33333333, 0.33333333], ))
Estep(x) update probabilistic memberships of the three components x: input data, shape=(nbitems) z: probabilisticmembership, shape =(nbitems, 3)
Mstep(x, z) Update the parameters of the three components through ML approach x: input data, shape=(nbitems) z: probabilisticmembership, shape =(nbitems, 3)
ROC(x)

ROC curve for seperating positive Gamma distribution from two other modes, predicted by current parameter values -x: vector of observations

Output: P P[0]: False positive rates P[1]: True positive rates

SAEM(x, burnin=100, niter=200, verbose=False)
SAEM estimation procedure: Input: -x: vector of observations -burnin: number of burn-in SEM iterations (discarded values) -niter: maximum number of SAEM iterations for convergence to local maximum -verbose (0/1): verbosity level Gaussian disribution mean is fixed to zero Output: -LL: successive log-likelihood values
check(x)
component_likelihood(x)
ng,y,pg = self.component_likelihood(x) Compute the likelihood of the data x under the three components negative gamma, gaussina, positive gaussian
estimate(x, niter=100, delta=0.0001, bias=0, verbose=0)
Whole EM estimation procedure: x : input data, should be an array of size nbitems niter = 100 : max number of iterations delta = 1.e-4: increment in LL at xhich convergence is declared bias=0 : possible hard constrain on the gaussian variance (to avoid shrinkage); when bias=0, no contrain is applied verbose=1: verbosity level
init(x, mixt=array([, 1., 1., 1.], ))
initialization of the differnt parameters - mixt = np.ones(3,’d’) gives the prior mixture parameters the other parameters are initialized using standard approaches
init_fdr(x, dof=-1)
Initilization of the class based on a fdr heuristic: the probability to be in the positive component is proportional to the ‘positive fdr’ of the data. The same holds for the negative part. The point is that the gamma parts should model nothing more that the tails of the distribution
parameters()
posterior(x)
ng,y,pg = self.posterior(x) Compute the posterior probability of observing the data x under the three components negative gamma, gaussina, positive gaussian
show(x, figname=None)
vizualization of the MM, based on the empirical histogram of x

GGM

class nipy.neurospin.clustering.GGMixture.GGM(shape=1, scale=1, mean=0, var=1, mixt=0.5)

This is the basic one dimensional Gaussian-Gamma Mixture estimation class Note that it can work with positive or negative values, as long as there is at least one positive value. NB : The gamma distribution is defined only on positive values. 5 parameters are used: - mean: gaussian mean - var: gaussian variance - shape: gamma shape - scale: gamma scale - mixt: mixture parameter (weight of the gamma)

__init__(shape=1, scale=1, mean=0, var=1, mixt=0.5)
Estep(x) Estimation of the likelihood of the data for the three compartments x = inpute data (shape = (nbitems)) z = likelihoods (shape=(nbitems, 2))
Mstep(x, z)
self.Mstep(x,z) Estimation of the parameters of the model x = the data z = probabilistic membership matrix
check(x)
estimate(x, niter=100, delta=0.0001, verbose=False)
Complete EM estimation procedure x : data (shape = nbitems) niter =100: max nb of iterations delta = 0.001: criterion for convergence
parameters()
posterior(x)
ng,y,pg = self.posterior(x) Compute the posterior probability of observing the data x under the three components negative gamma, gaussina, positive gaussian
show(x)
vizualisation of the mm based on the empirical histogram of x

Gamma

class nipy.neurospin.clustering.GGMixture.Gamma(shape=1, scale=1)

This is the basic one dimensional Gaussian-Gamma Mixture estimation class Note that it can work with positive or negative values, as long as there is at least one positive value. NB : The gamma distribution is defined only on positive values. 5 parameters are used: - mean: gaussian mean - var: gaussian variance - shape: gamma shape - scale: gamma scale - mixt: mixture parameter (weight of the gamma)

__init__(shape=1, scale=1)
check(x)
estimate(x, eps=9.9999999999999995e-08)
ML estimation of the Gamma parameters
parameters()

Functions

nipy.neurospin.clustering.GGMixture.compute_c(x, z, eps=1.0000000000000001e-05)
this function returns the mle of the shape parameter if a 1D gamma density
nipy.neurospin.clustering.GGMixture.dichopsi_log(u, v, y, eps=1.0000000000000001e-05)
this function implements the dichotomic part of the solution of the psi(c)-log(c)=y
nipy.neurospin.clustering.GGMixture.psi_solve(y, eps=1.0000000000000001e-05)
This function solves solve psi(c)-log(c)=y by dichotomy
nipy.neurospin.clustering.GGMixture.test_GGGM()
nipy.neurospin.clustering.GGMixture.test_GGM()
nipy.neurospin.clustering.GGMixture.test_Gamma_parameters()