PretrainedGaussianPosterior#
- class torchbayesian.bnn.PretrainedGaussianPosterior(param: Tensor)#
Bases:
VariationalPosteriorThis class is a diagonal Gaussian variational posterior whose mean parameter is initialized from an existing tensor.
Samples tensors via the reparametrization trick.
This posterior is useful when converting a pretrained model into a Bayesian neural network (BNN) via Bayes by Backprop (BBB) variational inference (VI), as it initializes the Gaussian variational distribution centered on the pretrained weights.
Parameters#
- paramTensor
The tensor (e.g. pretrained parameter) to initially center the posterior distribution on.
Attributes#
- muParameter
The variational parameter of the mean of the distribution.
- rhoParameter
The variational parameter that parametrizes the standard deviation of the distribution via softplus.
- classmethod from_param(param: Tensor, **kwargs) PretrainedGaussianPosterior#
Alternate constructor used by ‘get_posterior’ inside ‘bnn.BayesianModule’.
Overrides the default ‘from_param’ constructor of ‘VariationalPosterior’ to pass along ‘param’.
Parameters#
- paramTensor
The tensor (parameter or buffer) being replaced by a variational posterior.
- **kwargs
Additional keyword arguments passed along to the variational posterior constructor.
Returns#
- posterior_instancePretrainedGaussianPosterior
An instance of the class.
- reset_parameters() None#
Initializes the variational parameter ‘rho’ of the Gaussian posterior N(mu, sigma), where sigma = softplus(rho).
- property sigma: Tensor#
Returns the standard deviation parameter of the Gaussian distribution.
Returns#
- sigmaTensor
The standard deviation parameter of the Gaussian distribution