v0.2.0
Interface change 🛠︎
Bug Fixes 🐛
Added a feature 🆕
Distribution API
- 🛠︎Change the argument order of __init__ in the exponential distribution families and make the distribution parameters explicit #90
- 🛠︎Removed the sampling option from DistributionBase.set_dist (relaxed distribution families still have a sampling option) #108
- 🛠︎ TransformedDistribution will be the probability distribution of 2 variables of input and output of flow, and the return_all option of the conventional sample method is abolished. #115
- 🛠︎ Rename return_all option of InverseTransformedDistribution.sample to return_hidden #115
- 🛠︎ Added TransformedDistribution(or InverseTransformedDistribution ).sample with return_all option (that is, option that also returns random variables that are not involved). #115
- 🛠︎ TransformedDistribution.__init__ argument
var
is renamed toflow_output_var
- 🆕 Added Distribution.has_reparam property #93
- 🆕 Add return_all option to MixtureModel.sample #115
- 🐛 Fixed a bug that unintentional overwrite of parameters of basic Distribution when torch.load #113
Loss API
- 🛠︎StochasticReconstructionLoss removed (need to explicitly configure this loss) #103
- 🛠︎Changed the base class of Loss to torch.nn.Module #100
- 🛠︎Renamed Loss.train to Loss.loss_train, and Loss.test to Loss.loss_test #100
- 🛠︎Renamed Loss._get_eval to Loss.forward #95
- 🛠︎Added Entropy method to switch entropy estimation by AnalyticalEntropy and MonteCarlo in options (deprecated Entropy class) #89
- 🛠︎Added a CrossEntropy method that can be switched as well (deprecated the CrossEntropy class) #89
- 🛠︎Added a KullbackLeibler method that can be switched as well (deprecated the KullbackLeibler class) #89
- 🛠︎ deprecated ELBO class and replaced it with ELBO method that returns a Loss instance #89
- 🆕 Support for Loss.detach method #93
- 🆕 Support for MinLoss, MaxLoss #95
- 🆕 Support for alternative loss
REINFORCE
to derive a policy gradient. #93 - 🆕 Loss support for DataParallel #100
- 🆕 Separated some features of the Loss class into the Divergence class #95
- 🆕 Added return_all option to Loss.eval (that is, when
return_dict = True
, you can also select whether to return unrelated random variables) #115 - 🐛 Fixed a bug in IterativeLoss where the past value was conditioned on each step as the future value. #115
- 🐛 Placed the parameter tensor of ValueLoss in nn.Module.device. #100
- 🐛 Fixed incorrect argument checking in WassersteinDistance and MMDLoss. #103
- 🐛 Fixed a bug in checking variables in Loss initialization #107
- 🐛 Fixed a bug in IterativeLoss #107
Other
- 🆕 Add utils.lru_cache_for_sample_dict decorator that enables memoization with a function that takes a dictionary of random variables and their realization values #109
- 🆕 Renamed examples/vae_model to vae_with_vae_class
- 🆕 Added some exception messages #103
- 🐛 Fixed jacobian calculation in flow/Preprocess #107
- 🐛 Fixed a bug in some browsers that does not show the formula of readme.
- 🐛 Alternate text is displayed when readme formula is not displayed. #117