Changelog History
Page 2

v3.7.rc1
May 22, 2019 
v3.6 Changes
December 21, 2018π This is a major new release from 3.5 with many new features and important bugfixes. The highlight is certainly our completely revamped website: https://docs.pymc.io/
π Note also, that this release will be the last to be compatible with Python 2. Thanks to all contributors!
π New features
 π² Track the model loglikelihood as a sampler stat for NUTS and HMC samplers
(accessible astrace.get_sampler_stats('model_logp')
) (#3134)  β Add Incomplete Beta function
incomplete_beta(a, b, value)
 β Add log CDF functions to continuous distributions:
Beta
,Cauchy
,ExGaussian
,Exponential
,Flat
,Gumbel
,HalfCauchy
,HalfFlat
,HalfNormal
,Laplace
,Logistic
,Lognormal
,Normal
,Pareto
,StudentT
,Triangular
,Uniform
,Wald
,Weibull
.  Behavior of
sample_posterior_predictive
is now to produce posterior predictive samples, in order, from all values of thetrace
. Previously, by default it would produce 1 chain worth of samples, using a random selection from thetrace
(#3212)  π Show diagnostics for initial energy errors in HMC and NUTS.
 PR #3273 has added the
distributions.distribution._DrawValuesContext
context
manager. This is used to store the values already drawn in nestedrandom
anddraw_values
calls, enablingdraw_values
to draw samples from the
joint probability distribution of RVs and not the marginals. Custom
distributions that must calldraw_values
several times in theirrandom
method, or that invoke many calls to other distribution'srandom
methods
(e.g. mixtures) must do all of these calls under the same_DrawValuesContext
context manager instance. If they do not, the conditional relations between
the distribution's parameters could be broken, andrandom
could return
values drawn from an incorrect distribution. Rice
distribution is now defined with either the noncentrality parameter or the shape parameter (#3287).
π§ Maintenance
 π Big rewrite of documentation (#3275)
 π Fixed Triangular distribution
c
attribute handling inrandom
and updated sample codes for consistency (#3225)  β»οΈ Refactor SMC and properly compute marginal likelihood (#3124)
 β Removed use of deprecated
ymin
keyword in matplotlib'sAxes.set_ylim
(#3279)  π Fix for #3210. Now
distribution.draw_values(params)
, will draw theparams
values from their joint probability distribution and not from combinations of their marginals (Refer to PR #3273).  β Removed dependence on pandasdatareader for retrieving Yahoo Finance data in examples (#3262)
 π Rewrote
Multinomial._random
method to better handle shape broadcasting (#3271)  π Fixed
Rice
distribution, which inconsistently mixed two parametrizations (#3286). Rice
distribution now accepts multiple parameters and observations and is usable with NUTS (#3289).sample_posterior_predictive
no longer callsdraw_values
to initialize the shape of the ppc trace. This called could lead toValueError
's when sampling the ppc from a model withFlat
orHalfFlat
prior distributions (Fix issue #3294).
π Deprecations
 Renamed
sample_ppc()
andsample_ppc_w()
tosample_posterior_predictive()
andsample_posterior_predictive_w()
, respectively.
 π² Track the model loglikelihood as a sampler stat for NUTS and HMC samplers

v3.6.rc1
December 19, 2018 
v3.5 Changes
July 21, 2018π New features
 β Add documentation section on survival analysis and censored data models
 β
Add
check_test_point
method topm.Model
 β Add
Ordered
Transformation andOrderedLogistic
distribution  β Add
Chain
transformation  π Improve error message
Mass matrix contains zeros on the diagonal. Some derivatives might always be zero
during tuning ofpm.sample
 π Improve error message
NaN occurred in optimization.
during ADVI  Save and load traces without
pickle
usingpm.save_trace
andpm.load_trace
 β Add
Kumaraswamy
distribution  β Add
TruncatedNormal
distribution  Rewrite parallel sampling of multiple chains on py3. This resolves
long standing issues when transferring large traces to the main process,
avoids pickling issues on UNIX, and allows us to show a progress bar
for all chains. If parallel sampling is interrupted, we now return
partial results.  Add
sample_prior_predictive
which allows for efficient sampling from
the unconditioned model.  π SMC: remove experimental warning, allow sampling using
sample
, reduce autocorrelation from
final trace.  Add
model_to_graphviz
(which uses the optional dependencygraphviz
) to
plot a directed graph of a PyMC3 model using plate notation.  β Add betaELBO variational inference as in betaVAE model (Christopher P. Burgess et al. NIPS, 2017)
 Add
__dir__
toSingleGroupApproximation
to improve autocompletion in interactive environments
π Fixes
 π Fixed grammar in divergence warning, previously
There were 1 divergences ...
could be raised.  π Fixed
KeyError
raised when only subset of variables are specified to be recorded in the trace.  β Removed unused
repeat=None
arguments from allrandom()
methods in distributions.  π Deprecated the
sigma
argument inMarginalSparse.marginal_likelihood
in favor ofnoise
 π Fixed unexpected behavior in
random
. Now therandom
functionality is more robust and will work better forsample_prior
when that is implemented.  Fixed
scale_cost_to_minibatch
behaviour, previously this was not working and alwaysFalse

v3.5.rc1
July 15, 2018 
v3.4.1 Changes
April 19, 2018π There was no 3.4 release due to a naming issue on PyPI.
π New features
 Add
logit_p
keyword topm.Bernoulli
, so that users can specify the logit of the success probability. This is faster and more stable than usingp=tt.nnet.sigmoid(logit_p)
.  β Add
random
keyword topm.DensityDist
thus enabling users to pass custom random method which in turn makes sampling from aDensityDist
possible.  Effective sample size computation is updated. The estimation uses Geyer's initial positive sequence, which no longer truncates the autocorrelation series inaccurately.
pm.diagnostics.effective_n
now can reports N_eff>N.  β Added
KroneckerNormal
distribution and a correspondingMarginalKron
Gaussian Process implementation for efficient inference, along with
lowerlevel functions such ascartesian
andkronecker
products.  β Added
Coregion
covariance function.  β Add new 'pairplot' function, for plotting scatter or hexbin matrices of sampled parameters.
Optionally it can plot divergences.  π Plots of discrete distributions in the docstrings
 β Add logitnormal distribution
 π Densityplot: add support for discrete variables
 π Fix the Binomial likelihood in
.glm.families.Binomial
, with the flexibility of specifying then
.  β Add
offset
kwarg to.glm
.  π Changed the
compare
function to accept a dictionary of modeltrace pairs instead of two separate lists of models and traces.  β add test and support for creating multivariate mixture and mixture of mixtures
distribution.draw_values
, now is also able to draw values from conditionally dependent RVs, such as autotransformed RVs (Refer to PR #2902).
π Fixes
 π
VonMises
does not overflow for large values of kappa. i0 and i1 have been removed and we now use log_i0 to compute the logp.  π The bandwidth for KDE plots is computed using a modified version of Scott's rule. The new version uses entropy instead of standard deviation. This works better for multimodal distributions. Functions using KDE plots has a new argument
bw
controlling the bandwidth.  π fix PyMC3 variable is not replaced if provided in more_replacements (#2890)
 π Fix for issue #2900. For many situations, named nodeinputs do not have a
random
method, while some intermediate node may have it. This meant that if the named nodeinput at the leaf of the graph did not have a fixed value,theano
would try to compile it and fail to find inputs, raising atheano.gof.fg.MissingInputError
. This was fixed by going through the theano variable's owner inputs graph, trying to get intermediate namednodes values if the leafs had failed.  In
distribution.draw_values
, some named nodes could betheano.tensor.TensorConstant
s ortheano.tensor.sharedvar.SharedVariable
s. Nevertheless, indistribution._draw_value
, these would be passed todistribution._compile_theano_function
as if they weretheano.tensor.TensorVariable
s. This could lead to the following exceptionsTypeError: ('Constants not allowed in param list', ...)
orTypeError: Cannot use a shared variable (...)
. The fix was to not addtheano.tensor.TensorConstant
ortheano.tensor.sharedvar.SharedVariable
named nodes into thegivens
dict that could be used indistribution._compile_theano_function
.  π Exponential support changed to include zero values.
π Deprecations
 π DIC and BPIC calculations have been removed
 π df_summary have been removed, use summary instead
 π
njobs
andnchains
kwarg are deprecated in favor ofcores
andchains
forsample
 π
lag
kwarg inpm.stats.autocorr
andpm.stats.autocov
is deprecated.
 Add

v3.4.rc2
April 13, 2018 
v3.4.rc1
April 09, 2018 
v3.2 Changes
October 10, 2017π New features
This version includes two major contributions from our Google Summer of Code 2017 students:
 β»οΈ Maxim Kochurov extended and refactored the variational inference module. This primarily adds two important classes, representing operator variational inference (
OPVI
) objects andApproximation
objects. These make it easier to extend existingvariational
classes, and to derive inference fromvariational
optimizations, respectively. Thevariational
module now also includes normalizing flows (NFVI
).  π Bill Engels added an extensive new Gaussian processes (
gp
) module. Standard GPs can be specified using eitherLatent
orMarginal
classes, depending on the nature of the underlying function. A StudentT processTP
has been added. In order to accomodate larger datasets, approximate marginal Gaussian processes (MarginalSparse
) have been added.
π Documentation has been improved as the result of the project's monthly "docathons".
An experimental stochastic gradient Fisher scoring (
SGFS
) sampling step method has been added.The API for
find_MAP
was enhanced.SMC now estimates the marginal likelihood.
β Added
Logistic
andHalfFlat
distributions to set of continuous distributions.Bayesian fraction of missing information (
bfmi
) function added tostats
.β¨ Enhancements to
compareplot
added.QuadPotential adaptation has been implemented.
π Script added to build and deploy documentation.
MAP estimates now available for transformed and nontransformed variables.
π The
Constant
variable class has been deprecated, and will be removed in 3.3.DIC and BPIC calculations have been sped up.
Arrays are now accepted as arguments for the
Bound
class.random
method was added to theWishart
andLKJCorr
distributions.Progress bars have been added to LOO and WAIC calculations.
β‘οΈ All example notebooks updated to reflect changes in API since 3.1.
β»οΈ Parts of the test suite have been refactored.
π Fixes
π Fixed sampler stats error in NUTS for nonRAM backends
Matplotlib is no longer a hard dependency, making it easier to use in settings where installing Matplotlib is problematic. PyMC will only complain if plotting is attempted.
π Several bugs in the Gaussian process covariance were fixed.
All chains are now used to calculate WAIC and LOO.
π AR(1) loglikelihood function has been fixed.
π Slice sampler fixed to sample from 1D conditionals.
π Several docstring fixes.
Contributors
π The following people contributed to this release (ordered by number of commits):
Maxim Kochurov [email protected] Bill Engels [email protected] Chris Fonnesbeck [email protected] Junpeng Lao [email protected] Adrian Seyboldt [email protected] AustinRochford [email protected] Osvaldo Martin [email protected] Colin Carroll [email protected] Hannes VasyuraBathke [email protected] Thomas Wiecki [email protected] michaelosthege [email protected] Marco De Nadai [email protected] Kyle Beauchamp [email protected] Massimo [email protected] ctm22396 [email protected] Max Horn [email protected] Hennadii Madan [email protected] Hassan Naseri [email protected] Peadar Coyle [email protected] Saurav R. Tuladhar [email protected] Shashank Shekhar [email protected] Eric Ma [email protected] Ed Herbst [email protected] tsdlovell [email protected] zaxtax [email protected] Dan Nichol [email protected] Benjamin Yetton [email protected] jackhansom [email protected] Jack Tsai [email protected] AndrΓ©s Asensio Ramos [email protected]
 β»οΈ Maxim Kochurov extended and refactored the variational inference module. This primarily adds two important classes, representing operator variational inference (

v3.1 Changes
June 23, 2017π New features
π New user forum at http://discourse.pymc.io
π Much improved variational inference support:
 Add Operator Variational Inference (experimental).
 Add SteinVariational Gradient Descent as well as Amortized SVGD (experimental).
 Add pm.Minibatch() to easily specify minibatches.
 Added various optimizers including ADAM.
 Stopping criterion implemented via callbacks.
0οΈβ£ sample() defaults changed: tuning is enabled for the first 500 samples which are then discarded from the trace as burnin.
π MvNormal supports Cholesky Decomposition now for increased speed and numerical stability.
Many optimizations and speedups.
NUTS implementation now matches current Stan implementation.
β Add higherorder integrators for HMC.
ADVI stopping criterion implemented.
π Improved support for theano's floatX setting to enable GPU computations (work in progress).
π MvNormal supports Cholesky Decomposition now for increased speed and numerical stability.
β Added support for multidimensional minibatches
β Added
Approximation
class and the ability to convert a sampled trace into an approximation via itsEmpirical
subclass.π
Model
can now be inherited from and act as a base class for user specified models (see pymc3.models.linear).β Add MvGaussianRandomWalk and MvStudentTRandomWalk distributions.
GLM models do not need a lefthand variable anymore.
β»οΈ Refactored HMC and NUTS for better readability.
β Add support for Python 3.6.
π Fixes
Bound now works for discrete distributions as well.
Random sampling now returns the correct shape even for higher dimensional RVs.
π Use theano Psi and GammaLn functions to enable GPU support for them.