Changelog History
Page 1

v3.11.2 Changes
March 14, 2021π New Features
 π
pm.math.cartesian
can now handle inputs that are themselves >1D (see #4482).  π Statistics and plotting functions that were removed in
3.11.0
were brought back, albeit with deprecation warnings if an old naming scheme is used (see #4536). In order to future proof your code, rename these function calls:pm.traceplot
βpm.plot_trace
pm.compareplot
βpm.plot_compare
(here you might need to rename some columns in the input according to thearviz.plot_compare
documentation)pm.autocorrplot
βpm.plot_autocorr
pm.forestplot
βpm.plot_forest
pm.kdeplot
βpm.plot_kde
pm.energyplot
βpm.plot_energy
pm.densityplot
βpm.plot_density
pm.pairplot
βpm.plot_pair
π§ Maintenance
 π β Our memoization mechanism wasn't robust against hash collisions (#4506), sometimes resulting in incorrect values in, for example, posterior predictives. The
pymc3.memoize
module was removed and replaced withcachetools
. Thehashable
function andWithMemoization
class were moved topymc3.util
(see #4525). pm.make_shared_replacements
now retains broadcasting information which fixes issues with Metropolis samplers (see #4492).
π Release manager for 3.11.2: Michael Osthege (@michaelosthege)
 π

v3.11.1 Changes
February 12, 2021π New Features
 π Automatic imputations now also work with
ndarray
data, not justpd.Series
orpd.DataFrame
(see#4439). pymc3.sampling_jax.sample_numpyro_nuts
now returns samples from transformed random variables, rather than from the unconstrained representation (see #4427).
π§ Maintenance
 π We upgraded to
TheanoPyMC v1.1.2
which includes bugfixes for... β a problem with
tt.switch
that affected the behavior of several distributions, including at least the following special cases (see #4448) 
Bernoulli
when all the observed values were the same (e.g.,[0, 0, 0, 0, 0]
). 
TruncatedNormal
whensigma
was constant andmu
was being automatically broadcasted to match the shape of observations.  Warning floods and compiledir locking (see #4444)
 β a problem with
 π
math.log1mexp_numpy
no longer raises RuntimeWarning when given very small inputs. These were commonly observed during NUTS sampling (see #4428).  π
ScalarSharedVariable
can now be used as an input to other RVs directly (see #4445).  π
pm.sample
andpm.find_MAP
no longer change thestart
argument (see #4458).  π Fixed
Dirichlet.logp
method to work with unit batch or event shapes (see #4454).  π Bugfix in logp and logcdf methods of
Triangular
distribution (see #4470).
π Release manager for 3.11.1: Michael Osthege (@michaelosthege)
 π Automatic imputations now also work with

v3.11.0 Changes
January 21, 2021π This release breaks some APIs w.r.t.
3.10.0
. It also brings some dreadfully awaited fixes, so be sure to go through the (breaking) changes below.π₯ Breaking Changes
 π β Many plotting and diagnostic functions that were just aliasing ArviZ functions were removed (see 4397). This includes
pm.summary
,pm.traceplot
,pm.ess
and many more!  π β We now depend on
TheanoPyMC
version1.1.0
exactly (see #4405). Major refactorings were done inTheanoPyMC
1.1.0. If you implement customOp
s or interact with Theano in any way yourself, make sure to read the TheanoPyMC 1.1.0 release notes.  π β Python 3.6 support was dropped (by no longer testing) and Python 3.9 was added (see #4332).
 β Changed shape behavior: No longer collapse length 1 vector shape into scalars. (see #4206 and #4214)
 Applies to random variables and also the
.random(size=...)
kwarg!  To create scalar variables you must now use
shape=None
orshape=()
. shape=(1,)
andshape=1
now become vectors. Previously they were collapsed into scalars 0length dimensions are now ruled illegal for random variables and raise a
ValueError
.
 Applies to random variables and also the
 In
sample_prior_predictive
thevars
kwarg was removed in favor ofvar_names
(see #4327).  Removed
theanof.set_theano_config
because it illegally changed Theano's internal state (see #4329).
π New Features
 π Option to set
check_bounds=False
when instantiatingpymc3.Model()
. This turns off bounds checks that ensure that input parameters of distributions are valid. For correctly specified models, this is unneccessary as all parameters get automatically transformed so that all values are valid. Turning this off should lead to faster sampling (see #4377).  π
OrderedProbit
distribution added (see #4232). plot_posterior_predictive_glm
now works witharviz.InferenceData
as well (see #4234) β Add
logcdf
method to all univariate discrete distributions (see #4387).  β Add
random
method toMvGaussianRandomWalk
(see #4388)  π
AsymmetricLaplace
distribution added (see #4392).  π
DirichletMultinomial
distribution added (see #4373).  β Added a new
predict
method toBART
to compute out of sample predictions (see #4310).
π§ Maintenance
 π Fixed bug whereby partial traces returns after keyboard interrupt during parallel sampling had fewer draws than would've been available #4318
 Make
sample_shape
same across all contexts indraw_values
(see #4305).  π The notebook gallery has been moved to https://github.com/pymcdevs/pymcexamples (see #4348).
 π
math.logsumexp
now matchesscipy.special.logsumexp
when arrays contain infinite values (see #4360).  π Fixed mathematical formulation in
MvStudentT
random method. (see #4359)  π Fix issue in
logp
method ofHyperGeometric
. It now returnsinf
for invalid parameters (see 4367)  π Fixed
MatrixNormal
random method to work with parameters as random variables. (see #4368)  β‘οΈ Update the
logcdf
method of several continuous distributions to return inf for invalid parameters and values, and raise an informative error when multiple values cannot be evaluated in a single call. (see 4393 and #4421)  π Improve numerical stability in
logp
andlogcdf
methods ofExGaussian
(see #4407)  π Issue UserWarning when doing prior or posterior predictive sampling with models containing Potential factors (see #4419)
 β‘οΈ Dirichlet distribution's
random
method is now optimized and gives outputs in correct shape (see #4416)  π Attempting to sample a named model with SMC will now raise a
NotImplementedError
. (see #4365)
π Release manager for 3.11.0: Eelke Spaak (@Spaak)
 π β Many plotting and diagnostic functions that were just aliasing ArviZ functions were removed (see 4397). This includes

v3.10.0 Changes
December 07, 2020π This is a major release with many exciting new features. The biggest change is that we now rely on our own fork of TheanoPyMC. This is in line with our big announcement about our commitment to PyMC3 and Theano.
β¬οΈ When upgrading, make sure that
TheanoPyMC
and notTheano
are installed (the imports remain unchanged, however). If not, you can uninstallTheano
:conda remove theano
And to install:
conda install c condaforge theanopymc
Or, if you are using pip (not recommended):
pip uninstall theano
And to install:
pip install theanopymc
π This new version of
TheanoPyMC
comes with an experimental JAX backend which, when combined with the new and experimental JAX samplers in PyMC3, can greatly speed up sampling in your model. As this is still very new, please do not use it in production yet but do test it out and let us know if anything breaks and what results you are seeing, especially speedwise.π New features
 π New experimental JAX samplers in
pymc3.sample_jax
(see notebook and #4247). Requires JAX and either TFP or numpyro.  β Add MLDA, a new stepper for multilevel sampling. MLDA can be used when a hierarchy of approximate posteriors of varying accuracy is available, offering improved sampling efficiency especially in highdimensional problems and/or where gradients are not available (see #3926)
 β Add Bayesian Additive Regression Trees (BARTs) #4183)
 β Added
pymc3.gp.cov.Circular
kernel for Gaussian Processes on circular domains, e.g. the unit circle (see #4082).  β Added a new
MixtureSameFamily
distribution to handle mixtures of arbitrary dimensions in vectorized form for improved speed (see #4185). sample_posterior_predictive_w
can now feed onxarray.Dataset
 e.g. fromInferenceData.posterior
. (see #4042) π Change SMC metropolis kernel to independent metropolis kernel #4115)
 β Add alternative parametrization to NegativeBinomial distribution in terms of n and p (see #4126)
 β Added semantically meaningful
str
representations to PyMC3 objects for console, notebook, and GraphViz use (see #4076, #4065, #4159, #4217, #4243, and #4260).  β Add Discrete HyperGeometric Distribution (see #4249)
π§ Maintenance
 Switch the dependency of Theano to our own fork, TheanoPyMC.
 β Removed nonNDArray (Text, SQLite, HDF5) backends and associated tests.
 π Use dill to serialize user defined logp functions in
DensityDist
. The previous serialization code fails if it is used in notebooks on Windows and Mac.dill
is now a required dependency. (see #3844).  π Fixed numerical instability in ExGaussian's logp by preventing
logpow
from returninginf
(see #4050).  Numerically improved stickbreaking transformation  e.g. for the
Dirichlet
distribution. #4129  Enabled the
Multinomial
distribution to handle batch sizes that have more than 2 dimensions. #4169  π Test model logp before starting any MCMC chains (see #4211)
 β
Fix bug in
model.check_test_point
that caused thetest_point
argument to be ignored. (see PR #4211)  π¨ Refactored MvNormal.random method with better handling of sample, batch and event shapes. #4207
 The
InverseGamma
distribution now implements alogcdf
. #3944  π Make starting jitter methods for nuts sampling more robust by resampling values that lead to nonfinite probabilities. A new optional argument
jittermaxretries
can be passed topm.sample()
andpm.init_nuts()
to control the maximum number of retries per chain. 4298
π Documentation
 β Added a new notebook demonstrating how to incorporate sampling from a conjugate Dirichletmultinomial posterior density in conjunction with other step methods (see #4199).
 π Mentioned the way to do any random walk with
theano.tensor.cumsum()
inGaussianRandomWalk
docstrings (see #4048).
π Release manager for 3.10.0: Eelke Spaak (@Spaak)
 π New experimental JAX samplers in

v3.9.3 Changes
August 11, 2020π New features
 Introduce optional arguments to
pm.sample
:mp_ctx
to control how the processes for parallel sampling are started, andpickle_backend
to specify which library is used to pickle models in parallel sampling when the multiprocessing context is not of typefork
(see #3991).  Add sampler stats
process_time_diff
,perf_counter_diff
andperf_counter_start
, that record wall and CPU times for each NUTS and HMC sample (see #3986).  Extend
keep_size
argument handling forsample_posterior_predictive
andfast_sample_posterior_predictive
, to work on ArviZInferenceData
and xarrayDataset
input values (see PR #4006 and issue #4004).  SMCABC: add the Wasserstein and energy distance functions. Refactor API, the distance, sum_stats and epsilon arguments are now passed
pm.Simulator
instead ofpm.sample_smc
. Add random method topm.Simulator
. Add option to save the simulated data. Improved LaTeX representation #3996.  SMCABC: Allow use of potentials by adding them to the prior term. #4016.
π§ Maintenance
 π Fix an error on Windows and Mac where error message from unpickling models did not show up in the notebook, or where sampling froze when a worker process crashed (see #3991).
 π Require Theano >= 1.0.5 (see #4032).
π Documentation
 π Notebook on multilevel modeling has been rewritten to showcase ArviZ and xarray usage for inference result analysis (see #3963).
π NB: The
docs/*
folder is still removed from the tarball due to an upload size limit on PyPi.π Release manager for 3.9.3: Kyle Beauchamp (@kyleabeauchamp)
 Introduce optional arguments to

v3.9.2 Changes
June 24, 2020π§ Maintenance
 π Warning added in GP module when
input_dim
is lower than the number of columns inX
to compute the covariance function (see #3974).  Pass the
tune
argument fromsample
when usingadvi+adapt_diag_grad
(see issue #3965, fixed by #3979).  β Add simple test case for new coords and dims feature in
pm.Model
(see #3977).  π Require ArviZ >= 0.9.0 (see #3977).
 π Fixed issue #3962 by making a change in the
_random()
method ofGaussianRandomWalk
class (see PR #3985). Further testing revealed a new issue which is being tracked by #4010.
π NB: The
docs/*
folder is still removed from the tarball due to an upload size limit on PyPi.π Release manager for 3.9.2: Alex Andorra (@AlexAndorra)
 π Warning added in GP module when

v3.9.1 Changes
June 16, 2020π The
v3.9.0
upload to PyPI didn't include a tarball, which is fixed in this release. π Though we had to temporarily remove thedocs/*
folder from the tarball due to a size limit.π Release manager for 3.9.1: Michael Osthege (@michaelosthege)

v3.9.0 Changes
June 16, 2020π New features
 π Use fastprogress instead of tqdm #3693.
 π
DEMetropolis
can now tune bothlambda
andscaling
parameters, but by default neither of them are tuned. See #3743 for more info.  π
DEMetropolisZ
, an improved variant ofDEMetropolis
brings better parallelization and higher efficiency with fewer chains with a slower initial convergence. This implementation is experimental. See #3784 for more info.  π Notebooks that give insight into
DEMetropolis
,DEMetropolisZ
and theDifferentialEquation
interface are now located in the Tutorials/Deep Dive section.  Add
fast_sample_posterior_predictive
, a vectorized alternative tosample_posterior_predictive
. This alternative is substantially faster for large models.  π GP covariance functions can now be exponentiated by a scalar. See PR #3852
sample_posterior_predictive
can now feed onxarray.Dataset
 e.g. fromInferenceData.posterior
. (see #3846)SamplerReport
(MultiTrace.report
) now has propertiesn_tune
,n_draws
,t_sampling
for increased convenience (see #3827) π
pm.sample(..., return_inferencedata=True)
can now directly return the trace asarviz.InferenceData
(see #3911) pm.sample
now has support for adapting dense mass matrix usingQuadPotentialFullAdapt
(see #3596, #3705, #3858, and #3893). Useinit="adapt_full"
orinit="jitter+adapt_full"
to use. π
Moyal
distribution added (see #3870).  π
pm.LKJCholeskyCov
now automatically computes and returns the unpacked Cholesky decomposition, the correlations and the standard deviations of the covariance matrix (see #3881).  π
pm.Data
container can now be used for index variables, i.e with integer data and not only floats (issue #3813, fixed by #3925).  π
pm.Data
container can now be used as input for other random variables (issue #3842, fixed by #3925).  π Allow users to specify coordinates and dimension names instead of numerical shapes when specifying a model. This makes interoperability with ArviZ easier. (see #3551)
 π Plots and Stats API sections now link to ArviZ documentation #3927
 Add
SamplerReport
with propertiesn_draws
,t_sampling
andn_tune
to SMC.n_tune
is always 0 #3931.  π SMCABC: add option to define summary statistics, allow to sample from more complex models, remove redundant distances #3940
π§ Maintenance
 π Tuning results no longer leak into sequentially sampled
Metropolis
chains (see #3733 and #3796).  We'll deprecate the
Text
andSQLite
backends and thesave_trace
/load_trace
functions, since this is now done with ArviZ. (see #3902)  ArviZ
v0.8.3
is now the minimum required version  π In named models,
pm.Data
objects now get modelrelative names (see #3843).  π
pm.sample
now takes 1000 draws and 1000 tuning samples by default, instead of 500 previously (see #3855).  π
Moved argument division out of
NegativeBinomial
random
method. Fixes #3864 in the style of #3509.  π The Dirichlet distribution now raises a ValueError when it's initialized with <= 0 values (see #3853).
 π Dtype bugfix in
MvNormal
andMvStudentT
(see 3836).  End of sampling report now uses
arviz.InferenceData
internally and avoids storing pointwise log likelihood (see #3883).  π The multiprocessing start method on MacOS is now set to "forkserver", to avoid crashes (see issue #3849, solved by #3919).
 π The AR1 logp now uses the precision of the whole AR1 process instead of just the innovation precision (see issue #3892, fixed by #3899).
 π Forced the
Beta
distribution'srandom
method to generate samples that are in the open interval $(0, 1)$, i.e. no value can be equal to zero or equal to one (issue #3898 fixed by #3924).  π Fixed an issue that happened on Windows, that was introduced by the clipped beta distribution rvs function (#3924). Windows does not support the
float128
dtype, but we had assumed that it had to be available. The solution was to only supportfloat128
on Linux and Darwin systems (see issue #3929 fixed by #3930).
π Deprecations
 Remove
sample_ppc
andsample_ppc_w
that were deprecated in 3.6.  π Deprecated
sd
has been replaced bysigma
(already in version 3.7) in continuous, mixed and timeseries distributions and now raisesDeprecationWarning
whensd
is used. (see #3837 and #3688).  We'll deprecate the
Text
andSQLite
backends and thesave_trace
/load_trace
functions, since this is now done with ArviZ. (see #3902)  β¬οΈ Dropped some deprecated kwargs and functions (see #3906)
 β¬οΈ Dropped the outdated 'nuts' initialization method for
pm.sample
(see #3863).
π Release manager for 3.9.0: Michael Osthege (@michaelosthege)

v3.8 Changes
November 29, 2019π New features
 π Implemented robust u turn check in NUTS (similar to standev/stan#2800). See PR [#3605]
 β Add capabilities to do inference on parameters in a differential equation with
DifferentialEquation
. See #3590 and #3634.  Distinguish between
Data
andDeterministic
variables when graphing models with graphviz. PR #3491.  Sequential Monte Carlo  Approximate Bayesian Computation step method is now available. The implementation is in an experimental stage and will be further improved.
 β Added
Matern12
covariance function for Gaussian processes. This is the Matern kernel with nu=1/2.  Progressbar reports number of divergences in real time, when available #3547.
 Sampling from variational approximation now allows for alternative trace backends [#3550].
 Infix
@
operator now works with random variables and deterministics #3619.  ArviZ is now a requirement, and handles plotting, diagnostics, and statistical checks.
 Can use GaussianRandomWalk in sample_prior_predictive and sample_prior_predictive #3682
 Now 11 years of S&P returns in data set#3682
π§ Maintenance
 Moved math operations out of
Rice
,TruncatedNormal
,Triangular
andZeroInflatedNegativeBinomial
random
methods. Math operations on values returned bydraw_values
might not broadcast well, and all thesize
aware broadcasting is left togenerate_samples
. Fixes #3481 and #3508  Parallelization of population steppers (
DEMetropolis
) is now set via thecores
argument. (#3559)  π Fixed a bug in
Categorical.logp
. In the case of multidimensionalp
's, the indexing was done wrong leading to incorrectly shaped tensors that consumedO(n**2)
memory instead ofO(n)
. This fixes issue #3535  Fixed a defect in
OrderedLogistic.__init__
that unnecessarily increased the dimensionality of the underlyingp
. Related to issue issue #3535 but was not the true cause of it.  SMC: stabilize covariance matrix 3573
 SMC: is no longer a step method of
pm.sample
now it should be called usingpm.sample_smc
3579  SMC: improve computation of the proposal scaling factor 3594 and 3625
 SMC: reduce number of logp evaluations 3600
 π SMC: remove
scaling
andtune_scaling
arguments as is a better idea to always allow SMC to automatically compute the scaling factor 3625  Now uses
multiprocessong
rather thanpsutil
to count CPUs, which results in reliable core counts on Chromebooks. sample_posterior_predictive
now preallocates the memory required for its output to improve memory usage. Addresses problems raised in this discourse thread. π Fixed a bug in
Categorical.logp
. In the case of multidimensionalp
's, the indexing was done wrong leading to incorrectly shaped tensors that consumedO(n**2)
memory instead ofO(n)
. This fixes issue #3535  Fixed a defect in
OrderedLogistic.__init__
that unnecessarily increased the dimensionality of the underlyingp
. Related to issue issue #3535 but was not the true cause of it.  π Wrapped
DensityDist.rand
withgenerate_samples
to make it aware of the distribution's shape. Added control flow attributes to still be able to behave as in earlier versions, and to control how to interpret thesize
parameter in therandom
callable signature. Fixes 3553  Added
theano.gof.graph.Constant
to type checks done in_draw_value
(fixes issue 3595) HalfNormal
did not used to work properly indraw_values
,sample_prior_predictive
, orsample_posterior_predictive
(fixes issue 3686) π Random variable transforms were inadvertently left out of the API documentation. Added them. (See PR 3690).
 Refactored
pymc3.model.get_named_nodes_and_relations
to use the ancestors and descendents, in a way that is consistent withtheano
's naming convention.  Changed the way in which
pymc3.model.get_named_nodes_and_relations
computes nodes without ancestors to make it robust to changes in var_name orderings (issue #3643)

v3.7 Changes
May 29, 2019π New features
 β Add data container class (
Data
) that wraps the theano SharedVariable class and let the model be aware of its inputs and outputs.  β Add function
set_data
to update variables defined asData
.  π
Mixture
now supports mixtures of multidimensional probability distributions, not just lists of 1D distributions. GLM.from_formula
andLinearComponent.from_formula
can extract variables from the calling scope. Customizable via the neweval_env
argument. Fixing #3382. β Added the
distributions.shape_utils
module with functions used to help broadcast samples drawn from distributions using thesize
keyword argument.  Used
numpy.vectorize
indistributions.distribution._compile_theano_function
. This enablessample_prior_predictive
andsample_posterior_predictive
to ask for tuples of samples instead of just integers. This fixes issue #3422.
π§ Maintenance
 All occurances of
sd
as a parameter name have been renamed tosigma
.sd
will continue to function for backwards compatibility. HamiltonianMC
was ignoring certain arguments liketarget_accept
, and not using the custom step size jitter function with expectation 1. π Made
BrokenPipeError
for parallel sampling more verbose on Windows.  Added the
broadcast_distribution_samples
function that helps broadcasting arrays of drawn samples, taking into account the requestedsize
and the inferred distribution shape. This sometimes is needed by distributions that call severalrvs
separately within theirrandom
method, such as theZeroInflatedPoisson
(fixes issue #3310).  The
Wald
,Kumaraswamy
,LogNormal
,Pareto
,Cauchy
,HalfCauchy
,Weibull
andExGaussian
distributionsrandom
method used a hidden_random
function that was written with scalars in mind. This could potentially lead to artificial correlations between random draws. Added shape guards and broadcasting of the distribution samples to prevent this (Similar to issue #3310).  β Added a fix to allow the imputation of single missing values of observed data, which previously would fail (fixes issue #3122).
 The
draw_values
function was too permissive with what could be grabbed from insidepoint
, which lead to an error when sampling posterior predictives of variables that depended on shared variables that had changed their shape afterpm.sample()
had been called (fix issue #3346).  π
draw_values
now adds the theano graph descendants ofTensorConstant
orSharedVariables
to the named relationship nodes stack, only if these descendants areObservedRV
orMultiObservedRV
instances (fixes issue #3354).  Fixed bug in broadcast_distrution_samples, which did not handle correctly cases in which some samples did not have the size tuple prepended.
 π Changed
MvNormal.random
's usage oftensordot
for Cholesky encoded covariances. This lead to wrong axis broadcasting and seemed to be the cause for issue #3343.  π Fixed defect in
Mixture.random
when multidimensional mixtures were involved. The mixture component was not preserved across all the elements of the dimensions of the mixture. This meant that the correlations across elements within a given draw of the mixture were partly broken.  π Restructured
Mixture.random
to allow better use of vectorized calls tocomp_dists.random
.  β Added tests for mixtures of multidimensional distributions to the test suite.
 Fixed incorrect usage of
broadcast_distribution_samples
inDiscreteWeibull
.  0οΈβ£
Mixture
's default dtype is now determined bytheano.config.floatX
. dist_math.random_choice
now handles ndarrays of category probabilities, and also handles sizes that are notNone
. Also removed unusedk
kwarg fromdist_math.random_choice
. π Changed
Categorical.mode
to preserve all the dimensions ofp
except the last one, which encodes each category's probability.  π Changed initialization of
Categorical.p
.p
is now normalized to sum to1
insidelogp
andrandom
, but not during initialization. This could hide negative values supplied top
as mentioned in #2082. Categorical
now accepts elements ofp
equal to0
.logp
will returninf
if there arevalues
that index to the zero probability categories. β Add
sigma
,tau
, andsd
to signature ofNormalMixture
.  0οΈβ£ Set default lower and upper values of inf and inf for pm.distributions.continuous.TruncatedNormal. This avoids errors caused by their previous values of None (fixes issue #3248).
 Converted all calls to
pm.distributions.bound._ContinuousBounded
andpm.distributions.bound._DiscreteBounded
to use only and all positional arguments (fixes issue #3399).  Restructured
distributions.distribution.generate_samples
to use theshape_utils
module. This solves issues #3421 and #3147 by using thesize
aware broadcating functions inshape_utils
.  Fixed the
Multinomial.random
andMultinomial.random_
methods to make them compatible with the newgenerate_samples
function. In the process, a bug of theMultinomial.random_
shape handling was discovered and fixed.  Fixed a defect found in
Bound.random
where thepoint
dictionary was passed togenerate_samples
as anarg
instead of innot_broadcast_kwargs
.  Fixed a defect found in
Bound.random_
wheretotal_size
could end up as afloat64
instead of being an integer if givensize=tuple()
.  π Fixed an issue in
model_graph
that caused construction of the graph of the model for rendering to hang: replaced a search over the powerset of the nodes with a breadthfirst search over the nodes. Fix for #3458.  β Removed variable annotations from
model_graph
but left type hints (Fix for #3465). This means that we supportpython>=3.5.4
.  0οΈβ£ Default
target_accept
forHamiltonianMC
is now 0.65, as suggested in Beskos et. al. 2010 and Neal 2001.  π Fixed bug in
draw_values
that lead to intermittent errors in python3.5. This happened with some deterministic nodes that were drawn but not added togivens
.
π Deprecations
nuts_kwargs
andstep_kwargs
have been deprecated in favor of using the standardkwargs
to pass optional step method arguments. π
SGFS
andCSG
have been removed (Fix for #3353). They have been moved to pymc3experimental.  π References to
live_plot
and corresponding notebooks have been removed.  π Function
approx_hessian
was removed, due tonumdifftools
becoming incompatible with currentscipy
. The function was already optional, only available to a user who installednumdifftools
separately, and not hit on any common codepaths. #3485.  Deprecated
vars
parameter ofsample_posterior_predictive
in favor ofvarnames
.  π References to
live_plot
and corresponding notebooks have been removed.  Deprecated
vars
parameters ofsample_posterior_predictive
andsample_prior_predictive
in favor ofvar_names
. At least for the latter, this is more accurate, since thevars
parameter actually took names.
Contributors sorted by number of commits
45 Luciano Paz 38 Thomas Wiecki 23 Colin Carroll 19 Junpeng Lao 15 Chris Fonnesbeck 13 Juan MartΓn Loyola 13 Ravin Kumar 8 Robert P. Goldman 5 Tim Blazina 4 chang111 4 adamboche 3 Eric Ma 3 Osvaldo Martin 3 Sanmitra Ghosh 3 Saurav Shekhar 3 chartl 3 fredcallaway 3 Demetri 2 Daisuke Kondo 2 David Brochart 2 George Ho 2 Vaibhav Sinha 1 rpgoldman 1 Adel Tomilova 1 Adriaan van der Graaf 1 Bas Nijholt 1 Benjamin Wild 1 Brigitta Sipocz 1 Daniel Emaasit 1 Hari 1 Jeroen 1 Joseph Willard 1 Juan Martin Loyola 1 Katrin Leinweber 1 Lisa Martin 1 M. Domenzain 1 Matt Pitkin 1 Peadar Coyle 1 Rupal Sharma 1 Tom Gilliss 1 changjiangeng 1 michaelosthege 1 monsta 1 579397
 β Add data container class (