All Versions
24
Latest Version
Avg Release Cycle
95 days
Latest Release
1132 days ago

Changelog History
Page 2

  • v3.7.rc1

    May 22, 2019
  • v3.6 Changes

    December 21, 2018

    πŸ›  This is a major new release from 3.5 with many new features and important bugfixes. The highlight is certainly our completely revamped website: https://docs.pymc.io/

    πŸš€ Note also, that this release will be the last to be compatible with Python 2. Thanks to all contributors!

    πŸ†• New features

    • 🌲 Track the model log-likelihood as a sampler stat for NUTS and HMC samplers
      (accessible as trace.get_sampler_stats('model_logp')) (#3134)
    • βž• Add Incomplete Beta function incomplete_beta(a, b, value)
    • βž• Add log CDF functions to continuous distributions: Beta, Cauchy, ExGaussian, Exponential, Flat, Gumbel, HalfCauchy, HalfFlat, HalfNormal, Laplace, Logistic, Lognormal, Normal, Pareto, StudentT, Triangular, Uniform, Wald, Weibull.
    • Behavior of sample_posterior_predictive is now to produce posterior predictive samples, in order, from all values of the trace. Previously, by default it would produce 1 chain worth of samples, using a random selection from the trace (#3212)
    • πŸ‘‰ Show diagnostics for initial energy errors in HMC and NUTS.
    • PR #3273 has added the distributions.distribution._DrawValuesContext context
      manager. This is used to store the values already drawn in nested random
      and draw_values calls, enabling draw_values to draw samples from the
      joint probability distribution of RVs and not the marginals. Custom
      distributions that must call draw_values several times in their random
      method, or that invoke many calls to other distribution's random methods
      (e.g. mixtures) must do all of these calls under the same _DrawValuesContext
      context manager instance. If they do not, the conditional relations between
      the distribution's parameters could be broken, and random could return
      values drawn from an incorrect distribution.
    • Rice distribution is now defined with either the noncentrality parameter or the shape parameter (#3287).

    🚧 Maintenance

    • πŸ“š Big rewrite of documentation (#3275)
    • πŸ›  Fixed Triangular distribution c attribute handling in random and updated sample codes for consistency (#3225)
    • πŸ”¨ Refactor SMC and properly compute marginal likelihood (#3124)
    • βœ‚ Removed use of deprecated ymin keyword in matplotlib's Axes.set_ylim (#3279)
    • πŸ›  Fix for #3210. Now distribution.draw_values(params), will draw the params values from their joint probability distribution and not from combinations of their marginals (Refer to PR #3273).
    • βœ‚ Removed dependence on pandas-datareader for retrieving Yahoo Finance data in examples (#3262)
    • πŸ‘ Rewrote Multinomial._random method to better handle shape broadcasting (#3271)
    • πŸ›  Fixed Rice distribution, which inconsistently mixed two parametrizations (#3286).
    • Rice distribution now accepts multiple parameters and observations and is usable with NUTS (#3289).
    • sample_posterior_predictive no longer calls draw_values to initialize the shape of the ppc trace. This called could lead to ValueError's when sampling the ppc from a model with Flat or HalfFlat prior distributions (Fix issue #3294).

    πŸ—„ Deprecations

    • Renamed sample_ppc() and sample_ppc_w() to sample_posterior_predictive() and sample_posterior_predictive_w(), respectively.
  • v3.6.rc1

    December 19, 2018
  • v3.5 Changes

    July 21, 2018

    πŸ†• New features

    • βž• Add documentation section on survival analysis and censored data models
    • βœ… Add check_test_point method to pm.Model
    • βž• Add Ordered Transformation and OrderedLogistic distribution
    • βž• Add Chain transformation
    • πŸ‘Œ Improve error message Mass matrix contains zeros on the diagonal. Some derivatives might always be zero during tuning of pm.sample
    • πŸ‘Œ Improve error message NaN occurred in optimization. during ADVI
    • Save and load traces without pickle using pm.save_trace and pm.load_trace
    • βž• Add Kumaraswamy distribution
    • βž• Add TruncatedNormal distribution
    • Rewrite parallel sampling of multiple chains on py3. This resolves
      long standing issues when transferring large traces to the main process,
      avoids pickling issues on UNIX, and allows us to show a progress bar
      for all chains. If parallel sampling is interrupted, we now return
      partial results.
    • Add sample_prior_predictive which allows for efficient sampling from
      the unconditioned model.
    • 🚚 SMC: remove experimental warning, allow sampling using sample, reduce autocorrelation from
      final trace.
    • Add model_to_graphviz (which uses the optional dependency graphviz) to
      plot a directed graph of a PyMC3 model using plate notation.
    • βž• Add beta-ELBO variational inference as in beta-VAE model (Christopher P. Burgess et al. NIPS, 2017)
    • Add __dir__ to SingleGroupApproximation to improve autocompletion in interactive environments

    πŸ›  Fixes

    • πŸ›  Fixed grammar in divergence warning, previously There were 1 divergences ... could be raised.
    • πŸ›  Fixed KeyError raised when only subset of variables are specified to be recorded in the trace.
    • βœ‚ Removed unused repeat=None arguments from all random() methods in distributions.
    • πŸ—„ Deprecated the sigma argument in MarginalSparse.marginal_likelihood in favor of noise
    • πŸ›  Fixed unexpected behavior in random. Now the random functionality is more robust and will work better for sample_prior when that is implemented.
    • Fixed scale_cost_to_minibatch behaviour, previously this was not working and always False
  • v3.5.rc1

    July 15, 2018
  • v3.4.1 Changes

    April 19, 2018

    πŸš€ There was no 3.4 release due to a naming issue on PyPI.

    πŸ†• New features

    • Add logit_p keyword to pm.Bernoulli, so that users can specify the logit of the success probability. This is faster and more stable than using p=tt.nnet.sigmoid(logit_p).
    • βž• Add random keyword to pm.DensityDist thus enabling users to pass custom random method which in turn makes sampling from a DensityDist possible.
    • Effective sample size computation is updated. The estimation uses Geyer's initial positive sequence, which no longer truncates the autocorrelation series inaccurately. pm.diagnostics.effective_n now can reports N_eff>N.
    • βž• Added KroneckerNormal distribution and a corresponding MarginalKron
      Gaussian Process implementation for efficient inference, along with
      lower-level functions such as cartesian and kronecker products.
    • βž• Added Coregion covariance function.
    • βž• Add new 'pairplot' function, for plotting scatter or hexbin matrices of sampled parameters.
      Optionally it can plot divergences.
    • πŸ“„ Plots of discrete distributions in the docstrings
    • βž• Add logitnormal distribution
    • πŸ‘ Densityplot: add support for discrete variables
    • πŸ›  Fix the Binomial likelihood in .glm.families.Binomial, with the flexibility of specifying the n.
    • βž• Add offset kwarg to .glm.
    • πŸ”„ Changed the compare function to accept a dictionary of model-trace pairs instead of two separate lists of models and traces.
    • βž• add test and support for creating multivariate mixture and mixture of mixtures
    • distribution.draw_values, now is also able to draw values from conditionally dependent RVs, such as autotransformed RVs (Refer to PR #2902).

    πŸ›  Fixes

    • 🚚 VonMises does not overflow for large values of kappa. i0 and i1 have been removed and we now use log_i0 to compute the logp.
    • πŸ‘ The bandwidth for KDE plots is computed using a modified version of Scott's rule. The new version uses entropy instead of standard deviation. This works better for multimodal distributions. Functions using KDE plots has a new argument bw controlling the bandwidth.
    • πŸ›  fix PyMC3 variable is not replaced if provided in more_replacements (#2890)
    • πŸ›  Fix for issue #2900. For many situations, named node-inputs do not have a random method, while some intermediate node may have it. This meant that if the named node-input at the leaf of the graph did not have a fixed value, theano would try to compile it and fail to find inputs, raising a theano.gof.fg.MissingInputError. This was fixed by going through the theano variable's owner inputs graph, trying to get intermediate named-nodes values if the leafs had failed.
    • In distribution.draw_values, some named nodes could be theano.tensor.TensorConstants or theano.tensor.sharedvar.SharedVariables. Nevertheless, in distribution._draw_value, these would be passed to distribution._compile_theano_function as if they were theano.tensor.TensorVariables. This could lead to the following exceptions TypeError: ('Constants not allowed in param list', ...) or TypeError: Cannot use a shared variable (...). The fix was to not add theano.tensor.TensorConstant or theano.tensor.sharedvar.SharedVariable named nodes into the givens dict that could be used in distribution._compile_theano_function.
    • πŸ‘ Exponential support changed to include zero values.

    πŸ—„ Deprecations

    • 🚚 DIC and BPIC calculations have been removed
    • 🚚 df_summary have been removed, use summary instead
    • πŸ—„ njobs and nchains kwarg are deprecated in favor of cores and chains for sample
    • πŸ—„ lag kwarg in pm.stats.autocorr and pm.stats.autocov is deprecated.
  • v3.4.rc2

    April 13, 2018
  • v3.4.rc1

    April 09, 2018
  • v3.3 Changes

    January 09, 2018

    πŸ†• New features

    • Improve NUTS initialization advi+adapt_diag_grad and add jitter+adapt_diag_grad (#2643)
    • βž• Added MatrixNormal class for representing vectors of multivariate normal variables
    • Implemented HalfStudentT distribution
    • πŸ†• New benchmark suite added (see http://pandas.pydata.org/speed/pymc/)
    • πŸ‘€ Generalized random seed types
    • ⚑️ Update loo, new improved algorithm (#2730)
    • πŸ†• New CSG (Constant Stochastic Gradient) approximate posterior sampling algorithm (#2544)
    • πŸ‘€ Michael Osthege added support for population-samplers and implemented differential evolution metropolis (DEMetropolis). For models with correlated dimensions that can not use gradient-based samplers, the DEMetropolis sampler can give higher effective sampling rates. (also see PR#2735)
    • πŸ‘ Forestplot supports multiple traces (#2736)
    • βž• Add new plot, densityplot (#2741)
    • πŸ—„ DIC and BPIC calculations have been deprecated
    • πŸ”¨ Refactor HMC and implemented new warning system (#2677, #2808)

    πŸ›  Fixes

    • πŸ›  Fixed compareplot to use loo output.
    • πŸ‘Œ Improved posteriorplot to scale fonts
    • sample_ppc_w now broadcasts
    • df_summary function renamed to summary
    • βž• Add test for model.logp_array and model.bijection (#2724)
    • Fixed sample_ppc and sample_ppc_w to iterate all chains(#2633, #2748)
    • βž• Add Bayesian R2 score (for GLMs) stats.r2_score (#2696) and test (#2729).
    • SMC works with transformed variables (#2755)
    • Speedup OPVI (#2759)
    • πŸ›  Multiple minor fixes and improvements in the docs (#2775, #2786, #2787, #2789, #2790, #2794, #2799, #2809)

    πŸ—„ Deprecations

    • 🚚 Old (minibatch-)advi is removed (#2781)
  • v3.2 Changes

    October 10, 2017

    πŸ†• New features

    This version includes two major contributions from our Google Summer of Code 2017 students:

    • πŸ”¨ Maxim Kochurov extended and refactored the variational inference module. This primarily adds two important classes, representing operator variational inference (OPVI) objects and Approximation objects. These make it easier to extend existing variational classes, and to derive inference from variational optimizations, respectively. The variational module now also includes normalizing flows (NFVI).
    • πŸ“œ Bill Engels added an extensive new Gaussian processes (gp) module. Standard GPs can be specified using either Latent or Marginal classes, depending on the nature of the underlying function. A Student-T process TP has been added. In order to accomodate larger datasets, approximate marginal Gaussian processes (MarginalSparse) have been added.

    πŸ“š Documentation has been improved as the result of the project's monthly "docathons".

    An experimental stochastic gradient Fisher scoring (SGFS) sampling step method has been added.

    The API for find_MAP was enhanced.

    SMC now estimates the marginal likelihood.

    βž• Added Logistic and HalfFlat distributions to set of continuous distributions.

    Bayesian fraction of missing information (bfmi) function added to stats.

    ✨ Enhancements to compareplot added.

    QuadPotential adaptation has been implemented.

    πŸ“š Script added to build and deploy documentation.

    MAP estimates now available for transformed and non-transformed variables.

    🚚 The Constant variable class has been deprecated, and will be removed in 3.3.

    DIC and BPIC calculations have been sped up.

    Arrays are now accepted as arguments for the Bound class.

    random method was added to the Wishart and LKJCorr distributions.

    Progress bars have been added to LOO and WAIC calculations.

    ⚑️ All example notebooks updated to reflect changes in API since 3.1.

    πŸ”¨ Parts of the test suite have been refactored.

    πŸ›  Fixes

    πŸ›  Fixed sampler stats error in NUTS for non-RAM backends

    Matplotlib is no longer a hard dependency, making it easier to use in settings where installing Matplotlib is problematic. PyMC3 will only complain if plotting is attempted.

    πŸ›  Several bugs in the Gaussian process covariance were fixed.

    All chains are now used to calculate WAIC and LOO.

    πŸ›  AR(1) log-likelihood function has been fixed.

    πŸ›  Slice sampler fixed to sample from 1D conditionals.

    πŸ›  Several docstring fixes.

    Contributors

    πŸš€ The following people contributed to this release (ordered by number of commits):

    Maxim Kochurov [email protected] Bill Engels [email protected] Chris Fonnesbeck [email protected] Junpeng Lao [email protected] Adrian Seyboldt [email protected] AustinRochford [email protected] Osvaldo Martin [email protected] Colin Carroll [email protected] Hannes Vasyura-Bathke [email protected] Thomas Wiecki [email protected] michaelosthege [email protected] Marco De Nadai [email protected] Kyle Beauchamp [email protected] Massimo [email protected] ctm22396 [email protected] Max Horn [email protected] Hennadii Madan [email protected] Hassan Naseri [email protected] Peadar Coyle [email protected] Saurav R. Tuladhar [email protected] Shashank Shekhar [email protected] Eric Ma [email protected] Ed Herbst [email protected] tsdlovell [email protected] zaxtax [email protected] Dan Nichol [email protected] Benjamin Yetton [email protected] jackhansom [email protected] Jack Tsai [email protected] AndrΓ©s Asensio Ramos [email protected]