TensorFlow Probability 0.15.0
Release notes
This is the 0.15 release of TensorFlow Probability. It is tested and stable against TensorFlow version 2.7.0.
Change notes
-
Distributions
- Add
tfd.StudentTProcessRegressionModel
. - Distributions' statistics now all have batch shape matching the Distribution itself.
JointDistributionCoroutine
no longer requiresRoot
whensample_shape==()
.- Support
sample_distributions
from autobatched joint distributions. - Expose
mask
argument to support missing observations in HMM log probs. BetaBinomial.log_prob
is more accurate when all trials succeed.- Support broadcast batch shapes in
MixtureSameFamily
. - Add
cholesky_fn
argument toGaussianProcess
,GaussianProcessRegressionModel
, andSchurComplement
. - Add staticmethod for precomputing GPRM for more efficient inference in TensorFlow.
- Add
GaussianProcess.posterior_predictive
.
- Add
-
Bijectors
- Bijectors parameterized by distinct
tf.Variable
s no longer register as==
. - BREAKING CHANGE: Remove deprecated
AffineScalar
bijector. Please usetfb.Shift(shift)(tfb.Scale(scale))
instead. - BREAKING CHANGE: Remove deprecated
Affine
andAffineLinearOperator
bijectors.
- Bijectors parameterized by distinct
-
PSD kernels
- Add
tfp.math.psd_kernels.ChangePoint
. - Add slicing support for
PositiveSemidefiniteKernel
. - Add
inverse_length_scale
parameter to kernels. - Add
parameter_properties
to PSDKernel along with automated batch shape inference.
- Add
-
VI
- Add support for importance-weighted variational objectives.
- Support arbitrary distribution types in
tfp.experimental.vi.build_factored_surrogate_posterior
.
-
STS
- Support
+
syntax for summingStructuralTimeSeries
models.
- Support
-
Math
- Enable JAX/NumPy backends for
tfp.math.ode
. - Allow returning auxiliary information from
tfp.math.value_and_gradient
.
- Enable JAX/NumPy backends for
-
Experimental
- Speedup to
experimental.mcmc
windowed samplers. - Support unbiased gradients through particle filtering via stop-gradient resampling.
ensemble_kalman_filter_log_marginal_likelihood
(log evidence) computation added totfe.sequential
.- Add experimental joint-distribution layers library.
- Delete
tfp.experimental.distributions.JointDensityCoroutine
. - Add experimental special functions for high-precision computation on a TPU.
- Add custom log-prob ratio for
IncrementLogProb
. - Use
foldl
inno_pivot_ldl
instead ofwhile_loop
.
- Speedup to
-
Other
- TFP should now support numpy 1.20+.
- BREAKING CHANGE: Stock unpacking seeds when splitting in JAX.
Huge thanks to all the contributors to this release!
- 8bitmp3
- adriencorenflos
- Alexey Radul
- Allen Lavoie
- Ben Lee
- Billy Lamberta
- Brian Patton
- Christopher Suter
- Colin Carroll
- Dave Moore
- Du Phan
- Emily Fertig
- Faizan Muhammad
- George Necula
- George Tucker
- Grace Luo
- Ian Langmore
- Jacob Burnim
- Jake VanderPlas
- Jeremiah Liu
- Junpeng Lao
- Kaan
- Luke Wood
- Max Jiang
- Mihai Maruseac
- Neil Girdhar
- Paul Chiang
- Pavel Izmailov
- Pavel Sountsov
- Peter Hawkins
- Rebecca Chen
- Richard Song
- Rif A. Saurous
- Ron Shapiro
- Roy Frostig
- Sharad Vikram
- Srinivas Vasudevan
- Tomohiro Endo
- Urs Köster
- William C Grisaitis
- Yilei Yang