diff --git a/discussion/examples/Forecasting_With_AutoBNN.ipynb b/discussion/examples/Forecasting_With_AutoBNN.ipynb index 9a14bb16db..33cec1171a 100644 --- a/discussion/examples/Forecasting_With_AutoBNN.ipynb +++ b/discussion/examples/Forecasting_With_AutoBNN.ipynb @@ -286,7 +286,7 @@ "\n", "Manually specifying a kernel, like we did above, can be useful in many situations. The true power of the AutoBNN framework, though, lies in its ability to discover structure directly from the data. Below we run the same MAP estimator with the `sum_of_products` model, which is one of the model families distributed with the AutoBNN packages.\n", "\n", - "This kernel uses a continuous relaxation over periodic, linear, exponentiated quadratic, second-degree polynomial and identity leaves as the basic structure. It then combines four such subtrees by first adding them in pairs, and multiplying the final two subtrees. This is a structure that we empirically found to be suitable to model a wide variety of time series. It can serve as a useful starting point for further exploration. Other model families supporting structure discovery can be found in the [models.py](https://github.com/tensorflow/probability/blob/main/tensorflow_probability/python/experimental/autobnn/models.py) file." + "This kernel uses a continuous relaxation over periodic, linear, exponentiated quadratic, second-degree polynomial and identity leaves as the basic structure. It then combines four such subtrees by first adding them in pairs, and multiplying the final two subtrees. This is a structure that we empirically found to be suitable to model a wide variety of time series. It can serve as a useful starting point for further exploration. Other model families supporting structure discovery can be found in the [models.py](https://github.com/tensorflow/probability/blob/main/spinoffs/autobnn/autobnn/models.py) file." ] }, {