You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The pipeline.predict interface accepts either a 1D/2D tensor or a list of tensors. If you want to perform inference on a large dataset, you can either:
Send batches of shape [batch_size, context_length] to the predict function in a loop over batches in your dataset. Note: you would need to pad the time series with torch.nan on the left, if they don't have the same length.
(Easier) Send lists of tensors of length batch_size to the predict function in a loop over batches in your dataset. No need to pad here, it will be done internally.
If you're running OOM, decrease the batch_size.
The text was updated successfully, but these errors were encountered:
Opening this as a FAQ.
The
pipeline.predict
interface accepts either a 1D/2D tensor or a list of tensors. If you want to perform inference on a large dataset, you can either:[batch_size, context_length]
to thepredict
function in a loop over batches in your dataset. Note: you would need to pad the time series withtorch.nan
on the left, if they don't have the same length.batch_size
to thepredict
function in a loop over batches in your dataset. No need to pad here, it will be done internally.If you're running OOM, decrease the
batch_size
.The text was updated successfully, but these errors were encountered: