-
Notifications
You must be signed in to change notification settings - Fork 6
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Test loading/inference on all supported Python and TF versions #83
Comments
The best that can happen: Have an understanding of how to save a model so it can be loaded under all (supported) circumstances. |
Indeed. And to make it work better, as a precaution, we should try to get Calamari to support SavedModel format. In fact, the problem has already surfaced. |
@bertsky I agree (tentatively, as I have not yet made myself familiar with it, but it sure looks like the best solution) |
We already do this (the testing of Python 3.6 to 3.10), I just forgot about it :-)
|
Python 3.11 does not work currently and I don't think I'll investigate further until we can update to Calamari 2. (→ #77) |
(Moved comment to #72) |
Closing as this is already done and for the other stuff we have #72. |
Learning from qurator-spk/eynollah#87/qurator-spk/eynollah#87 I think we should test loading/test inference on all supported Python and TF versions. The worst that can happen is to learn what isn't working.
I'd do this after #61.
The text was updated successfully, but these errors were encountered: