-
-
Notifications
You must be signed in to change notification settings - Fork 181
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Unable to serialize lambdas or classes on Databricks #343
Comments
I am having the same issue |
I have this issue as well, any updates on this? |
I'm dealing with the same issue. Any updates? |
Sorry for the slow reply. A function (including a lambda) has a reference to the global namespace. So, function serialization requires some pickling of the global namespace. The setting The easiest thing to do, however, is to use I'm going to close this as answered, but if you feel it's not... then please do reopen and continue the conversation. |
Actually, I'm going to leave this open a bit more. Is the issue just with lambdas and classes, or is it also with functions, or any object that refers to the global dict...? |
the following code:
fails on Databricks notebook with the following error:
Obviously Pickler is trying to serialize spark context but I don't refer to spark context anywhere here. I tried also removing spark from globals, but it didn't work. Do you have any ideas what else can I do to prevent it from serializing spark context?
The text was updated successfully, but these errors were encountered: