-
Notifications
You must be signed in to change notification settings - Fork 62
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Plan on adaption to Spark 2.x #82
Comments
Any updating on this, for the scope issue, there are some discussion. [Spark Namespace]: Expanding Spark ML under Different Namespace? Lots of people are running this issue to extend spark with personal functions. one guy said "What I tend to do is keep my own code in its package and try to do as think a bridge over to it from the [private] scope. It's also important to name things obviously, say, org.apache.spark.microsoft , so stack traces in bug reports can be dealt with more easily" Thus, we can use the apache.spark.sql.simba as well. |
Can you forward me the link on this discussion? |
… On Sun, Feb 26, 2017 at 6:11 PM, Dong Xie ***@***.***> wrote:
Can you forward me the link on this discussion?
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
<#82 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/ABXY-clazBs0UECohrzuOAkEtjO9HN82ks5rgjC5gaJpZM4L6fRv>
.
|
Finish first migration to Spark 2.1 at branch |
@Skyprophet I would like to test it but have some troubles getting started with Simba as outlined #84 |
Is there any guide that describes how to install Simba on Spark 2.1? |
On the standalone branch, simply Then, you can import the package to Spark through |
I suggest a total rewrite for this particular task. Open a new empty branch and adding things back into the structure.
https://issues.apache.org/jira/browse/SPARK-14155.
Maybe
Encoder
is the correct direction to go?DataSet
abstraction, tailor our current design to this new abstraction.Will keep update to this ticket.
The text was updated successfully, but these errors were encountered: