Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Example Extension Package needed #5

Open
nicornk opened this issue Jul 5, 2016 · 3 comments
Open

Example Extension Package needed #5

nicornk opened this issue Jul 5, 2016 · 3 comments

Comments

@nicornk
Copy link

nicornk commented Jul 5, 2016

Hi,

I am trying to understand the concepts behind the extensions for sparklyr.

An example extension would be very helpful for me.

Thanks!

@jjallaire
Copy link
Member

We'll post a sample soon!

On Tue, Jul 5, 2016 at 10:16 AM, Nicolas Rnkmp [email protected]
wrote:

Hi,

I am trying to understand the concepts behind the extensions for sparklyr.

An example extension would be very helpful for me.

BR
Nicolas


You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
#5, or mute the thread
https://github.com/notifications/unsubscribe/AAGXx1aRA_T0KPrj5uk7feeqPTaRYw-_ks5qSmcogaJpZM4JFKeP
.

@jwijffels
Copy link

I tried this out and created an R package called spark.sas7bdat which imports SAS data in parallel in Spark
More information: https://github.com/bnosac/spark.sas7bdat
Thanks for the documentation about the usage of the sparkapi package. This worked nicely!

One question. Is it desireable to include spark packages from https://spark-packages.org/ inside the java folder of the R package to reference it from there or is it advised to use the packages argument of the spark_dependency function.

@jjallaire
Copy link
Member

That's fantastic!!!!

Yes, you should definitely use the packages argument of the
spark_dependency function (as you have done in your example) in preference
to embedding within the java folder of the R package.

On Wed, Jul 20, 2016 at 4:56 PM, jwijffels [email protected] wrote:

I tried this out and created an R package called spark.sas7bdat which
imports SAS data in parallel in Spark
More information: https://github.com/bnosac/spark.sas7bdat
Thanks for the documentation about the usage of the sparkapi package.
This worked nicely!

One question. Is it desireable to include spark packages from
https://spark-packages.org/ inside the java folder of the R package to
reference it from there or is it advised to use the packages argument of
the spark_dependency function.


You are receiving this because you commented.
Reply to this email directly, view it on GitHub
#5 (comment), or mute
the thread
https://github.com/notifications/unsubscribe-auth/AAGXxxm7kZa9pX4LbhcUwILkBsN7qbsSks5qXouHgaJpZM4JFKeP
.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants