Bring Local Sf to Spark

R binds 'GeoSpark' < http://geospark.datasyslab.org/> extending 'sparklyr' < https://spark.rstudio.com/> R package to make distributed 'geocomputing' easier. Sf is a package that provides [simple features] < https://en.wikipedia.org/wiki/Simple_Features> access for R and which is a leading 'geospatial' data processing tool. 'Geospark' R package bring the same simple features access like sf but running on Spark distributed system.


News

Reference manual

It appears you don't have a PDF plugin for this browser. You can click here to download the reference manual.

install.packages("geospark")

0.2.1 by Harry Zhu, 2 months ago


Report a bug at https://github.com/harryprince/geospark/issues


Browse source code at https://github.com/cran/geospark


Authors: Harry Zhu [aut, cre] , Javier Luraschi [ctb]


Documentation:   PDF Manual  


Apache License (>= 2.0) license


Imports sparklyr, dplyr, dbplyr

Suggests testthat, knitr, utils


See at CRAN