Deploy 'TensorFlow' Models

Tools to deploy 'TensorFlow' <> models across multiple services. Currently, it provides a local server for testing 'cloudml' compatible services.

BuildStatus CRAN_Status_Badge codecov

While TensorFlow models are typically defined and trained using R or Python code, it is possible to deploy TensorFlow models in a wide variety of environments without any runtime dependency on R or Python:

  • TensorFlow Serving is an open-source software library for serving TensorFlow models using a gRPC interface.

  • CloudML is a managed cloud service that serves TensorFlow models using a REST interface.

  • RStudio Connect provides support for serving models using the same REST API as CloudML, but on a server within your own organization.

TensorFlow models can also be deployed to mobile and embedded devices including iOS and Android mobile phones and Raspberry Pi computers. The tfdeploy package includes a variety of tools designed to make exporting and serving TensorFlow models straightforward. For documentation on using tfdeploy, see the package website at


Reference manual

It appears you don't have a PDF plugin for this browser. You can click here to download the reference manual.


0.6.0 by Javier Luraschi, 3 months ago

Browse source code at

Authors: Javier Luraschi [aut, cre] , RStudio [cph]

Documentation:   PDF Manual  

Apache License 2.0 license

Imports httpuv, httr, jsonlite, magrittr, reticulate, swagger, tensorflow

Suggests cloudml, knitr, pixels, processx, testthat, yaml

See at CRAN