Predict using a SavedModel
Runs a prediction over a saved model file, local service or cloudml model.
predict_savedmodel(instances, model, ...)
Arguments
instances | A list of prediction instances to be passed as input tensors to the service. Even for single predictions, a list with one entry is expected. |
model | The model as a local path, a REST url, CloudML name or graph object. A local path can be exported using Notice that predicting over a CloudML model requires a A |
... | See #' @section Implementations: |
See also
export_savedmodel()
, serve_savedmodel()
, load_savedmodel()
Examples
# NOT RUN {
# perform prediction based on an existing model
tfdeploy::predict_savedmodel(
list(rep(9, 784)),
system.file("models/tensorflow-mnist", package = "tfdeploy")
)
# }