Skip to content

Latest commit

 

History

History
35 lines (25 loc) · 759 Bytes

File metadata and controls

35 lines (25 loc) · 759 Bytes

Private Detector on Tensorflow Serving

Tensorflow serving image could be pulled via:

docker pull vazhega/private-detector:0.2.0

Run:

docker run -p 8501:8501 -e MODEL_NAME=private_detector -t vazhega/private-detector:0.2.0

Ports exposed:

  • REST API: 8501
  • GRPC: 8502

Calling a model:

import base64
import json
import requests

image_string = open("./test.jpeg", "rb").read()
endpoint = "http://localhost:8501/v1/models/private_detector:predict"
jpeg_bytes = base64.b64encode(image_string).decode('utf-8')
predict_request = {"instances" : [{"b64": jpeg_bytes}]}

response = requests.post(endpoint, json.dumps(predict_request))
print(response.json())

>>> {'predictions': [[0.0535223149, 0.946477652]]}