The service is based on FastAPI and uses the tartuNLP/EstBERT to return text embeddings. The model is downloaded automatically upon startup.
The API uses the following endpoints:
POST /estnltk/tagger/bert- the main endpoint for obtaining BERT embeddingsGET /estnltk/tagger/bert/about- returns information about the webserviceGET /estnltk/tagger/bert/status- returns the status of the webservice
The service should be run as a Docker container using the included Dockerfile. The API is exposed on port 8000. The following environment variables can be used to change webservice behavior:
BERT_MODEL- the BERT model used, the value should be a valid HuggingFace model ID (tartuNLP/EstBERTby default).MAX_CONTENT_LENGHT- maximum lenght of the POST request body size in characters.
The container uses uvicorn as the ASGI server. The entrypoint of the container is ["uvicorn", "app:app", "--host", "0.0.0.0", "--proxy-headers"]. Any additional uvicorn parameters can be passed to the container at runtime as CMD arguments.