Skip to content

coreym/fastapi-retriever

Repository files navigation

FastAPI based Retriever/Indexer service over Pinecone, optimized for deployment on render.com

Manual Steps

  1. Index your data in Pinecone with openAI text-embedding-3-small (sample notebook coming)

  2. Create a new Web Service on Render.

  3. Populate the OPENAI_KEY and PINECONE_KEY environment variables with your API keys.

  4. Specify the URL to this repository

  5. Render will automatically detect that you are deploying a Python service and use pip to download the dependencies.

  6. Specify the following as the Start Command.

    uvicorn main:app --host 0.0.0.0 --port $PORT
  7. Click Create Web Service.

Or simply click:

Deploy to Render

Thanks

Thanks to Harish for the inspiration to create a FastAPI quickstart for Render and for some sample code!

About

FastAPI based retriever service for custom GPTs

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages

Generated from render-examples/fastapi