This project implements a FastAPI backend designed to automate Sales Development Representative (SDR) tasks for Stahla, including real-time price quoting and providing an operational dashboard backend.
Manual handling of inbound calls, emails, and forms leads to missed context, slow responses, inconsistent lead routing, and delays in providing price quotes. This erodes customer trust and results in lost revenue.
The goal is to create a reliable, scalable AI-driven intake and quoting flow that captures complete information, classifies opportunities accurately, integrates seamlessly with HubSpot, generates quotes rapidly (<500ms P95), enables quick human follow-up, and provides operational visibility.
- <15 sec median first response time (SDR interaction).
- <500ms P95 quote generation latency (
/webhook/pricing/quote). - ≥95% data-field completeness in HubSpot.
- ≥90% routing accuracy.
- +20% increase in qualified-lead-to-quote conversion.
- AI Intake Agent: Uses voice (Bland.ai), email parsing, and web form follow-ups to greet prospects, ask dynamic questions, and populate HubSpot.
- Classification & Routing: Determines the appropriate business unit (Services, Logistics, Leads, or Disqualify) based on lead data and assigns the deal in HubSpot.
- Real-time Pricing Agent (Integrated): Provides instant price quotes via a secure webhook, using dynamically synced pricing rules from Google Sheets and cached Google Maps distance calculations.
- Human Handoff: Provides reps with summaries, context, and quotes for quick follow-up or disqualification.
- Operational Dashboard Backend: Exposes API endpoints for monitoring system status, cache performance, sync status, errors, recent requests, and limited cache/sync management.
- Extensible Framework: Built for future agent additions.
- Integration Layer: Uses n8n for managing specific webhook workflows (e.g., lead processing trigger).
- Backend Framework: FastAPI
- CRM: HubSpot
- Voice AI: Bland.ai
- Language Model (Optional): Marvin AI (or others like OpenAI, Anthropic, Gemini)
- Workflow Automation: n8n
- Data Validation: Pydantic
- Caching: Redis
- Geo-Services: Google Maps Distance Matrix API
- Data Source (Pricing): Google Sheets API
- Logging: Logfire
- Containerization: Docker, Docker Compose
- Language: Python 3.11+
- Voice AI Intake Agent (Bland.ai): Answers inbound calls and initiates callbacks for incomplete web forms within 1 minute.
- Web Form & Email Intake: Processes submissions/emails via webhooks, using dynamic questioning and LLM parsing for emails.
- Automated Follow-up: Initiates Bland.ai calls for missing web form data and sends auto-reply emails for incomplete email leads.
- HubSpot Data Enrichment: Creates/updates Contacts & Deals with high completeness. Writes call summaries/recordings to HubSpot.
- Classification & Routing Engine: Classifies leads (Services/Logistics/Leads/Disqualify) and routes deals to the correct HubSpot pipeline with round-robin owner assignment.
- Real-time Pricing Agent:
/webhook/pricing/quoteendpoint for instant quote generation (secured by API Key)./webhook/pricing/location_lookupendpoint for asynchronous distance calculation/caching.- Dynamic sync of pricing rules, config, and branches from Google Sheets to Redis cache.
- Calculates quotes based on trailer type, duration, usage, extras, delivery distance (nearest branch), and seasonal multipliers.
- Operational Dashboard Backend API:
- Endpoints (
/dashboard/...) for monitoring status (requests, errors, cache, sync) and recent activity. - Endpoints for managing cache (view/clear specific keys, clear pricing/maps cache) and triggering manual sheet sync.
- Endpoints (
- Human-in-the-Loop Handoff: Sends email notifications to reps with summaries, checklists, and action links.
- Configuration & Monitoring: Via
.env, Pydantic settings, health check endpoints, and background logging to Redis for dashboard. - Logging: Structured logging via Logfire.
- Workflow Integration: Connects with n8n for specific automation tasks.
(See docs/features.md for more details)
.
├── app/
│ ├── api/
│ │ └── v1/
│ │ ├── api.py
│ │ └── endpoints/
│ │ ├── classify.py
│ │ ├── health.py
│ │ ├── hubspot.py
│ │ ├── documentation.py
│ │ ├── dash/
│ │ │ └── dashboard.py
│ │ └── webhooks/
│ │ ├── form.py
│ │ ├── helpers.py
│ │ ├── hubspot.py
│ │ ├── voice.py
│ │ └── pricing.py
│ ├── assets/
│ │ ├── call.json
│ │ ├── data.json
│ │ ├── edges.json
│ │ ├── knowledge.json
│ │ ├── location.json
│ │ ├── quote.json
│ │ └── script.md
│ ├── core/
│ │ ├── __init__.py
│ │ ├── config.py
│ │ ├── dependencies.py
│ │ ├── middleware.py
│ │ ├── security.py
│ │ └── templating.py
│ ├── models/
│ │ ├── __init__.py
│ │ ├── bland.py
│ │ ├── blandlog.py
│ │ ├── classification.py
│ │ ├── common.py
│ │ ├── email.py
│ │ ├── error.py
│ │ ├── hubspot.py
│ │ ├── location.py
│ │ ├── pricing.py
│ │ ├── quote.py
│ │ ├── user.py
│ │ ├── webhook.py
│ │ └── dash/
│ │ ├── __init__.py
│ │ └── dashboard.py # Assuming dashboard.py based on pattern, verify if other files exist
│ ├── services/
│ │ ├── __init__.py
│ │ ├── auth/
│ │ │ # (contents of auth/ if known)
│ │ ├── classify/
│ │ │ ├── __init__.py
│ │ │ ├── classification.py
│ │ │ ├── marvin.py
│ │ │ └── rules.py
│ │ ├── dash/
│ │ │ ├── __init__.py
│ │ │ ├── background.py
│ │ │ └── dashboard.py
│ │ ├── location/
│ │ │ ├── __init__.py
│ │ │ └── location.py
│ │ ├── mongo/
│ │ │ # (contents of mongo/ if known)
│ │ ├── quote/
│ │ │ ├── __init__.py
│ │ │ ├── quote.py
│ │ │ └── sync.py
│ │ ├── redis/
│ │ │ ├── __init__.py
│ │ │ └── redis.py
│ │ ├── bland.py
│ │ ├── email.py
│ │ ├── hubspot.py
│ │ └── n8n.py
│ ├── static/
│ │ ├── css/
│ │ ├── img/
│ │ └── js/
│ ├── templates/
│ │ └── home.html
│ ├── utils/
│ │ ├── __init__.py
│ │ ├── enhanced.py
│ │ └── location.py
│ ├── __init__.py
│ ├── gcp.json
│ └── main.py
├── docs/
│ ├── api.md
│ ├── faq.md
│ ├── features.md
│ ├── hubspot.md
│ ├── marvin.md
│ ├── services.md
│ └── webhooks.md
├── info/
│ ├── 01 Pricing Agent Analysis & Implementation Proposal.md
│ ├── drop.txt
│ ├── hubspot.md
│ ├── properties.csv
│ └── Unified AI Call Assistant(v2).md
├── rest/
│ ├── auth.http
│ ├── bland.http
│ ├── classify.http
│ ├── classify_fixed.http
│ ├── dash.http
│ ├── documentation.http
│ ├── error.http
│ ├── form.http
│ ├── health.http
│ ├── location.http
│ ├── quote.http
│ ├── test.http
│ └── webhooks.http
├── sheets/
│ ├── Stahla - config.csv
│ ├── Stahla - generators.csv
│ ├── Stahla - locations.csv
│ └── Stahla - products.csv
├── tests/ # (Placeholder)
├── .env
├── .env.example
├── .gitignore
├── requirements.txt
├── Dockerfile
├── docker-compose.yml
└── README.md # This file
-
Clone the repository.
-
Create and configure
.env:- Copy
.env.exampleto.env. - Fill in API keys:
HUBSPOT_API_KEY,BLAND_API_KEY,LOGFIRE_TOKEN,GOOGLE_MAPS_API_KEY,PRICING_WEBHOOK_API_KEY, and your chosenLLM_PROVIDER's key. - Configure
GOOGLE_SHEET_IDand theGOOGLE_SHEET_*_RANGEvariables for products, generators, branches, and config tabs/ranges. - Set up
GOOGLE_APPLICATION_CREDENTIALSif using Google Service Account auth. - Configure
REDIS_URL. - Configure
APP_BASE_URL. - Configure n8n settings if
N8N_ENABLED=true. - Adjust other settings as needed.
- Copy
-
Install dependencies:
pip install -r requirements.txt
-
Run locally using Uvicorn:
uvicorn app.main:app --reload --port 8000
-
Run using Docker Compose:
docker-compose up --build
Comprehensive documentation is available to understand the Stahla AI SDR application's architecture, features, and API.
For in-depth information on specific aspects, please refer to the following documents in the docs/ directory:
api.md: Detailed specifications for all API endpoints.features.md: A comprehensive list and description of core application features.webhooks.md: In-depth explanation of webhook functionalities, including models, logic, and examples for Form, HubSpot, Voice (Bland.ai), and Pricing webhooks.hubspot.md: HubSpot integration details, including custom Contact and Deal properties.services.md: Overview of core services like Bland.ai, Google Sheets, Redis, etc.marvin.md: Integration with Marvin AI for classification and data extraction.faq.md: Frequently Asked Questions (to be populated).
Once the application is running:
- Interactive API documentation (Swagger UI) is available at
/docson the application server. - Alternative API documentation (ReDoc) is available at
/redocon the application server. - The markdown documentation files from the
docs/directory (e.g.,features.md,webhooks.md) are also served as rendered HTML at/api/v1/docs/{filename}(e.g.,/api/v1/docs/webhooks.md).
- SMS intake channel (e.g., via Twilio).
- Frontend UI for the Operational Dashboard.
- Integration with external monitoring/alerting systems for advanced metrics (latency P95, cache hit ratios, historical trends) and alerts.
- Refinement of HubSpot dynamic ID fetching and call data persistence.
- Dedicated Integration & Orchestration Layer (e.g., self-hosted n8n).
This application can be deployed to Fly.io with the provided Docker Compose setup. Follow these steps to deploy:
-
Install the Fly.io CLI (flyctl):
# On macOS brew install flyctl # On Linux curl -L https://fly.io/install.sh | sh # On Windows (using PowerShell) iwr https://fly.io/install.ps1 -useb | iex
-
Sign up and log in to Fly.io:
# If you don't have an account, sign up # This will open a browser window for authentication flyctl auth signup # OR flyctl auth login
-
Ensure you have a valid
.envfile with all required environment variables.
We've created a deployment script to simplify the process:
- Make the script executable (if not already):
chmod +x deploy-to-fly.sh-
Run the deployment script:
./deploy-to-fly.sh
This script will:
- Check if you're logged in to Fly.io
- Create a new Fly.io application if it doesn't exist
- Set up your environment variables from
.envas Fly.io secrets - Create volumes for MongoDB and Redis if needed
- Deploy the application using your Docker Compose configuration
If you prefer to deploy manually:
-
Create a new Fly.io application:
flyctl apps create stahla -
Set environment variables from your
.envfile:# Example for setting individual variables flyctl secrets set MONGO_DB_NAME=stahla_dashboard -
Create volumes for MongoDB and Redis:
flyctl volumes create mongo_data --size 1 flyctl volumes create redis_data --size 1 -
Deploy the application:
flyctl deploy
- View deployment status:
flyctl status -a stahla - Check logs:
flyctl logs -a stahla - Open the application in a browser:
flyctl open -a stahla
To scale your application on Fly.io:
# Scale to multiple instances
flyctl scale count 3
For more information, refer to the Fly.io documentation.
For deployment instructions, see README-fly.md for Fly.io deployment.
This application uses a cloud MongoDB service (like MongoDB Atlas) for data storage.
- Create a MongoDB Atlas account (or use another cloud MongoDB service)
- Create a MongoDB cluster
- Create a database user with read/write permissions
- Allow network access from anywhere (or at least from Fly.io IP range)
- Initialize the database before deploying:
# Run the cloud MongoDB initialization script (connects to your cloud MongoDB)
./initialize-mongodb.shThis script will:
- Connect to your cloud MongoDB instance
- Create necessary collections
- Set up required indexes
This needs to be done from your local machine because the Alpine Linux container in Fly.io doesn't support MongoDB Shell (mongosh) needed for initialization.