This document describes how to run tests for the DeeperSensor API.
The test suite includes:
- Unit tests: Small, focused tests for individual functions (validation, utilities)
- Integration tests: End-to-end tests of HTTP endpoints with a real database
- Build verification: Clippy lints and cargo checks via CI/CD
No special setup required. Unit tests run in isolation:
cargo test --libIntegration tests require a PostgreSQL database. You have two options:
Start a test database with Docker:
# Start PostgreSQL (detached)
docker compose up -d postgres
# Wait for PostgreSQL to be ready
sleep 3
# Set environment variable for tests
export TEST_DATABASE_URL="postgresql://deepersensor:devpassword@localhost:5432/deepersensor"
# Run integration tests
cargo test --test integration_tests -- --test-threads=1If you have PostgreSQL running locally:
# Create a test database
createdb deepersensor_test
# Export the connection URL
export TEST_DATABASE_URL="postgresql://username:password@localhost/deepersensor_test"
# Run integration tests
cargo test --test integration_tests -- --test-threads=1Note: Integration tests use
--test-threads=1to avoid database conflicts between parallel tests.
cargo test --workspace --libThis runs all unit tests across all workspace crates without needing external dependencies.
# Start services
docker compose up -d postgres
# Set test database URL
export TEST_DATABASE_URL="postgresql://deepersensor:devpassword@localhost:5432/deepersensor"
# Run all tests
cargo test --workspace -- --test-threads=1Located within each module using #[cfg(test)]:
crates/api/src/validation.rs: Email, password, model name, message content validatorscrates/auth/src/lib.rs: Password hashing, JWT generation/verificationcrates/core/src/error.rs: Error handling and serialization
Run unit tests for a specific crate:
cargo test -p api --lib
cargo test -p ds-auth --lib
cargo test -p ds-core --libLocated in crates/api/tests/:
integration_tests.rs: HTTP endpoint tests (signup, login, health, metrics)
Run integration tests:
export TEST_DATABASE_URL="postgresql://deepersensor:devpassword@localhost:5432/deepersensor"
cargo test --test integration_tests -- --test-threads=1Test coverage includes:
- ✅ Health endpoint returns 200 OK with dependency status
- ✅ Readiness endpoint returns 200 OK
- ✅ Signup with valid credentials succeeds
- ✅ Signup with duplicate email returns 422 UNPROCESSABLE_ENTITY
- ✅ Signup with weak password returns 422 UNPROCESSABLE_ENTITY
- ✅ Login with correct credentials succeeds
- ✅ Login with wrong password returns 401 UNAUTHORIZED
- ✅ Metrics endpoint returns Prometheus format
GitHub Actions automatically runs tests on every push and pull request:
# Automated testing includes:
- cargo fmt -- --check # Code formatting
- cargo clippy -- -D warnings # Linting
- cargo test --workspace # All tests
- cargo audit # Security vulnerabilitiesView CI results: https://github.com/your-org/api.deepersensor/actions
Integration tests automatically clean up test data using:
async fn cleanup_test_db(pool: &sqlx::PgPool) -> Result<()> {
sqlx::query("TRUNCATE TABLE users CASCADE")
.execute(pool)
.await?;
Ok(())
}Each test calls cleanup_test_db() at the end to ensure isolation.
Add to the appropriate module's #[cfg(test)] section:
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn test_your_function() {
let result = your_function("input");
assert_eq!(result, expected_value);
}
}Add to crates/api/tests/integration_tests.rs:
#[tokio::test]
async fn test_new_endpoint() -> Result<()> {
let (_cfg, state, router) = setup_test_app().await?;
let response = router
.with_state(state.clone())
.oneshot(
Request::builder()
.uri("/your/endpoint")
.body(axum::body::Body::empty())
.unwrap()
)
.await?;
assert_eq!(response.status(), StatusCode::OK);
cleanup_test_db(&state.db).await?;
Ok(())
}Current coverage focus areas:
- ✅ Input validation (email, password, model names, messages)
- ✅ Authentication flow (signup, login)
- ✅ Health monitoring endpoints
⚠️ JWT-protected endpoints (TODO: add after applying middleware)⚠️ Rate limiting (TODO: add integration tests)⚠️ Chat streaming (TODO: requires Ollama mock)
# Show all test output (not just failures)
cargo test -- --nocapture
# Run a specific test
cargo test test_signup_success -- --nocapture
# Show test backtraces
RUST_BACKTRACE=1 cargo testIf integration tests fail with database errors:
# Check PostgreSQL is running
docker compose ps postgres
# View PostgreSQL logs
docker compose logs postgres
# Reset the database
docker compose down -v
docker compose up -d postgresIf migrations fail in integration tests:
# Manually run migrations
sqlx migrate run --database-url "postgresql://deepersensor:devpassword@localhost:5432/deepersensor"
# Check migration status
sqlx migrate info --database-url "postgresql://deepersensor:devpassword@localhost:5432/deepersensor"For performance and load testing, see DEPLOYMENT.md section on "Performance Testing".
Integration tests focus on correctness, not performance.
Run security audit as part of testing:
cargo auditThis checks for known vulnerabilities in dependencies (automated in CI).
Tests respect these environment variables:
| Variable | Default | Description |
|---|---|---|
TEST_DATABASE_URL |
(required) | PostgreSQL connection string for integration tests |
RUST_LOG |
info |
Log level for test output |
RUST_BACKTRACE |
0 |
Set to 1 or full for detailed error traces |
Example test run with full logging:
RUST_LOG=debug \
RUST_BACKTRACE=1 \
TEST_DATABASE_URL="postgresql://deepersensor:devpassword@localhost:5432/deepersensor" \
cargo test --test integration_tests -- --nocapture --test-threads=1After the core API is stable:
- Add JWT-protected route tests: Apply
auth_middleware::require_authto chat endpoints, test with valid/invalid tokens - Add rate limiting tests: Verify rate limits trigger correctly
- Add chat streaming tests: Mock Ollama responses or use test instance
- Add load tests: Use
criterionfor benchmarking critical paths - Add property-based tests: Use
proptestfor validation fuzzing