Skip to content

Feature/huggingface space continuous deployment#7

Open
whats2000 wants to merge 4 commits intoai-twinkle:mainfrom
whats2000:feature/huggingface-space-continuous-deployment
Open

Feature/huggingface space continuous deployment#7
whats2000 wants to merge 4 commits intoai-twinkle:mainfrom
whats2000:feature/huggingface-space-continuous-deployment

Conversation

@whats2000
Copy link
Member

✨ Add Hugging Face Spaces continuous deployment (Docker)

This PR adds a Hugging Face Spaces (Docker) deployment setup for eval-analyzer, plus a GitHub Actions workflow that auto-syncs main to the Space repo using an HF_TOKEN secret.


What changed

  • Docker-based Space runtime

    • Added Dockerfile that runs Streamlit (app.py) on port 7860 (required by HF Spaces), with a healthcheck.
    • Added .dockerignore to keep the image small / avoid copying local env + git metadata.
  • Continuous deployment workflow

    • Added .github/workflows/huggingface.yml to force-push main to the Space on every push to main (also supports manual dispatch).
    • Target Space repo is set to: https://huggingface.co/spaces/twinkle-ai/tw-eval-analyzer
  • Space metadata

    • Added Hugging Face Spaces YAML front-matter to README.md (title/emoji/colors + sdk: docker, app_file: app.py).

How deployment works

On every push to main, GitHub Actions runs a job that:

  1. Checks out the repo (with full history + LFS enabled)
  2. Pushes the repo to the HF Space git remote using HF_TOKEN
  3. Hugging Face Space rebuilds automatically from the updated repo

Setup instructions (Hugging Face Space + HF_TOKEN)

1) Create the Space under the org

  1. Open: https://huggingface.co/new-space?owner=twinkle-ai
  2. Set Owner = twinkle-ai
  3. Set Space name = tw-eval-analyzer (must match the workflow push target)
  4. Choose SDK = Docker
  5. Create the Space

The workflow is currently hardcoded to push to twinkle-ai/tw-eval-analyzer. If you use a different Space name, update the push URL in .github/workflows/huggingface.yml.

2) Create a Hugging Face token

  • Create a Hugging Face access token with write permission (so it can push to the Space repo).

3) Add HF_TOKEN to GitHub repo secrets

In GitHub repoSettingsSecrets and variablesActionsNew repository secret:

  • Name: HF_TOKEN
  • Value: your Hugging Face token

The workflow uses secrets.HF_TOKEN as an env var when pushing to the Space.


Verification / expected result

  • Merge this PR
  • Ensure the Space twinkle-ai/tw-eval-analyzer exists
  • Add HF_TOKEN secret to GitHub
  • Push/merge anything into main
  • You should see the “Sync to Hugging Face hub” workflow run, then the Space rebuild and serve the Streamlit app

Notes / caveats

  • The workflow uses git push --force to sync main to the Space repo. This is intentional for “mirror main to Space” behavior, but it will overwrite the Space’s git history.

Copilot AI review requested due to automatic review settings January 21, 2026 07:28
Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR adds Hugging Face Spaces deployment infrastructure with continuous deployment from the main branch. It includes a Docker-based setup configured for HF Spaces requirements (port 7860) and a GitHub Actions workflow that automatically syncs the repository to the Space on every push to main.

Changes:

  • Added Hugging Face Spaces YAML metadata to README.md for Space configuration
  • Created Dockerfile with Streamlit server configured for port 7860 (HF Spaces requirement)
  • Added GitHub Actions workflow for automatic deployment to HF Space via git push
  • Added .dockerignore to optimize Docker image build

Reviewed changes

Copilot reviewed 4 out of 4 changed files in this pull request and generated 3 comments.

File Description
README.md Added HF Spaces YAML front-matter with SDK configuration and app metadata
Dockerfile Configured Python 3.13 environment with Streamlit running on port 7860
.github/workflows/huggingface.yml Automated deployment workflow that syncs main branch to HF Space repository
.dockerignore Excluded development files and directories from Docker image

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment on lines +12 to +15
COPY . .

RUN pip3 install -r requirements.txt

Copy link

Copilot AI Jan 21, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The order of COPY operations is inefficient for Docker layer caching. Currently, requirements.txt is copied first but then the entire directory (including requirements.txt again) is copied immediately after. This negates the benefit of separate layer caching. Consider restructuring to copy requirements.txt, run pip install, and then copy the rest of the application files. This way, the dependencies layer will only rebuild when requirements.txt changes, not when any application file changes.

Suggested change
COPY . .
RUN pip3 install -r requirements.txt
RUN pip3 install -r requirements.txt
COPY . .

Copilot uses AI. Check for mistakes.
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants