A web application for searching and visualizing protein domains using AlphaFold predictions.
- Docker image, built using
.gitlab-ci.yml, pushed to GitLab Container Registry atregistry.gitlab.ics.muni.cz:443/445526/widgets-app-starterpack:latest - Kubernetes deployment using an appropriate kubeconfig file.
kubectl --kubeconfig=/path/to/your/kubeconfig.yml apply -f kubernetes/deployment.yaml -n fi-lmi-ns - data storage in a persistent volume claim, mounted to
/app/datain the container.- to copy the data to the PVC:
# 1) download kubectl, chmod +x kubectl # 2) configure your kubeconfig.yml, # 3) run the command: kubectl --kubeconfig=/path/to/your/kubeconfig.yml cp -n fi-lmi-ns <path_to_data> <pod_with_pvc_mounted>:/app/data # kubectl --kubeconfig=/path/to/your/kubeconfig.yml cp -n fi-lmi-ns proteins/proteins-disordered-regions/analyses/search_results_P10911.json fi-lmi-ns/protein-search-dashboard-7c46dd6c75-m25nc:/app/data/structures/.
- Get storage info:
curl https://protein-search-dashboard.dyn.cloud.e-infra.cz/storage-info
- Access a specific protein:
https://protein-search-dashboard.dyn.cloud.e-infra.cz/protein/P10911
To populate the protein database:
- Prepare your data directories:
mkdir -p protein_db metadata
# Copy your PDB files to protein_db/
# Copy your JSON files to metadata/- Copy to the PVC:
# Get pod name
POD=$(kubectl get pods -n fi-lmi-ns -l app=protein-search-dashboard -o jsonpath='{.items[0].metadata.name}')
# Copy protein database
kubectl --kubeconfig=kubeconfig.yml cp -n fi-lmi-ns ./protein_db/. $POD:/app/data/protein_db/
# Copy metadata
kubectl --kubeconfig=kubeconfig.yml cp -n fi-lmi-ns ./metadata/. $POD:/app/data/metadata/- Verify the data:
curl https://protein-search-dashboard.dyn.cloud.e-infra.cz/storage-infoCopy the example environment file and configure your settings:
cp env.example .env
# Edit .env with your actual valuesRequired environment variables:
ALPHAFOLD_API_KEY: Your AlphaFold API key (optional, but recommended for full functionality)
If you need to deploy to Kubernetes, ensure you have:
- A valid
kubeconfig.ymlfile (not included in this repository for security) - Proper permissions to access the target cluster
- Ensure Conda is installed:
conda --version- Create and activate a Conda environment:
# Create environment with Python 3.9
conda create -n protein-viz python=3.9
# Activate environment
conda activate protein-vizAlternatively, you can create the environment from the provided yml file:
conda env create -f conda-environment.yml
conda activate protein-viz- Install development dependencies:
# Install core dependencies
conda install -c conda-forge fastapi uvicorn py3dmol pandas numpy requests
# Install additional dependencies
pip install python-multipart typing-extensions
# Install development tools (optional)
conda install -c conda-forge pytest black flake8- Set up test data:
mkdir -p test
# Copy your test files
cp path/to/AF-P10911-F1-model_v4.pdb test/
cp path/to/search_results_P10911.json test/- Run the test server:
python test_local.py- Visit http://localhost:8000/protein/P10911 in your browser
The test environment uses a local directory structure that mirrors the production setup.
- If you get SSL errors with Conda:
conda config --set ssl_verify False- If you need to remove and recreate the environment:
conda deactivate
conda env remove -n protein-viz
conda env create -f conda-environment.yml- If py3dmol is not available in conda-forge:
conda activate protein-viz
pip install py3dmol- To update the environment after changes to conda-environment.yml:
conda env update -f conda-environment.yml --pruneapp/
├── __init__.py
├── main.py
├── config.py
├── models/
│ ├── __init__.py
│ └── protein.py
├── services/
│ ├── __init__.py
│ ├── protein_service.py
│ └── storage_service.py
├── templates/
│ ├── __init__.py
│ └── protein_view.py
└── utils/
├── __init__.py
└── logging.py