Skip to content

Commit 4b63ea6

Browse files
Merge pull request #5 from SkyflowFoundry/IE-498
IE-498: added support to migrate pipelines
2 parents ceba728 + 9ee4646 commit 4b63ea6

5 files changed

Lines changed: 849 additions & 1 deletion

File tree

.github/workflows/ci.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,7 @@ jobs:
1919
- name: Install dependencies
2020
run: |
2121
python -m pip install --upgrade pip
22-
pip install pytest pytest-cov requests
22+
pip install pytest pytest-cov requests python-dotenv
2323
2424
- name: Run tests
2525
run: pytest -q
Lines changed: 119 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,119 @@
1+
name: pipelines_migration
2+
3+
on:
4+
workflow_dispatch:
5+
inputs:
6+
env_url:
7+
description: "Select source and target env's"
8+
type: choice
9+
default: "Source: SANDBOX, Target: PRODUCTION"
10+
options:
11+
- "Source: SANDBOX, Target: PRODUCTION"
12+
- "Source: SANDBOX, Target: SANDBOX"
13+
- "Source: PRODUCTION, Target: PRODUCTION"
14+
- "Source: PRODUCTION, Target: SANDBOX"
15+
source_datastore_config:
16+
description: "Source datastore JSON. Provide either an ftpServer or s3Bucket object."
17+
required: true
18+
default: |
19+
{
20+
"ftpServer": {
21+
"transferProtocol": "FTPS",
22+
"plainText": {
23+
"hostname": "",
24+
"port": "",
25+
"username": "",
26+
"password": "",
27+
},
28+
"skyflowHosted": false
29+
}
30+
}
31+
target_datastore_config:
32+
description: "Destination datastore JSON. Provide either an ftpServer or s3Bucket object."
33+
required: true
34+
default: |
35+
{
36+
"s3Bucket": {
37+
"name": "",
38+
"region": "",
39+
"assumedRoleARN": ""
40+
}
41+
}
42+
source_vault_id:
43+
description: "Source Vault ID."
44+
required: false
45+
pipeline_id:
46+
description: "PipelineID to be migrated."
47+
required: false
48+
default: ""
49+
target_vault_id:
50+
description: "Target Vault ID"
51+
required: true
52+
source_account_access_token:
53+
description: "Access token of the Source Account. (Not required, if config file is selected)"
54+
required: false
55+
target_account_access_token:
56+
description: "Access token of the Target Account"
57+
required: true
58+
source_account_id:
59+
description: "Source Account ID. If not provided, will use the repository variable"
60+
required: false
61+
target_account_id:
62+
description: "Target Account ID. If not provided, will use the repository variable"
63+
required: false
64+
65+
66+
jobs:
67+
execute-pipelines-migration-script:
68+
runs-on: ubuntu-latest
69+
70+
steps:
71+
- name: Checkout code
72+
uses: actions/checkout@v4
73+
74+
- name: Set up Python
75+
uses: actions/setup-python@v5
76+
with:
77+
python-version: "3.x"
78+
79+
- name: Install dependencies
80+
run: pip install requests
81+
82+
- name: Parse and map environment URLs
83+
id: map_envs
84+
shell: bash
85+
run: |
86+
input="${{ github.event.inputs.env_url }}"
87+
88+
source_name=$(echo "$input" | sed -n 's/Source: \([^,]*\),.*/\1/p' | xargs)
89+
target_name=$(echo "$input" | sed -n 's/.*Target: \(.*\)/\1/p' | xargs)
90+
91+
get_env_url() {
92+
case "$1" in
93+
SANDBOX) echo "https://manage.skyflowapis-preview.com" ;;
94+
PRODUCTION) echo "https://manage.skyflowapis.com" ;;
95+
*) echo "Invalid environment: $1" >&2; exit 1 ;;
96+
esac
97+
}
98+
99+
# Resolve URLs
100+
source_url=$(get_env_url "$source_name")
101+
target_url=$(get_env_url "$target_name")
102+
103+
echo "source_url=$source_url" >> $GITHUB_OUTPUT
104+
echo "target_url=$target_url" >> $GITHUB_OUTPUT
105+
106+
- name: Run Python script
107+
env:
108+
PIPELINE_ID: ${{ github.event.inputs.pipeline_id }}
109+
SOURCE_DATASTORE_CONFIG: ${{ github.event.inputs.source_datastore_config }}
110+
TARGET_DATASTORE_CONFIG: ${{ github.event.inputs.target_datastore_config }}
111+
SOURCE_VAULT_ID: ${{ github.event.inputs.source_vault_id }}
112+
TARGET_VAULT_ID: ${{ github.event.inputs.target_vault_id }}
113+
SOURCE_ACCOUNT_AUTH: ${{ github.event.inputs.source_account_access_token }}
114+
TARGET_ACCOUNT_AUTH: ${{ github.event.inputs.target_account_access_token }}
115+
SOURCE_ACCOUNT_ID: ${{ github.event.inputs.source_account_id != '' && github.event.inputs.source_account_id || vars.SOURCE_ACCOUNT_ID }}
116+
TARGET_ACCOUNT_ID: ${{ github.event.inputs.target_account_id != '' && github.event.inputs.target_account_id || vars.TARGET_ACCOUNT_ID }}
117+
SOURCE_ENV_URL: ${{ steps.map_envs.outputs.source_url }}
118+
TARGET_ENV_URL: ${{ steps.map_envs.outputs.target_url }}
119+
run: python3 migrate_pipelines.py

README.md

Lines changed: 43 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -115,6 +115,49 @@ Note: Please note that if all values are provided `config_file` will take the pr
115115
- The script doesn't migrate service accounts related to connection, this has to be done from Studio.
116116
- Migration of connections associated with functions is not supported.
117117

118+
### Pipelines Migration
119+
120+
Migrates a pipeline definition from the source vault to the target vault.
121+
122+
##### Parameters:
123+
- **`source_and_target_env`**: Source and Target Env's.
124+
- **`pipeline_id`**: Pipeline ID to migrate. Get the pipeline ID from Studio.
125+
- **`source_datastore_config`**: JSON object that replaces the source datastore configuration. Provide either an `ftpServer` or `s3Bucket` object with the required credentials.
126+
- **`target_datastore_config`**: JSON object that replaces the destination datastore configuration. Provide either an `ftpServer` or `s3Bucket` object with the required credentials.
127+
- **`source_account_access_token`**: Access token of the source account.
128+
- **`target_account_access_token`**: Access token of the target account.
129+
130+
##### Notes:
131+
- Datastore overrides accept exactly one of `ftpServer` or `s3Bucket`. FTP datastore require `transferProtocol` plus either `plainText` or `encrypted` credentials. S3 datastore must include `name`, `region`, and `assumedRoleARN`.
132+
- The script validates incompatible overrides (for example, replacing an S3 datastore with FTP).
133+
134+
##### Sample datastore configurations:
135+
136+
```jsonc
137+
{
138+
"ftpServer": {
139+
"transferProtocol": "SFTP",
140+
"plainText": {
141+
"hostname": "sftp.example.com",
142+
"port": "22",
143+
"username": "pipeline-user",
144+
"password": "secret"
145+
},
146+
"skyflowHosted": false
147+
}
148+
}
149+
```
150+
151+
```jsonc
152+
{
153+
"s3Bucket": {
154+
"name": "pipeline-export-bucket",
155+
"region": "us-west-2",
156+
"assumedRoleARN": "arn:aws:iam::123456789012:role/pipeline-export-role"
157+
}
158+
}
159+
```
160+
118161
## Steps to run the workflows
119162

120163
### Prerequisites

0 commit comments

Comments
 (0)