-
Notifications
You must be signed in to change notification settings - Fork 161
Expand file tree
/
Copy pathoutput.txt
More file actions
69 lines (47 loc) · 3.03 KB
/
output.txt
File metadata and controls
69 lines (47 loc) · 3.03 KB
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
=== Deploy bundle 1 with terraform
>>> DATABRICKS_BUNDLE_ENGINE=terraform [CLI] bundle deploy
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/snapshot-test-1-[UNIQUE_NAME]/default/files...
Deploying resources...
Updating deployment state...
Deployment complete!
=== Run migrate on bundle 1
>>> [CLI] bundle deployment migrate
Note: Migration should be done after a full deploy. Running plan now to verify that deployment was done:
Plan: 0 to add, 0 to change, 0 to delete, 2 unchanged
Success! Migrated 2 resources to direct engine state file: [TEST_TMP_DIR]/bundle1/.databricks/bundle/default/resources.json
Validate the migration by running "databricks bundle plan", there should be no actions planned.
The state file is not synchronized to the workspace yet. To do that and finalize the migration, run "bundle deploy".
To undo the migration, remove [TEST_TMP_DIR]/bundle1/.databricks/bundle/default/resources.json and rename [TEST_TMP_DIR]/bundle1/.databricks/bundle/default/terraform/terraform.tfstate.backup to [TEST_TMP_DIR]/bundle1/.databricks/bundle/default/terraform/terraform.tfstate
=== Deploy bundle 2 with terraform and YamlSync
>>> DATABRICKS_BUNDLE_ENABLE_EXPERIMENTAL_YAML_SYNC=true DATABRICKS_BUNDLE_ENGINE=terraform [CLI] bundle deploy
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/snapshot-test-2-[UNIQUE_NAME]/default/files...
Deploying resources...
Updating deployment state...
Deployment complete!
=== Normalize bundle names for comparison
>>> cp bundle1/.databricks/bundle/default/resources.json tmp.bundle1.json
>>> cp bundle2/.databricks/bundle/default/resources-config-sync-snapshot.json tmp.bundle2.json
>>> update_file.py tmp.bundle1.json snapshot-test-1- snapshot-test-NORMALIZED-
>>> update_file.py tmp.bundle2.json snapshot-test-2- snapshot-test-NORMALIZED-
=== Compare normalized snapshots
>>> diff.py tmp.bundle1.json tmp.bundle2.json
=== Cleanup bundle 1
>>> [CLI] bundle destroy --auto-approve
The following resources will be deleted:
delete resources.jobs.test_job
delete resources.pipelines.test_pipeline
This action will result in the deletion or recreation of the following Lakeflow Spark Declarative Pipelines along with the Streaming Tables (STs) and Materialized Views (MVs) managed by them:
delete resources.pipelines.test_pipeline
All files and directories at the following location will be deleted: /Workspace/Users/[USERNAME]/.bundle/snapshot-test-1-[UNIQUE_NAME]/default
Deleting files...
Destroy complete!
=== Cleanup bundle 2
>>> [CLI] bundle destroy --auto-approve
The following resources will be deleted:
delete resources.jobs.test_job
delete resources.pipelines.test_pipeline
This action will result in the deletion or recreation of the following Lakeflow Spark Declarative Pipelines along with the Streaming Tables (STs) and Materialized Views (MVs) managed by them:
delete resources.pipelines.test_pipeline
All files and directories at the following location will be deleted: /Workspace/Users/[USERNAME]/.bundle/snapshot-test-2-[UNIQUE_NAME]/default
Deleting files...
Destroy complete!