22
33The 'default_python' project was generated by using the default-python template.
44
5+ For documentation on the Databricks Asset Bundles format use for this project,
6+ and for CI/CD configuration, see https://docs.databricks.com/aws/en/dev-tools/bundles .
7+
58## Getting started
69
7- 0 . Install UV: https://docs.astral.sh/uv/getting-started/installation/
10+ Choose how you want to work on this project:
11+
12+ (a) Directly in your Databricks workspace, see
13+ https://docs.databricks.com/dev-tools/bundles/workspace .
14+
15+ (b) Locally with an IDE like Cursor or VS Code, see
16+ https://docs.databricks.com/vscode-ext .
17+
18+ (c) With command line tools, see https://docs.databricks.com/dev-tools/cli/databricks-cli.html
19+
20+
21+ Dependencies for this project should be installed using uv:
822
9- 1 . Install the Databricks CLI from https://docs.databricks.com/dev-tools/cli/databricks-cli.html
23+ * Make sure you have the UV package manager installed.
24+ It's an alternative to tools like pip: https://docs.astral.sh/uv/getting-started/installation/ .
25+ * Run ` uv sync --dev ` to install the project's dependencies.
1026
11- 2 . Authenticate to your Databricks workspace, if you have not done so already:
27+ # Using this project using the CLI
28+
29+ The Databricks workspace and IDE extensions provide a graphical interface for working
30+ with this project. It's also possible to interact with it directly using the CLI:
31+
32+ 1 . Authenticate to your Databricks workspace, if you have not done so already:
1233 ```
1334 $ databricks configure
1435 ```
1536
16- 3 . To deploy a development copy of this project, type:
37+ 2 . To deploy a development copy of this project, type:
1738 ```
1839 $ databricks bundle deploy --target dev
1940 ```
@@ -23,9 +44,9 @@ The 'default_python' project was generated by using the default-python template.
2344 This deploys everything that's defined for this project.
2445 For example, the default template would deploy a job called
2546 `[dev yourname] default_python_job` to your workspace.
26- You can find that job by opening your workpace and clicking on **Workflows **.
47+ You can find that job by opening your workpace and clicking on **Jobs & Pipelines **.
2748
28- 4 . Similarly, to deploy a production copy, type:
49+ 3 . Similarly, to deploy a production copy, type:
2950 ```
3051 $ databricks bundle deploy --target prod
3152 ```
@@ -35,17 +56,12 @@ The 'default_python' project was generated by using the default-python template.
3556 is paused when deploying in development mode (see
3657 https://docs.databricks.com/dev-tools/bundles/deployment-modes.html).
3758
38- 5 . To run a job or pipeline, use the "run" command:
59+ 4 . To run a job or pipeline, use the "run" command:
3960 ```
4061 $ databricks bundle run
4162 ```
42- 6. Optionally, install the Databricks extension for Visual Studio code for local development from
43- https://docs.databricks.com/dev-tools/vscode-ext.html. It can configure your
44- virtual environment and setup Databricks Connect for running unit tests locally.
45- When not using these tools, consult your development environment's documentation
46- and/or the documentation for Databricks Connect for manually setting up your environment
47- (https://docs.databricks.com/en/dev-tools/databricks-connect/python/index.html).
48-
49- 7. For documentation on the Databricks asset bundles format used
50- for this project, and for CI/CD configuration, see
51- https://docs.databricks.com/dev-tools/bundles/index.html.
63+
64+ 5. Finally, to run tests locally, use `pytest`:
65+ ```
66+ $ uv run pytest
67+ ```
0 commit comments