You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/declarative-pipelines/index.md
+162-9Lines changed: 162 additions & 9 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -22,6 +22,159 @@ Declarative Pipelines uses the following [Python decorators](https://peps.python
22
22
23
23
Once described, a pipeline can be [started](PipelineExecution.md#runPipeline) (on a [PipelineExecution](PipelineExecution.md)).
24
24
25
+
## Demo: spark-pipelines CLI
26
+
27
+
```bash
28
+
uv init hello-spark-pipelines
29
+
```
30
+
31
+
```bash
32
+
cd hello-spark-pipelines
33
+
```
34
+
35
+
```console
36
+
❯ uv pip list
37
+
Using Python 3.12.11 environment at: /Users/jacek/.local/share/uv/python/cpython-3.12.11-macos-aarch64-none
38
+
Package Version
39
+
------- -------
40
+
pip 24.3.1
41
+
```
42
+
43
+
```bash
44
+
export SPARK_HOME=/Users/jacek/oss/spark
45
+
```
46
+
47
+
```bash
48
+
uv add $SPARK_HOME/python/packaging/client
49
+
```
50
+
51
+
```console
52
+
❯ uv pip list
53
+
Package Version
54
+
------------------------ -----------
55
+
googleapis-common-protos 1.70.0
56
+
grpcio 1.74.0
57
+
grpcio-status 1.74.0
58
+
numpy 2.3.2
59
+
pandas 2.3.1
60
+
protobuf 6.31.1
61
+
pyarrow 21.0.0
62
+
pyspark-client 4.1.0.dev0
63
+
python-dateutil 2.9.0.post0
64
+
pytz 2025.2
65
+
pyyaml 6.0.2
66
+
six 1.17.0
67
+
tzdata 2025.2
68
+
```
69
+
70
+
```bash
71
+
source .venv/bin/activate
72
+
```
73
+
74
+
```console
75
+
$ $SPARK_HOME/bin/spark-pipelines --help
76
+
usage: cli.py [-h] {run,dry-run,init} ...
77
+
78
+
Pipelines CLI
79
+
80
+
positional arguments:
81
+
{run,dry-run,init}
82
+
run Run a pipeline. If no refresh options specified, a
83
+
default incremental update is performed.
84
+
dry-run Launch a run that just validates the graph and checks
85
+
for errors.
86
+
init Generate a sample pipeline project, including a spec
87
+
file and example definitions.
88
+
89
+
options:
90
+
-h, --help show this help message and exit
91
+
```
92
+
93
+
```console
94
+
❯ $SPARK_HOME/bin/spark-pipelines dry-run
95
+
Traceback (most recent call last):
96
+
File "/Users/jacek/oss/spark/python/pyspark/pipelines/cli.py", line 382, in <module>
97
+
main()
98
+
File "/Users/jacek/oss/spark/python/pyspark/pipelines/cli.py", line 358, in main
99
+
spec_path = find_pipeline_spec(Path.cwd())
100
+
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
101
+
File "/Users/jacek/oss/spark/python/pyspark/pipelines/cli.py", line 101, in find_pipeline_spec
102
+
raise PySparkException(
103
+
pyspark.errors.exceptions.base.PySparkException: [PIPELINE_SPEC_FILE_NOT_FOUND] No pipeline.yaml or pipeline.yml file provided in arguments or found in directory `/` or readable ancestor directories.
0 commit comments