You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Activate (_source_) the virtual environment (that `uv` helped us create).
71
+
It will bring all the necessary PySpark modules that have not been released yet and are only available in the source format only.
72
+
73
+
```bash
74
+
source .venv/bin/activate
75
+
```
76
+
77
+
```console
78
+
❯ $SPARK_HOME/bin/spark-pipelines --help
79
+
usage: cli.py [-h] {run,dry-run,init} ...
80
+
81
+
Pipelines CLI
82
+
83
+
positional arguments:
84
+
{run,dry-run,init}
85
+
run Run a pipeline. If no refresh options specified, a
86
+
default incremental update is performed.
87
+
dry-run Launch a run that just validates the graph and checks
88
+
for errors.
89
+
init Generate a sample pipeline project, including a spec
90
+
file and example definitions.
91
+
92
+
options:
93
+
-h, --help show this help message and exit
94
+
```
95
+
96
+
```bash
97
+
$SPARK_HOME/bin/spark-pipelines dry-run
98
+
```
99
+
100
+
??? note "Output"
101
+
```console
102
+
Traceback (most recent call last):
103
+
File "/Users/jacek/oss/spark/python/pyspark/pipelines/cli.py", line 382, in <module>
104
+
main()
105
+
File "/Users/jacek/oss/spark/python/pyspark/pipelines/cli.py", line 358, in main
106
+
spec_path = find_pipeline_spec(Path.cwd())
107
+
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
108
+
File "/Users/jacek/oss/spark/python/pyspark/pipelines/cli.py", line 101, in find_pipeline_spec
109
+
raise PySparkException(
110
+
pyspark.errors.exceptions.base.PySparkException: [PIPELINE_SPEC_FILE_NOT_FOUND] No pipeline.yaml or pipeline.yml file provided in arguments or found in directory `/` or readable ancestor directories.
111
+
```
112
+
113
+
Create a demo double `hello-spark-pipelines` pipelines project with a sample `pipeline.yml` and sample transformations (in Python and in SQL).
0 commit comments