I have written a bash script that uses gcloud command to activate service account and gsutil command to move files to GCS bucket. When I run the script using bash command, it executes well.But when I place the script in crontab , it fails with command not found.
==> NOTE: You are uploading one or more large file(s), which would run
significantly faster if you enable parallel composite uploads. This
feature can be enabled by editing the
"parallel_composite_upload_threshold" value in your .boto
configuration file. However, note that if you do this large files will
be uploaded as `composite objects
<https://cloud.google.com/storage/docs/composite-objects>`_,which
means that any user who downloads such objects will need to have a
compiled crcmod installed (see "gsutil help crcmod"). This is because
without a compiled crcmod, computing checksums on composite objects is
so slow that gsutil disables downloads of composite objects.
Removing file:///file.csv
Operation completed over 1 objects/2.7 GiB.
/script.sh: line 691: gcloud: command not found
/script.sh: line 693: gsutil: command not found
What is the mistake here. How do I correct it
1 Answer 1
The issue is with the PATH. Thanks to below answer. It worked after explicitly exporting gcloud path with in the script. gcloud command not found - while installing Google Cloud SDK
export PATH="~/google-cloud-sdk/bin:$PATH"
Sign up to request clarification or add additional context in comments.
1 Comment
Gordon Davisson
Don't put the
~/ inside quotes, it causes problems depending on exactly how the command is invoked. Use either export PATH=~/"google-cloud-sdk/bin:$PATH" or export PATH="$HOME/google-cloud-sdk/bin:$PATH". See "tilde expansion when evaluating $PATH".lang-bash
PATHcontaining only /bin and /usr/bin, so if they're anywhere else (and you don't add the relevant directory toPATH), they won't be found. See "crontab and binaries in /usr/local/bin" and if that doesn't solve it, "CronJob not running" has more troubleshooting recommendations.