Cloud Dataproc V1 API - Class Google::Cloud::Dataproc::V1::Batch (v1.6.0)

Reference documentation and code samples for the Cloud Dataproc V1 API class Google::Cloud::Dataproc::V1::Batch.

A representation of a batch workload in the service.

Inherits

  • Object

Extended By

  • Google::Protobuf::MessageExts::ClassMethods

Includes

  • Google::Protobuf::MessageExts

Methods

#create_time

defcreate_time()->::Google::Protobuf::Timestamp
Returns

#creator

defcreator()->::String
Returns
  • (::String) — Output only. The email address of the user who created the batch.

#environment_config

defenvironment_config()->::Google::Cloud::Dataproc::V1::EnvironmentConfig
Returns

#environment_config=

defenvironment_config=(value)->::Google::Cloud::Dataproc::V1::EnvironmentConfig
Parameter
Returns

#labels

deflabels()->::Google::Protobuf::Map{::String=>::String}
Returns
  • (::Google::Protobuf::Map{::String => ::String}) — Optional. The labels to associate with this batch. Label keys must contain 1 to 63 characters, and must conform to RFC 1035. Label values may be empty, but, if present, must contain 1 to 63 characters, and must conform to RFC 1035. No more than 32 labels can be associated with a batch.

#labels=

deflabels=(value)->::Google::Protobuf::Map{::String=>::String}
Parameter
  • value (::Google::Protobuf::Map{::String => ::String}) — Optional. The labels to associate with this batch. Label keys must contain 1 to 63 characters, and must conform to RFC 1035. Label values may be empty, but, if present, must contain 1 to 63 characters, and must conform to RFC 1035. No more than 32 labels can be associated with a batch.
Returns
  • (::Google::Protobuf::Map{::String => ::String}) — Optional. The labels to associate with this batch. Label keys must contain 1 to 63 characters, and must conform to RFC 1035. Label values may be empty, but, if present, must contain 1 to 63 characters, and must conform to RFC 1035. No more than 32 labels can be associated with a batch.

#name

defname()->::String
Returns
  • (::String) — Output only. The resource name of the batch.

#operation

defoperation()->::String
Returns
  • (::String) — Output only. The resource name of the operation associated with this batch.

#pyspark_batch

defpyspark_batch()->::Google::Cloud::Dataproc::V1::PySparkBatch
Returns
  • (::Google::Cloud::Dataproc::V1::PySparkBatch) — Optional. PySpark batch config.

    Note: The following fields are mutually exclusive: pyspark_batch, spark_batch, spark_r_batch, spark_sql_batch. If a field in that set is populated, all other fields in the set will automatically be cleared.

#pyspark_batch=

defpyspark_batch=(value)->::Google::Cloud::Dataproc::V1::PySparkBatch
Parameter
  • value (::Google::Cloud::Dataproc::V1::PySparkBatch) — Optional. PySpark batch config.

    Note: The following fields are mutually exclusive: pyspark_batch, spark_batch, spark_r_batch, spark_sql_batch. If a field in that set is populated, all other fields in the set will automatically be cleared.

Returns
  • (::Google::Cloud::Dataproc::V1::PySparkBatch) — Optional. PySpark batch config.

    Note: The following fields are mutually exclusive: pyspark_batch, spark_batch, spark_r_batch, spark_sql_batch. If a field in that set is populated, all other fields in the set will automatically be cleared.

#runtime_config

defruntime_config()->::Google::Cloud::Dataproc::V1::RuntimeConfig
Returns

#runtime_config=

defruntime_config=(value)->::Google::Cloud::Dataproc::V1::RuntimeConfig
Parameter
Returns

#runtime_info

defruntime_info()->::Google::Cloud::Dataproc::V1::RuntimeInfo
Returns

#spark_batch

defspark_batch()->::Google::Cloud::Dataproc::V1::SparkBatch
Returns
  • (::Google::Cloud::Dataproc::V1::SparkBatch) — Optional. Spark batch config.

    Note: The following fields are mutually exclusive: spark_batch, pyspark_batch, spark_r_batch, spark_sql_batch. If a field in that set is populated, all other fields in the set will automatically be cleared.

#spark_batch=

defspark_batch=(value)->::Google::Cloud::Dataproc::V1::SparkBatch
Parameter
  • value (::Google::Cloud::Dataproc::V1::SparkBatch) — Optional. Spark batch config.

    Note: The following fields are mutually exclusive: spark_batch, pyspark_batch, spark_r_batch, spark_sql_batch. If a field in that set is populated, all other fields in the set will automatically be cleared.

Returns
  • (::Google::Cloud::Dataproc::V1::SparkBatch) — Optional. Spark batch config.

    Note: The following fields are mutually exclusive: spark_batch, pyspark_batch, spark_r_batch, spark_sql_batch. If a field in that set is populated, all other fields in the set will automatically be cleared.

#spark_r_batch

defspark_r_batch()->::Google::Cloud::Dataproc::V1::SparkRBatch
Returns
  • (::Google::Cloud::Dataproc::V1::SparkRBatch) — Optional. SparkR batch config.

    Note: The following fields are mutually exclusive: spark_r_batch, pyspark_batch, spark_batch, spark_sql_batch. If a field in that set is populated, all other fields in the set will automatically be cleared.

#spark_r_batch=

defspark_r_batch=(value)->::Google::Cloud::Dataproc::V1::SparkRBatch
Parameter
  • value (::Google::Cloud::Dataproc::V1::SparkRBatch) — Optional. SparkR batch config.

    Note: The following fields are mutually exclusive: spark_r_batch, pyspark_batch, spark_batch, spark_sql_batch. If a field in that set is populated, all other fields in the set will automatically be cleared.

Returns
  • (::Google::Cloud::Dataproc::V1::SparkRBatch) — Optional. SparkR batch config.

    Note: The following fields are mutually exclusive: spark_r_batch, pyspark_batch, spark_batch, spark_sql_batch. If a field in that set is populated, all other fields in the set will automatically be cleared.

#spark_sql_batch

defspark_sql_batch()->::Google::Cloud::Dataproc::V1::SparkSqlBatch
Returns
  • (::Google::Cloud::Dataproc::V1::SparkSqlBatch) — Optional. SparkSql batch config.

    Note: The following fields are mutually exclusive: spark_sql_batch, pyspark_batch, spark_batch, spark_r_batch. If a field in that set is populated, all other fields in the set will automatically be cleared.

#spark_sql_batch=

defspark_sql_batch=(value)->::Google::Cloud::Dataproc::V1::SparkSqlBatch
Parameter
  • value (::Google::Cloud::Dataproc::V1::SparkSqlBatch) — Optional. SparkSql batch config.

    Note: The following fields are mutually exclusive: spark_sql_batch, pyspark_batch, spark_batch, spark_r_batch. If a field in that set is populated, all other fields in the set will automatically be cleared.

Returns
  • (::Google::Cloud::Dataproc::V1::SparkSqlBatch) — Optional. SparkSql batch config.

    Note: The following fields are mutually exclusive: spark_sql_batch, pyspark_batch, spark_batch, spark_r_batch. If a field in that set is populated, all other fields in the set will automatically be cleared.

#state

defstate()->::Google::Cloud::Dataproc::V1::Batch::State
Returns

#state_history

defstate_history()->::Array<::Google::Cloud::Dataproc::V1::Batch::StateHistory>
Returns

#state_message

defstate_message()->::String
Returns
  • (::String) — Output only. Batch state details, such as a failure description if the state is FAILED.

#state_time

defstate_time()->::Google::Protobuf::Timestamp
Returns

#uuid

defuuid()->::String
Returns
  • (::String) — Output only. A batch UUID (Unique Universal Identifier). The service generates this value when it creates the batch.

Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.

Last updated 2025年10月30日 UTC.