Cloud Dataproc V1 API - Class Google::Cloud::Dataproc::V1::SparkJob (v1.6.0)

Reference documentation and code samples for the Cloud Dataproc V1 API class Google::Cloud::Dataproc::V1::SparkJob.

A Dataproc job for running Apache Spark applications on YARN.

Inherits

  • Object

Extended By

  • Google::Protobuf::MessageExts::ClassMethods

Includes

  • Google::Protobuf::MessageExts

Methods

#archive_uris

defarchive_uris()->::Array<::String>
Returns
  • (::Array<::String>) — Optional. HCFS URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.

#archive_uris=

defarchive_uris=(value)->::Array<::String>
Parameter
  • value (::Array<::String>) — Optional. HCFS URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
Returns
  • (::Array<::String>) — Optional. HCFS URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.

#args

defargs()->::Array<::String>
Returns
  • (::Array<::String>) — Optional. The arguments to pass to the driver. Do not include arguments, such as --conf, that can be set as job properties, since a collision may occur that causes an incorrect job submission.

#args=

defargs=(value)->::Array<::String>
Parameter
  • value (::Array<::String>) — Optional. The arguments to pass to the driver. Do not include arguments, such as --conf, that can be set as job properties, since a collision may occur that causes an incorrect job submission.
Returns
  • (::Array<::String>) — Optional. The arguments to pass to the driver. Do not include arguments, such as --conf, that can be set as job properties, since a collision may occur that causes an incorrect job submission.

#file_uris

deffile_uris()->::Array<::String>
Returns
  • (::Array<::String>) — Optional. HCFS URIs of files to be placed in the working directory of each executor. Useful for naively parallel tasks.

#file_uris=

deffile_uris=(value)->::Array<::String>
Parameter
  • value (::Array<::String>) — Optional. HCFS URIs of files to be placed in the working directory of each executor. Useful for naively parallel tasks.
Returns
  • (::Array<::String>) — Optional. HCFS URIs of files to be placed in the working directory of each executor. Useful for naively parallel tasks.

#jar_file_uris

defjar_file_uris()->::Array<::String>
Returns
  • (::Array<::String>) — Optional. HCFS URIs of jar files to add to the CLASSPATHs of the Spark driver and tasks.

#jar_file_uris=

defjar_file_uris=(value)->::Array<::String>
Parameter
  • value (::Array<::String>) — Optional. HCFS URIs of jar files to add to the CLASSPATHs of the Spark driver and tasks.
Returns
  • (::Array<::String>) — Optional. HCFS URIs of jar files to add to the CLASSPATHs of the Spark driver and tasks.

#logging_config

deflogging_config()->::Google::Cloud::Dataproc::V1::LoggingConfig
Returns

#logging_config=

deflogging_config=(value)->::Google::Cloud::Dataproc::V1::LoggingConfig
Parameter
Returns

#main_class

defmain_class()->::String
Returns
  • (::String) — The name of the driver's main class. The jar file that contains the class must be in the default CLASSPATH or specified in SparkJob.jar_file_uris.

    Note: The following fields are mutually exclusive: main_class, main_jar_file_uri. If a field in that set is populated, all other fields in the set will automatically be cleared.

#main_class=

defmain_class=(value)->::String
Parameter
  • value (::String) — The name of the driver's main class. The jar file that contains the class must be in the default CLASSPATH or specified in SparkJob.jar_file_uris.

    Note: The following fields are mutually exclusive: main_class, main_jar_file_uri. If a field in that set is populated, all other fields in the set will automatically be cleared.

Returns
  • (::String) — The name of the driver's main class. The jar file that contains the class must be in the default CLASSPATH or specified in SparkJob.jar_file_uris.

    Note: The following fields are mutually exclusive: main_class, main_jar_file_uri. If a field in that set is populated, all other fields in the set will automatically be cleared.

#main_jar_file_uri

defmain_jar_file_uri()->::String
Returns
  • (::String) — The HCFS URI of the jar file that contains the main class.

    Note: The following fields are mutually exclusive: main_jar_file_uri, main_class. If a field in that set is populated, all other fields in the set will automatically be cleared.

#main_jar_file_uri=

defmain_jar_file_uri=(value)->::String
Parameter
  • value (::String) — The HCFS URI of the jar file that contains the main class.

    Note: The following fields are mutually exclusive: main_jar_file_uri, main_class. If a field in that set is populated, all other fields in the set will automatically be cleared.

Returns
  • (::String) — The HCFS URI of the jar file that contains the main class.

    Note: The following fields are mutually exclusive: main_jar_file_uri, main_class. If a field in that set is populated, all other fields in the set will automatically be cleared.

#properties

defproperties()->::Google::Protobuf::Map{::String=>::String}
Returns
  • (::Google::Protobuf::Map{::String => ::String}) — Optional. A mapping of property names to values, used to configure Spark. Properties that conflict with values set by the Dataproc API might be overwritten. Can include properties set in /etc/spark/conf/spark-defaults.conf and classes in user code.

#properties=

defproperties=(value)->::Google::Protobuf::Map{::String=>::String}
Parameter
  • value (::Google::Protobuf::Map{::String => ::String}) — Optional. A mapping of property names to values, used to configure Spark. Properties that conflict with values set by the Dataproc API might be overwritten. Can include properties set in /etc/spark/conf/spark-defaults.conf and classes in user code.
Returns
  • (::Google::Protobuf::Map{::String => ::String}) — Optional. A mapping of property names to values, used to configure Spark. Properties that conflict with values set by the Dataproc API might be overwritten. Can include properties set in /etc/spark/conf/spark-defaults.conf and classes in user code.

Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.

Last updated 2025年10月30日 UTC.