Class SparkJob (1.1.3)

A Dataproc job for running Apache Spark <http://spark.apache.org/>__ applications on YARN.

The HCFS URI of the jar file that contains the main class.

Optional. The arguments to pass to the driver. Do not include arguments, such as --conf, that can be set as job properties, since a collision may occur that causes an incorrect job submission.

Optional. HCFS URIs of files to be placed in the working directory of each executor. Useful for naively parallel tasks.

Optional. A mapping of property names to values, used to configure Spark. Properties that conflict with values set by the Dataproc API may be overwritten. Can include properties set in /etc/spark/conf/spark-defaults.conf and classes in user code.

Classes

PropertiesEntry

API documentation for dataproc_v1.types.SparkJob.PropertiesEntry class.