Required. Job information, including how, when, and where to
run the job.
Job is a Hadoop job.
Job is a Pyspark job.
Job is a Pig job.
Job is a SparkSql job.
Output only. The previous job status.
Output only. The email address of the user submitting the job.
For jobs submitted on the cluster, the address is
username@hostname.
Output only. If present, the location of miscellaneous control
files which may be used as part of job setup and handling. If
not present, control files may be placed in the same location
as driver_output_uri.
Optional. Job scheduling configuration.
Classes
LabelsEntry
API documentation for dataproc_v1beta2.types.Job.LabelsEntry class.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2024-11-01 UTC."],[],[]]