Updates the state of an existing Cloud Dataflow job. To update the state
of an existing job, we recommend using projects.locations.jobs.update
with a regional endpoint.
Using projects.jobs.update
is not recommended, as you can only update
the state of jobs that are running in us-central1
.
This method waits—the workflow execution is paused—until the operation is
complete, fails, or times out. The default timeout value is 1800
seconds (30
minutes) and can be changed to a maximum value of 31536000
seconds (one year)
for long-running operations using the connector_params
field. See the
Connectors reference.
The connector uses polling to monitor the long-running operation, which might generate additional billable steps. For more information about retries and long-running operations, refer to Understand connectors.
The polling policy for the long-running operation can be configured. To set the
connector-specific parameters (connector_params
), refer to
Invoke a connector call.
Arguments
Parameters | |
---|---|
jobId |
Required. The job ID.
|
location |
Required. The regional endpoint that contains this job.
|
projectId |
Required. The ID of the Cloud Platform project that the job belongs to.
|
body |
Required.
|
Raised exceptions
Exceptions | |
---|---|
ConnectionError |
In case of a network problem (such as DNS failure or refused connection). |
HttpError |
If the response status is >= 400 (excluding 429 and 503). |
TimeoutError |
If a long-running operation takes longer to finish than the specified timeout limit. |
TypeError |
If an operation or function receives an argument of the wrong type. |
ValueError |
If an operation or function receives an argument of the right type but an inappropriate value. For example, a negative timeout. |
OperationError |
If the long-running operation finished unsuccessfully. |
ResponseTypeError |
If the long-running operation returned a response of the wrong type. |
Response
If successful, the response contains an instance of Job
.
Subworkflow snippet
Some fields might be optional or required. To identify required fields, refer to the API documentation.
YAML
- update: call: googleapis.dataflow.v1b3.projects.locations.jobs.update args: jobId: ... location: ... projectId: ... body: clientRequestId: ... createTime: ... createdFromSnapshotId: ... currentState: ... currentStateTime: ... environment: clusterManagerApiService: ... dataset: ... debugOptions: enableHotKeyLogging: ... experiments: ... flexResourceSchedulingGoal: ... internalExperiments: ... sdkPipelineOptions: ... serviceAccountEmail: ... serviceKmsKeyName: ... serviceOptions: ... tempStoragePrefix: ... userAgent: ... version: ... workerPools: ... workerRegion: ... workerZone: ... executionInfo: stages: ... id: ... jobMetadata: bigTableDetails: ... bigqueryDetails: ... datastoreDetails: ... fileDetails: ... pubsubDetails: ... sdkVersion: sdkSupportStatus: ... version: ... versionDisplayName: ... spannerDetails: ... labels: ... location: ... name: ... pipelineDescription: displayData: ... executionPipelineStage: ... originalPipelineTransform: ... projectId: ... replaceJobId: ... replacedByJobId: ... requestedState: ... satisfiesPzs: ... stageStates: ... startTime: ... steps: ... stepsLocation: ... tempFiles: ... transformNameMapping: ... type: ... result: updateResult
JSON
[ { "update": { "call": "googleapis.dataflow.v1b3.projects.locations.jobs.update", "args": { "jobId": "...", "location": "...", "projectId": "...", "body": { "clientRequestId": "...", "createTime": "...", "createdFromSnapshotId": "...", "currentState": "...", "currentStateTime": "...", "environment": { "clusterManagerApiService": "...", "dataset": "...", "debugOptions": { "enableHotKeyLogging": "..." }, "experiments": "...", "flexResourceSchedulingGoal": "...", "internalExperiments": "...", "sdkPipelineOptions": "...", "serviceAccountEmail": "...", "serviceKmsKeyName": "...", "serviceOptions": "...", "tempStoragePrefix": "...", "userAgent": "...", "version": "...", "workerPools": "...", "workerRegion": "...", "workerZone": "..." }, "executionInfo": { "stages": "..." }, "id": "...", "jobMetadata": { "bigTableDetails": "...", "bigqueryDetails": "...", "datastoreDetails": "...", "fileDetails": "...", "pubsubDetails": "...", "sdkVersion": { "sdkSupportStatus": "...", "version": "...", "versionDisplayName": "..." }, "spannerDetails": "..." }, "labels": "...", "location": "...", "name": "...", "pipelineDescription": { "displayData": "...", "executionPipelineStage": "...", "originalPipelineTransform": "..." }, "projectId": "...", "replaceJobId": "...", "replacedByJobId": "...", "requestedState": "...", "satisfiesPzs": "...", "stageStates": "...", "startTime": "...", "steps": "...", "stepsLocation": "...", "tempFiles": "...", "transformNameMapping": "...", "type": "..." } }, "result": "updateResult" } } ]