Processing Landsat satellite images with GPUs


This tutorial shows you how to use GPUs on Dataflow to process Landsat 8 satellite images and render them as JPEG files. The tutorial is based on the example Processing Landsat satellite images with GPUs.

Objectives

  • Build a Docker image for Dataflow that has TensorFlow with GPU support.
  • Run a Dataflow job with GPUs.

Costs

This tutorial uses billable components of Google Cloud, including:

  • Cloud Storage
  • Dataflow
  • Artifact Registry

Use the pricing calculator to generate a cost estimate based on your projected usage.

Before you begin

  1. Sign in to your Google Cloud account. If you're new to Google Cloud, create an account to evaluate how our products perform in real-world scenarios. New customers also get $300 in free credits to run, test, and deploy workloads.
  2. Install the Google Cloud CLI.
  3. To initialize the gcloud CLI, run the following command:

    gcloud init
  4. Create or select a Google Cloud project.

    • Create a Google Cloud project:

      gcloud projects create PROJECT_ID

      Replace PROJECT_ID with a name for the Google Cloud project you are creating.

    • Select the Google Cloud project that you created:

      gcloud config set project PROJECT_ID

      Replace PROJECT_ID with your Google Cloud project name.

  5. Make sure that billing is enabled for your Google Cloud project.

  6. Enable the Dataflow, Cloud Build, and Artifact Registry APIs:

    gcloud services enable dataflow cloudbuild.googleapis.com artifactregistry.googleapis.com
  7. If you're using a local shell, then create local authentication credentials for your user account:

    gcloud auth application-default login

    You don't need to do this if you're using Cloud Shell.

  8. Grant roles to your user account. Run the following command once for each of the following IAM roles: roles/iam.serviceAccountUser

    gcloud projects add-iam-policy-binding PROJECT_ID --member="user:USER_IDENTIFIER" --role=ROLE
    • Replace PROJECT_ID with your project ID.
    • Replace USER_IDENTIFIER with the identifier for your user account. For example, user:myemail@example.com.

    • Replace ROLE with each individual role.
  9. Install the Google Cloud CLI.
  10. To initialize the gcloud CLI, run the following command:

    gcloud init
  11. Create or select a Google Cloud project.

    • Create a Google Cloud project:

      gcloud projects create PROJECT_ID

      Replace PROJECT_ID with a name for the Google Cloud project you are creating.

    • Select the Google Cloud project that you created:

      gcloud config set project PROJECT_ID

      Replace PROJECT_ID with your Google Cloud project name.

  12. Make sure that billing is enabled for your Google Cloud project.

  13. Enable the Dataflow, Cloud Build, and Artifact Registry APIs:

    gcloud services enable dataflow cloudbuild.googleapis.com artifactregistry.googleapis.com
  14. If you're using a local shell, then create local authentication credentials for your user account:

    gcloud auth application-default login

    You don't need to do this if you're using Cloud Shell.

  15. Grant roles to your user account. Run the following command once for each of the following IAM roles: roles/iam.serviceAccountUser

    gcloud projects add-iam-policy-binding PROJECT_ID --member="user:USER_IDENTIFIER" --role=ROLE
    • Replace PROJECT_ID with your project ID.
    • Replace USER_IDENTIFIER with the identifier for your user account. For example, user:myemail@example.com.

    • Replace ROLE with each individual role.
  16. Grant roles to your Compute Engine default service account. Run the following command once for each of the following IAM roles: roles/dataflow.admin, roles/dataflow.worker, roles/bigquery.dataEditor, roles/pubsub.editor, roles/storage.objectAdmin, and roles/artifactregistry.reader.

    gcloud projects add-iam-policy-binding PROJECT_ID --member="serviceAccount:PROJECT_NUMBER-compute@developer.gserviceaccount.com" --role=SERVICE_ACCOUNT_ROLE
    • Replace PROJECT_ID with your project ID.
    • Replace PROJECT_NUMBER with your project number. To find your project number, see Identify projects.
    • Replace SERVICE_ACCOUNT_ROLE with each individual role.
  17. To store the output JPEG image files from this tutorial, create a Cloud Storage bucket:
    1. In the Google Cloud console, go to the Cloud Storage Buckets page.

      Go to Buckets page

    2. Click Create bucket.
    3. On the Create a bucket page, enter your bucket information. To go to the next step, click Continue.
      • For Name your bucket, enter a unique bucket name. Don't include sensitive information in the bucket name, because the bucket namespace is global and publicly visible.
      • For Choose where to store your data, do the following:
        • Select a Location type option.
        • Select a Location option.
      • For Choose a default storage class for your data, select the following: Standard.
      • For Choose how to control access to objects, select an Access control option.
      • For Advanced settings (optional), specify an encryption method, a retention policy, or bucket labels.
    4. Click Create.

Prepare your working environment

Download the starter files, and then create your Artifact Registry repository.

Download the starter files

Download the starter files and then change directories.

  1. Clone the python-docs-samples repository.

    git clone https://github.com/GoogleCloudPlatform/python-docs-samples.git
    
  2. Navigate to the sample code directory.

    cd python-docs-samples/dataflow/gpu-examples/tensorflow-landsat
    

Configure Artifact Registry

Create an Artifact Registry repository so that you can upload artifacts. Each repository can contain artifacts for a single supported format.

All repository content is encrypted using either Google-owned and Google-managed keys or customer-managed encryption keys. Artifact Registry uses Google-owned and Google-managed keys by default and no configuration is required for this option.

You must have at least Artifact Registry Writer access to the repository.

Run the following command to create a new repository. The command uses the --async flag and returns immediately, without waiting for the operation in progress to complete.

gcloud artifacts repositories create REPOSITORY \
    --repository-format=docker \
    --location=LOCATION \
    --async

Replace REPOSITORY with a name for your repository. For each repository location in a project, repository names must be unique.

Before you can push or pull images, configure Docker to authenticate requests for Artifact Registry. To set up authentication to Docker repositories, run the following command:

gcloud auth configure-docker LOCATION-docker.pkg.dev

The command updates your Docker configuration. You can now connect with Artifact Registry in your Google Cloud project to push images.

Build the Docker image

Cloud Build allows you to build a Docker image using a Dockerfile and save it into Artifact Registry, where the image is accessible to other Google Cloud products.

Build the container image by using the build.yaml config file.

gcloud builds submit --config build.yaml

Run the Dataflow job with GPUs

The following code block demonstrates how to launch this Dataflow pipeline with GPUs.

We run the Dataflow pipeline using the run.yaml config file.

export PROJECT=PROJECT_NAME
export BUCKET=BUCKET_NAME

export JOB_NAME="satellite-images-$(date +%Y%m%d-%H%M%S)"
export OUTPUT_PATH="gs://$BUCKET/samples/dataflow/landsat/output-images/"
export REGION="us-central1"
export GPU_TYPE="nvidia-tesla-t4"

gcloud builds submit \
    --config run.yaml \
    --substitutions _JOB_NAME=$JOB_NAME,_OUTPUT_PATH=$OUTPUT_PATH,_REGION=$REGION,_GPU_TYPE=$GPU_TYPE \
    --no-source

Replace the following:

  • PROJECT_NAME: the Google Cloud project name
  • BUCKET_NAME: the Cloud Storage bucket name (without the gs:// prefix)

After you run this pipeline, wait for the command to finish. If you exit your shell, you might lose the environment variables that you've set.

To avoid sharing the GPU between multiple worker processes, this sample uses a machine type with 1 vCPU. The memory requirements of the pipeline are addressed by using 13 GB of extended memory. For more information, read GPUs and worker parallelism.

View your results

The pipeline in tensorflow-landsat/main.py processes Landsat 8 satellite images and renders them as JPEG files. Use the following steps to view these files.

  1. List the output JPEG files with details by using the Google Cloud CLI.

    gcloud storage ls "gs://$BUCKET/samples/dataflow/landsat/" --long --readable-sizes
    
  2. Copy the files into your local directory.

    mkdir outputs
    gcloud storage cp "gs://$BUCKET/samples/dataflow/landsat/*" outputs/
    
  3. Open these image files with the image viewer of your choice.

Clean up

To avoid incurring charges to your Google Cloud account for the resources used in this tutorial, either delete the project that contains the resources, or keep the project and delete the individual resources.

Delete the project

The easiest way to eliminate billing is to delete the project that you created for the tutorial.

To delete the project:

  1. In the Google Cloud console, go to the Manage resources page.

    Go to Manage resources

  2. In the project list, select the project that you want to delete, and then click Delete.
  3. In the dialog, type the project ID, and then click Shut down to delete the project.

What's next