Cloud Data Loss Prevention (DLP) V2 API - Class Google::Cloud::Dlp::V2::DataProfileAction::Export (v1.8.0)

Reference documentation and code samples for the Cloud Data Loss Prevention (DLP) V2 API class Google::Cloud::Dlp::V2::DataProfileAction::Export.

If set, the detailed data profiles will be persisted to the location of your choice whenever updated.

Inherits

  • Object

Extended By

  • Google::Protobuf::MessageExts::ClassMethods

Includes

  • Google::Protobuf::MessageExts

Methods

#profile_table

def profile_table() -> ::Google::Cloud::Dlp::V2::BigQueryTable
Returns
  • (::Google::Cloud::Dlp::V2::BigQueryTable) —

    Store all profiles to BigQuery.

    • The system will create a new dataset and table for you if none are are provided. The dataset will be named sensitive_data_protection_discovery and table will be named discovery_profiles. This table will be placed in the same project as the container project running the scan. After the first profile is generated and the dataset and table are created, the discovery scan configuration will be updated with the dataset and table names.
    • See Analyze data profiles stored in BigQuery.
    • See Sample queries for your BigQuery table.
    • Data is inserted using streaming insert and so data may be in the buffer for a period of time after the profile has finished.
      • The Pub/Sub notification is sent before the streaming buffer is guaranteed to be written, so data may not be instantly visible to queries by the time your topic receives the Pub/Sub notification.
      • The best practice is to use the same table for an entire organization so that you can take advantage of the provided Looker reports. If you use VPC Service Controls to define security perimeters, then you must use a separate table for each boundary.

#profile_table=

def profile_table=(value) -> ::Google::Cloud::Dlp::V2::BigQueryTable
Parameter
  • value (::Google::Cloud::Dlp::V2::BigQueryTable) —

    Store all profiles to BigQuery.

    • The system will create a new dataset and table for you if none are are provided. The dataset will be named sensitive_data_protection_discovery and table will be named discovery_profiles. This table will be placed in the same project as the container project running the scan. After the first profile is generated and the dataset and table are created, the discovery scan configuration will be updated with the dataset and table names.
    • See Analyze data profiles stored in BigQuery.
    • See Sample queries for your BigQuery table.
    • Data is inserted using streaming insert and so data may be in the buffer for a period of time after the profile has finished.
      • The Pub/Sub notification is sent before the streaming buffer is guaranteed to be written, so data may not be instantly visible to queries by the time your topic receives the Pub/Sub notification.
      • The best practice is to use the same table for an entire organization so that you can take advantage of the provided Looker reports. If you use VPC Service Controls to define security perimeters, then you must use a separate table for each boundary.
Returns
  • (::Google::Cloud::Dlp::V2::BigQueryTable) —

    Store all profiles to BigQuery.

    • The system will create a new dataset and table for you if none are are provided. The dataset will be named sensitive_data_protection_discovery and table will be named discovery_profiles. This table will be placed in the same project as the container project running the scan. After the first profile is generated and the dataset and table are created, the discovery scan configuration will be updated with the dataset and table names.
    • See Analyze data profiles stored in BigQuery.
    • See Sample queries for your BigQuery table.
    • Data is inserted using streaming insert and so data may be in the buffer for a period of time after the profile has finished.
      • The Pub/Sub notification is sent before the streaming buffer is guaranteed to be written, so data may not be instantly visible to queries by the time your topic receives the Pub/Sub notification.
      • The best practice is to use the same table for an entire organization so that you can take advantage of the provided Looker reports. If you use VPC Service Controls to define security perimeters, then you must use a separate table for each boundary.