GCP Cloud Account Onboarding Prerequisites
Focus
Focus
AI Runtime Security

GCP Cloud Account Onboarding Prerequisites

Table of Contents

GCP Cloud Account Onboarding Prerequisites

Discovery onboarding prerequisites for GCP
This section outlines the prerequisites for onboarding a GCP cloud account in Strata Cloud Manager (SCM).
On this page, you'll:
  • Enable VPC Flow Logs
  • Enable security data access audit logs for AI Models
  • Create a Cloud Storage Bucket
  • Set up a Log Router to direct log entries
  • Create a sink and sink destinations
  • Update required IAM permissions to the user
Where Can I Use This?
What Do I Need?
  • AI Runtime Security
    in GCP

Enable the VPC Flow Logs

  1. Go to Google Cloud Console and select the project you want to onboard for discovery.
  2. Navigate to VPC Networks.
  3. Select the VPC with the workloads (VMs/Containers) to protect.
    SCM will discover only the running VM workloads and containers in the VPC.
  4. Click the
    SUBNETS
    tab and select all the subnets where your workloads are present.
  5. Click on the
    FLOW LOGS
    drop-down.
  6. Select
    Configure
    .
  7. In
    Configure VPC Flow Logs
    , set the
    Aggregation Interval
    of 5 Sec, enable the
    Metadata annotations
    , and use a
    Sample rate
    of 100%.
  8. SAVE
    .
  9. To view the logs, click
    FLOW LOGS
    and select
    View flow logs of selected subnets
    .

Enable Data Access Audit Logs

Before you create a Cloud Storage bucket and ensure you enable the data access audit logs in IAM for the project where the AI models are present, specifically for unprotected AI model traffic.
  1. Go to the Google Cloud Console and select your project.
  2. In the search bar at the top, type
    Audit Logs
    and select it.
  3. Search for and click
    Vertex AI API
    from the list of available audit logs.
  4. Enable the
    Data Read
    log under
    PERMISSION TYPE
    .
  5. SAVE
    .

Create a Cloud Storage Bucket

Create a cloud storage bucket to securely store the VPC flow logs and audit logs. The bucket acts as a central repository for the data collected from your GCP environment and is used for traffic analysis.
  1. Go to Cloud Storage and click
    CREATE
    :
    1. Enter a globally unique name for the bucket and click
      CONTINUE
      .
    2. Choose
      Multi-region
      for high availability and click
      CONTINUE
      .
      The Multi-region selection will incur higher costs than other options.
    3. Choose the
      Standard
      option for the storage class and click
      CONTINUE
      .
    4. For access control, select the
      Uniform
      configuration and click
      CONTINUE
      .
      Making this bucket publicly accessible is optional.
    5. Use default settings for data protection.
    6. Click
      CREATE
      .
  2. In the Google Cloud Console search for
    Log Router
    :
    1. Select
      Create sink
      .
    2. Enter a
      Sink name
      and optionally enter a
      Sink description
      . Click
      Next
      .
    3. In the
      Sink destination
      , choose
      Cloud Storage Bucket
      for the sink service and specify the
      Cloud Storage bucket
      name.
    4. In the next section, provide a filter that matches with all the:
      1. VPC flow logs generated by the workloads
      2. Audit logs for GCP Vertex-AI models API calls.
      Below is a recommended filter:
      (logName =~ "logs/cloudaudit.googleapis.com%2Fdata_access" AND protoPayload.methodName:("google.cloud.aiplatform.")) OR ((logName="projects/<GCP_PROJECT_ID>/logs/compute.googleapis.com%2Fvpc_flows") AND (resource.labels.subnetwork_name="<SUBNET_1>" OR resource.labels.subnetwork_name="<SUBNET_2>"))
      • <GCP Project ID>: Replace it with your GCP project ID.
      • <SUBNET_1>, <SUBNET_2>: Replace these with the values for your subnets.
      Consider using regular expressions if you have a high number of subnets you need to protect.
    5. Click
      Preview logs
      and run the query to verify the filter settings and ensure the logs are correctly routed.
    6. Click
      Create sink
      .
      Logs may take up to one hour to appear in the bucket. Hence the cloud assets discovery may be delayed in the SCM.
  3. (Optional) If the GCP AI models accessed by your workloads are in a different GCP project, forward those logs to your bucket from that other project.
    1. In the other GCP project, repeat the log router setup using the same bucket and filter:
      (logName =~ "logs/cloudaudit.googleapis.com%2Fdata_access" AND protoPayload.methodName:("google.cloud.aiplatform."))
    2. Click the 3 dots `...` and select
      View sink details
      .
    3. Copy the
      sink writer identity email
      from the sink details.
    4. Navigate to the bucket you created and select the
      PERMISSIONS
      tab.
    5. Click
      GRANT ACCESS
      .
    6. In
      New principals
      enter the
      Writer identity email
      ID.
    7. Assign the
      Storage Object Creator
      role.
    8. Click
      Save
      .

IAM Permissions

  1. Assign the following permissions to the user deploying Terraform in the cloud environment:
    cloudasset.assets.listResource cloudasset.assets.listAccessPolicy cloudasset.feeds.get cloudasset.feeds.list compute.machineTypes.list compute.networks.list compute.subnetworks.list container.clusters.list pubsub.subscriptions.consume pubsub.topics.attachSubscription storage.buckets.list aiplatform.models.list

Recommended For You