GCP Cloud Account Onboarding Prerequisites
Focus
Focus
Prisma AIRS

GCP Cloud Account Onboarding Prerequisites

Table of Contents

GCP Cloud Account Onboarding Prerequisites

Prerequisites for onboarding a GCP cloud account in Strata Cloud Manager.
This section outlines the prerequisites for onboarding a GCP cloud account in Strata Cloud Manager.
Where Can I Use This?What Do I Need?
  • Prisma AIRS AI Runtime Security in GCP
On this page, you will:

Enable the VPC Flow Logs

Enable VPC flow logs to capture information about network traffic sent and received by your VM instances in GCP. This data is essential for Prisma AIRS AI Runtime: Network intercept to monitor network behavior, discover cloud assets, and detect potential threats.
  1. Go to Google Cloud Console and select the project you want to onboard for discovery.
  2. Navigate to VPC Networks.
  3. Select the VPC with the workloads (VMs/Containers) to protect.
    Strata Cloud Manager will discover only the running VM workloads and containers in the VPC.
  4. Click the SUBNETS tab and select all the subnets where your workloads are present.
  5. Click on the FLOW LOGS.
  6. Select Configure.
  7. In Configure VPC Flow Logs, set the Aggregation Interval to 5 Sec, enable the Metadata annotations, and use a Sample rate of 100%.
  8. SAVE.
  9. To view the logs, click FLOW LOGS and select View flow logs of selected subnets.

Enable Data Access Audit Logs

Before creating the Cloud Storage bucket, you must enable Data Access Audit Logs. In the IAM settings for the project where your AI models are deployed, ensure that logging is turned on specifically for unprotected AI model traffic. This step ensures visibility into how the AI models are being accessed, which is critical for audit and security monitoring.
  1. Go to the Google Cloud Console and select your project.
  2. In the search bar at the top, type Audit Logs and select it.
  3. Search for and click Vertex AI API from the list of available audit logs.
  4. Enable the Data Read log under PERMISSION TYPE.
  5. SAVE.

Create a Cloud Storage Bucket

Use a Cloud Storage bucket as a secure, centralized location to store VPC Flow Logs and audit logs. This repository supports traffic analysis and enables consistent monitoring across your GCP environment.
  1. Go to Cloud Storage and click CREATE:
    1. Enter a globally unique name for the bucket and click CONTINUE.
    2. Choose Multi-region for high availability and click CONTINUE.
      The Multi-region selection will incur higher costs than other options.
    3. Choose the Standard option for the storage class and click CONTINUE.
    4. For access control, select the Uniform configuration and click CONTINUE.
      Making this bucket publicly accessible is optional.
    5. Use default settings for data protection.
    6. Click CREATE.
  2. In the Google Cloud Console search for Log Router:
    1. Select Create sink.
    2. Enter a Sink name and optionally enter a Sink description.
    3. Click Next.
    4. In the Sink destination, choose Cloud Storage Bucket for the sink service.
    5. Enter the Cloud Storage bucket name.
    6. In the next section, provide a filter that matches all the:
      1. VPC flow logs generated by the workloads.
      2. Audit logs for GCP Vertex-AI models API calls.
      Below is a recommended filter:
      (logName =~ "logs/cloudaudit.googleapis.com%2Fdata_access" AND protoPayload.methodName:("google.cloud.aiplatform.")) OR ((logName="projects/<GCP_PROJECT_ID>/logs/compute.googleapis.com%2Fvpc_flows") AND (resource.labels.subnetwork_name="<SUBNET_1>" OR resource.labels.subnetwork_name="<SUBNET_2>"))
      • <GCP Project ID>: Replace it with your GCP project ID.
      • <SUBNET_1>, <SUBNET_2>: Replace these with the values for your subnets.
      Consider using regular expressions if you have a high number of subnets you need to protect.
    7. Click Preview logs and run the query to verify the filter settings and ensure the logs are correctly routed.
    8. Click Create sink.
      Logs can take up to an hour to populate in the bucket, which may result in a lag in asset discovery and log correlation in Strata Cloud Manager during initial onboarding.
  3. (Optional) If the GCP AI models accessed by your workloads are in a different GCP project, forward those logs to your bucket from that other project.
    1. In the other GCP project, repeat the log router setup using the same bucket and filter:
      (logName =~ "logs/cloudaudit.googleapis.com%2Fdata_access" AND protoPayload.methodName:("google.cloud.aiplatform."))
    2. Click the 3 dots `...` and select View sink details.
    3. Copy the sink Writer identity email from the sink details.
    4. Navigate to the bucket you created and select the PERMISSIONS tab.
    5. Click GRANT ACCESS.
    6. In New principals, enter the Writer identity email ID you copied from the sink details above.
    7. Assign the Storage Object Creator role.
    8. Click Save.

IAM Permissions

  1. Assign the following permissions to the user deploying Terraform in the cloud environment:
    cloudasset.assets.listResource cloudasset.assets.listAccessPolicy cloudasset.feeds.get cloudasset.feeds.list compute.machineTypes.list compute.networks.list compute.subnetworks.list container.clusters.list pubsub.subscriptions.consume pubsub.topics.attachSubscription storage.buckets.list aiplatform.models.list

Create a GCP Service Identity

  1. Execute the following command in the gcloud CLI to create the necessary service identity for your project. This step is required to successfully deploy the Prisma AIRS AI Runtime: Network intercept Terraform template.
    gcloud beta services identity create --service=cloudasset.googleapis.com --project=<your_gcp_project_id>