Enable Flow Logs for GCP Project

Enable flow logs for your GCP project.
With VPC flow logs, Prisma Cloud helps you visualize flow information for resources deployed in your GCP projects. VPC flow logs on GCP provide flow-level network information of packets going to and from network interfaces that are part of a VPC, including a record of packets flowing to a source port and destination port, the number of distinct peers connecting to an endpoint IP address and port, so that you can monitor your applications from the perspective of your network. On the
Investigate
page, you can view the traffic flow between virtual machines in different service-projects and/or host-projects that are using shared VPC network and firewall rules.
VPC flow logs are supported on VPC networks only, and are not available for legacy networks on GCP.
To analyze these logs on Prisma Cloud you must enable VPC flow logs for each VPC subnet and export the logs to a sink that holds a copy of each log entry. Prisma Cloud requires you to export the flow logs to a single Cloud Storage bucket, which functions as the sink destination that holds all VPC flow logs in your environment. When you then configure Prisma Cloud to ingest these logs, the service can analyze this data and provide visibility into your network traffic and detect potential network threats such as crypto mining, data exfiltration, and host compromises.
Prisma Cloud automates VPC flow log compression using the Google Cloud Dataflow service, and saves them to your Storage bucket for ingestion. Consider enabling the Google Cloud Dataflow Service and enabling log compression because transferring raw GCP Flow logs from your storage bucket to Prisma Cloud can add to your data cost. See Flow Logs Compression on GCP to make sure that you have the permissions to create and run pipelines for a Cloud Dataflow job.
Enabling flow logs will incur high network egress costs. Prisma Cloud strongly recommends that you enable Flow Logs Compression on GCP to significantly reduce the network egress costs associated with sending uncompressed GCP logs to the Prisma Cloud infrastructure.
  1. Enable flow logs for your VPC networks on GCP.
    To analyze your network traffic, you must enable flow logs for each project you want Prisma Cloud to monitor.
    1. Log in to GCP console and select your project.
    2. Select
      Navigation menu
      VPC network
      VPC networks
      .
    3. Select VPC network and click
      EDIT
      .
    4. Select
      Flow logs
      On
      to enable flow logs.
    5. Select
      Include metadata
      .
      This setting ensures that the log entries include metadata that is required to analyze traffic.
    6. Set the
      Aggregation Interval
      to
      15 min
      .
    7. Set the
      Sample rate
      to 100%.
      Setting the aggregate interval and the sample rate as recommended above generates alerts faster on Prisma Cloud and reduces network costs you incur.
    8. Save
      your changes.
  2. (
    Required, if you are not using the Terraform template for adding your cloud account
    ) Add additional permissions to the bucket that is collecting VPC flow logs.
    You must grant the Prisma Cloud service principal permissions to list objects in the storage bucket, and to read object data and metadata stored within the bucket. The permissions required are
    storage.objects.list
    and
    storage.objects.get
    . The Terraform template that Prisma Cloud provides to enable onboarding, includes these permissions in the role named
    Prisma Cloud Flow Logs Viewer
    , and this role is assigned to the service account on the bucket name that you provide the
    Flow Log Storage Bucket
    in the onboarding flow. If you want to manually add these permissions, refer to Google Cloud Storage documentation for instructions —https://cloud.google.com/storage/docs/access-control/using-iam-permissions#bucket-add.
  3. Create a Sink to export flow logs.
    You must create a sink and specify a Cloud Storage bucket as the export destination for VPC flow logs. You must configure a sink for every project that you want Prisma Cloud to monitor and configure a single Cloud Storage bucket as the sink destination for all projects. When you Onboard Your GCP Project, you must provide the Cloud Storage bucket from which the service can ingest VPC flow logs. As a cost reduction best practice, set a lifecycle to delete logs from your Cloud Storage bucket.
    1. While authenticated into GCP, switch to the new Logs Explorer by selecting
      Navigation menu
      Logging
      Legacy Logs Viewer
      Upgrade
      Upgrade to the new Logs Explorer
      .
    2. Create a sink.
      Select
      Logging
      Logs Router
      Create Sink
      .
    3. Enter the sink details.
      • Sink name
        —An identifier for the sink.
      • (
        Optional
        ) *Sink description
        *—The use case for the sink.
      • Click
        Next
        .
    4. Enter the sink destination.
      Select the service type and the destination for logs to route to.
      • Select sink service
        —Select the service that you want the logs to route to. If you want to create a sink for flowlogs then select
        Cloud Storage Bucket
        .
      • Sink destination
        —Click
        Browse
        and
        select
        the bucket you want to use.
      • Click
        Next
        .
    5. Choose logs to include in the sink.
      Create an inclusion filter to determine the logs you want to include in the routing sink.+ In
      Build inclusion filter
      do the following:
      • Enter a filter expression
        —Add a filter that matches the log entries you want to include. For example, if you want to build a filter to route all Data Access logs to a single logging bucket, the filter will look like the following:
        logName:("projects/text-project-351281/logs/compute.googleapis.com%2Fvpc_flows") AND resource.labels.subnetwork_id:(5271207857644187590)
      • Verify your filter
        —To verify you entered the correct filter, select
        Preview logs
        ; the Logs Explorer opens in a new tab with the filter pre-populated.
      • Click
        Next
        .
      • Click
        Create Sink
        .
    6. Add a lifecycle rule to limit the number of days you store flow logs on the Cloud Storage bucket.
      By default, logs are never deleted. To manage cost, specify the threshold (in number of days) for which you want to store logs.
      1. Select
        Navigation Menu
        Cloud Storage
        Browser
        .
      2. Select the
        Lifecycle
        link for the storage bucket you want to modify.
      3. Add rule
        and Select object conditions to set
        Age
        to 30 days and Select Action as
        Delete
        .
        Logs that are stored on your Cloud Storage bucket will be deleted in 30 days.
      4. Select
        Continue
        and
        Save
        your changes.
  4. Add the name of the Cloud Storage bucket you referenced above in
    Flow Logs Storage Bucket
    when you Onboard Your GCP Project.
  5. (Optional)
    Verify that your cloud storage bucket is being ingested.
    You can review the status and take necessary actions to resolve any issues encountered during the onboarding process by viewing the
    Cloud Accounts
    page. To verify if the flow log data from your cloud storage buckets has been analyzed, you can run a network query on the
    Investigate
    page.
    1. Authenticate into Prisma Cloud and verify that your storage bucket is being ingested.
      Select
      Settings
      Cloud Accounts
      , filter for GCP cloud accounts. Click the
      Edit
      icon under the
      Actions
      column to view the results.
    2. Navigate to
      Investigate
      , replace the name with your GCP cloud account name, and enter the following network query:
      network from vpc.flow_record where cloud.account = ‘{{cloud account name}}’ AND source.publicnetwork IN (‘Internet IPs’, ‘Suspicious IPs’) AND bytes > 0
      This query allows you to list all network traffic from the Internet or from Suspicious IP addresses with over 0 bytes of data transferred to a network interface on any resource on any cloud environment.

Recommended For You