Enable Flow Logs for GCP Project
Enable flow logs for your GCP project.
With VPC flow logs, Prisma Cloud helps you visualize flow information for resources deployed in your GCP projects. VPC flow logs on GCP provide flow-level network information of packets going to and from network interfaces that are part of a VPC, including a record of packets flowing to a source port and destination port, the number of distinct peers connecting to an endpoint IP address and port, so that you can monitor your applications from the perspective of your network. On the
Investigate
page, you can view the traffic flow between virtual machines in different service-projects and/or host-projects that are using shared VPC network and firewall rules.VPC flow logs are supported on VPC networks only, and are not available for legacy networks on GCP.
To analyze these logs on Prisma Cloud you must enable VPC flow logs for each VPC subnet and export the logs to a sink that holds a copy of each log entry. Prisma Cloud requires you to export the flow logs to a single Cloud Storage bucket, which functions as the sink destination that holds all VPC flow logs in your environment. When you then configure Prisma Cloud to ingest these logs, the service can analyze this data and provide visibility into your network traffic and detect potential network threats such as crypto mining, data exfiltration, and host compromises.
Prisma Cloud automates VPC flow log compression using the Google Cloud Dataflow service, and saves them to your Storage bucket for ingestion. Consider enabling the Google Cloud Dataflow Service and enabling log compression because transferring raw GCP Flow logs from your storage bucket to Prisma Cloud can add to your data cost. See Flow Logs Compression on GCP to make sure that you have the permissions to create and run pipelines for a Cloud Dataflow job.
Enabling flow logs will incur high network egress costs. Prisma Cloud strongly recommends that you enable Flow Logs Compression on GCP to significantly reduce the network egress costs associated with sending uncompressed GCP logs to the Prisma Cloud infrastructure.
- Enable flow logs for your VPC networks on GCP.To analyze your network traffic, you must enable flow logs for each project you want Prisma Cloud to monitor.
- Log in to GCP console and select your project.
- Select.Navigation menuVPC networkVPC networks
- Select VPC network and clickEDIT.
- SelectFlow logsOnto enable flow logs.
- SelectInclude metadata.This setting ensures that the log entries include metadata that is required to analyze traffic.
- Set theAggregation Intervalto15 min.
- Set theSample rateto 100%.Setting the aggregate interval and the sample rate as recommended above generates alerts faster on Prisma Cloud and reduces network costs you incur.
- Saveyour changes.
- (Required, if you are not using the Terraform template for adding your cloud account) Add additional permissions to the bucket that is collecting VPC flow logs.You must grant the Prisma Cloud service principal permissions to list objects in the storage bucket, and to read object data and metadata stored within the bucket. The permissions required arestorage.objects.listandstorage.objects.get. The Terraform template that Prisma Cloud provides to enable onboarding, includes these permissions in the role namedPrisma Cloud Flow Logs Viewer, and this role is assigned to the service account on the bucket name that you provide theFlow Log Storage Bucketin the onboarding flow. If you want to manually add these permissions, refer to Google Cloud Storage documentation for instructions —https://cloud.google.com/storage/docs/access-control/using-iam-permissions#bucket-add.
- Create a Sink to export flow logs.You must create a sink and specify a Cloud Storage bucket as the export destination for VPC flow logs. You must configure a sink for every project that you want Prisma Cloud to monitor and configure a single Cloud Storage bucket as the sink destination for all projects. When you Onboard Your GCP Project, you must provide the Cloud Storage bucket from which the service can ingest VPC flow logs. As a cost reduction best practice, set a lifecycle to delete logs from your Cloud Storage bucket.
- While authenticated into GCP, switch to the new Logs Explorer by selecting.Navigation menuLoggingLegacy Logs ViewerUpgradeUpgrade to the new Logs Explorer
- Create a sink.Select.LoggingLogs RouterCreate Sink
- Enter the sink details.
- Sink name—An identifier for the sink.
- (*—The use case for the sink.Optional) *Sink description
- ClickNext.
- Enter the sink destination.Select the service type and the destination for logs to route to.
- Select sink service—Select the service that you want the logs to route to. If you want to create a sink for flowlogs then selectCloud Storage Bucket.
- Sink destination—ClickBrowseandselectthe bucket you want to use.
- ClickNext.
- Choose logs to include in the sink.Create an inclusion filter to determine the logs you want to include in the routing sink.+ InBuild inclusion filterdo the following:
- Enter a filter expression—Add a filter that matches the log entries you want to include. For example, if you want to build a filter to route all Data Access logs to a single logging bucket, the filter will look like the following:logName:("projects/text-project-351281/logs/compute.googleapis.com%2Fvpc_flows") AND resource.labels.subnetwork_id:(5271207857644187590)
- Verify your filter—To verify you entered the correct filter, selectPreview logs; the Logs Explorer opens in a new tab with the filter pre-populated.
- ClickNext.
- ClickCreate Sink.
- Add a lifecycle rule to limit the number of days you store flow logs on the Cloud Storage bucket.By default, logs are never deleted. To manage cost, specify the threshold (in number of days) for which you want to store logs.
- Select.Navigation MenuCloud StorageBrowser
- Select theLifecyclelink for the storage bucket you want to modify.
- Add ruleand Select object conditions to setAgeto 30 days and Select Action asDelete.Logs that are stored on your Cloud Storage bucket will be deleted in 30 days.
- SelectContinueandSaveyour changes.
- Add the name of the Cloud Storage bucket you referenced above inFlow Logs Storage Bucketwhen you Onboard Your GCP Project.
- (Optional)Verify that your cloud storage bucket is being ingested.You can review the status and take necessary actions to resolve any issues encountered during the onboarding process by viewing theCloud Accountspage. To verify if the flow log data from your cloud storage buckets has been analyzed, you can run a network query on theInvestigatepage.
- Authenticate into Prisma Cloud and verify that your storage bucket is being ingested.Select, filter for GCP cloud accounts. Click theSettingsCloud AccountsEditicon under theActionscolumn to view the results.
- Navigate toInvestigate, replace the name with your GCP cloud account name, and enter the following network query:network from vpc.flow_record where cloud.account = ‘{{cloud account name}}’ AND source.publicnetwork IN (‘Internet IPs’, ‘Suspicious IPs’) AND bytes > 0This query allows you to list all network traffic from the Internet or from Suspicious IP addresses with over 0 bytes of data transferred to a network interface on any resource on any cloud environment.
Most Popular
Recommended For You
Recommended Videos
Recommended videos not found.