Flow Log Compression on GCP
Prisma Cloud enables you to automate the compression
of flow logs using the Google Cloud Dataflow service.
This additional automation on Prisma cloud addresses the
lack of native compression support for flow logs sink setup on GCP,
and helps reduce the egress costs associated with transferring large
volume of logs to the Prisma Cloud infrastructure. Therefore, Prisma
Cloud recommends that you enable flowlog compression.
When you enable dataflow compression on Prisma Cloud, the dataflow
pipeline resources are created in the same GCP project associated
with the Google Cloud Storage bucket to which your VPC Flow logs
are sent, and it saves the compressed logs also to the Cloud Storage
bucket. Therefore, if you are onboarding a GCP Organization and
enabling dataflow compression to it or enabling dataflow compression
to an existing GCP Organization that has been added to Prisma cloud,
make sure that the Dataflow-enabled Project ID is the same Google
Cloud Storage bucket to which you send VPC flow logs.
In order to launch the dataflow job and create and stage the
compressed files, the following permissions are required:
- Enable the Dataflow APIs.The API is dataflow.googleapi.com.
- Grant the service account with permissions to:
- Run and examine jobs—Dataflow Adminrole
- Create a network, subnetwork, and firewall rules within your VPC—compute.networks.create,compute.subnetworks.create,compute.firewalls.create,compute.networks.updatepolicyTo enable connectivity with the Dataflow pipeline resources and the compute instances that perform log compression within your VPC, Prisma Cloud creates a network, subnetwork, and firewall rules in your VPC. You can view the compute instance that are spun up with the RQLconfig where api.name='gcloud-compute-instances-list' AND json.rule = name starts with "prisma-compress"
For details on enabling the APIs, see Permissions and Roles for GCP Account on Prisma Cloud.
The Cloud Dataflow service spins up short lived compute instances
to handle the compression jobs and you may have associated costs
with the service. Palo Alto Networks recommends keeping your Cloud
Storage bucket in the same project in which you have enabled the
Dataflow service. Based on the location of your Cloud Storage bucket,
Prisma Cloud launches the Cloud Dataflow jobs in the following regions:
Storage Bucket Region | Region Where the Dataflow is Launched |
---|---|
us-central1 us-east1 us-west1 europe-west1 europe-west4 asia-east1 asia-northeast1 | us-central1 us-east1 us-west1 europe-west1 europe-west4 asia-east1 asia-northeast1 |
eur4 eu | europe-west4 |
asia | asia-east1 |
us | us-central1 us-east1 |
Any other region | us-central1 |
Recommended For You
Recommended Videos
Recommended videos not found.