Remediate Alerts for IAM Security
Table of Contents
Prisma Cloud Enterprise Edition
Expand all | Collapse all
-
- Prisma Cloud
- Prisma Cloud License Types
- Prisma Cloud—How it Works
- Get Prisma Cloud From the AWS Marketplace
- Get Prisma Cloud From the GCP Marketplace
- Access Prisma Cloud
- Prisma Cloud—First Look
- Prisma Cloud—Next Steps
- Enable Access to the Prisma Cloud Console
- Access the Prisma Cloud REST API
- Prisma Cloud FAQs
-
- Cloud Account Onboarding
-
- Onboard Your AWS Organization
- Onboard Your AWS Account
- Configure Audit Logs
- Configure Flow Logs
- Configure Data Security
- Configure DNS Logs
- Configure Findings
- Update an Onboarded AWS Organization
- Add AWS Member Accounts on Prisma Cloud
- Update an Onboarded AWS Account
- Update an Onboarded AWS Account to AWS Organization
- AWS APIs Ingested by Prisma Cloud
- Troubleshoot AWS Onboarding Errors
- Prisma Cloud on AWS China
- Manually Set Up Prisma Cloud Role for AWS Accounts
- Automate AWS Cloud Accounts Onboarding
-
- Connect your Azure Account
- Connect your Azure Tenant
- Connect an Azure Subscription
- Connect an Azure Active Directory Tenant
- Authorize Prisma Cloud to access Azure APIs
- Update Azure Application Permissions
- View and Edit a Connected Azure Account
- Troubleshoot Azure Account Onboarding
- Microsoft Azure API Ingestions and Required Permissions
-
- Prerequisites to Onboard GCP Organizations and Projects
- Onboard Your GCP Organization
- Onboard Your GCP Projects
- Flow Logs Compression on GCP
- Enable Flow Logs for GCP Organization
- Enable Flow Logs for GCP Project
- Update an Onboarded GCP Account
- Create a Service Account With a Custom Role
- GCP API Ingestions
- Cloud Service Provider Regions on Prisma Cloud
-
- Prisma Cloud Administrator Roles
- Create and Manage Account Groups on Prisma Cloud
- Create Prisma Cloud Roles
- Create Custom Prisma Cloud Roles
- Prisma Cloud Administrator Permissions
- Manage Roles in Prisma Cloud
- Add Administrative Users On Prisma Cloud
- Add Service Accounts On Prisma Cloud
- Create and Manage Access Keys
- Manage your Prisma Cloud Profile
-
- Get Started
- Set up ADFS SSO on Prisma Cloud
- Set up Azure AD SSO on Prisma Cloud
- Set up Google SSO on Prisma Cloud
- Set up Just-in-Time Provisioning on Google
- Set up Okta SSO on Prisma Cloud
- Set up Just-in-Time Provisioning on Okta
- Set up OneLogin SSO on Prisma Cloud
- Set up Just-in-Time Provisioning on OneLogin
- View and Forward Audit Logs
- Define Prisma Cloud Enterprise and Anomaly Settings
- Add a Resource List on Prisma Cloud
- Adoption Advisor
-
- Prisma Cloud Alerts and Notifications
- Trusted IP Addresses on Prisma Cloud
- Enable Prisma Cloud Alerts
- Create an Alert Rule for Run-Time Checks
- Configure Prisma Cloud to Automatically Remediate Alerts
- Send Prisma Cloud Alert Notifications to Third-Party Tools
- View and Respond to Prisma Cloud Alerts
- Suppress Alerts for Prisma Cloud Anomaly Policies
- Generate Reports on Prisma Cloud Alerts
- Alert Payload
- Prisma Cloud Alert Resolution Reasons
- Alert Notifications on State Change
- Create Views
-
- Prisma Cloud Integrations
- Integrate Prisma Cloud with Amazon GuardDuty
- Integrate Prisma Cloud with Amazon Inspector
- Integrate Prisma Cloud with Amazon S3
- Integrate Prisma Cloud with AWS Security Hub
- Integrate Prisma Cloud with Amazon SQS
- Integrate Prisma Cloud with Azure Service Bus Queue
- Integrate Prisma Cloud with Cortex XSOAR
- Integrate Prisma Cloud with Google Cloud Security Command Center (SCC)
- Integrate Prisma Cloud with Jira
- Integrate Prisma Cloud with Microsoft Teams
- Integrate Prisma Cloud with PagerDuty
- Integrate Prisma Cloud with Qualys
- Integrate Prisma Cloud with ServiceNow
- Integrate Prisma Cloud with Slack
- Integrate Prisma Cloud with Splunk
- Integrate Prisma Cloud with Tenable
- Integrate Prisma Cloud with Webhooks
- Prisma Cloud Integrations—Supported Capabilities
-
- What is Prisma Cloud IAM Security?
- Enable IAM Security
- Investigate IAM Incidents on Prisma Cloud
- Cloud Identity Inventory
- Create an IAM Policy
- Integrate Prisma Cloud with IdP Services
- Integrate Prisma Cloud with Okta
- Integrate Prisma Cloud with AWS IAM Identity Center
- Remediate Alerts for IAM Security
- Context Used to Calculate Effective Permissions
Remediate Alerts for IAM Security
The IAM security module provides two options for remediating alerts so that you can enforce the principle of least privilege across your AWS, Azure, and GCP environments. You can manually remediate the alerts by copying the AWS, Azure, or GCP CLI commands and then run them in your cloud environment or you can configure a custom python script to automate the remediation steps.
IAM automatic remediation is different from Prisma Cloud automatic remediation. The IAM module does not support the option to enable automatic remediation. Instead, Create an Alert Rule for Run-Time Checks follow the instructions for configuring a custom p ython script on Configure and Run AWS IAM Remediation Script or Configure and Run Azure IAM Remediation Script to manage automatic remediation for IAM alert rules using the messaging queuing service on the respective CSP.
- Manually Remediate IAM Security Alerts—Copy and paste the CLI commands for your AWS, Azure, or GCP environments and then execute them to manually remove excess permissions.
- Custom python scripts—Copy, paste, and configure the custom python scripts so that you can automate the steps of executing the CLI commands to remediate excess permissions in your AWS, Azure, or GCP environments.
- Automatic remediation for GCP IAM alerts is not supported.
Manually Remediate IAM Security Alerts
The following steps shows how you can manually remediate alerts in AWS. You can follow similar steps in Azure and GCP.
- View the existing alerts.To view all of the policies that triggered an alert selectAlertsOverview.
- ClickAdd Filter(
) and select
.Policy TypeIAM - Select the violating policy that you want to remediate.On Prisma Cloud, policies that you can remediate are indicated by the
icon.
- Investigate the policy violations.In this example we see all the violating resources for this Okta user. Prisma Cloud provides us hints on why the alert was generated, which in this case was a result of an Okta user being able to create another IAM user—this could lead to a potential back door because even if the original Okta user is deleted, they can still log in through the second user. Prisma Cloud also provides recommended steps on how to resolve the policy violation.After a policy violation is triggered, it is sent to the SQS queue.In this example the SQS queue shows 1 message which is the alert that was triggered.
- Get the remediation steps.Under theOPTIONScolumn, clickRemediate.
- Copy the CLI commands.After you clickRemediatethe CLI commands appears in a popup window.
- Run the CLI commands on your AWS account. In case of your GCP account, before you run the CLI command see Set up Remediation for GCP IAM Alerts.After you executed the CLI commands you will have completed the remediation process and the excess privileges will be revoked. The SQS queue will now show 0 messages.
Set up Automatic Remediation for AWS IAM Alerts
Automate the remediation steps for your AWS IAM alerts with the help of a custom python script which receives an alert via the AWS SQS queue, extracts the alert ID and uses it to call the IAM remediation API, and runs the commands which are provided by the API response.
Review Prerequisites for AWS Remediation Script
Complete the following prerequisites so that you can set up everything you need to successfully run the python script. This includes the Prisma Cloud integrations, APIs, and python libraries.
- Integrate Prisma Cloud with Amazon SQS—This is an AWS service that allows you to send, store, and receive messages between AWS and Prisma Cloud. Follow the steps to integrate Prisma Cloud with SQS.
- Create alert rules and set up alert notificationsto Amazon SQS. All alerts triggered for the IAM policy you selected will be sent to the SQS queue.
Configure and Run AWS IAM Remediation Script
Install 3rd party libraries to create HTTP requests to your API endpoints, and edit the custom python script to include the values for the environment variables so that you can automatically remediate alerts.
- Copy/paste the script into a text editor or integrated development environment (IDE).import json import os import subprocess import boto3 import requests def log(s): if os.environ['DEBUG']: print(s) # Mapping of account number to AWS CLI profile. This is used to run each remediation with the appropriate profile account_number_to_profile = { } sqs = boto3.resource('sqs') queue = sqs.get_queue_by_name(QueueName=os.environ['SQS_QUEUE_NAME']) # Read all queue messages all_messages = [] message_batch = queue.receive_messages(MaxNumberOfMessages=10) while len(message_batch) > 0: all_messages.extend(message_batch) message_batch = queue.receive_messages(MaxNumberOfMessages=10) for message in all_messages: try: alert_info = json.loads(message.body) log(f'processing alert: {alert_info}') except json.JSONDecodeError as e: print(f'Can\'t parse queue message: {e.msg}') continue alert_id = alert_info['alertId'] account_id = alert_info['account']['id'] log(f'alert id: {alert_id}, account id: {account_id}') if 'remediable' in alert_info['metadata'] and alert_info['metadata']['remediable'] is False: log(f'Remediation is not supported for the alert: {alert_id}') continue try: log(f'getting remediation steps for the alert') r = requests.post( verify=False, url=f'{os.environ["API_ENDPOINT"]}/api/v1/permission/alert/remediation', data=json.dumps({ "alerts": [ alert_id ] }), headers={ 'x-redlock-auth': os.environ['AUTH_KEY'], 'Content-Type': 'application/json' } ) except requests.exceptions.RequestException as e: print(f'Can\'t make request to the remediation api: {e.strerror}') continue if r.status_code != 200: print(f'Error from the remediation API for the alert id: {alert_id}') continue cli_commands = r.json()['alertIdVsCliScript'][alert_id] log(f'cli commands: {cli_commands}') try: log(f'running the CLI commands') aws_cli = subprocess.Popen( cli_commands, env=dict(os.environ, AWS_PROFILE=account_number_to_profile.get(account_id)), shell=True ) except OSError as e: print(f'Can\'t run cli commands: {e.strerror}') continue aws_cli.communicate() if aws_cli.returncode != 0: print(f'Can\'t run cli commands: {cli_commands}') continue log("Deleting message") message.delete()Install the 3rd party libraries.This script uses a total of five python libraries. Three of the libraries:json,os, andsubprocessare part of the python core which allows you to import them into your programs after you install python. The other two libraries areboto3andrequestswhich are 3rd party libraries—or—libraries that you have to install before running the script. Python has a default package downloader calledpip, which can install 3rd party libraries and frameworks via the command line.
- Installboto3.From the command line (Windows) or terminal (Linux/MacOS) type the following command:pip install boto3This is the AWS SDK for python that allows you to create, configure, and manage AWS services such as SQS.
- Installrequests.From the command line (Windows) or terminal (Linux/MacOS) type the following command:pip install requestsrequests is a 3rd party library for making simple HTTP requests.
Edit the environment variables.These are mandatory variables to specify in the python script to run the commands provided by the API response and to customize the settings.Optional (mac/linux only)—Use the export command to set your environment variables.If you’re not familiar with python and don’t want to edit the script then you can use theexportcommand to set the environment variables. Here’s the syntax for doing so:- % export API_ENDPOINT=api_tenant
- % export YOUR_ACCOUNT_NUMBER=123456789
- % export SQS_QUEUE_NAME=your_sqs_queue_name
- % export YOUR_ACCOUNT_NUMBER=123456789
- % export AUTH_KEY=your_jwt_token
- % python script.pyThe following instructions can be executed on any operating system that has python installed. For example, Windows, macOS, and Linux.Environment VariableValueSQS_QUEUE_NAMEA string that represents the name of the SQS queue that you created in step 1. For example,Queue2_Policy_UUID.API_ENDPOINTYour Prisma Cloud API subdomain. For example, if your tenant is https://api.prismacloud.io , then theAPI_ENDPOINTwill beapi.DEBUGDisplays the debug logs for your script which is enabled by default.YOUR_ACCOUNT_NUMBERThe 12-digit number, such as123456789012, that uniquely identifies an AWS account. A user could have multiple account numbers.AUTH_KEYYour JWT authentication token string (x-redlock-auth). See the API reference for more details.
- Edit.DEBUGDEBUGis enabled or set toTrueby default. To disable logs, update the code snippet as follow:if os.environ['DEBUG'] = False:Edit YOUR_ACCOUNT_NUMBER.ReplaceYOUR_ACCOUNT_NUMBERwith the 12-digit account ID. The portion of the script to modify is:account_number_to_profile = { 'YOUR_ACCOUNT_NUMBER_1': 'YOUR_ACCOUNT_NAME_1', 'YOUR_ACCOUNT_NUMBER_2': 'YOUR_ACCOUNT_NAME_2'}An example of valid values:account_number_to_profile = {'123456789123': 'default','512478725627': 'user1'}Edit API_ENDPOINT.ReplaceAPI_ENDPOINTwith the Prisma Cloud tenant sub domain that you’re using. The portion of the script to modify is:url=f'{os.environ["API_ENDPOINT"]}/api/v1/permission/alert/remediation'For example, replaceAPI_ENDPOINTwithapp,app2,app3, orapp.gov.Edit theSQS_QUEUE_NAME.This stores the value of your queue name. The portion of the script to modify is:queue = sqs.get_queue_by_name(QueueName=os.environ['SQS_QUEUE_NAME'])ReplaceSQS_QUEUE_NAMEwith the name of your actual queue—for example, ifQueue2_Policy_UUIDis the name of your queue, then the code snippet will be updated as follow:queue = sqs.get_queue_by_name(QueueName=os.environ['Queue2_Policy_UUID'])Edit the AUTH_KEY.Generate a JWT token and replace the value inAUTH_KEYof the python script. The portion of the script to modify is as follows:'x-redlock-auth': os.environ['AUTH_KEY']ReplaceAUTH_KEYwith the JWT token that you generated.View the remediation results.After you configured the python script with your environment variables, run the script to view the remediation results.
- Run the script.Open up command prompt (Windows) or terminal (Linux/MacOS) and type in the following command:python script.pyReplace script.py with the name of your actual script.
- View the results.After executing the python script, details related to the remediation will display in the output.processing alert: {'alertStatus': 'open', 'reason': 'SCHEDULED', 'metadata': {'remediable': True}, 'alertRuleName': 'auto-remediation-test', 'resource': {'resourceId': 'ABCDEFGHIJKLMN', 'resourceTs': '1234567890', 'resourceName': 'test-resource'}, 'firstSeen': '1605104944614', 'lastSeen': '1617799423260', 'service': 'Prisma Cloud', 'alertTs': '1234567890123', 'alertId': 'I-1234567', 'region': 'global', 'account': {'cloudType': 'aws', 'name': 'test-account', 'id': '1234567890'}, 'policy': {'severity': 'medium', 'policyType': 'iam', 'name': 'AWS entities with risky permissions', 'policyTs': '123456789012', 'description': "This policy identifies AWS IAM permissions that are risky. Ensure that the AWS entities provisioned in your AWS account don't have a risky set of permissions to minimize security risks.", 'recommendation': "Remediation for a user: \n1. Log in to the AWS console \n2. Ntest-resourcegate to the IAM service \n3. Click on Users \n4. Choose the relevant user \n5. Under 'Permissions policies', find the relevant policy according to the alert details and remove the risky actions \n----------------------------------------\n Remediation for a Compute instance/Okta user that assumes a role: \n1. Log in to the AWS console \n2. Ntest-resourcegate to the compute service (For example, AWS EC2, AWS Lambda or AWS ECS) or login to the Okta console \n3. Find the role used by the compute instance/Okta user \n4. Ntest-resourcegate to the IAM service \n5. Click on Roles \n6. Choose the relevant role \n7. Under 'Permissions policies', find the relevant policy according to the alert details and remove the risky actions \n----------------------------------------\n Remediation for a Resource-based Policy: \n1. Log in to the AWS console \n2. Ntest-resourcegate to the relevant service (For example, AWS S3) \n3. Find resource-based policy of the resource \n4. Remove the risky actions according to the alert details", 'id': 'abcdefg9-1abc-47fc-c876-j123f4567', 'labels': '[]'}, 'alertRuleId': '1234abc-abc0-1234-ab1c-abc1234567'} alert id: I-1234567, account id: 1234567890 getting remediation steps for the alert cli commands: aws iam create-policy --policy-name 'test-resource-prisma-restrictions-I-1234567-1' --policy-document '{"Version":"2012-10-17","Statement":[{"Resource":["arn:aws:iam::1234567890123:user/test-resource"],"Action":["iam:CreateAccessKey"],"Effect":"Deny"}]}' and aws iam attach-user-policy --user-name 'test-resource' --policy-arn 'arn:aws:iam::123456789012:policy/test-resource-prisma-restrictions-I-1234567-1' running the CLI commands { "Policy": { "PolicyName": "test-resource-prisma-I-1234567-1", "PolicyId": "ABCDEFGHIJKLMNO", "Arn": "arn:aws:iam::1234567890:policy/test-resource-prisma-restrictions-I-1234567-1", "Path": "/", "DefaultVersionId": "v1", "AttachmentCount": 0, "PermissionsBoundaryUsageCount": 0, "IsAttachable": true, "CreateDate": "2021-04-08T09:03:47+00:00", "UpdateDate": "2021-04-08T09:03:47+00:00" } } Deleting messageThe output shows that we’re processing an alert for a resource namedtest-resourcewhich should now be gone when we viewAlerts. The CLI commands for executing the remediation steps are shown in the output; these commands are automatically executed on your behalf by the python script. A new policy will be created in AWS that removes the excess permissions of the user.
- Integrate Prisma Cloud with Azure Serve Bus—This is an Azure service that allows you to send, store, and receive messages between Azure and Prisma Cloud. Follow the steps to Integrate Prisma Cloud with Azure Service Bus.
- Create alert rules and set up alert notifications to Azure Service Bus. All alerts triggered for the IAM policy you selected will be sent to the Service Bus queue.
- Copy/paste the script into a text editor or integrated development environment (IDE).import subprocess import logging import json import requests import os from azure.servicebus import ServiceBusService, Message, Topic, Rule, DEFAULT_RULE_NAME logging.basicConfig(level=os.environ.get("LOGLEVEL", "INFO")) account_number_to_profile = { } def execute_command(command): """ Execute the CLI command :param command: :return: Returns output on success and False on Failure """ logging.info("Executing CLI command :- " + str(command)) try: output = subprocess.check_output(command, shell=True, stderr=subprocess.STDOUT) logging.info("Command execution passed with following output : {}".format(output)) return output except subprocess.CalledProcessError as e: logging.error("Command [{}] have failed with return code : {}".format(command, e.returncode)) logging.error("Error Output : {}".format(e.output)) return False def run_azure_cli_commands(cli_commands, account_id): logging.info(f'Start run_azure_cli_commands cli commands: {cli_commands}') try: azure_cli = subprocess.Popen( "az cli_commands", env=dict(os.environ, AWS_PROFILE=account_number_to_profile.get(account_id)), shell=True ) except OSError as e: logging.error(f'Can\'t run cli commands: {e.strerror}') return azure_cli.communicate() if azure_cli.returncode != 0: logging.error(f'return code:{azure_cli.returncode}, Can\'t run cli commands,: {cli_commands}') return logging.info(f'Finished run_azure_cli_commands cli commands: {cli_commands}') def login_azure(): logging.info("Start login_azure") execute_command("az login") logging.info("Finished login_azure") def logout_azure(): logging.info("Start logout_azure") execute_command("az logout") logging.info("Finished logout_azure") def get_messages_from_queue(): logging.info("Start get_messages_from_queue") queue_name = os.environ['SB_QUEUE_NAME'] logging.info(f'Using Azure alerts queue: {queue_name}') sb_key = os.environ['SB_QUEUE_KEY'] sb_key_name = os.environ['SB_QUEUE_KEY_NAME'] service_bus_name_space = os.environ['SB_QUEUE_NAME_SPACE'] bus_service = ServiceBusService(service_bus_name_space, shared_access_key_name=sb_key_name, shared_access_key_value=sb_key) queue = bus_service.get_queue(queue_name) logging.info(f'queue.message_count: {queue.message_count}') max_number_of_messages = 10 all_messages = [] messages_batch_index = 0 while messages_batch_indexInstall the 3rd party libraries.This script uses a total of five python libraries. Three of the libraries:subprocess,logging, andjsonare part of the python core which allows you to import them into your programs after you install python. The other two libraries arerequestsandazure.servicebuswhich are 3rd party libraries—or—libraries that you have to install before running the script. Python has a default package downloader calledpip, which can install 3rd party libraries and frameworks through the command line.
- Install requests.From the command line (Windows) or terminal (Linux/MacOS) type the following command:pip install requestsrequests is a 3rd party library for making simple HTTP requests
- Install azure.servicebus.From the command line (Windows) or terminal (Linux/MacOS) type the following command:pip install azure.servicebusazure.servicebusis a client library for python to communicate between applications and services and implement asynchronous messaging patterns.
Edit the environment variables.These are mandatory variables to specify in the python script to run the commands provided by the API response and to customize the settings.Optional (mac/linux only)—Use the export command to set your environment variables.If you’re not familiar with python and don’t want to edit the script then you can use theexportcommand to set the environment variables. Here’s the syntax for doing so:- % export SB_QUEUE_KEY=your_sb_queue_key
- % export SB_QUEUE_KEY_NAME=your_sb_queue_key_name
- % export SB_QUEUE_NAME_SPACE=your_sb_queue_name_space
- % export API_ENDPOINT=api_tenant
- % export AUTH_KEY=your_jwt_tokenThe following instructions can be executed on any operating system that has python installed. For example, Windows, macOS, and Linux.+ENVIRONMENT VARIABLEVALUESB_QUEUE_KEYA string that represents the Service Bus queue key value.+SB_QUEUE_KEY_NAMEA string that represents your Service Bus key name.SB_QUEUE_NAME_SPACEA string that represents your Service Bus namespace.API_ENDPOINTYour Prisma Cloud API subdomain. For example, if your tenant is https://api.prismacloud.io , then theAPI_ENDPOINTwill be api.AUTH_KEYYour JWT authentication token string (x-redlock-auth). See the api reference for more details.
View the remediation results.After you configured the python script with your environment variables, run the script to view the remediation results.- Run the script.Open up command prompt (Windows) or terminal (Linux/MacOS) and type in the following command:python script.pyReplace script.py with the name of your actual script.
- View the results.After executing the python script, details related to the remediation will display in output.
- GCPDeny Policiesfeature does not yet support all permissions due to which some of the alerts can be partially remediable or not remediable. The list of permissions in Prisma Cloud IAM security will be updated as per their availability in GCP.
- Deny Policiesis a public Beta release on GCP, soremediationwill also be a Beta release on Prisma Cloud.
- Add Filter(
) and select
andPolicy TypeIAM.Cloud TypeGCP - Select the violating policy that you want to remediate.
- Investigate the policy violations.
- Get the remediation steps.Under theOPTIONScolumn, clickRemediate.
- Copy the CLI commands.After you clickRemediatethe CLI commands appears in a popup window.
- Run the CLI commands on your GCP account. Before you run the CLI command, see Deny Policies.After you execute the CLI commands, the remediation process is complete and the excess privileges will be revoked.
Set up Automatic Remediation for Azure IAM AlertsAutomate the remediation steps for your IAM Azure alerts with the help of a custom python script—the script reads in the Azure Bus queue, collects alerts, and then goes into Azure and executes the CLI remediation steps.Review Prerequisites for Azure Remediation ScriptComplete the following prerequisites so that you can set up everything you need to successfully run the python script. This includes the Prisma Cloud integrations, APIs, and python libraries.Configure and Run Azure IAM Remediation ScriptComplete the following prerequisites so that you can set up everything you need to successfully run the python script. This includes the Prisma Cloud integrations, APIs, and python libraries.Set up Remediation for GCP IAM AlertsPrisma Cloud leverages the Deny Policies feature on GCP to remediate risky permissions to ensure a safe rollout in case you decide to revert the remediated risky permissions. Make sure you have done all the necessary configurations in your GCP environment to useDeny Policies.Recommended For You