Remediate Alerts for IAM Security

Manually remediate your IAM security misconfigurations by running CLI commands or automatically remediate overly permissive users with a custom python script.
The IAM security module provides two options for remediating alerts so that you can enforce the principle of least privilege across your AWS, Azure, and GCP environments. You can manually remediate the alerts by copying the AWS, Azure, or GCP CLI commands and then run t hem in your cloud environment or you can configure a custom python script to automate the remediation steps.
IAM automatic remediation is different from Prisma Cloud automatic remediation. The IAM module does not support the option to enable automatic remediation. Instead, Create an Alert Rule for Run-Time Checks follow the instructions for configuring a custom p ython script on AWS or Azure to manage automatic remediation for IAM alert rules using the messaging queuing service on the respective CSP.

Manually Remediate IAM Security Alerts

The following steps shows how you can manually remediate alerts in AWS. You can follow similar steps in Azure and GCP.
  1. View the existing alerts.
    To view all of the policies that triggered an alert select
    Alerts
    Overview.
  2. Click
    Add Filter
    ( ) and select
    Policy Type
    IAM
    .
  3. Select the violating policy that you want to remediate.
    On Prisma Cloud, policies that you can remediate are indicated by the icon.
  4. Investigate the policy violations.
    In this example we see all the violating resources for this Okta user. Prisma Cloud provides us hints on why the alert was generated, which in this case was a result of an Okta user being able to create another IAM user—this could lead to a potential back door because even if the original Okta user is deleted, they can still log in through the second user. Prisma Cloud also provides recommended steps on how to resolve the policy violation.
    After a policy violation is triggered, it is sent to the SQS queue.
    In this example the SQS queue shows 1 message which is the alert that was triggered.
  5. Get the remediation steps.
    Under the
    OPTIONS
    column, click
    Remediate
    .
    1. Copy the CLI commands.
      After you click
      Remediate
      the CLI commands appears in a popup window.
    2. Run the CLI commands on your AWS account. In case of your GCP account, before you run the CLI command see GCP-Deny Policies.
      After you executed the CLI commands you will have completed the remediation process and the excess privileges will be revoked. The SQS queue will now show 0 messages.

Set up Automatic Remediation for AWS IAM Alerts

Automate the remediation steps for your AWS IAM alerts with the help of a custom python script which receives an alert via the AWS SQS queue, extracts the alert ID and uses it to call the IAM remediation API, and runs the commands which are provided by the API response.

Review Prerequisites for AWS Remediation Script

Complete the following prerequisites so that you can set up everything you need to successfully run the python script. This includes the Prisma Cloud integrations, APIs, and python libraries.
  • Integrate Prisma Cloud with Amazon SQS—This is an AWS service that allows you to send, store, and receive messages between AWS and Prisma Cloud. Follow the steps to integrate Prisma Cloud with SQS.
  • Create alert rules and set up alert notifications to Amazon SQS. All alerts triggered for the IAM policy you selected will be sent to the SQS queue.

Configure and Run AWS IAM Remediation Script

Install 3rd party libraries to create HTTP requests to your API endpoints, and edit the custom python script to include the values for the environment variables so that you can automatically remediate alerts.
  1. Copy/paste the script into a text editor or integrated development environment (IDE).
    import json import os import subprocess import boto3 import requests def log(s): if os.environ['DEBUG']: print(s) # Mapping of account number to AWS CLI profile. This is used to run each remediation with the appropriate profile account_number_to_profile = { } sqs = boto3.resource('sqs') queue = sqs.get_queue_by_name(QueueName=os.environ['SQS_QUEUE_NAME']) # Read all queue messages all_messages = [] message_batch = queue.receive_messages(MaxNumberOfMessages=10) while len(message_batch) > 0: all_messages.extend(message_batch) message_batch = queue.receive_messages(MaxNumberOfMessages=10) for message in all_messages: try: alert_info = json.loads(message.body) log(f'processing alert: {alert_info}') except json.JSONDecodeError as e: print(f'Can\'t parse queue message: {e.msg}') continue alert_id = alert_info['alertId'] account_id = alert_info['account']['id'] log(f'alert id: {alert_id}, account id: {account_id}') if 'remediable' in alert_info['metadata'] and alert_info['metadata']['remediable'] is False: log(f'Remediation is not supported for the alert: {alert_id}') continue try: log(f'getting remediation steps for the alert') r = requests.post( verify=False, url=f'{os.environ["API_ENDPOINT"]}/api/v1/permission/alert/remediation', data=json.dumps({ "alerts": [ alert_id ] }), headers={ 'x-redlock-auth': os.environ['AUTH_KEY'], 'Content-Type': 'application/json' } ) except requests.exceptions.RequestException as e: print(f'Can\'t make request to the remediation api: {e.strerror}') continue if r.status_code != 200: print(f'Error from the remediation API for the alert id: {alert_id}') continue cli_commands = r.json()['alertIdVsCliScript'][alert_id] log(f'cli commands: {cli_commands}') try: log(f'running the CLI commands') aws_cli = subprocess.Popen( cli_commands, env=dict(os.environ, AWS_PROFILE=account_number_to_profile.get(account_id)), shell=True ) except OSError as e: print(f'Can\'t run cli commands: {e.strerror}') continue aws_cli.communicate() if aws_cli.returncode != 0: print(f'Can\'t run cli commands: {cli_commands}') continue log("Deleting message") message.delete()
  2. Install the 3rd party libraries.
    This script uses a total of five python libraries. Three of the libraries:
    json
    ,
    os
    , and
    subprocess
    are part of the python core which allows you to import them into your programs after you install python. The other two libraries are
    boto3
    and
    requests
    which are 3rd party libraries—or—libraries that you have to install before running the script. Python has a default package downloader called
    pip
    , which can install 3rd party libraries and frameworks via the command line.
    1. Install
      boto3
      .
      From the command line (Windows) or terminal (Linux/MacOS) type the following command:
      pip install boto3
      This is the AWS SDK for python that allows you to create, configure, and manage AWS services such as SQS.
    2. Install
      requests
      .
      From the command line (Windows) or terminal (Linux/MacOS) type the following command:
      pip install requests
      requests is a 3rd party library for making simple HTTP requests.
  3. Edit the environment variables.
    These are mandatory variables to specify in the python script to run the commands provided by the API response and to customize the settings.
    Optional (mac/linux only)
    —Use the export command to set your environment variables.
    If you’re not familiar with python and don’t want to edit the script then you can use the
    export
    command to set the environment variables. Here’s the syntax for doing so:
    • % export API_ENDPOINT=api_tenant
    • % export YOUR_ACCOUNT_NUMBER=123456789
    • % export SQS_QUEUE_NAME=your_sqs_queue_name
    • % export YOUR_ACCOUNT_NUMBER=123456789
    • % export AUTH_KEY=your_jwt_token
    • % python script.py
    The following instructions can be executed on any operating system that has python installed. For example, Windows, macOS, and Linux.
    Environment Variable
    Value
    SQS_QUEUE_NAME
    A string that represents the name of the SQS queue that you created in step 1. For example,
    Queue2_Policy_UUID
    .
    API_ENDPOINT
    Your Prisma Cloud API subdomain. For example, if your tenant is
    https://api.prismacloud.io
    , then the
    API_ENDPOINT
    will be
    api
    .
    DEBUG
    Displays the debug logs for your script which is enabled by default.
    YOUR_ACCOUNT_NUMBER
    The 12-digit number, such as
    123456789012
    , that uniquely identifies an AWS account. A user could have multiple account numbers.
    AUTH_KEY
    Your JWT authentication token string (x-redlock-auth). See the api reference for more details.
    1. Edit
      DEBUG
      .
      DEBUG
      is enabled or set to
      True
      by default. To disable logs, update the code snippet as follow:
      if os.environ['DEBUG'] = False:
    2. Edit YOUR_ACCOUNT_NUMBER.
      Replace
      YOUR_ACCOUNT_NUMBER
      with the 12-digit account ID. The portion of the script to modify is:
      account_number_to_profile = { 'YOUR_ACCOUNT_NUMBER_1': 'YOUR_ACCOUNT_NAME_1', 'YOUR_ACCOUNT_NUMBER_2': 'YOUR_ACCOUNT_NAME_2'}
      An example of valid values:
      account_number_to_profile = {'123456789123': 'default','512478725627': 'user1'}
    3. Edit API_ENDPOINT.
      Replace
      API_ENDPOINT
      with the Prisma Cloud tenant sub domain that you’re using. The portion of the script to modify is:
      url=f'{os.environ["API_ENDPOINT"]}/api/v1/permission/alert/remediation'
      For example, replace
      API_ENDPOINT
      with
      app,
      app2
      ,
      app3
      , or
      app.gov
      .
    4. Edit the
      SQS_QUEUE_NAME
      .
      This stores the value of your queue name. The portion of the script to modify is:
      queue = sqs.get_queue_by_name(QueueName=os.environ['SQS_QUEUE_NAME'])
      Replace
      SQS_QUEUE_NAME
      with the name of your actual queue—for example, if
      Queue2_Policy_UUID
      is the name of your queue, then the code snippet will be updated as follow:
      queue = sqs.get_queue_by_name(QueueName=os.environ['Queue2_Policy_UUID'])
    5. Edit the AUTH_KEY.
      Generate a JWT token and replace the value in
      AUTH_KEY
      of the python script. The portion of the script to modify is as follows:
      'x-redlock-auth': os.environ['AUTH_KEY']
      Replace
      AUTH_KEY
      with the JWT token that you generated.
  4. View the remediation results.
    After you configured the python script with your environment variables, run the script to view the remediation results.
    1. Run the script.
      Open up command prompt (Windows) or terminal (Linux/MacOS) and type in the following command:
      python script.py
      Replace script.py with the name of your actual script.
    2. View the results.
      After executing the python script, details related to the remediation will display in the output.
      processing alert: {'alertStatus': 'open', 'reason': 'SCHEDULED', 'metadata': {'remediable': True}, 'alertRuleName': 'auto-remediation-test', 'resource': {'resourceId': 'ABCDEFGHIJKLMN', 'resourceTs': '1234567890', 'resourceName': 'test-resource'}, 'firstSeen': '1605104944614', 'lastSeen': '1617799423260', 'service': 'Prisma Cloud', 'alertTs': '1234567890123', 'alertId': 'I-1234567', 'region': 'global', 'account': {'cloudType': 'aws', 'name': 'test-account', 'id': '1234567890'}, 'policy': {'severity': 'medium', 'policyType': 'iam', 'name': 'AWS entities with risky permissions', 'policyTs': '123456789012', 'description': "This policy identifies AWS IAM permissions that are risky. Ensure that the AWS entities provisioned in your AWS account don't have a risky set of permissions to minimize security risks.", 'recommendation': "Remediation for a user: \n1. Log in to the AWS console \n2. Ntest-resourcegate to the IAM service \n3. Click on Users \n4. Choose the relevant user \n5. Under 'Permissions policies', find the relevant policy according to the alert details and remove the risky actions \n----------------------------------------\n Remediation for a Compute instance/Okta user that assumes a role: \n1. Log in to the AWS console \n2. Ntest-resourcegate to the compute service (For example, AWS EC2, AWS Lambda or AWS ECS) or login to the Okta console \n3. Find the role used by the compute instance/Okta user \n4. Ntest-resourcegate to the IAM service \n5. Click on Roles \n6. Choose the relevant role \n7. Under 'Permissions policies', find the relevant policy according to the alert details and remove the risky actions \n----------------------------------------\n Remediation for a Resource-based Policy: \n1. Log in to the AWS console \n2. Ntest-resourcegate to the relevant service (For example, AWS S3) \n3. Find resource-based policy of the resource \n4. Remove the risky actions according to the alert details", 'id': 'abcdefg9-1abc-47fc-c876-j123f4567', 'labels': '[]'}, 'alertRuleId': '1234abc-abc0-1234-ab1c-abc1234567'} alert id: I-1234567, account id: 1234567890 getting remediation steps for the alert cli commands: aws iam create-policy --policy-name 'test-resource-prisma-restrictions-I-1234567-1' --policy-document '{"Version":"2012-10-17","Statement":[{"Resource":["arn:aws:iam::1234567890123:user/test-resource"],"Action":["iam:CreateAccessKey"],"Effect":"Deny"}]}' and aws iam attach-user-policy --user-name 'test-resource' --policy-arn 'arn:aws:iam::123456789012:policy/test-resource-prisma-restrictions-I-1234567-1' running the CLI commands { "Policy": { "PolicyName": "test-resource-prisma-I-1234567-1", "PolicyId": "ABCDEFGHIJKLMNO", "Arn": "arn:aws:iam::1234567890:policy/test-resource-prisma-restrictions-I-1234567-1", "Path": "/", "DefaultVersionId": "v1", "AttachmentCount": 0, "PermissionsBoundaryUsageCount": 0, "IsAttachable": true, "CreateDate": "2021-04-08T09:03:47+00:00", "UpdateDate": "2021-04-08T09:03:47+00:00" } } Deleting message
      The output shows that we’re processing an alert for a resource named
      test-resource
      which should now be gone when we view
      Alerts
      . The CLI commands for executing the remediation steps are shown in the output; these commands are automatically executed on your behalf by the python script. A new policy will be created in AWS that removes the excess permissions of the user.

Set up Automatic Remediation for Azure IAM Alerts

Automate the remediation steps for your IAM Azure alerts with the help of a custom python script—the script reads in the Azure Bus queue, collects alerts, and then goes into Azure and executes the CLI remediation steps.

Review Prerequisites for Azure Remediation Script

Complete the following prerequisites so that you can set up everything you need to successfully run the python script. This includes the Prisma Cloud integrations, APIs, and python libraries.
  • Integrate Prisma Cloud with Azure Serve Bus—This is an Azure service that allows you to send, store, and receive messages between Azure and Prisma Cloud. Follow the steps to Integrate Prisma Cloud with Azure Service Bus.
  • Create alert rules and set up alert notifications to Azure Service Bus. All alerts triggered for the IAM policy you selected will be sent to the Service Bus queue.

Configure and Run Azure IAM Remediation Script

Complete the following prerequisites so that you can set up everything you need to successfully run the python script. This includes the Prisma Cloud integrations, APIs, and python libraries.
  1. Copy/paste the script into a text editor or integrated development environment (IDE).
    import subprocess import logging import json import requests import os from azure.servicebus import ServiceBusService, Message, Topic, Rule, DEFAULT_RULE_NAME logging.basicConfig(level=os.environ.get("LOGLEVEL", "INFO")) account_number_to_profile = { } def execute_command(command): """ Execute the CLI command :param command: :return: Returns output on success and False on Failure """ logging.info("Executing CLI command :- " + str(command)) try: output = subprocess.check_output(command, shell=True, stderr=subprocess.STDOUT) logging.info("Command execution passed with following output : {}".format(output)) return output except subprocess.CalledProcessError as e: logging.error("Command [{}] have failed with return code : {}".format(command, e.returncode)) logging.error("Error Output : {}".format(e.output)) return False def run_azure_cli_commands(cli_commands, account_id): logging.info(f'Start run_azure_cli_commands cli commands: {cli_commands}') try: azure_cli = subprocess.Popen( "az cli_commands", env=dict(os.environ, AWS_PROFILE=account_number_to_profile.get(account_id)), shell=True ) except OSError as e: logging.error(f'Can\'t run cli commands: {e.strerror}') return azure_cli.communicate() if azure_cli.returncode != 0: logging.error(f'return code:{azure_cli.returncode}, Can\'t run cli commands,: {cli_commands}') return logging.info(f'Finished run_azure_cli_commands cli commands: {cli_commands}') def login_azure(): logging.info("Start login_azure") execute_command("az login") logging.info("Finished login_azure") def logout_azure(): logging.info("Start logout_azure") execute_command("az logout") logging.info("Finished logout_azure") def get_messages_from_queue(): logging.info("Start get_messages_from_queue") queue_name = os.environ['SB_QUEUE_NAME'] logging.info(f'Using Azure alerts queue: {queue_name}') sb_key = os.environ['SB_QUEUE_KEY'] sb_key_name = os.environ['SB_QUEUE_KEY_NAME'] service_bus_name_space = os.environ['SB_QUEUE_NAME_SPACE'] bus_service = ServiceBusService(service_bus_name_space, shared_access_key_name=sb_key_name, shared_access_key_value=sb_key) queue = bus_service.get_queue(queue_name) logging.info(f'queue.message_count: {queue.message_count}') max_number_of_messages = 10 all_messages = [] messages_batch_index = 0 while messages_batch_index
  2. Install the 3rd party libraries.
    This script uses a total of five python libraries. Three of the libraries:
    subprocess
    ,
    logging
    , and
    json
    are part of the python core which allows you to import them into your programs after you install python. The other two libraries are
    requests
    and
    azure.servicebus
    which are 3rd party libraries—or—libraries that you have to install before running the script. Python has a default package downloader called
    pip
    , which can install 3rd party libraries and frameworks through the command line.
    1. Install requests.
      From the command line (Windows) or terminal (Linux/MacOS) type the following command:
      pip install requests
      requests is a 3rd party library for making simple HTTP requests
    2. Install azure.servicebus.
      From the command line (Windows) or terminal (Linux/MacOS) type the following command:
      pip install azure.servicebus
      azure.servicebus
      is a client library for python to communicate between applications and services and implement asynchronous messaging patterns.
  3. Edit the environment variables.
    These are mandatory variables to specify in the python script to run the commands provided by the API response and to customize the settings.
    Optional (mac/linux only)
    —Use the export command to set your environment variables.
    If you’re not familiar with python and don’t want to edit the script then you can use the
    export
    command to set the environment variables. Here’s the syntax for doing so:
    • % export SB_QUEUE_KEY=your_sb_queue_key
    • % export SB_QUEUE_KEY_NAME=your_sb_queue_key_name
    • % export SB_QUEUE_NAME_SPACE=your_sb_queue_name_space
    • % export API_ENDPOINT=api_tenant
    • % export AUTH_KEY=your_jwt_token
    The following instructions can be executed on any operating system that has python installed. For example, Windows, macOS, and Linux.
    ENVIRONMENT VARIABLE
    VALUE
    SB_QUEUE_KEY
    A string that represents the Service Bus queue key value.
    SB_QUEUE_KEY_NAME
    A string that represents your Service Bus key name.
    SB_QUEUE_NAME_SPACE
    A string that represents your Service Bus namespace.
    API_ENDPOINT
    Your Prisma Cloud API subdomain. For example, if your tenant is
    https://api.prismacloud.io
    , then the
    API_ENDPOINT
    will be api.
    AUTH_KEY
    Your JWT authentication token string (x-redlock-auth). See the api reference for more details.
  4. View the remediation results.
    After you configured the python script with your environment variables, run the script to view the remediation results.
    1. Run the script.
      Open up command prompt (Windows) or terminal (Linux/MacOS) and type in the following command:
      python script.py
      Replace script.py with the name of your actual script.
    2. View the results.
      After executing the python script, details related to the remediation will display in output.

Set up Remediation for GCP IAM Alerts

Prisma Cloud leverages the Deny Policies feature on GCP to remediate risky permissions to ensure a safe rollout in case you decide to revert the remediated risky permissions. Make sure you have done all the necessary configurations in your GCP environment to use
Deny Policies
.
  • GCP
    Deny Policies
    feature does not yet support all permissions due to which some of the alerts can be partially remediable or not remediable. The list of permissions in Prisma Cloud IAM security will be updated as per their availability in GCP.
  • Deny Policies
    is a public Beta release on GCP, so
    remediation
    will also be a Beta release on Prisma Cloud.
  1. Add Filter
    ( ) and select
    Policy Type
    IAM
    and
    Cloud Type
    GCP
    .
  2. Select the violating policy that you want to remediate.
  3. Investigate the policy violations.
  4. Get the remediation steps.
    Under the
    OPTIONS
    column, click
    Remediate
    .
    1. Copy the CLI commands.
      After you click
      Remediate
      the CLI commands appears in a popup window.
    2. Run the CLI commands on your GCP account. Before you run the CLI command, see Deny Policies.
      After you execute the CLI commands, the remediation process is complete and the excess privileges will be revoked.

Recommended For You