Use the Prisma Cloud Extension for AWS DevOps

Summary of Prisma Cloud extension for AWS devOps
With a Prisma Cloud Enterprise Edition license, you can integrate compliance and vulnerability checks into your AWS continuous integration/continuous (CI/CD) and build environments. This extension enables you to scan Infrastructure-as-Code (IaC) templates like AWS CFT, Terraform templates, and Kubernetes deployment files against Prisma Cloud security policies. It also enables you to use Prisma Cloud Compute to scan container images for vulnerabilities.
The sections below show how to integrate the Prisma Cloud extension with your AWS CodePipeline pipelines and AWS CodeBuild projects.

Set Up IaC Scanning with AWS CodePipeline

You can customize your AWS CodePipeline to check Infrastructure-as-Code (Iac) templates. The following examples show you how to integrate IaC scan into your CodePipeline.
You have two options to scan your IaC templates against Prisma Cloud security policies. You can use an AWS Lambda function with Python scripting, or you can use a custom action with a Bash shell script.
The prerequisites for IaC scan integration regardless of whether you use an AWS Lambda fuction or a custom action with a Bash shell script are as follows:
  • You have a valid Prisma Cloud Enterprise Edition license
  • You have a valid AWS CodePipeline service role to give AWS CodePipeline access to other resources in your account.
  • You have configured a two-stage pipeline in AWS CodePipeline.
  • If your customization uses any AWS commands, then you have installed and configured the AWS command line interface.

Use an AWS Lambda Function with Python Scripting

The following table describes the variables you need to set for your Lambda function, whether you are using the AWS CLI or the AWS console to configure your Lambda function. If you use the AWS console, these variables are environment variables. If you use a script that invokes the AWS CLI, you may specify the variables directly in the script.
Variable
Value
Prisma_Cloud_API_URL
Your Prisma Cloud API URL (e.g.
https://api.prismacloud.io
). The exact URL depends on the Prisma Cloud region and cluster of your tenant
Access_Key
Your Prisma Cloud access key for API access. If you do not have an access key, you must Create and Manage Access Keys
Secret_Key
The secret key that corresponds to your Prisma Cloud access key
Asset_Name
Identifies the repository you want to scan
Tags
Organizes the templates that are scanned with this service connection, for visibility on Prisma Cloud
  1. Create a Lambda function.
    This example shows you how to use the AWS command line interface to create a Lambda function that scans the IaC templates in your AWS CodePipeline for checking against Prisma Cloud security policies.
    1. Create the .prismaCloud/config.yml file and add it to the root directory of your repository branch. The file is required, and it must include the template type, version, and the template specific parameters and tags you use in your environment.
    2. Download
      PrismaCloudIaCScan.zip
      to an accessible location.
      You can find the file at
      https://github.com/PaloAltoNetworks/Prisma-Cloud-DevOps-Security/blob/aws-codepipeline/aws-codepipeline/PrismaCloudIacScan/Lambda/PrismaCloudIaCScan.zip
    3. Run the following command to create your Lambda function.
      Note that you will need to set the variables directly in your script.
      export AWS_PROFILE=prisma-scan export AWS_DEFAULT_REGION=us-west-1 export AWS_LAMBDA_FUNCTION=AWSCodePipeline-gn export AWS_ROLE=iam::/CustomPipelineWithLambda aws --profile ${AWS_PROFILE} --region ${AWS_DEFAULT_REGION} \ lambda create-function \ --role ${AWS_ROLE} --function-name ${AWS_LAMBDA_FUNCTION} \ --runtime python3.6 \ --handler PrismaCloudIaCScan.lambda_handler \ --environment Variables="{Prisma_Cloud_API_URL<hostname>,Access_Key=<accesskey>,\ Secret_Key=<secetkey>, Asset_Name=<sssetname>, Tags=<tags>}” \ --zip-file fileb://PrismaCloudIaCScan.zip
    If you prefer to use the AWS console instead of the AWS CLI to create your Lambda function, you can use the steps below.
    1. Create the .prismaCloud/config.yml file and add it to the root directory of your repository branch. The file is required, and it must include the template type, version, and the template specific parameters and tags you use in your environment.
    2. In the AWS console, set the environment variables listed at the beginning of Use an AWS Lambda Function with Python Scripting.
      iac-scan-aws-devops-lambda-env-vars.png
    3. Open the AWS Lambda console and navigate to the
      Create function
      page.
    4. Provide a function name (e.g. LambdaFunctionForAWSCodePipeLine).
    5. Chose a runtime of either Python 3.6 or Python 3.7.
    6. Either create a new execution role or choose an existing role that has the proper permissions.
      The proper permissions are:
      • Write permission for AWS Code Pipeline
      • List, Read, and Write permissions for AWS Cloudwatch Logs
      • Read permission for your S3 bucket if it is your data source
    7. Select
      Create function
      .
    8. Set the handler to
      PrismaCloudIaCScan.lambda_handler
      .
      The handler is defined in
      Basic Settings
      .
    9. Choose the Execution role (
      optional
      ).
    10. Set a timeout.
    11. Select
      Save
      .
    iac-scan-aws-devops-create-lambda-function.png
  2. Add the Lambda function to your pipeline.
    1. In the AWS console, navigate to
      Services
      Developer Tools
      CodePipeline
      Edit Pipeline
      . Choose your pipeline and select
      Edit
      .
    2. Between any phase stage, select
      + Add Stage
      and provide a stage name of your choice.
    3. Select
      + Add action group
      . In
      Edit action
      , provide the information required to define a custom action.
      The table below identifies the fields that have values specific to Prisma Cloud. The value for the
      User parameters
      is in JSON format and specifies the conditions under which the pipeline job status will fail. For the example in the table, the job will fail if the extension finds one high-severity violation or two medium-severity violations or five low-severity violations.
      Field
      Value
      Action provider
      AWS Lambda
      Function name
      The function name you used when you created the Lambda function (e.g.
      PrismaCloudIaCScan
      )
      User parameters
      Example:
      {"FailureCriteria": {"High": 1,"Medium": 2,"Low": 5,"Operator": "or"}}
      Example with Tags:
      {"FailureCriteria": {"High": 1,"Medium": 2,"Low": 5,"Operator": "or"}, "Tags": ["team: devOps", "env: test"]}
      Valid values for
      “Operator”
      are
      “or”
      and
      “and”
      The following example shows the Edit action entries.
      iac-scan-aws-devops-console-edit-action.png
    4. Select
      Done
      .
    5. Review the results after you’ve executed your pipeline.
      To start a pipeline manually through the console, select
      Release change
      on the pipeline details page.
      Select the link to execution details to see the latest CloudWatch logs to view any security violations that Prisma Cloud identified.
      aws-pipeline-lambda-details.png
      aws-pipeline-cloudwatch-log-err.png

Use a Custom Action with Bash

  1. Create a custom action.
    The following example shows how to use an AWS custom action with a Bash shell script to scan your IaC templates and compare them against Prisma Cloud security policies.
    It’s assumed that your source is created in a GitHub repository.
    1. If it’s not already installed, install jq, version 1.6 or higher on the EC2 instance or system where your job worker will run.
      jq is available at
      https://stedolan.github.io/jq/
      .
    2. If your job worker runs in an EC2 instance, ensure your EC2 instance user has permission to run CodePipeline.
    3. Ensure the AWS CLI is available where your job worker runs.
      The job worker uses the following:
      • aws codebuild
      • aws codepipeline
    4. Create a file
      CustomAction.json
      in a working location, such as your EC2 instance, and copy the following content to that file.
      { "category": "Test", "provider": "Prisma-Cloud-IaC-Scan", "version": "1e", "settings": { "entityUrlTemplate": "https://s3.console.aws.amazon.com/s3/buckets/{Config:S3BucketName}/?region={Config:S3BucketRegion}&tab=overview", "executionUrlTemplate": "https://s3.console.aws.amazon.com/s3/buckets/{Config:S3BucketName}/?region={Config:S3BucketRegion}&tab=overview" }, "configurationProperties": [ { "name": "S3BucketName", "required": true, "key": true, "secret": false, "queryable": false, "description": "The S3 bucket name. The results with the vulnerabilities will be stored in this bucket.", "type": "String" }, { "name": "S3BucketRegion", "required": true, "key": true, "secret": false, "queryable": false, "description": "The S3 bucket region.", "type": "String" }, { "name": "Prisma_Cloud_API_URL", "required": true, "key": true, "secret": false, "queryable": false, "description": "Prisma Cloud server URL", "type": "String" }, { "name": "Access_Key", "required": true, "key": true, "secret": false, "queryable": false, "description": "Prisma Cloud access key", "type": "String" }, { "name": "Secret_Key", "required": true, "key": true, "secret": true, "queryable": false, "description": "Prisma Cloud secret key", "type": "String" }, { "name": "Asset_Name", "required": true, "key": true, "secret": false, "queryable": false, "description": "Provide the asset name for the pipeline", "type": "Number" }, { "name": "Failure_Criteria", "required": true, "key": true, "secret": false, "queryable": false, "description": "Provide failure threshold for high, medium and low severity security issues along with the operator. Ex. high:5, medium:0, low:2, op:or", "type": "String" }, { "name": "Tags", "required": false, "key": true, "secret": false, "queryable": false, "description": "Provide the tags for the IaC Scan task.", "type": "String" } ], "inputArtifactDetails": { "maximumCount": 1, "minimumCount": 0 }, "outputArtifactDetails": { "maximumCount": 1, "minimumCount": 0 } }
      Optionally, you can edit the
      provider
      and
      version
      fields, but do not modify the
      configurationProperties
      field.
    5. Execute the following AWS command to create the custom action.
      aws codepipeline create-custom-action-type --cli-input-json \ file://CustomAction.json
    6. Create the required IAM policies.
      • Navigate to
        IAM
        Policies
        , and create a policy to transfer files to and from the S3 bucket.
        This policy enables the worker to pull build artifacts from the S3 bucket for scanning and publish the logs to the bucket.
        iac-scan-aws-devops-iam-policy.png
      • Create the scan job worker for custom actions.
        • Execute
          aws configure
          and set the default output format to JSON.
        • Copy the job worker shell script poll.sh to your local machine or the EC2 instance, depending on the job worker’s running location.
          Make sure your EC2 instance user has permission to run your pipeline and the job worker has permission to access CodePipeline.
        • Execute the job worker shell script with the following command:
          ./poll.sh "category=Test,owner=Custom,version=1,provider=Prisma-Cloud-IaC-Scan"
        The job worker is now configured to listen to requests from CodePipeline.
  2. Add the custom action to your pipeline.
    1. In the AWS console, navigate to
      Services
      Developer Tools
      CodePipeline
      Create/Edit Pipeline
      to add your Scan custom action to your pipeline.
      To add your custom action to your pipeline as a test step, navigate to
      Test
      Test provider
      , and select
      Custom Action
      .
    2. Configure the values for the pipeline.
      Field
      Value
      Input artifacts
      The output artifact from the previous step
      Prisma_Cloud_API_URL
      Your Prisma Cloud API URL (e.g.
      https://api.prismacloud.io
      ). The exact URL depends on the region and cluster of your Prisma Cloud tenant
      Access_Key
      Your Prisma Cloud access key for API access
      Secret_Key
      Your Prisma Cloud secret key
      Asset_Name
      identifies the repository you want to scan
      Tags
      Organizes templates that are scanned, for visibility on Prisma Cloud. Example:
      env:test,team:devOps
      Failure_Criteria
      Failure criteria for high, medium, and low severity issues. Example:
      high:0, med:0, low:0, operator:or
      S3BucketName
      Although this field is not specific to Prisma Cloud, a valid
      S3BucketName
      is required for this custom action
      S3BucketRegion
      Although this field is not specific to Prisma Cloud, a valid
      S3BucketRegion
      is required for this custom action
      aws-pipeline-custom-action.png
    3. Save the pipeline changes.
  3. Test your pipeline.
    You can use the AWS console to release the pipeline manually. After your stage completes, you can view the results of the checks against Prisma Cloud security profile in the log report in S3 by selecting the
    Details
    link.

Set Up Container Image Scanning with AWS CodeBuild

Set up AWS CodeBuild to run Prisma Cloud Compute scans
You can enable container image scanning with Prisma Cloud Compute. Add the following steps to your normal AWS CodeBuild build project set-up steps to add container scans to your build project.
  1. On the Prisma Cloud Compute Console, add a vulnerability scan rule.
    1. Select
      Compute
      Defender
      Vulnerabilities
      Images
      CI
      .
      iac-scan-azure-devops-add-vulnerability-rule.png
    2. Add Rule and enter a Rule name.
      iac-scan-azure-devops-add-vulnerability-rule-2.png
    3. Specify the
      Alert
      and
      Failure
      thresholds.
      You can set the vulnerability scan to fail on critical, high, medium, or low severity. The failure threshold must be greater than the alert threshold.
    4. Specify the
      Grace period
      .
      The grace period is the number of days for which you want
    For more information about these settings, see the Prisma Cloud Compute Guide.
  2. Use the following example as your AWS buildspec file,
    buildspec.yaml
    , which is in the root directory of your source.
    This file runs the
    twistcli
    command to scan the specified container image for vulnerabilities.
    The following example splits some of the lines of code for documentation formatting. If you choose to copy this example directly, ensure the commands are not split into multiple lines in your code.
    version: 0.2 # In this example, we're using environment variables # to store the username and password of our Prisma Cloud Compute CI user account # and the URL to our console # PC_COMPUTE_USER: The Prisma Cloud Compute user with the CI User role # PC_COMPUTE_PASS: The password for this user account # PC_COMPUTE_CONSOLE_URL: The base URL for the console -- http://console.<my_company>.com:8083 -- without a trailing / phases: install: runtime-versions: docker: 18 build: commands: - echo Build started on `date` - echo Building the Docker image..$IMAGE_TAG - docker build -t $IMAGE_REPO_NAME:$IMAGE_TAG . post_build: commands: - echo Build completed on `date` - curl -k -u $PC_COMPUTE_USER:$PC_COMPUTE_PASS --output ./twistcli $PC_COMPUTE_CONSOLE_URL/api/v1/util/twistcli - chmod +x ./twistcli - echo Scanning with twistcli $PC_COMPUTE_PASS $PC_COMPUTE_USER # Run the scan with twistcli, providing detailed results in CodeBuild and # pushing the results to the Prisma Cloud Compute console. # --details returns all vulnerabilities and compliance issues rather than just summaries. # -address points to our Prisma Cloud Compute console # -u and -p provide credentials for the console. These creds only need the CI User role. # Finally, we provide the name of the image we built with 'docker build', above. - ./twistcli images scan --details --address $PC_COMPUTE_CONSOLE_URL -u $PC_COMPUTE_USER -p $PC_COMPUTE_PASS $IMAGE_REPO_NAME:$IMAGE_TAG # See twistcli documentation for more details.
  3. In your AWS CodeBuild build project, set the following environment variables, which the sample buildspec.yml file will use.
    Environment Variable
    Description
    PC_COMPUTE_USER
    Prisma Cloud Compute user with the CI User role
    PC_COMPUTE_PASS
    Prisma Cloud Compute user password
    PC_COMPUTE_CONSOLE_URL
    Base URL for the Prisma Cloud Compute console (e.g.
    http://console.<example>.com:8083
    )
    IMAGE_REPO_NAME
    Docker repository for image to be scanned for vulnerabilities
    IMAGE_TAG
    Docker tag for image to be scanned for vulnerabilities
    iac-scan-aws-devops-codebuild-env-vars.png
  4. View the results of the container image scan.
    iac-scan-aws-devops-image-scan-results.png

poll.sh

#!/bin/bash set -u set -e trap "echo ERR; exit" ERR # exec&> >(while read line; do echo "$(date +'%h %d %H:%M:%S') $line" >> cmds.log; done;) #set -x if [[ -z "${1:-}" ]]; then echo "Usage: ./poll.sh <action type id>" >&2 echo -e "Example:\n ./poll.sh \"category=Test,owner=Custom,version=1,provider=Prisma-Cloud-IaC-Scan\"" >&2 exit 1 fi echo_ts() { echo -e "\n" >> Prisma_Cloud_IaC_Scan.log echo "$1" >> Prisma_Cloud_IaC_Scan.log } run() { local action_type_id="$1" echo_ts "actiontypeid: $action_type_id" local job_json="$(fetch_job "$action_type_id")" if [[ "$job_json" != "null" && "$job_json" != "None" && "$job_json" != "" ]]; then local job_id="$(echo "$job_json" | jq -r '.id')" echo "job_id: $job_id" mkdir $job_id chmod +x $job_id cd $job_id || update_job_status "$job_json" "job id not found" acknowledge_job "$job_json" echo "job_json: $job_json" local build_json=$(create_build "$job_json") else sleep 10 fi } acknowledge_job() { local job_json="$1" local job_id="$(echo "$job_json" | jq -r '.id')" local nonce="$(echo "$job_json" | jq -r '.nonce')" echo_ts "Acknowledging CodePipeline job (id: $job_id nonce: $nonce)" >&2 aws codepipeline acknowledge-job --job-id "$job_id" --nonce "$nonce" > /dev/null 2>&1 } fetch_job() { local action_type_id="$1" aws codepipeline poll-for-jobs --max-batch-size 1 \ --action-type-id "$action_type_id" \ --query 'jobs[0]' } action_configuration_value() { local job_json="$1" local configuration_key="$2" echo "$job_json" | jq -r ".data.actionConfiguration.configuration | .[\"$configuration_key\"]" } update_job_status() { local job_json="$1" local build_state="$2" local job_id="$(echo "$job_json" | jq -r '.id')" echo_ts "Updating CodePipeline job with '$build_state' and job_id '$job_id'result" >&2 if [[ "$build_state" == *"succeeded"* ]]; then aws codepipeline put-job-success-result \ --job-id "$job_id" \ --execution-details "summary=$build_state,externalExecutionId=$job_id,percentComplete=100" else aws codepipeline put-job-failure-result \ --job-id "$job_id" \ --failure-details "type=JobFailed,message=Build $build_state,externalExecutionId=$job_id" fi } decide_job_status(){ local job_json="$1" local stats="$2" local in_high="$3" local in_med="$4" local in_low="$5" local in_oper="$6" local resp_high="$(echo "$stats" | jq -r '.high')" local resp_med="$(echo "$stats" | jq -r '.medium')" local resp_low="$(echo "$stats" | jq -r '.low')" if [[ $in_oper == null ]];then in_oper="or" fi if [[ $in_high == null ]];then in_high=0 fi if [[ $in_med == null ]];then in_med=0 fi if [[ $in_low == null ]];then in_low=0 fi if [[ $stats != null ]] ;then if [[ "$in_oper" == "or" && ( "$resp_high" -ge "$in_high" || "$resp_med" -ge "$in_med" || "$resp_low" -ge "$in_low" ) ]] ;then local failure_message="Prisma Cloud IaC scan failed with issues as security issues count (high: $resp_high, medium: $resp_med, Low: $resp_low) meets or exceeds failure criteria (high: $in_high, medium: $in_med, Low: $in_low)" echo_ts "$failure_message" update_job_status "$job_json" "failed: Prisma Cloud IaC scan failed with issues." elif [[ "$in_oper" == "and" && ( "$resp_high" -ge "$in_high" && "$resp_med" -ge "$in_med" &&esp_low" -ge "$in_low" ) ]]; then local failure_message="Prisma Cloud IaC scan failed with issues as security issues count (high: $resp_high, medium: $resp_med, Low: $resp_low) meets or exceeds failure criteria (high: $in_high, medium: $in_med, Low: $in_low)" echo_ts "$failure_message" update_job_status "$job_json" "failed: Prisma Cloud IaC scan failed with issues." else local partial_success="Prisma Cloud IaC scan succeeded with issues as security issues count (high: $resp_high, medium: $resp_med, Low: $resp_low) does not exceed failure criteria (high: $in_high, medium: $in_med, Low: $in_low)" echo_ts "$partial_success" update_job_status "$job_json" "succeeded: Prisma Cloud IaC scan succeeded with issues as security issues." fi else update_job_status "$job_json" "success" fi } create_build() { ls local job_json="$1" local job_id="$(echo "$job_json" | jq -r '.id')" local pipelineName="$(echo "$job_json" | jq -r ".data.pipelineContext.pipelineName")" echo "pipelineName: $pipelineName" local s3_bucket=$(action_configuration_value "$job_json" "S3BucketName") local bucketName="$(echo "$job_json" | jq -r ".data.inputArtifacts[0].location.s3Location | .[\"bucketName\"]")" local object_key="$(echo "$job_json" | jq -r ".data.inputArtifacts[0].location.s3Location | .[\"objectKey\"]")" local output_object="$(echo "$job_json" | jq -r ".data.outputArtifacts[0].location.s3Location | .[\"objectKey\"]")" local console_url="$(echo "$job_json" | jq -r ".data.actionConfiguration.configuration.Prisma_Cloud_API_URL")" local access_key="$(echo "$job_json" | jq -r ".data.actionConfiguration.configuration.Access_Key")" local secret_key="$(echo "$job_json" | jq -r ".data.actionConfiguration.configuration.Secret_Key")" aws codepipeline get-pipeline --name "$pipelineName" > pipelineDetails.json jq '.pipeline.stages[] | select(.name == "Source")' pipelineDetails.json > source.json #cat source.json local user_id="$(cat source.json | jq -r ".actions[].configuration.Owner")" local project_name="$(cat source.json | jq -r ".actions[].configuration.Repo")" if [ -z "$console_url" ]; then echo_ts "Please enter valid Prisma Cloud API URL in plugin in Input param. For details refer to :plugin link" update_job_status "$job_json" "Please enter valid Prisma Cloud API URL in plugin in Input param. For details refer to plugin link" exit 1; fi echo "executing login api" local login_url="${console_url}/login" local req_cmd=$(curl -k -i -o -X POST $login_url -H "Content-Type:application/json" --user-agent "AWS-CodePipeline-CustomAction/2.0.0" -d "{\"username\":\"${access_key}\",\"password\":\"${secret_key}\"}" -x http://127.0.0.1:8080 ) || update_job_status "$job_json" "$err_500" local err_400="Invalid credentials please verify that API URL, Access Key and Secret Key in Prisma Cloud plugin settings are valid For details refer to Extension link https://docs.paloaltonetworks.com/prisma/prisma-cloud/prisma-cloud-admin/prisma-cloud-devops-security/use-the-prisma-cloud-extension-for-aws-codepipeline.html" local err_500="Oops! Something went wrong, please try again or refer to documentation on https://docs.paloaltonetworks.com/prisma/prisma-cloud/prisma-cloud-admin/prisma-cloud-devops-security/use-the-prisma-cloud-extension-for-aws-codepipeline.html" http_status=$(echo "$req_cmd" | grep HTTP | awk '{print $2}') echo "http status: $http_status" if [[ -z "$http_status" ]]; then echo_ts '$err_500' >&2 update_job_status "$job_json" "error" exit 1; fi if [[ "$http_status" == 400 || "$http_status" == 401 ]] ; then echo_ts '$err_400' >&2 update_job_status "$job_json" "error" exit 1 fi if [[ "$http_status" == 500 || "$http_status" > 500 ]] ; then echo "http_status: $http_status" echo_ts '$err_500' >&2 update_job_status "$job_json" "error" exit 1 fi #echo "req cmd: $req_cmd" output_response=$(echo "$req_cmd" | grep token) local token="$(echo "$output_response" | jq .token | tr -d '"')" #echo "token: $token" local scan_location="$(echo $bucketName/$object_key)" aws s3 cp s3://$scan_location . || update_job_status "$job_json" "Copy Object from S3 bucket failed" local file=( *.zip ) #echo "file: $file" mv $file artifact.zip file_size="$(wc -c artifact.zip | awk '{print $1}')" #echo "$file_size" file_size_limit=1000000 if [[ "$file_size" -gt "$file_size_limit" ]] then printf "\nDirectory size $project_name more than 2 MB is not supported." exit 1; fi mkdir .prismaCloud unzip -p artifact.zip .prismaCloud/config.yml >.prismaCloud/config.yml if [[ ! -f .prismaCloud/config.yml ]] then echo "File nt present" fi iacAPI=${console_url}/iac_scan echo "exceuting scan api: $iacAPI" ################################################################## # Generate the url and the headers ################################################################## #echo "m here" if [[ ! -f .prismaCloud/config.yml ]] then echo "Can not find config.yml under .prismaCloud folder in repo $project_name. Please make sure the file is present in correct format (refer: https://docs.paloaltonetworks.com/prisma/prisma-cloud/prisma-cloud-admin/prisma-cloud-devops-security/use-the-prisma-cloud-extension-for-aws-devops.html) at the root of your repo under .prismaCloud folder." exit 1; fi headers="" url="" fileContents=$(yq read -j .prismaCloud/config.yml) #echo "file contents are: " $fileContents templateType="$(echo "$fileContents" | jq -r '.template_type')" #echo "template type: " $templateType if [[ ! -z "$templateType" && ( "$templateType" == "TF" || "$templateType" == "tf" ) ]] then url="$console_url/iac/tf/v1/scan" terraformVersion="$(echo "$fileContents" | jq -r '.terraform_version')" if [[ ! -z "$terraformVersion" && ( "$terraformVersion" == 0.12 || "$terraformVersion" > 0.12 ) ]] then headers+=" -H terraform-version:0.12" isTerraform12ParamsPresent="$(echo "$fileContents" | jq -r '.terraform_012_parameters')" if [[ "$isTerraform12ParamsPresent" != null ]] then terraformContents="$(echo "$fileContents" | jq -r '.terraform_012_parameters[] |= with_entries( .key |= gsub("root_module"; "root-module") )' | jq -r '.terraform_012_parameters[] |= with_entries( .key |= gsub("variable_files"; "variable-files") )' )" terraform012Parameters="$(echo "$terraformContents" | jq -r '.terraform_012_parameters' | tr -d '\n\t' | tr -d '[:blank:]')" if [[ "$terraform012Parameters" != null ]] then headers+=" -H terraform-012-parameters:$terraform012Parameters" fi fi else headers+=" -H terraform-version:0.11" #read terraform 0.11 parameters variableFiles="$(echo "$fileContents" | jq -r '.terraform_011_parameters.variable_files' | tr -d '\n\t' | tr -d '[:blank:]')" variableValues="$(echo "$fileContents" | jq -r '.terraform_011_parameters.variable_values' | tr -d '\n\t' | tr -d '[:blank:]')" if [[ "$variableFiles" != null ]] then headers+=" -H rl-variable-file-names:$variableFiles" fi if [[ "$variableValues" != null ]] then headers+=" -H rl-parameters:$variableValues" fi fi elif [[ ! -z "$templateType" && ( "$templateType" == "CFT" || "$templateType" == "cft" ) ]] then url="$console_url/iac/cft/v1/scan" variableValues="$(echo "$fileContents" | jq -r '.cft_parameters.variable_values' | tr -d '\n\t' | tr -d '[:blank:]')" if [[ "$variableValues" != null ]] then headers+=" -H rl-parameters:$variableValues" fi elif [[ ! -z "$templateType" && ( "$templateType" == "K8S" || "$templateType" == "k8s" || "$templateType" == "K8s" ) ]] then url="$console_url/iac/k8s/v1/scan" else echo "No valid template-type found in config.yml file in repo $project_name. Please specify either of these values: TF, CFT or K8s as template-type variable in the config.yml" exit 1; fi #echo url: "$url" #echo header: "$headers" ######################################################################################### # Metadata Structure ######################################################################################### # Tags task_tags="$(echo "$job_json" | jq -r ".data.actionConfiguration.configuration.Tags")" repo_tags="$(echo "$fileContents" | jq -r '.tags' |tr -d '\n\t' | tr -d '[:blank:]')" prisma_tags="" if [[ ! -z "$repo_tags" ]] then prisma_tags+="\"repo_tags\":$repo_tags" fi if [[ ! -z "$task_tags" ]] then temp="\"$(sed 's/,/","/g' \<<< "$task_tags")\"" if [[ "$prisma_tags" == "" ]] then prisma_tags+="\"task_tags\":[$temp]" else prisma_tags+=", \"task_tags\":[$temp]" fi fi aws codepipeline get-pipeline --name "$pipelineName" > pipelineDetails.json jq '.pipeline.stages[] | select(.name == "Source")' pipelineDetails.json > source.json cat source.json local user_id="$(cat source.json | jq -r ".actions[].configuration.Owner")" local project_name="$(cat source.json | jq -r ".actions[].configuration.Repo")" local stage_name="$(echo "$job_json" | jq -r ".data.pipelineContext.stage.name")" local action_name="$(echo "$job_json" | jq -r ".data.pipelineContext.action.name")" local asset_name="$(echo "$job_json" | jq -r ".data.actionConfiguration.configuration.Asset_Name")" ############################################################# # Check failure criteria exists, if not default 0,0,0,or ############################################################ local failure_criteria="$(echo "$job_json" | jq -r ".data.actionConfiguration.configuration.Failure_Criteria")" if [[ -z "$failure_criteria" ]];then failure_criteria_high_severity=0 failure_criteria_medium_severity=0 failure_criteria_low_severity=0 failure_criteria_operator="or" else echo "failure criteria:" $failure_criteria failure_criteria_removed_spaces=$(printf '%s' $failure_criteria) delimiter=, s=$failure_criteria_removed_spaces$delimiter array=(); while [[ $s ]]; do array+=( "${s%%"$delimiter"*}" ); s=${s#*"$delimiter"}; done; #declare -p array failure_criteria_high_severity=$(awk -F':' '{print $2}'<<<"${array[0]}") failure_criteria_medium_severity=$(awk -F':' '{print $2}' <<< "${array[1]}") failure_criteria_low_severity=$(awk -F':' '{print $2}' <<< "${array[2]}") failure_criteria_operator=$(awk -F':' '{print $2}' <<< "${array[3]}") #echo "Failure Criterias:" $failure_criteria_high_severity $failure_criteria_medium_severity $failure_criteria_low_severity $failure_criteria_operator fi # Metadata metadata_json={"asset-name":"$asset_name","asset-type":"AWS-CodePipeline","user-id":"${user_id}","prisma_tags":{"$prisma_tags"},"scan-attributes":{"project-name":"${project_name}","pipeline-details":{"pipeline-name":"$pipelineName","job-id":"$job_id","stage-name":"$stage_name","action-name":"$action_name"}},"failure-criteria":{"high":"$failure_criteria_high_severity","medium":"$failure_criteria_medium_severity","low":"$failure_criteria_low_severity","operator":"$failure_criteria_operator"}} echo metadata "$metadata_json" ################################################################################################# # IaC Scan Exceution ################################################################################################# echo "Executing the scan api" local response="$(curl -k -X POST $url -H "x-redlock-auth:${token}" --user-agent "AWS-CodePipeline-CustomAction/2.0.0" $headers -H "x-redlock-iac-metadata:${metadata_json}" -F templateFile=@artifact.zip -x http://127.0.0.1:8080)" || update_job_status "$job_json" "Call from API failed" #echo "response: $response" local result="$(echo "$response" | jq -r '.result.is_successful')" #echo "result: $result" if [[ $result ]] then local partial_failure="$(echo "$response" | jq -r '.result.partial_failure')" local matched="$(echo "$response" | jq -r '.result.rules_matched')" if [[ $matched != null ]] ;then local stats="$(echo "$response" | jq -r '.result.severity_stats')" decide_job_status "$job_json" "$stats" "$failure_criteria_high_severity" "$failure_criteria_medium_severity" "$failure_criteria_low_severity" "$failure_criteria_operator" display="$(echo "$matched" | jq -r 'sort_by(.severity) | (["SEVERITY" ,"NAME" ,"FILES"] | (., map(length*"-")) ), (.[] | [.severity , .name, .files[0] ]) | join(",")' | column -t -s ",")" || update_job_status "$job_json" "Unknown Error " if [[ ! -z "$partial_failure" ]] then display+="\n$partial_failure" fi else echo_ts "Good job! Prisma Cloud did not detect any issues." fi else local error_message="$(echo "$response" | jq -r '.result.error_details')" echo_ts "$error_message" update_job_status "$job_json" "$error_message" exit 1 fi echo_ts "$display" >&2 aws s3 cp Prisma_Cloud_IaC_Scan.log s3://$s3_bucket/Prisma_Cloud_IaC_Scan_$job_id.log || update_job_status "$job_json" "upload results to S3 bucket failed" rm -fr $job_id } run "$1"

Recommended For You