WAAS bot protection provides visibility into bots and other automation frameworks accessing protected web applications and APIs.
Bots are categorized into the following Categories:
- Search Engine Crawlers- Bots systematically crawling and indexing the world wide web to index pages for online searching. Also known as spider bots or web crawlers.
- Business Analytics Bots- Bots that crawl, extract and index business related information.
- Educational Bots- Bots that crawl, extract and index information for educational purposes, such as academic search engines.
- News Bots- Bots that crawl, extract and index the latest news articles, usually for news aggregation services.
- Financial Bots- Bots that crawl, extract and index financial related data.
- Content Feed Clients- Automated tools, services or end-user clients that fetch web contents for feed readers.
- Archiving Bots- Bots that crawl, extract and archive web site information.
- Career Search Bots- Automated tools or online services that extract and index job related postings.
- Media Search Bots- Bots that crawl, extract and index media contents for search engine purposes.
This category contains various bots and other automation frameworks which cannot be classified by their activity or origin
- Generic Bots- Clients with attributes that indicate an automated bot.
- Web Automation Tools- Scriptable headless web browsers and similar web automation tools.
- Web Scrapers- Automated tools or services that scrape web site contents.
- API Libraries- Software code libraries for Web API communications.
- HTTP Libraries- Software code libraries for HTTP transactions.
- Request Anomalies- HTTP requests with anomalies that are not expected from common web browsers.
- Bot Impersonators- Bots and automation tools impersonating as known good bots to evade rate limitation and other restrictions.
- Browser Impersonators- Automated tools or services that impersonate common web browser software.
Users can create custom signatures to be used based on HTTP headers and source IPs. User-defined signatures are useful for tracking customer specific bots, self-developed automation clients and traffic that appears suspicious.
WAAS uses static and active methods for detecting bots.
Static detection examines each incoming HTTP request and analyzes it to determine whether it was sent by a bot.
Prisma Session Cookies set by WAAS are encrypted and signed to prevent cookie tampering. In addition, cookies include advanced protections against cookie replay attacks where cookies are harvested and re-used in other clients.
Deploying Bot Protection
- If Request anomalies are enabled, choose sensitivity threshold
- Strict enforcement- high sensitivity (a few anomalies suffice for classifying as bot).
- Moderate enforcement- medium sensitivity.
- Lax enforcement- low sensitivity.
- Create bot signature by using a combination of the following fields:
- HTTP Header name- specify HTTP header name to include in the signature
- Header Values- comma separated list of values to be matched on in the HTTP header. Wildcard is allowed.
Enabling active detections
- Choose actions to apply.
- Session Validation- action to apply when WAAS is unable to validate the session, either due to cookie tampering or cookie replay.
- reCAPTCHA v2 integration- enable Google’s reCAPTCHA v2 integration by specifying the site key, secret key and challenge type. For more details please refer to elaborated