Bot Protection

WAAS bot protection provides visibility into bots and other automation frameworks accessing protected web applications and APIs.

Bot Categories

WAAS detects known good bots as well as other bots, headless browsers and automation frameworks. WAAS is also able to fend off cookie-dropping clients and other primitive clients by mandating the use of cookies and javascript in order for the client to reach the protected origin.
Bots are categorized into the following Categories:
  • Search Engine Crawlers
    - Bots systematically crawling and indexing the world wide web to index pages for online searching. Also known as spider bots or web crawlers.
  • Business Analytics Bots
    - Bots that crawl, extract and index business related information.
  • Educational Bots
    - Bots that crawl, extract and index information for educational purposes, such as academic search engines.
  • News Bots
    - Bots that crawl, extract and index the latest news articles, usually for news aggregation services.
  • Financial Bots
    - Bots that crawl, extract and index financial related data.
  • Content Feed Clients
    - Automated tools, services or end-user clients that fetch web contents for feed readers.
  • Archiving Bots
    - Bots that crawl, extract and archive web site information.
  • Career Search Bots
    - Automated tools or online services that extract and index job related postings.
  • Media Search Bots
    - Bots that crawl, extract and index media contents for search engine purposes.
This category contains various bots and other automation frameworks which cannot be classified by their activity or origin
  • Generic Bots
    - Clients with attributes that indicate an automated bot.
  • Web Automation Tools
    - Scriptable headless web browsers and similar web automation tools.
  • Web Scrapers
    - Automated tools or services that scrape web site contents.
  • API Libraries
    - Software code libraries for Web API communications.
  • HTTP Libraries
    - Software code libraries for HTTP transactions.
  • Request Anomalies
    - HTTP requests with anomalies that are not expected from common web browsers.
  • Bot Impersonators
    - Bots and automation tools impersonating as known good bots to evade rate limitation and other restrictions.
  • Browser Impersonators
    - Automated tools or services that impersonate common web browser software.
Users can create custom signatures to be used based on HTTP headers and source IPs. User-defined signatures are useful for tracking customer specific bots, self-developed automation clients and traffic that appears suspicious.

Detection methods

WAAS uses static and active methods for detecting bots.
Static detection examines each incoming HTTP request and analyzes it to determine whether it was sent by a bot.
Active detections make use of javascript and Prisma Sessions Cookies to detect and classify bots.
Prisma Session Cookies set by WAAS are encrypted and signed to prevent cookie tampering. In addition, cookies include advanced protections against cookie replay attacks where cookies are harvested and re-used in other clients.
Prisma sessions are intended to address the problem of "Cookie Droppers" by validating clients support of cookies and Javascript before allowing them to reach the origin server. Once enabled, WAAS serves an interstitial page for any request that does not include a valid Prisma Session Cookie. The interstitial page sets a cookie and redirects the client to the requested page using Javascript. A client that doesn’t support cookies and Javascript will keep receiving the interstitial page. Browsers can easily proceed to the requested page, and once they possess a valid cookie they will not encounter the interstitial page.
When enabled, javascript will be injected periodically in server responses to collect browser attributes and flag anomalies typical to various bot frameworks. Javascript fingerprint results are received and processed asynchronously and are used to classify session for future requests.

Detection workflow