About Queries
Use queries to identify the exact log records you want
Cortex Data Lake to retrieve.
Queries are Boolean expressions that identify the log records Cortex Data Lake will retrieve for
the specified log record type. You use them as an addition to the log record type and
time range information that you are always required to provide. Use queries to narrow
the retrieval set to the exact records you want.
Query Syntax
Specify queries using match statements. These statements can be either an equality or
pattern matching expression. You can optionally combine these statements using the
Boolean operators:
AND
or
OR
. <match_statement> [<boolean> <match_statement>] ...
For example:
source_user LIKE 'paloalto%' AND action.value = 'deny'
A query can be at most 4096 characters long. The actual field name that you use for
your filters are not identical to the names shown in the column header. Also, the
data displayed in the log table might not always be the identical value you want to
use in your queries. For example, the
BYTES
field shows
values rounded to the nearest byte or kilobyte. To obtain the exact bytes_total
value, use the add-to-search feature provided by the query
builder.The filter evaluates queries according to the standard order of precedence for logical
operators. However, you can change the order of operations by grouping
terms in parentheses.
It is an error to create a query with identical start and end times.
Expression Type | Definition |
---|---|
Numeric comparison |
Equality operators are described below. |
String comparison |
|
Pattern matching |
Pattern matching is supported only for fields that contain
strings or IP addresses. For strings and IP addresses, % may be
provided as a wild card character at any location in the value.
A pattern matching expression that does not provide a wild card
returns the identical log lines as an equality comparison. |
You must use single quotes with your string values: '<
value
>'. Double
quotes are illegal: "<value
>". Supported Operators
When building a query, you can choose from a set of operators. The following table
describes when to use each operator and lists its compatible values.
Operator | When to Use it | Possible Values |
---|---|---|
= | Find logs that contain an exact value. |
bytes_total = 270
action.value = 'allow'
Full: src_ip.value = “192.1.1.10/32” Subnet Range: src_ip.value = “192.1.1.10/24”
time_generated = '2022-03-29 12:57:14' |
!= or <> | Find logs that do not contain anexact value. |
bytes_total != 270 bytes_total <> 270
action.value != 'allow' action.value <> 'allow'
Full: src_ip.value != “192.1.1.10/32” Subnet range: src_ip.value <>
“192.1.1.10/24”
time_generated != '2022-03-29 12:57:14' time_generated <> '2022-03-29 12:57:14' |
< | Find logs with data less than a value. |
bytes_total < 270
time_generated < '2022-03-29 12:57:14' |
<= | Find logs with data less than or equal to a
value. |
bytes_total <= 270
time_generated <= '2022-03-29 12:57:14' |
> | Find logs with data greater than a value. |
bytes_total < 270
time_generated > '2022-03-29 12:57:14' |
>= | Find logs with data greater than or equal to a
value. |
bytes_total <= 270
time_generated >= '2022-03-29 12:57:14' |
LIKE | Find logs with data that matches a string pattern. LIKE is not
supported for fields such as action ,
tunnel , or
proto that have limited possible
values. |
source_user_info.name LIKE “usern_me” You can use either _ or
% as wildcard characters. |
AND | Find logs that satisfy multiple search terms at
once. |
bytes_total = 270 AND
source_user_info.name LIKE “usern_me” AND src_ip.value !=
“192.1.1.10/24” |
OR | Find logs that satisfy at least one of multiple search
terms. |
bytes_total = 270 OR source_user_info.name LIKE
“usern_me” OR src_ip.value != “192.1.1.10/24” |
() | Specify the priority in which search terms are
evaluated. |
bytes_total = 270 AND (source_user_info.name LIKE
“usern_me” OR src_ip.value != “192.1.1.10/24”) |
About Field Names
The field names that
you use in your queries are sometimes, but not always, identical
to the names shown in the log record column headers. The field name
that you must use is the log record field name as it is stored in
Cortex Data Lake. There are two ways to obtain this field name:
- Click into the user interface query field to see a drop-down list of available field names for the selected log type. On the right-hand side of this drop-down list is the corresponding column name.
- The Explore Schema Reference guide provides a mapping of the log column name, as shown in the user interface, to the corresponding log record field name.
Most Popular
Recommended For You
Recommended Videos
Recommended videos not found.