Autonomous DEM
Synthetics for an Application
Table of Contents
Expand All
|
Collapse All
Autonomous DEM Docs
-
Autonomous DEM for China
- AI-Powered ADEM
- Autonomous DEM for China
- Autonomous DEM for China
- Products That Use Autonomous DEM
- Set up an Autonomous DEM Application Test
- ADEM Data Collection and Agent Processes
-
AI-Powered ADEM
- AI-Powered ADEM
- Access Experience Agent 5.1
- Access Experience Agent 5.3
Synthetics for an Application
The widgets that display under Synthesis show you the application
metrics across the organization for the selected app. If you have the Zoom QSS
integration with ADEM enabled, you can see the Zoom performance data in the
Zoom Data tab. For details on the Zoom Data tab, see Zoom Performance Analysis for an Application
Experience Score Cards
Select the Mobile User Experience card to see only the Mobile
User related data displayed in the widgets underneath. Likewise, clicking the
Remote Site Experience card displays data specific to
Remote Sites only in the widgets that follow.

The number enclosed in the rectangle is the Experience Score, which is a
weighted average of end-to-end application performance metrics for all monitored
applications across all users or remote sites. A fair or poor experience score lets
you know right away that there are performance issues impacting a large number of
your users or remote sites. However, because the experience score is weighted, it
may not uncover performance issues in monitored apps or locations that have a
smaller number of users.
The experience score will also give you an indication of the overall digital
experience for the user. For each application that is monitored per mobile user,
ADEM calculates a score based on the 5 critical metrics - application availability,
DNS resolution time, TCP connect time, SSL connect time, and the HTTP latency. If
the application fails the availability test (application is unavailable), then the
experience score is 0. If the application is reachable, only then the remaining four
metrics will be calculated. Each of the above metrics (other than application
reachability) have a different weightage and baselined lower and upper thresholds,
and their combined weightage equals 100. The sum of these individual metric scores
determines the application experience score for a user. An average of all the test
sample results for each application determines the experience score of a user.
The Monitored Applications graph is color coded to show you
the number of applications that have Good (green), Fair (yellow), and Poor (red)
experience scores. An application experience score >= 70 is Good, 69-30 is Fair, and
below 30 is Poor.
Application Experience Score Trend
This widget shows the digital experience trend across the network for a specific app
and allows you to pinpoint when the digital experience began to degrade. You can
visually see how this app is performing compared to the rest of the apps in the
organization.
It displays a graphical representation of the application experience scores with a
trend line for the selected Time Range. The y-axis is color
coded to show you what category your experience score fell into at any given time in
the selected Time Range. Hover your mouse cursor over the
trend line to see the experience score at the exact time where your cursor is
placed.

Experience Score Across Network
This widget gives you a sense of the distribution of app performance across all
monitored apps, users, and remote sites and lets you drill-down into specific apps
or sites that are performing poorly.
You can see what segment of the network might be causing issues within your
organization from the endpoints and Prisma SD-WAN remote sites all the way to the
application. You can see what issues—such as an ISP or compute location outage or a
SaaS app outage—is impacting digital experience within your organization.

Global Distribution of Application Experience Scores
The map view in this widget shows you the application experience of Prisma Access
Locations based on the total number of Mobile Users and applications monitored or
the total number of Remote Sites and applications monitored on a specific Prisma
Access Location on the map.
The Prisma Access locations are marked with circles that are color coded to represent
the status of application segment scores of all monitored mobile users and remote
sites connected to the specific Prisma Access Location where the circle appears.
Hover your mouse cursor over a circle to see the experience scores for the location,
as well as the total number of Mobile User Devices or Remote Sites monitored and the
total number of apps that are monitored on them.
Click on the number next to the Total Mobile User Devices or
Total Remote Sites to open the Monitored
Mobile Users dashboard or the Monitored Remote
Sites dashboard respectively. From these dashboards, you can drill
down into the individual mobile user or remote site to view the data on them.
Clicking on the number next to the Total Apps opens the
Applications dashboard where you can see the detailed
metrics for the apps that are being monitored in the region.
