Experience League | Image

I’m hooked to streaming media: Tips for Video and Audio Analytics

Table of Contents

Lab Overview

What show are you binge watching? Which podcast has you hooked? The world has never been more obsessed with video and audio content, and your customers are no different. Learn how to implement Adobe Analytics for video and audio to discover more about what makes your customers go Ooga-chaka Ooga-Ooga!

Key Takeaways

  • Discover new Analytics for Video and Audio capabilities
  • Learn how to leverage Launch by Adobe to quickly configure the Media SDK for mobile devices
  • Understand how to use Media Collection API for Downloaded Content Tracking
  • Learn how to validate Video and Audio implementations using reporting data within Analytics or Debug tool


  • Basic understanding of streaming tracking and how a media player works
  • Basic understanding of Adobe Launch
  • Basic understanding of Analytics reporting - Analysis Workspace

Lesson 1 - Warm up: The Secrets of the Heartbeat Solution


  1. Understand how to setup the configuration parameters for different use cases
  2. Why and how to use Custom Metadata support
  3. When and why to use the Media SDK vs the Media Collection API tracking solutions

Lesson Context

This lesson you will review the basics of out tracking solution, a few tips about tracking configuration and most important differences between the SDK and the API.

Media solution overview

Key capabilities:

  • Video & Audio stream types support
  • Media SDK and Media Collection API tracking capabilities
  • 10 seconds granularity for the main content and 1 second granularity for ads

Figure 1: Solution Overview

Data processing timeline

Key concepts:

  • Media Start is the only metrics set when a playback starts all the other metrics are computed on Heartbeats backend and sent to Analytics on the close call
  • The same logic is applied for Ads, only the Ad Start metrics is when ad is initiated and all other metrics are sent to Analytics on the ad close call from Heartbeats backend
  • Based on events received a session is closed right away or 10 minutes after the last heartbeat event received by Heartbeats backend

Figure 2: Playback Timeline

Pro Tip! Always set COMPLETE or END events to reduce the data processing latency.

Video & Audio reporting properties

Our tracking solution provides over 100 out-of-the box properties split into 4 categories: content (including standard metadata for video & audio), ad, chapter and quality. In addition to those we provide over 20 Segments and Calculated Metrics that might come in handy.

Module Dimensions Metrics
Content 32 22
Ad 14 3
Chapter 5 3
Quality of Experience 11 15

All properties are available in Analytics to be consumed within any reporting tool (Analysis Workspace, legacy reporting, Report Builder) or to be exported via Web Services APIs, Data Feeds or Data Warehouse exports. Also, you could use all media properties to build your own Segments, Calculated Metrics, Alerts or Virtual Reports.

Pro Tip! Make sure that you will NOT change the classification names because will break any further updates of the report suites already enabled for media tracking.

Audio vs Video

Most of the available tracking properties are common for all types of media content. But there are a few specific ones that will need to be set in order to get the expected reporting capabilities for each component (audio and/or video). You could enable a report for audio and video tracking in the same time and have the data in a single place.

Configuration Audio Video
Stream Type audio video
Content Type AOD, Podcast, Audiobook, Linear, Live VOD, Linear, Live
Standard Metadata Station, Artist, Album, Label, Author Network, MVPD, Authorized

Pro Tip! Our recommendation is to treat Linear as a playlist of On Demand contents.

Custom metadata support

Custom metadata is the feature that allows customers to add any info that they need on top of our standard properties for both stream types (audio and video). The custom properties could be added only when a main content, ad or chapter starts. This data can't be changed during playback.

Pro Tip! Make sure to use your own properties using your own prefix (company.mycustom). Using out-of-the box video and audio properties will inflate the reporting numbers.

Media SDK vs Media Collection API

Media SDK Media Collection API
Platforms supported JavaScript, Android, iOS, Roku Bright Script Any client platform or Server Side integrations
Certificated devices Desktop, iOS devices (mobile, tablet, tvOS), Android devices (mobile, tablet, Android TV, Fire TV), Roku N/A
Benefits Contains logic to correct tracking implementation in different scenarios, handles 10 second granularity, out-of-the box integration with Visitor Service support for all platforms or server side and no 3rd party SDK need it
Downsides Multiple SDKs (one for each platform) The integration code must be maintained at application level, need to send 10 second pings, Visitor Service integration must handle by customers

Pro Tip! Plan for upgrades, regardless if you are using the SDK or the Collection API.

Configuration Settings

We have noticed some less than perfect configurations on our logs, some of the most important configuration suggestions are:

  • Organization ID / Publisher - make sure that looks like 3CE342C75100435B0A490D4C@AdobeOrg
  • Tracking Servers - don't mix them, each one has a specific role with a specific data format
    • Analytics: <your_organization_domain>.sc.omtrdc.net or your custom endpoint
    • Heartbeat: <your_organization_domain>.hb.omtrdc.net
    • Collection API Stage: <your_organization_domain>-stage.hb-api.omtrdc.net - use this one during development and testing for each app release
    • Collection API Production: <your_organization_domain>.hb-api.omtrdc.net - by default all traffic is pushed to stage, you will need to request provisioning for API Production environment
  • Always change the default values like "sample" and "test" or better remove them if you don't need them

Pro Tip! No data is better than bad data!

Lesson 2 - Make it simple: Media SDK Extension for Launch by Adobe


  1. Understand how to configure Media extension through Adobe Launch
  2. Learn publishing workflow for Adobe Launch mobile extensions
  3. Learn how your application implements Media Extension published through Adobe Launch

Lesson Context

We will go through some of the key concepts needed to understand how to implement a Media SDK in your applications consuming media. In this lesson we will also introduce key Adobe Launch concepts like mobile properties, extension catalogs and publishing environments. We'll look into how to configure a mobile property and use Launch AppId to install MediaSDK extension in your application.

Exercise 2.1 - Create a mobile property in Adobe Launch

  1. Use the presented credentials to access the Adobe Launch or use the direct link: https://launch.adobe.com.

    Your computer is assigned a number between 1 and 100. Find out your number and use it to login:

    Heads Up! Each lab machine gives access to a personalized Adobe Launch setup, thus always use your own machine credentials when logging into Analytics.

  2. Once logged in, click on "New Property" button at top right corner to create a new launch mobile property.

    Figure 3: Launch Dashboard

  3. After clicking New Property button, fill in property name using (please include your computer number), select mobile platform, and check on "Configure for extension development" under Advanced Options. Save the property

    Figure 4: Mobile Property Configuration

Exercise 2.2 - Configure mobile extensions though catalog

For this lab, we will be installing "Adobe Media Analytics" and "Adobe Analytics" extensions. Media Analytics has a hard dependency on Analytics, hence, we need to install and configure both these extensions to get media extension working.

Note: By default, every mobile extension depends on Mobile Core and Profile extensions which is installed and configured as a default in your launch mobile property.

  1. From the list of existing properties, open the property created earlier. You could search your property using "Search" box on top left corner using the property name.

  2. On this next page, click on "Extensions" tab to see all the installed mobile extensions. To install media extension, click on "Catalog" to see a list of available extensions to install for your mobile property. In the Search box type "Analytics" to filter out all other extensions.

    Figure 5: App Catalog

  3. Click on "Install" button on Adobe Analytics Extension to start the configuration process. Once you see configuration UI, fill in the form with following settings:

    • Report Suites: summit2019l756.lab
    • Tracking Server: summit2019l756.sc.omtrdc.net

    You can leave rest of the configuration unless you are specifically in features like offline tracking or Audience Manager forwarding.

    Click on the blue "Save" button on top right corner to save the configuration and install extension.

    Figure 6: Analytics Configuration UI

  4. After saving your Analytics configuration you will be returned to Extensions menu with installed extensions. Select "Catalog" option again and search for "Analytics" to repeat similar steps required for installing Media Analytics extension.

  5. Click on "Install" button on Adobe Media Analytics Extension to start the configuration process. Once you see configuration UI, fill in the form with following settings:

    • Tracking Server: summit2019l756.hb.omtrdc.net
    • Channel: <your_media_channel>
    • Online Video Provider: <your_ovp>
    • Player Name: <your_content_player_name_value>
    • Application Version: <your_application_version>

    Pro Tip! During development phase, you might want to check on Debug Logging box.

  6. Click on the blue "Save" button on top right corner to save the configuration and install extension.

    Figure 7: Media Configuration UI

Exercise 2.3 - Publishing the changes to development environment

  1. After you have configured Adobe Analytics and Media Analytics solutions of the Adobe Experience Cloud, it's time to learn the publishing workflow. To achieve this, go to "Publishing" tab on your Launch UI.

  2. To start adding your configured extensions, you need to create a new library into your development environment. Click on "Add New Library" button or click on small "+" icon to create a new library.

    Figure 8: Media Configuration UI

  3. Enter the name for your library and select "Development" for the Environment.

    Pro Tip! Always create a library to test with Development environment first then publish Staging and Production as testing progresses.

  4. Now, you need to select all the resources that needs to be published in Development environment. This can be either done by selecting "Add a Resource" option below and add every changed resource one by one manually or you can select "Add All Changed Resources" option to automatically add all the changed resources. For this exercise, lets select "Add All Changed Resources" option. Since this is newly created property, you will see all the changes for extensions we configured in previous exercises as seen below.

  5. If you see all the changed resources being added, you can click "Save and Build for Development" button to publish this mobile property with installed extensions to development environment.

    Figure 9: Development Configuration

  6. Once you save and build your library, you should see it published in development environment as shown in below figure.

    Figure 10: Save and Build

Exercise 2.4 - Validate library and implement media extension tracking APIs

After publishing your library with all required mobile extensions configured in Adobe Launch, now validating the library and implementing SDK APIs is the next crucial step in getting your mobile apps enabled for media tracking. In order to learn this please go through the following steps to complete this exercise.

  1. Let us first validate if our library was published and if we have a development AppId generated. For this, click on "Environments" tab in your Adobe Launch UI and then click on "Install" icon from the right side.

    Figure 11: Validate Library

  2. At this point, you should see a pop-up UI showcasing Environment ID (aka AppId for mobile) and implementation steps for Android and iOS. Since this lab focuses on hands-on Android application, make sure you are on the Android tab for generated code snippets.

    We will now keep this web page open and move on to setting up the Android project. Once the Android project is setup, we will come back to this web page for implementation code snippets.

    Figure 12: Mobile Install Instructions

  3. To implement mobile SDK in Android application, we have already created a template Android project that includes a basic video player implementation. To open this Android project, open Android Studio application located on your dock. Then click on the SummitSampleApp tab on left side OR click on "Open an existing Android Studio project", browse to Desktop/SummitSampleApp and click "Open".

    Figure 13: Android Studio Icon

  4. Once done, Android Studio should start loading the Android project and you should see gradle building the project.

  5. Once gradle builds successfully, we can start using mobile installation steps from Launch configuration. Open build.gradle (Module:app) file located under "Gradle Scripts", copy the contents from "Add the dependency to build.gradle:" section of Launch UI (1) and paste it inside the "dependencies" section of gradle file in your project .

    Tip: You can also click on copy icon to precisely copy contents from Launch UI. Refer figure 18. above for highlighted copy button.

    Figure 14: Copy build.gradle Dependencies

  6. After modifying build.gradle file as shown in figure above, you should see a "Sync Now" link on top right corner as highlighted above. In order to fetch latest Mobile Extension libraries configured on Launch, click on "Sync Now" and wait for gradle build to finish.

  7. Open SummitSampleApp.java file located under apps/java/com.adobe.media.sample directory of your Android studio project. In order to get all the mobile extensions installed execute the following steps:

    • (2) From Adobe Launch mobile install instructions UI, copy import statements and paste it to SummitSampleApp.java file above existing import statement. This will make sure we have all the required SDK classes imported in your application file.
    • (3) Similarly, copy MobileCore API calls from Launch UI and paste it right after "super.onCreate()" method in SummitSampleApp.java file.
    • (4) Continue the setup by copying try-catch block next and pasting it below the code pasted in above step.

    In the end, after copy-pasting all the code from Launch UI to SummitSampleApp.java, your application code should look like following figure.

    Figure 15: Copy build.gradle Dependencies

  8. At this point, your application is now configured to use Adobe Launch extensions for Analytics tracking. Your template project already has implementation for video player that plays sample video content and executed Media tracking APIs. if you are interested to look into MediaSDK extension API implementation, refer to "MediaTrackerHelper.java" and "VideoAnalyticsProvider.java" classes under "analytics package". For detailed documentation on Media APIs, please follow the URL: https://aep-sdks.gitbook.io/docs/

  9. Last step would be to see the application running on Android emulator and check network traffic for tracking video content. To do so, simply click on "play" icon on top of Android Studio and select Android emulator device which is already preconfigured in your Android studio. The PIN code is 0000.

    Figure 16: Run SummitSampleApp

  10. Once the app launches, hit play button to play the video content. At this point, the Media SDK will provide an extended set of data about the content tracking process in Logcat window.

    Figure 17: Android Logcat info

The Logcat info is useful to understand what happens inside the Media SDK but is hard to validate the implementation from data perspective. Next lesson will present a data validation flow using Debug.

Lesson 3 - Do it yourself: Self Validation using Adobe Debug


  1. Use Adobe Debug to inspect implementation during development or after production releases
  2. Use Self Validation functionality to assess your video and audio implementations

Lesson Context

This lesson will be focused on a few new features that were introduced in Debug that should help you to have an out-of-the box validation at any given moment during implementation, testing or after releases.

Debug is a tool that helps Video & Audio customers to validate Media SDK and Media Collection API implementations across all devices (desktop, mobile, tablet, Android TV, Apple TV, Fire TV, Roku, Smart TVs, xBox, PlayStation or any other connected device). This tool is not limited to video and audio tracking validation, has support for other solution as well.

If you don't have access to Debug, please ask your organization admin to grant you access via Admin Console - https://adminconsole.adobe.com. If your organization is not provisioned for Debug, then you will need to require to be provisioned. Ask your consultant to help you out.

To help you validate video and audio implementations faster, we added a series of pre-define scenarios that could be used to check your media tracking.

Exercise 3.1

  1. Open you Chrome browser and login to Debug at https://debug.adobe.com

    To speed things up, we granted access to Debug for all of you using the same credential as for Launch.

  2. After login, the generic dashboard will be loaded. On the top right corner, you will notice the Proxy is Off.

    Figure 18: Debug Dashboard View

  3. Start the Adobe Debug Proxy that is already installed on your machine. Please use the machine credentials if will be required.

    Figure 19: Debug New Session

  4. Once the Proxy is ON, click "New Report" button to start a new Debug session. For this exercise you will need to select Adobe Analytics and Video Heartbeats plugins and then click "Start Debug Report".

    Figure 20: Debug New Session

  5. Open the Android simulator and press play button. Leave the content to play until completion, the content is only 1 minute long. You could switch back to the Debug tab and see how the data is collected in real-time.


  6. Once the playback is completed, click the "Stop Report" button on session view to stop recording any other data. At that point 2 new tabs will be available "Audit Report" and "Media Summary". On the "Calls" tab you could inspect any call that was captured and check any potential warnings or errors.

    Figure 22: Debug Session View

  7. On the "Media Summary" tab you will find a list of metrics that were captured during playback and should be available for reporting on Analytics. The data is grouped by Content ID and then by playback session ID. For each playback session you could run any of the pre-defined validation scenarios. In our case please select "Complete Playback" one and click "Validate". For each scenario there are more data points that are checked and based on those the tool provides a list with all passed and failed steps. Each step provides hints about what's checked, and you would know what has to be fixed to pass all steps and, finally, passed a certain scenario. Of course, that there is no point to pass multiple scenarios for each playback, you will need to run a different playback for each scenario that you want to validate.

    Figure 23: Debug Video Summary

Lesson 4 - Get dirty: Media Collection API for Downloaded Content Tracking


  1. Understand Media Collection API concepts
  2. Understand Downloaded Content tracking capabilities

Lesson Context

The goal of this lesson is to do a high-level presentation of our Media Collection API solution and how that could be used for Downloaded Content Tracking.

Media Collection API considerations

The Media Collection API is the alternative solution for the platforms where we don't have an SDK for or for the customers that don't want to include 3rd party SDKs on their own apps. The general concepts are:

  • the developer will have to integrate the Media Collection API at player level or server side across all player states and fragment types
  • once a session is initiated, the Media Collection backend will provide a Session ID that will have to be used for any consecutive calls during same playback session
  • the developer will need to send granular pings from player: each second during ads and every 10 seconds for the main content

Figure 24: Debug New Session

Pro Tip! Make sure that you will use the same user identification values (MID, VID) across all Experience Cloud tracking solutions.

Exercise 4.1

  1. Switch to Debug, on your browser tab, and start a new Debug session. This time, please select only Media Collection API plugin.

    Figure 25: Debug New Session

  2. Open a new browser tab at http://localhost/api and press play button. Leave the content to play until completion, the content is only 1 minute long. You could switch back to the Debug tab and see how the data is collected in real-time.

    Figure 26: Debug New Session

  3. Once the playback is completed, click the "Stop Report" button on session view to stop recording any other data. At that point 2 new tabs will be available "Audit Report" and "Media Summary". On the "Calls" tab you could inspect any call that was captured and check any potential warnings or errors.

    Figure 27: Debug Session View

  4. On the "Media Summary" tab you will find a list of metrics that were captured during playback and should be available for reporting on Analytics for the API tracking solution.

    Figure 28: Debug Video Summary

Downloaded content tracking scenario

The scenario that has surfaced this use case is the following:

  • A user downloads and installs an app on his/her mobile device;
  • After the app installation, the user downloads a content from the app into the device's local storage;
  • The user plays the content from device's storage regardless of the device connectivity (e.g.: on an airplane, commute);
  • During playback the tracking data is stored on the device regardless of the device connectivity;
  • After the user finishes the play session (abandon / complete) and comes back online, all the information gathered throughout the session is sent inside a single payload for processing & reporting.

Figure 29: Debug New Session

The API call for Download Content Tracking should contain the entire tracking data as below:

_endpoint: http://custom.hb-api.omtrdc.net/api/v1/sessions payload: {data:[ {eventType: "sessionStart", playerTime: {playhead: 0, ts: 1529997923478}, params: {}, customMetadata:{}, qoeData:{}}, {eventType: "play", playerTime: {playhead: 0, ts: 1529997928174}}, {eventType: "ping", playerTime: {playhead: 10, ts: 1529997937503}}, {eventType: "ping", playerTime: {playhead: 20, ts: 1529997947533}}, {eventType: "ping", playerTime: {playhead: 30, ts: 1529997957545},}, {eventType: "sessionComplete", playerTime: {playhead: 35, ts: 1529997960559}} ] }

Downloaded content tracking tips

  • As for now, this functionality is available only via Media Collection API
  • Make sure that the "media.downloaded" flag is configured and set to "true"
  • You will need to set your report settings to at least "Timestamp Optional"

Figure 30: Timestamp Configuration on Analytics Admin

Lesson 5 - Master the data: Using Video and Audio reporting data


  1. How to use Analysis Workspace templates for Video and Audio reporting
  2. Understand how to validate the tracking implementations using reporting data

Lesson Context

In this lesson, we will use Analysis Workspace to understand the video and audio tracking data. The goal is to understand how to look at the data and learn how to apply an alert on relevant media metrics.

Exercise 5.1

  1. Open a browser tab at https://experiencecloud.adobe.com and login using provided credentials. Then click on the menu icon on top right corner and select Analytics from the new menu.

    Figure 31: Experience Cloud View

  2. Once you are on Analytics view, please select Workspace from top left corner. That will open a list view of all Analytics Workspace available for your account.

    Figure 32: Analysis Workspace Projects List

  3. Click "Create New Project" and select "Audio Consumption" template, then click "Create"

    Figure 33: Analysis Workspace New Project

  4. The new project will provide a set of trends based on the data available on that report suite. In order to see the entire media data (video and audio) you need to remove the "Media Stream Type: Audio" segment or you could use the "Media Stream Type: Video" segment instead. In the first 2 panels there are top numbers and trends for last completed week and month along with a top 5 contents based on average consumption time.

    Figure 34: Analysis Workspace project for Media - top trends

  5. The last panel contains a list of important metrics and trends for the current week. Also, there are a few breakdowns that should help you to have an overview of the usage across multiple segments.

    Figure 35: Analysis Workspace project for - week trends

  6. Finally, on the bottom of the page, there is a breakdown of playback experience by the Device type. By default, this table contains the number of streams started and impacted by different metrics (like buffer, bitrate change and errors).

    Figure 36: Analysis Workspace project for - QoE Metrics

    Right from the start we could notice that:

    • The Content Completes for Set-top Box devices is way higher than the 95% Progress Marker
    • The number of streams impacted by errors really high, there are about 0.75 errors per stream

    Pro Tip! The Analysis Workspace offer a lot of flexibility like multiple view and breakdowns in the same project and that is useful when you want to understand if there are any issues with the data or maybe at the player level.

  7. To investigate the above issues we will remove Buffer Impacted Streams, Bitrate Change Impacted Streams and Error Impacted Streams from the Playback Experience by Device Type table and add new ones.

    We already setup this report to store additional properties that are sent to Analytics but not setup by default:

    • Media Close Ratio - this is a calculated metric computed as (Content Start + Drop before Starts) / Media Starts
    • Stall Time (HH:MM:SS) - this is calculated metrics that converts the Total Stalling Duration into time format, to get the Total Stalling Duration you need to create a processing rule to store a.media.qoe.stallTime value to a counter event

    So, lets add the following metrics to our working table: Content Completion Rate, Media Close Ratio, Error Events, Content Time Spent and Stall Time (HH:MM:SS)

    Figure 37: Analysis Workspace project for - QoE Metrics

    It's obvious that:

    • The Completion rate for STB is way higher than the average across all devices
    • On a perfect implementation there shouldn't be any stall time, but, in this example about 10% of playback time is lost as stall time
  8. To get a better understanding of what are the potential issues we will breakdown the data by a few dimensions:

    • sdkVersion - this is a custom eVar that collects the a.media.sdkVersion value via processing rule, the value is set at player level as build/version of your app
    • vhlVersion - this is a custom eVar that collects the a.media.vhlVersion value via processing rule, the values is set by our Media SDK
    • Player SDK Error IDs - this is a solution variable (multi variable) list that contains all error IDs collected during a playback and its setup for each report once is enabled for media tracking.

    Figure 38: Analysis Workspace project for - data investigation

    Now we have a point to start in any further investigation, there is a clear information regarding the potential apps that are generation "bad" data. As next steps you could try to use more breakdowns to determine if there is a specific device, OS version or browser type that might have a negative impact on your data.

    Pro Tip! You might consider storing additional media tracking properties that could help you to debug validations.

  9. Another useful thing to do is to create alerts on the metrics that matters to you, just to be sure that you don't miss important data for you. To do that, right-click on the metric that you want to set the alert and select "Create alert from selection"

    Figure 39: Setup alerts

  10. In this case, the alert is set to run every day and to send notifications when the close ration is under 95% or over 105%. Basically, you will want to make sure that the number of sessions started are almost the same as the ones closed.

    Figure 40: Alert editor view

    Pro Tip! You might consider adding alerts on your top metrics to be able to act as soon as possible.