What show are you binge watching? Which podcast has you hooked? The world has never been more obsessed with video and audio content, and your customers are no different. Learn how to implement Adobe Analytics for video and audio to discover more about what makes your customers go Ooga-chaka Ooga-Ooga!
This lesson you will review the basics of out tracking solution, a few tips about tracking configuration and most important differences between the SDK and the API.
Key capabilities:
Key concepts:
Pro Tip! Always set COMPLETE or END events to reduce the data processing latency.
Our tracking solution provides over 100 out-of-the box properties split into 4 categories: content (including standard metadata for video & audio), ad, chapter and quality. In addition to those we provide over 20 Segments and Calculated Metrics that might come in handy.
Module | Dimensions | Metrics |
---|---|---|
Content | 32 | 22 |
Ad | 14 | 3 |
Chapter | 5 | 3 |
Quality of Experience | 11 | 15 |
All properties are available in Analytics to be consumed within any reporting tool (Analysis Workspace, legacy reporting, Report Builder) or to be exported via Web Services APIs, Data Feeds or Data Warehouse exports. Also, you could use all media properties to build your own Segments, Calculated Metrics, Alerts or Virtual Reports.
Pro Tip! Make sure that you will NOT change the classification names because will break any further updates of the report suites already enabled for media tracking.
Most of the available tracking properties are common for all types of media content. But there are a few specific ones that will need to be set in order to get the expected reporting capabilities for each component (audio and/or video). You could enable a report for audio and video tracking in the same time and have the data in a single place.
Configuration | Audio | Video |
---|---|---|
Stream Type | audio | video |
Content Type | AOD, Podcast, Audiobook, Linear, Live | VOD, Linear, Live |
Standard Metadata | Station, Artist, Album, Label, Author | Network, MVPD, Authorized |
Pro Tip! Our recommendation is to treat Linear as a playlist of On Demand contents.
Custom metadata is the feature that allows customers to add any info that they need on top of our standard properties for both stream types (audio and video). The custom properties could be added only when a main content, ad or chapter starts. This data can't be changed during playback.
Pro Tip! Make sure to use your own properties using your own prefix (company.mycustom). Using out-of-the box video and audio properties will inflate the reporting numbers.
Media SDK | Media Collection API | |
---|---|---|
Platforms supported | JavaScript, Android, iOS, Roku Bright Script | Any client platform or Server Side integrations |
Certificated devices | Desktop, iOS devices (mobile, tablet, tvOS), Android devices (mobile, tablet, Android TV, Fire TV), Roku | N/A |
Benefits | Contains logic to correct tracking implementation in different scenarios, handles 10 second granularity, out-of-the box integration with Visitor Service | support for all platforms or server side and no 3rd party SDK need it |
Downsides | Multiple SDKs (one for each platform) | The integration code must be maintained at application level, need to send 10 second pings, Visitor Service integration must handle by customers |
Pro Tip! Plan for upgrades, regardless if you are using the SDK or the Collection API.
We have noticed some less than perfect configurations on our logs, some of the most important configuration suggestions are:
Pro Tip! No data is better than bad data!
We will go through some of the key concepts needed to understand how to implement a Media SDK in your applications consuming media. In this lesson we will also introduce key Adobe Launch concepts like mobile properties, extension catalogs and publishing environments. We'll look into how to configure a mobile property and use Launch AppId to install MediaSDK extension in your application.
Use the presented credentials to access the Adobe Launch or use the direct link: https://launch.adobe.com.
Your computer is assigned a number between 1 and 100. Find out your number and use it to login:
Heads Up! Each lab machine gives access to a personalized Adobe Launch setup, thus always use your own machine credentials when logging into Analytics.
Once logged in, click on "New Property" button at top right corner to create a new launch mobile property.
After clicking New Property button, fill in property name using (please include your computer number), select mobile platform, and check on "Configure for extension development" under Advanced Options. Save the property
For this lab, we will be installing "Adobe Media Analytics" and "Adobe Analytics" extensions. Media Analytics has a hard dependency on Analytics, hence, we need to install and configure both these extensions to get media extension working.
Note: By default, every mobile extension depends on Mobile Core and Profile extensions which is installed and configured as a default in your launch mobile property.
From the list of existing properties, open the property created earlier. You could search your property using "Search" box on top left corner using the property name.
On this next page, click on "Extensions" tab to see all the installed mobile extensions. To install media extension, click on "Catalog" to see a list of available extensions to install for your mobile property. In the Search box type "Analytics" to filter out all other extensions.
Click on "Install" button on Adobe Analytics Extension to start the configuration process. Once you see configuration UI, fill in the form with following settings:
You can leave rest of the configuration unless you are specifically in features like offline tracking or Audience Manager forwarding.
Click on the blue "Save" button on top right corner to save the configuration and install extension.
After saving your Analytics configuration you will be returned to Extensions menu with installed extensions. Select "Catalog" option again and search for "Analytics" to repeat similar steps required for installing Media Analytics extension.
Click on "Install" button on Adobe Media Analytics Extension to start the configuration process. Once you see configuration UI, fill in the form with following settings:
Pro Tip! During development phase, you might want to check on Debug Logging box.
Click on the blue "Save" button on top right corner to save the configuration and install extension.
After you have configured Adobe Analytics and Media Analytics solutions of the Adobe Experience Cloud, it's time to learn the publishing workflow. To achieve this, go to "Publishing" tab on your Launch UI.
To start adding your configured extensions, you need to create a new library into your development environment. Click on "Add New Library" button or click on small "+" icon to create a new library.
Enter the name for your library and select "Development" for the Environment.
Pro Tip! Always create a library to test with Development environment first then publish Staging and Production as testing progresses.
Now, you need to select all the resources that needs to be published in Development environment. This can be either done by selecting "Add a Resource" option below and add every changed resource one by one manually or you can select "Add All Changed Resources" option to automatically add all the changed resources. For this exercise, lets select "Add All Changed Resources" option. Since this is newly created property, you will see all the changes for extensions we configured in previous exercises as seen below.
If you see all the changed resources being added, you can click "Save and Build for Development" button to publish this mobile property with installed extensions to development environment.
Once you save and build your library, you should see it published in development environment as shown in below figure.
After publishing your library with all required mobile extensions configured in Adobe Launch, now validating the library and implementing SDK APIs is the next crucial step in getting your mobile apps enabled for media tracking. In order to learn this please go through the following steps to complete this exercise.
Let us first validate if our library was published and if we have a development AppId generated. For this, click on "Environments" tab in your Adobe Launch UI and then click on "Install" icon from the right side.
At this point, you should see a pop-up UI showcasing Environment ID (aka AppId for mobile) and implementation steps for Android and iOS. Since this lab focuses on hands-on Android application, make sure you are on the Android tab for generated code snippets.
We will now keep this web page open and move on to setting up the Android project. Once the Android project is setup, we will come back to this web page for implementation code snippets.
To implement mobile SDK in Android application, we have already created a template Android project that includes a basic video player implementation. To open this Android project, open Android Studio application located on your dock. Then click on the SummitSampleApp tab on left side OR click on "Open an existing Android Studio project", browse to Desktop/SummitSampleApp and click "Open".
Once done, Android Studio should start loading the Android project and you should see gradle building the project.
Once gradle builds successfully, we can start using mobile installation steps from Launch configuration. Open build.gradle (Module:app) file located under "Gradle Scripts", copy the contents from "Add the dependency to build.gradle:" section of Launch UI (1) and paste it inside the "dependencies" section of gradle file in your project .
Tip: You can also click on copy icon to precisely copy contents from Launch UI. Refer figure 18. above for highlighted copy button.
After modifying build.gradle file as shown in figure above, you should see a "Sync Now" link on top right corner as highlighted above. In order to fetch latest Mobile Extension libraries configured on Launch, click on "Sync Now" and wait for gradle build to finish.
Open SummitSampleApp.java file located under apps/java/com.adobe.media.sample directory of your Android studio project. In order to get all the mobile extensions installed execute the following steps:
In the end, after copy-pasting all the code from Launch UI to SummitSampleApp.java, your application code should look like following figure.
At this point, your application is now configured to use Adobe Launch extensions for Analytics tracking. Your template project already has implementation for video player that plays sample video content and executed Media tracking APIs. if you are interested to look into MediaSDK extension API implementation, refer to "MediaTrackerHelper.java" and "VideoAnalyticsProvider.java" classes under "analytics package". For detailed documentation on Media APIs, please follow the URL: https://aep-sdks.gitbook.io/docs/
Last step would be to see the application running on Android emulator and check network traffic for tracking video content. To do so, simply click on "play" icon on top of Android Studio and select Android emulator device which is already preconfigured in your Android studio. The PIN code is 0000.
Once the app launches, hit play button to play the video content. At this point, the Media SDK will provide an extended set of data about the content tracking process in Logcat window.
The Logcat info is useful to understand what happens inside the Media SDK but is hard to validate the implementation from data perspective. Next lesson will present a data validation flow using Debug.
This lesson will be focused on a few new features that were introduced in Debug that should help you to have an out-of-the box validation at any given moment during implementation, testing or after releases.
Debug is a tool that helps Video & Audio customers to validate Media SDK and Media Collection API implementations across all devices (desktop, mobile, tablet, Android TV, Apple TV, Fire TV, Roku, Smart TVs, xBox, PlayStation or any other connected device). This tool is not limited to video and audio tracking validation, has support for other solution as well.
If you don't have access to Debug, please ask your organization admin to grant you access via Admin Console - https://adminconsole.adobe.com. If your organization is not provisioned for Debug, then you will need to require to be provisioned. Ask your consultant to help you out.
To help you validate video and audio implementations faster, we added a series of pre-define scenarios that could be used to check your media tracking.
Open you Chrome browser and login to Debug at https://debug.adobe.com
To speed things up, we granted access to Debug for all of you using the same credential as for Launch.
After login, the generic dashboard will be loaded. On the top right corner, you will notice the Proxy is Off.
Start the Adobe Debug Proxy that is already installed on your machine. Please use the machine credentials if will be required.
Once the Proxy is ON, click "New Report" button to start a new Debug session. For this exercise you will need to select Adobe Analytics and Video Heartbeats plugins and then click "Start Debug Report".
Open the Android simulator and press play button. Leave the content to play until completion, the content is only 1 minute long. You could switch back to the Debug tab and see how the data is collected in real-time.
Once the playback is completed, click the "Stop Report" button on session view to stop recording any other data. At that point 2 new tabs will be available "Audit Report" and "Media Summary". On the "Calls" tab you could inspect any call that was captured and check any potential warnings or errors.
On the "Media Summary" tab you will find a list of metrics that were captured during playback and should be available for reporting on Analytics. The data is grouped by Content ID and then by playback session ID. For each playback session you could run any of the pre-defined validation scenarios. In our case please select "Complete Playback" one and click "Validate". For each scenario there are more data points that are checked and based on those the tool provides a list with all passed and failed steps. Each step provides hints about what's checked, and you would know what has to be fixed to pass all steps and, finally, passed a certain scenario. Of course, that there is no point to pass multiple scenarios for each playback, you will need to run a different playback for each scenario that you want to validate.
The goal of this lesson is to do a high-level presentation of our Media Collection API solution and how that could be used for Downloaded Content Tracking.
The Media Collection API is the alternative solution for the platforms where we don't have an SDK for or for the customers that don't want to include 3rd party SDKs on their own apps. The general concepts are:
Pro Tip! Make sure that you will use the same user identification values (MID, VID) across all Experience Cloud tracking solutions.
Switch to Debug, on your browser tab, and start a new Debug session. This time, please select only Media Collection API plugin.
Open a new browser tab at http://localhost/api and press play button. Leave the content to play until completion, the content is only 1 minute long. You could switch back to the Debug tab and see how the data is collected in real-time.
Once the playback is completed, click the "Stop Report" button on session view to stop recording any other data. At that point 2 new tabs will be available "Audit Report" and "Media Summary". On the "Calls" tab you could inspect any call that was captured and check any potential warnings or errors.
On the "Media Summary" tab you will find a list of metrics that were captured during playback and should be available for reporting on Analytics for the API tracking solution.
The scenario that has surfaced this use case is the following:
The API call for Download Content Tracking should contain the entire tracking data as below:
_endpoint: http://custom.hb-api.omtrdc.net/api/v1/sessions payload: {data:[ {eventType: "sessionStart", playerTime: {playhead: 0, ts: 1529997923478}, params: {}, customMetadata:{}, qoeData:{}}, {eventType: "play", playerTime: {playhead: 0, ts: 1529997928174}}, {eventType: "ping", playerTime: {playhead: 10, ts: 1529997937503}}, {eventType: "ping", playerTime: {playhead: 20, ts: 1529997947533}}, {eventType: "ping", playerTime: {playhead: 30, ts: 1529997957545},}, {eventType: "sessionComplete", playerTime: {playhead: 35, ts: 1529997960559}} ] }
In this lesson, we will use Analysis Workspace to understand the video and audio tracking data. The goal is to understand how to look at the data and learn how to apply an alert on relevant media metrics.
Open a browser tab at https://experiencecloud.adobe.com and login using provided credentials. Then click on the menu icon on top right corner and select Analytics from the new menu.
Once you are on Analytics view, please select Workspace from top left corner. That will open a list view of all Analytics Workspace available for your account.
Click "Create New Project" and select "Audio Consumption" template, then click "Create"
The new project will provide a set of trends based on the data available on that report suite. In order to see the entire media data (video and audio) you need to remove the "Media Stream Type: Audio" segment or you could use the "Media Stream Type: Video" segment instead. In the first 2 panels there are top numbers and trends for last completed week and month along with a top 5 contents based on average consumption time.
The last panel contains a list of important metrics and trends for the current week. Also, there are a few breakdowns that should help you to have an overview of the usage across multiple segments.
Finally, on the bottom of the page, there is a breakdown of playback experience by the Device type. By default, this table contains the number of streams started and impacted by different metrics (like buffer, bitrate change and errors).
Right from the start we could notice that:
Pro Tip! The Analysis Workspace offer a lot of flexibility like multiple view and breakdowns in the same project and that is useful when you want to understand if there are any issues with the data or maybe at the player level.
To investigate the above issues we will remove Buffer Impacted Streams, Bitrate Change Impacted Streams and Error Impacted Streams from the Playback Experience by Device Type table and add new ones.
We already setup this report to store additional properties that are sent to Analytics but not setup by default:
So, lets add the following metrics to our working table: Content Completion Rate, Media Close Ratio, Error Events, Content Time Spent and Stall Time (HH:MM:SS)
It's obvious that:
To get a better understanding of what are the potential issues we will breakdown the data by a few dimensions:
Now we have a point to start in any further investigation, there is a clear information regarding the potential apps that are generation "bad" data. As next steps you could try to use more breakdowns to determine if there is a specific device, OS version or browser type that might have a negative impact on your data.
Pro Tip! You might consider storing additional media tracking properties that could help you to debug validations.
Another useful thing to do is to create alerts on the metrics that matters to you, just to be sure that you don't miss important data for you. To do that, right-click on the metric that you want to set the alert and select "Create alert from selection"
In this case, the alert is set to run every day and to send notifications when the close ration is under 95% or over 105%. Basically, you will want to make sure that the number of sessions started are almost the same as the ones closed.
Pro Tip! You might consider adding alerts on your top metrics to be able to act as soon as possible.