Experience League | Image

L778 - Adobe Experience Platform: Developer and Architect Deep Dive

Table of Contents

Lab Overview

This workshop is intended to cover the following topics:

  • Adobe Experience Platform overview and use cases
  • Getting set up on and using Adobe Experience Platform
  • Experience Data Model (XDM) & Schema Registry APIs
  • Data ingestion APIs
  • Data lookup and processing APIs
  • Unified Profile Service, Segmentation APIs
  • Experience Platform Launch and streaming data into Experience Platform

In the course of completing this lab, you will learn how to access Adobe Experience Platform and perform different functions using our robust APIs. Experience Platform was built API-first, so the full functionality is at your fingertips. We will be looking throuh this process in the eyes of Luma, a fictitious yoga-based lifestyle site and online store. Adobe Experience Platform allows Luma to build a basis for its customer experience and loyalty program on the Unified Profile Service.

Adobe Experience Platform enables Luma to understand its users via Unified Profile, by bringing in existing customer profiles through batch ingestion, creating profiles and adding real-time behavioral data through streaming ingestion, and managing those Unified Profiles using the segmentation engine. All of this data helps to hydrate the Unified Profile, giving Luma a full real-time view of their customers.

Key Takeaways

After completion of this workshop. you should be able to:

  • Define and select an XDM Schema
  • Ingest sample data in XDM format into Experience Platform
  • Understand what can be done with XDM data and Unified Profiles
  • Use Experience Platform Launch to stream data from a webpage into Platform
  • Segment profiles created in the Unified Profile Service based off ingested data

Prerequisites

Lab Instructors

Lesson 0 - Authentication: Login and Accessing with Postman

Objective

  1. Set up an Adobe I/O Integration to call Adobe APIs
  2. Set up Postman to do API calls to Adobe Experience Platform
  3. Create your local websever to host a Luma demo site for data streaming (Lesson 3)

Lesson Resources

Lesson Context

Using the Adobe Experience Platform APIs, a suite of abilities to view schema, extend and create custom schema, (and more) are available to you within the API - there are some steps to get up and running with the Adobe Experience Platform API environment. This includes creating an integration to link your account to Adobe's APIs, and testing this link on the Postman software. Finally, we will quickly create a local webserver to host Luma's website for us to use in Lesson 3.

Exercise 0.1 - Create an integration

  1. An integration can be subscribed to one or more services. In many cases, you will use the same client credentials to access multiple Adobe services. In addition to APIs, you may also subscribe your integration to I/O Events so that your applications can access content and services in real-time.

  2. Create Certificate

    Open terminal and execute below command:

    openssl req -x509 -sha256 -nodes -days 365 -newkey rsa:2048 -keyout private.key -out certificate_pub.crt
    

    You will get a response similar to the following which prompts you to enter some information about yourself:

    Generating a 2048 bit RSA private key
    .................+++
    .......................................+++
    writing new private key to 'private.key'
    -----
    You are about to be asked to enter information that will be incorporated
    into your certificate request.
    What you are about to enter is what is called a Distinguished Name or a DN.
    There are quite a few fields but you can leave some blank
    For some fields there will be a default value,
    If you enter '.', the field will be left blank.
    -----
    Country Name (2 letter code) []:
    State or Province Name (full name) []:
    Locality Name (eg, city) []:
    Organization Name (eg, company) []:
    Organizational Unit Name (eg, section) []:
    Common Name (eg, fully qualified host name) []:
    Email Address []:
    

    After entering the information two files will be generated: certificate_pub.crt and private.key. These files can be found in the same directory that you ran the openssl command from.

    Note certificate_pub.crt will expire in 365 days. You can make the period longer by changing the value of days in the openssl command above but rotating credentials periodically is a good security practice.

    The certificate_pub.crt certificate will later be uploaded to the Adobe IO Console for when you create an API key for access to any Adobe I/O API.

    Your private key file named private.key will be used later to sign your JWT token.

    Note: Don't close this terminal window as you will need it later.

  3. Navigate to the Adobe I/O Console and sign in with your Adobe ID.

  4. From this page we want to create a New Integration.

    New integration

  5. You will then be prompted to Access an API or to Receive near-real-time events. We will be accessing APIs so select Access an API and then Continue.

    New integration 1

  6. The drop-down menu on the top right of the screen is where you would switch your organization if your account is tied to multiple. We are selecting Summit Platform Lab 03 and Experience Platform API under Adobe Experience Platform.

    New integration 2

  7. Fill in your Integration Details. Afterwards, click on Select a File to upload your certificate_pub.crt file we generated in the previous section. Then click on AEP Summit 2019 to select this product profile. Finally, click Create Integration to finish up the process

    New ingtegration 4

  8. After creating your integration, you will be able to view the details of your integration. After clicking on Retrieve client Secret your screen should look similar to this.

    Access values

    Copy down the values for {API KEY}, {IMS ORG} which is the Organization ID, and {CLIENT SECRET} as these will be used in the next step.

Exercise 0.2 - Authenticate via POSTMan

  1. Start POSTMan

  2. Click the Import button on the top left.

    Select the L778.postman_collection.json collection file from this repository.

  3. Next we need to import our environment. Click on the settings logo

    To bring up the Manage Environments dialog.

  4. Then click on Import

  5. Select the L778.postman_environment.json file to import the environment.

  6. Now click on the newly imported L778 - Environment.

  7. Fill out the values for:

    • clientID
    • clientSecret
    • OrgID
    • TechAcctID

    that you generated when you created your new integration.

    Fill these out in both the "Initial Values" and the "Current Values" field.

    Your AccountID is assigned to you by your computer's label and should appear as L778+###. Fill out the ldap field in Postman with your Account ID (L778+###) so you'll be able to uniquely identify the datasets you create.

  8. Copy the contents of the private.key and use it as the value for secret.

    From the same terminal you ran openssl, execute the following command:

    pbcopy < private.key
    

    Paste the contents of the clipboard into the secret, include the -----BEGIN PRIVATE KEY----- and -----END PRIVATE KEY----- lines up to the last -. When you paste the contents of the clipboard into POSTman you may end up with an extra new line character.

    Make sure you delete this character as it is not required and will cause problems with authentication.

  9. Click Update and close the Manage Environments dialog.

  10. Now make sure you select the L778 - Environment from the environments drop down at the top right of POSTMan.

  11. After all this setup you are now ready to generate an JWT and bearer token to interact with Adobe I/O. In order to make this process easier we'll be using an Adobe I/O Runtime action.

    From our newly imported L778 collection, open Lesson 0 and click on Adobe I/O Runtime: Generate Auth. Then click on the body tab:

    All of that work you did to setup the environment has been put to good use. Each POSTMan call will take advantage of these values.

  12. Now click Send and scroll down to the response section:

    That JSON response includes an access_token which is the Bearer token used to authenticate with Adobe I/O. The POSTMan call will save this value in an environment variable for future use.

Exercise 0.3 - Setup Fenix Web Server

  1. Download the Luma website.

  2. Unzip the Luma-local.zip file and make note of which directory you extract the content to.

  3. Start the Fenix web server.

  4. Select the Web Servers menu and click New

  5. Use Luma as descriptive name.

  6. Click on the folder in the Directory input field. Then navigate the file dialog to the directory you unzipped the Luma-local folder, for the lab machines it should be Desktop/resources/Luma-local, highlight it and click Select.

  7. Click Create.

  8. Click on the Play button to start the server.

  9. Now your Luma site should be ready to be browsed.

  1. Navigate to http://luma.com to test the web server. You should see the following:

Lesson 0 - Wrap

Whew! We are finally ready to start calling the Adobe Experience Platform API's for real. We've run through creating an integration and getting authenticated.

Lesson 0 - Additional Resources

Lesson 1 - Schema: Explore and Define XDM Schema

Objective

  1. Understand the structure of XDM Schema
  2. Learn to list, lookup, create, and extend schema in API

Lesson Resources

Lesson Context

When setting up an Experience Platform environment as Luma, we will want to bring in data through batch ingestion or streaming in data from the Luma website. This will first require us to define the schema that we want to use for our data. A quick rundown on XDM Schema: these are the defining schema that all datasets on Experience Platform will use. They begin with a starting class of either person-related profile or time-based event data, and Platform users are able to add additional "mixins" to each class to extend their schema to the data fields they require. There are several global mixins that come packaged with Experience Platform, and users are also able to define their own mixins to fit their specific use-cases.

Using the Experience Platform APIs, a suite of abilities to view, extend and create schema are available to you within the API. In order to start, we often utilize either the Profile or ExperienceEvent schema class as our base, and add "mixins" to extend our schema to fit the use case of what data we're looking to ingest. If we want to do that, or define a custom schema, it is sometimes easier to do so with a JSON representation that we can submit as a POST call rather than defining each field in the Experience Platform UI.

Exercise 1.1 - Schema Lookup

  1. Now that you've successfully been authenticated by following the steps in previous lesson you are now ready to make API calls.

  2. Start by expanding the Lesson 1 & List Schemas folder in postman under L778.

  3. Select the XDM: Get All Classes action and click Send.

    In the response pane you can scroll down and see the classes that are available in XDM. The ExperienceEvent class is used as a base for time based data while Profile is used for person based data.

  4. Let's drill down a bit and view the Profile class. If you select XDM: Get Profile Class and click Send you should see something similar to:

  5. There are not many classes in XDM but they are used as the base for all tenant schemas. If you select XDM: Get All Tenant Schemas and click Send you should see something similar to:

    This will list all of the schemas that have already been created for this tenant.

  6. Now select XDM: Get All Mixins and click Send you should see something like the following:

    This will return all of the mixins that ship with Experience Platform. Scroll down until you see Profile Personal Details which we are going to look up next.

  7. Now select XDM: Get Profile Mixin and click Send you will see the following:

    This is a mixin you will add to your schema when you need common personal details like home address, home phone, etc.

Exercise 1.2 - Schema Creation

  1. Now that we've seen the API calls to query schema let's move on to creating a custom schema of our own. Expand the Create Schema folder under Lesson 1 in postman and select XDM: Get Tenant ID and click Send.

    This API call will determine which tenant you are currently logged in as and save the value as an environment variable in postman for future API calls.

  2. Now select XDM: Create New Schema.

    As you can see, our simple schema extends "context/profile" and that's it. At its most basic, a schema need only extend a class which will capture the minimum viable schema. This schema will be named L778+[AccountID]_Customers_Schema, as it is the schema we will be using for user profile data ingestion in Lesson 2.

    Click Send

    In the response section you will the ID of your newly created schema which we will be using to create a dataset in Lesson 2.

  3. Now select XDM: Get Custom Profile Schema and click Send.

    Get custom profile schema

    The response section should return the details of the schema which look an awful lot like Profile class because that is the base of our schema. In the meta:extends section you can see our schema extends the Profile class as well as the four entities (record, auditable, identitymap and extensible) that the Profile class extended.

  4. Now that the schema has been created we can add Mixin's to it. The mixins available for use will depend on the class selected. Mixins provide fields for any schemas that extends the same specific class. This allows a concept, like "person details", to be predefined so that they can be reused in any schema that needs to capture that information. Let's add the "profile person details", "profile personal details" and "profile preferences details" mixins to our simple schema so that we can capture the typical details of a person, such as full name, birthday, etc. as part of this profile schema.

    Select XDM: Add Person Details Mixin

    Click Send

    You will notice there is a new entries in the meta:extends section for profile-person-details, profile-personal-details and profile-preferences-details.

  5. We would also like our schema to capture information related to a loyalty program. There is no out-of-the-box loyalty mixin, so will create a new mixin of our own. The Loyalty mixin defines a loyalty field that can now be reused in any Profile-based schema. Select the XDM: Create Loyalty Mixin and take a look at the Body tab to see the JSON for our Loyalty mixin:

    Click Send.

    In the response section you will the ID of your newly created mixin. The ID of the mixin will be stored in an environment variable in postman which will be used in our next step.

  6. Select XDM: Add Loyalty Mixin to Custom Profile Schema. The following API call will update the simple profile schema to include the Loyalty mixin. The body of the request includes the "MIXIN_ID" of the newly created Loyalty mixin.

  7. Now select XDM: Get Custom Profile Schema so we can lookup the schema and see the changes that have been made to the schema definition.

    You'll notice there is a new mixin in the meta:extends section.

  8. Now we need to make the "address" field within the personalEmail mixin an identity so that values for that field are added to the identity graph. To do this we need to create an identity descriptor that references the "personalEmail.address" field and defines the details of the identity that the email address data represents. This identity descriptor can be added for multiple fields to define as an key field to identify schema data for Unified Profile. In our case, we are only adding one, and making sure to mark it as the primary identifier. Select XDM: Create Identify Descriptor and look at the structure of the descriptor on the body tag:

    Click Send.

  9. When creating schema, it's also possible to enable a schema for Unified Profile, one of Experience Platform's most useful features. In fact, for a dataset to be ingested into the Unified Profile, its underlying schema must be enabled for Unified Profile aswell. To enable this we need to add "union" to the "meta:immutableTags" property of the schema. Select XDM: Enable Schema for Unified Profile.

    Click Send.

    Your completed profile-based schema, L778+[AccountID]_Customers_Schema, with Unified Profile enabled is now ready for batch data ingestion of user profile data.

Lesson 1 - Wrap

With the creation of our custom schema we are ready to move on to the next step of creating datasets and data ingestion.

Lesson 1 - Additional Resources

Lesson 2 - Data: Ingesting Data via Batch Ingestion

Objective

  • Learn how to ingest XDM-formatted data files
  • Learn the API interface for data ingestion

Lesson Resources

Lesson Context

Luma's engineers and developers are likely to also utilize the Adobe Experience Platform APIs to ingest data. This includes not just batch data of files, but also point-to-point connectors that could be transferring or streaming data from user registration forms on Luma's site, sign-in details and CRM data, as well as behavioral transactions from logged-in users using the Luma site. We will implement these different data ingestion and lookup functions using the API, and demonstrate the tools available for integration of Experience Platform, starting with batch ingestion.

For Lesson 2, we will first start with basic dataset creation, and the requirements that go into that - selecting a schema to use, creating a dataset, enabling it for Unified Profile, and bringing in batch data via files.

Exercise 2.1 - Dataset Lookup

  1. Start by expanding the Lesson 2 & List Datasets folder in postman under Adobe Experience Platform.

  2. Skip over Catalog: Get Datasets and instead select the Catalog: Get Datasets Limit 5 action and click Send.

    In the response pane you can scroll down and see the top five datasets. We are limiting the number of datasets returned to five in order have the query return quickly and not process too much data.

  3. Next let's run the Catalog: Get Batches action by selecting it and clicking Send.

    Datasets can consist of multiple batches.

  4. Finally let's run the Catalog: Get Dataset Files action by selecting it and clicking Send.

    In this case we'll get a list of files in the dataset and the metadata will include which batch it is from. Now that we've learned how to query datasets let's get busy creating one.

Exercise 2.2 - Dataset Creation

  1. Expand the Create Dataset folder in postman under Lesson 2, select the Catalog: Create CRM Dataset action and click Send.

    The call will create a new dataset named L778+[AccountID]_CRM_Dataset and return a ID we can use in future calls. The "unifiedProfile": ["enabled:true"] flag within the body ensures that this dataset is automatically included in the Unified Profile Service.

    Remember that the Dataset is based on the schema you select, which is based on a schema class like Profile.

    Once created it will conceptually look like this:

  2. Next we'll call Catalog: Get CRM Dataset to inspect the dataset:

    In the response area of the postman call you view the datasets metadata. In our next exercise we'll populate this dataset.

Exercise 2.3 - Batch Data Ingestion

  1. Now that we've successfully created a dataset we'll create a batch so we can start ingesting files.

    Expand the Create Batch folder, select Ingest: Create Batch and click Send:

  2. Next we'll upload a file to the batch.

    Select Ingest: Upload File then click on the Body tab.

  3. From there you'll need to select a file to upload by clicking on Choose Files and selecting Profile_CRM_Data_Sample.parquet.

    Then click Send to upload the file. If the upload succeeds you won't see anything in the response section other than a 200 OK.

  4. Since batches may contain multiple files we have to make an additional call to close off the batch to indicate that the batch is now ready for processing.

    So select the Ingest: Close Batch action and click Send.

    Once again you won't see anything in the response section of postman other than a 200 OK.

  5. If we've done everything right up until this point the file and batch should be successfully ingested. In order to check on that we'll open the Batch Status folder, select the Batch by ID action and click Send.

    In the response section pay close attention to the value of status. We want it to say success but if it says processing it just means we have to wait a bit and make this postman request again.

    In the meantime, this is what the response would look like if the ingested file brings up a failed status - likely due to not being in the correct format for that dataset:

  6. Once we've seen the file in the batch has been successfully ingested we can check to see how many files are in the batch. Select the Files in Batch action and click Send.

    The response pane will show you how many files are in this batch. In our case there should only be one file.

  7. Let's walk back up the containers and check on the status of our dataset. Expand the Dataset Status folder, select the Catalog: Get Dataset action and click Send.

    Scroll down in the response pane until you find lastBatchStatus property which should read as success.

Exercise 2.4 - Dataset Inspection and Preview

  1. Things are looking good for our dataset but what if we want to get more information about the files stored in the dataset? Then we open the Data Access folder, select Data Export: Get File Details action and hit Send.

    This response contains important metadata like the file name and file size.

  2. Next, let's take a look at the file stored in Experience Platform.

    Select the Data Export: Get File action and hitting Send.

    In this case the response is not that useful to us as .parquet files are binary in format so they are difficult to read. However, it should make customers happy to know whatever data they import into Adobe Experience Platform can be exported as well.

  3. Finally, let's use the preview API to get a human readable version of the dataset we've just created. Select the Data Export: Get Dataset Preview action and hitting Send.

    This response returns JSON data which is at least textual format which can be read without any additional programs to do the formatting. Note: preview will only show you the results of the latest batch to be ingested into that dataset.

Lesson 2 - Wrap

Well that was a lot of work to ingest the data! Let's move on to the next step, which is collecting real time data from user interactions in the format of Experience Events.

Lesson 2 - Additional Resources

Catalog Service RESTful API Resource

Lesson 3 - Streaming: Stream in Data using Launch

Objective

  • Learn how to integrate Adobe Experience Platform Launch onto a webpage
  • Create Launch rules to stream data to Experience Platform

Lesson Context

Once Luma has its customer data system set up on Experience Platform by defining schema to use, datasets to ingest data, and ingesting existing customer data, we now want to make sure that new Profile data (from sign-ups, updates, etc) and behavioral ExperienceEvent data (browsing habits, actionable triggers, etc) are recorded and updated in near real-time. We can do this by utilizing Adobe Experience Platform Launch on our website. For this exercise, we will see how we can bring in behavioral and profile data from website interactions into Adobe Experience Platform. We can flex our experience in creating new schema and new datasets from Lessons 1 & 2, and apply it to creating streaming datasets to use with Experience Platform Launch.

Lesson Resources

Launch and Luma

Some elements of Launch are pre-packaged for sake of time. Rules, Data Elements and Extensions are already populated with a few settings that we will add or update during this lab to demonstrate how Launch works.

Make sure that Fenix webserver is running with luma.com. Navigate to http://luma.com to make sure that you have Luma running properly.

Exercise 3.1 - Creating Behavioral Schema, and Datasets for Streaming

  1. Since we will be streaming in data for this lab, we will need to create new datasets that are streaming-enabled, as well as a new schema for behavioral events with the ExperienceEvent class, which we'll name L778+[AccountID]_WebBehavior_Schema. Since we covered schema creation in Lesson 1 and dataset creation in detail in Lesson 2, we've put the correct POST calls into the Postman collection for you to run. First, open the Create Streaming Datasets folder in Lesson 3, and send the Create Web Behavior Schema POST call. You should receive a response showing the structure of the ExperienceEvent, including a few mixins we've defined. This schema has also been enabled for Unified Profile as well.

  2. Next, we need to define the schema identity decriptor in order to tell Experience Platform what the main identifying field for data is in this schema, which here will be the ECID, or Experience Cloud ID that a user is auto-assigned within their browser session. Click Define Identity Descriptor and send the POST call.

  3. Send the POST call for Create Web Behavior Event Dataset, which includes the tag "streamingIngestionEnabled": "true" within it to enable data streaming. You should then get a Dataset ID in response:

  4. Do the same with the next call, Create User Signups Dataset, to create a new Profile-based dataset with "streamingIngestionEnabled": "true" in it as well.

  5. Then send a GET call to get the ExperienceEvent dataset details. Now you will have your WebBehavior Dataset ExperienceEvent dataset created:

  6. You can also GET the details of the User Signups Dataset Profile-based dataset that you just crated for Profile data streaming:

  7. Examples of the two datasets after creation in the Experience Platform UI is below:

Exercise 3.2 - Setting up a Launch Property

  1. Open Adobe Experience Platform Launch at https://launch-demo.adobe.com

  2. Login with your Adobe ID, and ensure in the top right Organization selector that you are in Summit 2019 Platform Lab 03

  3. Find the property associated with your user number (L778+### Property) and open it. This property points to the URL "luma.com", which should be set up on your local machine prior to this lab. The property will open up and take you to the Launch property home screen.

  4. First, we need to set up a Launch library in the development environment. Go to the Publishing tab and click Add New Library.

  5. Next, name your library web publish and select Development as the environment.

  6. Launch requires us to select which elements in our Property that it should publish. This property already has several Data Elements and Rules set (we'll learn about this in a bit), so we need to click Add All Changed Resources at the bottom of the page to add every existing element to this library. Then, click Save & Build for Development.

  7. The last step is to select our new library as the working environment. In the top right drop down for Working Library, select our new web publish library. This will now update the library whenever we make any changes in the next steps.

Exercise 3.3 - Putting Launch into a website

  1. Now, we need to take a snippet of HTML to insert in your Luma site. This snippet will be how Launch sends data to the Experience Platform. Click the Environments tab from the home screen, and then click the small box icon to the right of the Development row.

  2. Click the "copy" square button to the right of the script textbox to copy this snippet to the clipboard.

  3. Open up the index.html and signup.html pages of your Luma site in your text/code editor, and paste this code into the <head> tags of each file. This will enable Launch to run its rules on these two pages. Remember to paste this snippet in BOTH files.

Exercise 3.4 - Setting up Launch Rules and Elements

  1. Go to the Data Elements tab. This is where we will define elements from our Luma site to pull in values - names, emails, IDs, descriptors, etc. There are already a few populated in here that we will be using when we send data to Experience Platform.

  2. Let's take a look at the extensions we will need for our Luma site. Click the Extensions tab. You'll see that the Core extension is there by default, and that we've already had the following extensions installed:

    1. Adobe Experience Platform - This is the extension needed to send data specifically to our Experience Platform instance
    2. Experience Cloud ID Service - This is the extension that allows us to identify the unique Experience Cloud ID attribute across all Adobe products

  3. We're going to need to configure the Adobe Experience Platform extension, so click Config on that box. Click on Create a streaming endpoint, and name it Launch Streaming Endpoint [AccountID], with Source ID = Launch_[AccountID], and then click Save.

  4. Select your streaming endpoint - in the text box, type your [AccountID] and your endpoint should pop up. Select it and click Save to Library and Build.

Exercise 3.5 - Sending ExperienceEvent Data (behavioral data)

  1. Now, we have our streaming endpoint from our Luma site into Experience Platform defined and ready to go. The next step is to create triggers that will make actions happen on our Luma site. Go to the Rules tab. Take alook at the existing rules - we have two defined here. Look into Page Visit.

  2. You can see that the Event, Core - Page Bottom is triggered when the browser hits the bottom of the page. This stands in as a de-facto "page view" as the bottom of the page is automatically hit when the page renders. At the bottom, the Action flow shows two actions - one for Experience Cloud ID Service, and then one for Adobe Experience Platform.

  3. Click on the Experience Cloud ID Service action. You can see that it is setting the ECID to share across Adobe solutions with the ECID being sent to it from our "ECID" Data Element. Click Cancel and Discard changes (don't worry, no changes were made) to go back to the Rule editor.

  4. Now let's define data that we send to Experience Platform. Click on Adobe Experience Platform - Send Beacon. This action is sending an ExperienceEvent. You'll see that some of the variables and data elements populating those variables are already set - sending the ECID as an identity for the event, the name of the webpage currently viewed, and a timestamp of that action.

  5. There are two things here that we need to change. First, the dataset selected that we're sending data to needs to but changed. Click the dropdown and find your corresponding L778+[AccountID]_WebBehavior_Dataset dataset from earlier. You will get a warning to make sure the fields match, which you should assume is true here.

  6. Now let's add another variable to send in our beacon - the Language data element. Click on Add Another and create the following:

    1. variable: use the "target" icon to select the Environment - browserDetails - acceptLanguage schema field
    2. value: %Language% - (Select this from the list of data elements by clicking the round stack icon next to the value field. It can also be typed in directly)

  7. Let's save our changes. Click on Keep Changes to save our Action, and then Save to Library and Build. Now the rule for "Page View" has been set.

Exercise 3.6 - Sending Profile Data (attribute data)

  1. We just set up our first rule to send an ExperienceEvent to Adobe Experience Platform! Now let's do the same thing for same with Profile data, by taking a look at the Sign Up rule. Clicking on Core - Click you can see the composition of the event that we're looking for - a click on the Submit button during sign-up, with a specific CSS selector for that button.

  2. Under the Conditions field, you can see there as a requirement that we be on the signup.html page for this rule to fire.

  3. Just like we did with the ExperienceEvent dataset, we need to select the correct dataset. Since this is a Profile-updating action, we want to send our data beacon to our L778+[AccountID]_Signups_Dataset dataset, so select that from the dropdown.

  4. Explore the actions for this rule, as there are several already pre-populated. You'll notice that we are sending the many variables, including identityMap which defines who to attribute this beacon data to, sending the ECID and the user's email (which is marked as our primary identifier):

    1. %ECID% => ECID
    2. %Email Address% => Email, Primary
  5. You'll also see that the personalEmail.address value is DELETE and needs to be corrected. This value sets one of the identities that Unified Profile uses to match XDM data to each other. Clear this text box and then select the Email Address data element using the round stack icon next to the text box.

  6. Click Keep Changes on the action, and thten Save to Library and Build on the rule. Click on the Publishing tab to watch the status of our build and see it succeed.

Exercise 3.7 - Streaming a Launch data beacon into Adobe Experience Platform from Luma

Now we've got a Launch property set up, and a Luma instance running with the Launch streaming endpoint included. Let's watch it stream data into Experience Platform.

  1. Go to http://www.luma.com. This should be pointing to your Fenix webserver instance of Luma. Right click and open up Inspect to see the page inspector.

  2. Go to the Network tab, and type dcs into the filter box. This will filter our network commands being sent to the Launch streaming URL. Refresh the page.

  3. Select the endpoint under Name"\ and scroll dwon on the right pane to Request Payload. You should see an expandable JSON payload starting with {header:. Expand all these fields and you will be able to see the XDM data that we defined in our Launch rule for "Page View" being sent from the page to Launch.

  4. Go back to your Luma screen. Now let's see our Sign-Up rule in action by going to Sign Up in the top right of the page. Scroll down to the sign-up form and fill in some dummy details with your name and email. These do not have to be real or accurate.

  5. Click Sign Up. Nothing will happen visually; our instance of Luma is stripped-down and does not contain any functionality past this page. Go back to your inspector and you can now observe two Launch beacons being sent!

  6. One beacon data contains the Profile XDM data we filled into the form. You can see all the fields we defined in the "Sign Up" rule on Launch here.

  7. The other beacon is an ExperienceEvent XDM fired by the "Page View" rule, the same as when we were at the Luma Homepage. You'll notice the data for this page is basically identical, except for the webPageDetails and timestamp fields.

  8. We can also look at all of our ExperienceEvent data within Experience Platform using an API call through Postman. Within the beacon data, find the ecid: {id: "##" field and select your ECID. This ID is specific to your browser session and can be used to look up ExperienceEvent and Unified Profile data. Copy this field and we'll use it in the first step of Lesson 4.

Lesson 3 Wrap

Congratulations! You've got all the basics you need for your Adobe Experience Platform setup to build out not just a basis for customer loyalty programs, but also for data collection, storage, and analysis of your incoming customer data from batch ingestion and streaming data.

The streaming capabilities of data into Experience Platform are not limited to websites - mobile apps, POS devices, and other points of data capture can all stream behavioral and other types of data into Experience Platform using methods like Launch.

Lesson 3 - Additional Resources

Getting Started with Adobe Launch

Lesson 4 - Unified Profile: Querying the Unified Profile

Objective

  • Learn how to query Profile and Experience Events from the Unified Profile

Lesson Context

Much of the value of having all our data integrated with the Adobe Experience Platform is to be able to utilize the Unified Profile. This will give us a real-time customer profile that's built on a user's interactions with various channels (web, physical beacon, email) and can help segment customers better as well as do further analytics using other tools on the Experience Platform. Here we will show how to do a lookup of a user's Unified Profile - namely, the user info we just entered in Lesson 3 - using both an email address as well as an ECID.

Lesson Resources

  • Make sure to have the ECID copied from the end of Lesson 3

Exercise 4.1 - Querying Unified Profiles via the API

  1. Go to Postman, and under the Lesson 4 collection find the Behavioral Data Point Lookup call. find the entityId Field and paste your copied ECID here from the clipboard. Make sure that the entityIdNS field is set to "ecid". Send the GET call.

  2. You should get back a list of all ExperienceEvents associated with your ECID - browsing the homepage and the signup page. Take a look, your streaming data is now in Experience Platform!

  3. This data is now in Experience Platform! In addition, it's now been ingested into the Unified Profile, since we tagged it for Unified Profile when creating our dataset back in Lesson 2. To confirm this in the API since we're still in Postman, you can take the ECID that you used in the last step and do a similar call with the GET call for Unified Profile Lookup. You will then be able to see your full Unified Profile associated with your ECID - your sign-up data for Profile data should now be available as a profile view.

    NOTE: It may take up to 5 minutes for the Unified Profile data to refresh and show your profile data

Lesson 4 Wrap

Well that was quick and easy - and brings us to the end of our journey for the regular portion of our lab! You were able to observe and perform the most critical and necessary functions of the Adobe Experience Platform completely from a developer or architect's point of view - using APIs and external sources to help complement the Experience Platform UI or to help you integrate Experience Platform's features into your own custom applications. The streaming capabilities of data into Experience Platform are not limited to websites - mobile apps, POS devices, and other points of data capture can all stream behavioral and other types of data into Experience Platform using methods like Launch.

Lesson 4 Additional Resources

Unified Profile Overview

[BONUS] Lesson 5 - Segmentation: Segment the Unified Profile

NOTE: This section is bonus material and can be completed within this lab, if you have extra time.

Objective

  • Learn how to segment an audience of profiles using the API
  • Learn how to post segment jobs and export them to a new dataset
  • Define a segment via API in JSON
  • Create a segment job and export it to a new dataset

Lesson Context

Now that Luma has a system with Unified Profile populated from both batch customer data and streaming data, and demonstrated the ability to look up Unified Profile data, we can utilize Adobe Experience Platform to take further action on these profiles. Using the segmentation engine, we can specify filters to segment our database of Luma customers. These segments can then be delivered directly to other platforms either through Experience Platform's Segment Builder integrations with other Adobe solutions, or by creating new datasets of segmented customers to export to external solutions.

This will help utilize several aspects Luma's customer loyalty program, including targeted upgrade promotions, re-engagement or re-activation of customers, garnering insights from the loyalty program statistics, etc. Because all data for a customer including and beyond customer loyalty will be tied to one Unified Profile, we can bring in more information to further segment customers, or use segments for a broader scope of actions.

Lesson Resources

Exercise 5.1 - Segmentation via the API

  1. Now that we've ingested some data in Lesson 2 into our Unified Profile Service, we're ready to begin segmenting the data in order to gain insights.

  2. Start by expanding the Lesson 5 folder in postman under L778.

  3. Select the Profile: Create A Segment Definition action. Note that in the body section of the action we are creating a simple segment that will include all females who have interacted with the system in the past 90 days.

  4. Then click Send to create the segment and view the results in the response section of Postman.

  5. Now that we've been able to define a segment the next step is to start a job to run the segment query for us. So select POST Segment Job and hit Send.

  6. A segment job may take a little while to run so let's check on it's status by selecting GET Segment Job status and hitting Send.

    If the status property says PROCESSING wait a little longer and Send the query again until you see SUCCEEDED.

  7. Let's create a new dataset to hold the results of our segment job.

  8. Now that the job has succeeded and we have a dataset to hold our segmentation data it's time to export the data. Select POST Export Job and hit Send.

  9. We are getting close to being able to view our segment data. Select the Get Export Job By Id action and hit Send.

    If the status property says PROCESSING wait a little longer and Send the query again until you see SUCCEEDED.

  10. Now the easiest way to view your segmented dataset is to log in to https://platform.adobe.com, click on the Data tab, then click on {{ldap}}_ProfileSegmentDataset and finally the Preview button.

    As you can see all of the people in the dataset are female.

Additional Resources

Unified Profile, Identity, & Segmentation

Next Steps

Now that we've been able to see how to use Adobe Experience Platform's APIs to empower Luma, the opportunities that to understand and utilize the real-time customer profile that Unified Profile offers are numerous. As an API-first built platform, Adobe Experience Platform allows developers to take full control of its features and integrate them into their own solutions as well.

We hope you enjoyed this workshop and that this workbook can serve as a useful reference for you going forward. Please don't hesitate to reach out to one of the Platform team members if you have any questions, and visit our booth at the pavillion to see the many other faces of Adobe Experience Platform!

Additional Resources

Adobe Experience Platfrom API Documentation