This workshop is intended to cover the following topics:
In the course of completing this lab, you will learn how to access Adobe Experience Platform and perform different functions using our robust APIs. Experience Platform was built API-first, so the full functionality is at your fingertips. We will be looking throuh this process in the eyes of Luma, a fictitious yoga-based lifestyle site and online store. Adobe Experience Platform allows Luma to build a basis for its customer experience and loyalty program on the Unified Profile Service.
Adobe Experience Platform enables Luma to understand its users via Unified Profile, by bringing in existing customer profiles through batch ingestion, creating profiles and adding real-time behavioral data through streaming ingestion, and managing those Unified Profiles using the segmentation engine. All of this data helps to hydrate the Unified Profile, giving Luma a full real-time view of their customers.
After completion of this workshop. you should be able to:
Using the Adobe Experience Platform APIs, a suite of abilities to view schema, extend and create custom schema, (and more) are available to you within the API - there are some steps to get up and running with the Adobe Experience Platform API environment. This includes creating an integration to link your account to Adobe's APIs, and testing this link on the Postman software. Finally, we will quickly create a local webserver to host Luma's website for us to use in Lesson 3.
An integration can be subscribed to one or more services. In many cases, you will use the same client credentials to access multiple Adobe services. In addition to APIs, you may also subscribe your integration to I/O Events so that your applications can access content and services in real-time.
Create Certificate
Open terminal and execute below command:
openssl req -x509 -sha256 -nodes -days 365 -newkey rsa:2048 -keyout private.key -out certificate_pub.crt
You will get a response similar to the following which prompts you to enter some information about yourself:
Generating a 2048 bit RSA private key
.................+++
.......................................+++
writing new private key to 'private.key'
-----
You are about to be asked to enter information that will be incorporated
into your certificate request.
What you are about to enter is what is called a Distinguished Name or a DN.
There are quite a few fields but you can leave some blank
For some fields there will be a default value,
If you enter '.', the field will be left blank.
-----
Country Name (2 letter code) []:
State or Province Name (full name) []:
Locality Name (eg, city) []:
Organization Name (eg, company) []:
Organizational Unit Name (eg, section) []:
Common Name (eg, fully qualified host name) []:
Email Address []:
After entering the information two files will be generated: certificate_pub.crt
and private.key
. These files can be found in the same directory that you ran the openssl
command from.
Note certificate_pub.crt
will expire in 365 days. You can make the period longer by changing the value of days in the openssl command above but rotating credentials periodically is a good security practice.
The certificate_pub.crt
certificate will later be uploaded to the Adobe IO Console for when you create an API key for access to any Adobe I/O API.
Your private key file named private.key
will be used later to sign your JWT token.
Note: Don't close this terminal window as you will need it later.
Navigate to the Adobe I/O Console and sign in with your Adobe ID.
From this page we want to create a New Integration.
You will then be prompted to Access an API or to Receive near-real-time events. We will be accessing APIs so select Access an API and then Continue.
The drop-down menu on the top right of the screen is where you would switch your organization if your account is tied to multiple. We are selecting Summit Platform Lab 03
and Experience Platform API
under Adobe Experience Platform
.
Fill in your Integration Details. Afterwards, click on Select a File to upload your certificate_pub.crt file we generated in the previous section. Then click on AEP Summit 2019
to select this product profile. Finally, click Create Integration to finish up the process
After creating your integration, you will be able to view the details of your integration. After clicking on Retrieve client Secret your screen should look similar to this.
Copy down the values for {API KEY}, {IMS ORG} which is the Organization ID, and {CLIENT SECRET} as these will be used in the next step.
Start POSTMan
Click the Import
button on the top left.
Select the L778.postman_collection.json collection file from this repository.
Next we need to import our environment. Click on the settings logo
To bring up the Manage Environments dialog.
Then click on Import
Select the L778.postman_environment.json file to import the environment.
Now click on the newly imported L778 - Environment
.
Fill out the values for:
that you generated when you created your new integration.
Fill these out in both the "Initial Values" and the "Current Values" field.
Your AccountID is assigned to you by your computer's label and should appear as L778+###. Fill out the ldap
field in Postman with your Account ID (L778+###) so you'll be able to uniquely identify the datasets you create.
Copy the contents of the private.key
and use it as the value for secret
.
From the same terminal you ran openssl
, execute the following command:
pbcopy < private.key
Paste the contents of the clipboard into the secret
, include the -----BEGIN PRIVATE KEY-----
and -----END PRIVATE KEY-----
lines up to the last -
. When you paste the contents of the clipboard into POSTman you may end up with an extra new line character.
Make sure you delete this character as it is not required and will cause problems with authentication.
Click Update
and close the Manage Environments
dialog.
Now make sure you select the L778 - Environment
from the environments drop down at the top right of POSTMan.
After all this setup you are now ready to generate an JWT and bearer token to interact with Adobe I/O. In order to make this process easier we'll be using an Adobe I/O Runtime action.
From our newly imported L778
collection, open Lesson 0
and click on Adobe I/O Runtime: Generate Auth
. Then click on the body tab:
All of that work you did to setup the environment has been put to good use. Each POSTMan call will take advantage of these values.
Now click Send
and scroll down to the response section:
That JSON response includes an access_token
which is the Bearer token used to authenticate with Adobe I/O. The POSTMan call will save this value in an environment variable for future use.
Download the Luma website.
Unzip the Luma-local.zip
file and make note of which directory you extract the content to.
Start the Fenix web server.
Select the Web Servers
menu and click New
Use Luma
as descriptive name.
Click on the folder in the Directory
input field. Then navigate the file dialog to the directory you unzipped the Luma-local
folder, for the lab machines it should be Desktop/resources/Luma-local
, highlight it and click Select
.
Click Create
.
Click on the Play
button to start the server.
Now your Luma site should be ready to be browsed.
Navigate to http://luma.com to test the web server. You should see the following:
Whew! We are finally ready to start calling the Adobe Experience Platform API's for real. We've run through creating an integration and getting authenticated.
When setting up an Experience Platform environment as Luma, we will want to bring in data through batch ingestion or streaming in data from the Luma website. This will first require us to define the schema that we want to use for our data. A quick rundown on XDM Schema: these are the defining schema that all datasets on Experience Platform will use. They begin with a starting class of either person-related profile or time-based event data, and Platform users are able to add additional "mixins" to each class to extend their schema to the data fields they require. There are several global mixins that come packaged with Experience Platform, and users are also able to define their own mixins to fit their specific use-cases.
Using the Experience Platform APIs, a suite of abilities to view, extend and create schema are available to you within the API. In order to start, we often utilize either the Profile or ExperienceEvent schema class as our base, and add "mixins" to extend our schema to fit the use case of what data we're looking to ingest. If we want to do that, or define a custom schema, it is sometimes easier to do so with a JSON representation that we can submit as a POST call rather than defining each field in the Experience Platform UI.
Now that you've successfully been authenticated by following the steps in previous lesson you are now ready to make API calls.
Start by expanding the Lesson 1
& List Schemas
folder in postman under L778
.
Select the XDM: Get All Classes
action and click Send
.
In the response pane you can scroll down and see the classes that are available in XDM. The ExperienceEvent class is used as a base for time based data while Profile is used for person based data.
Let's drill down a bit and view the Profile
class. If you select XDM: Get Profile Class
and click Send
you should see something similar to:
There are not many classes in XDM but they are used as the base for all tenant schemas. If you select XDM: Get All Tenant Schemas
and click Send
you should see something similar to:
This will list all of the schemas that have already been created for this tenant.
Now select XDM: Get All Mixins
and click Send
you should see something like the following:
This will return all of the mixins that ship with Experience Platform. Scroll down until you see Profile Personal Details
which we are going to look up next.
Now select XDM: Get Profile Mixin
and click Send
you will see the following:
This is a mixin you will add to your schema when you need common personal details like home address, home phone, etc.
Now that we've seen the API calls to query schema let's move on to creating a custom schema of our own. Expand the Create Schema
folder under Lesson 1
in postman and select XDM: Get Tenant ID
and click Send
.
This API call will determine which tenant you are currently logged in as and save the value as an environment variable in postman for future API calls.
Now select XDM: Create New Schema
.
As you can see, our simple schema extends "context/profile" and that's it. At its most basic, a schema need only extend a class which will capture the minimum viable schema. This schema will be named L778+[AccountID]_Customers_Schema
, as it is the schema we will be using for user profile data ingestion in Lesson 2.
Click Send
In the response section you will the ID of your newly created schema which we will be using to create a dataset in Lesson 2
.
Now select XDM: Get Custom Profile Schema
and click Send
.
The response section should return the details of the schema which look an awful lot like Profile
class because that is the base of our schema. In the meta:extends
section you can see our schema extends the Profile
class as well as the four entities (record, auditable, identitymap and extensible) that the Profile
class extended.
Now that the schema has been created we can add Mixin's to it. The mixins available for use will depend on the class selected. Mixins provide fields for any schemas that extends the same specific class. This allows a concept, like "person details", to be predefined so that they can be reused in any schema that needs to capture that information. Let's add the "profile person details", "profile personal details" and "profile preferences details" mixins to our simple schema so that we can capture the typical details of a person, such as full name, birthday, etc. as part of this profile schema.
Select XDM: Add Person Details Mixin
Click Send
You will notice there is a new entries in the meta:extends
section for profile-person-details
, profile-personal-details
and profile-preferences-details
.
We would also like our schema to capture information related to a loyalty program. There is no out-of-the-box loyalty mixin, so will create a new mixin of our own. The Loyalty mixin defines a loyalty field that can now be reused in any Profile-based schema. Select the XDM: Create Loyalty Mixin
and take a look at the Body tab to see the JSON for our Loyalty mixin:
Click Send
.
In the response section you will the ID of your newly created mixin. The ID of the mixin will be stored in an environment variable in postman which will be used in our next step.
Select XDM: Add Loyalty Mixin to Custom Profile Schema
. The following API call will update the simple profile schema to include the Loyalty mixin. The body of the request includes the "MIXIN_ID" of the newly created Loyalty mixin.
Now select XDM: Get Custom Profile Schema
so we can lookup the schema and see the changes that have been made to the schema definition.
You'll notice there is a new mixin in the meta:extends
section.
Now we need to make the "address" field within the personalEmail mixin an identity so that values for that field are added to the identity graph. To do this we need to create an identity descriptor that references the "personalEmail.address" field and defines the details of the identity that the email address data represents. This identity descriptor can be added for multiple fields to define as an key field to identify schema data for Unified Profile. In our case, we are only adding one, and making sure to mark it as the primary identifier. Select XDM: Create Identify Descriptor
and look at the structure of the descriptor on the body tag:
Click Send
.
When creating schema, it's also possible to enable a schema for Unified Profile, one of Experience Platform's most useful features. In fact, for a dataset to be ingested into the Unified Profile, its underlying schema must be enabled for Unified Profile aswell. To enable this we need to add "union" to the "meta:immutableTags" property of the schema. Select XDM: Enable Schema for Unified Profile
.
Click Send
.
Your completed profile-based schema, L778+[AccountID]_Customers_Schema
, with Unified Profile enabled is now ready for batch data ingestion of user profile data.
With the creation of our custom schema we are ready to move on to the next step of creating datasets and data ingestion.
Luma's engineers and developers are likely to also utilize the Adobe Experience Platform APIs to ingest data. This includes not just batch data of files, but also point-to-point connectors that could be transferring or streaming data from user registration forms on Luma's site, sign-in details and CRM data, as well as behavioral transactions from logged-in users using the Luma site. We will implement these different data ingestion and lookup functions using the API, and demonstrate the tools available for integration of Experience Platform, starting with batch ingestion.
For Lesson 2, we will first start with basic dataset creation, and the requirements that go into that - selecting a schema to use, creating a dataset, enabling it for Unified Profile, and bringing in batch data via files.
Start by expanding the Lesson 2
& List Datasets
folder in postman under Adobe Experience Platform
.
Skip over Catalog: Get Datasets
and instead select the Catalog: Get Datasets Limit 5
action and click Send
.
In the response pane you can scroll down and see the top five datasets. We are limiting the number of datasets returned to five in order have the query return quickly and not process too much data.
Next let's run the Catalog: Get Batches
action by selecting it and clicking Send
.
Datasets can consist of multiple batches.
Finally let's run the Catalog: Get Dataset Files
action by selecting it and clicking Send
.
In this case we'll get a list of files in the dataset and the metadata will include which batch it is from. Now that we've learned how to query datasets let's get busy creating one.
Expand the Create Dataset
folder in postman under Lesson 2
, select the Catalog: Create CRM Dataset
action and click Send
.
The call will create a new dataset named L778+[AccountID]_CRM_Dataset
and return a ID we can use in future calls. The "unifiedProfile": ["enabled:true"]
flag within the body ensures that this dataset is automatically included in the Unified Profile Service.
Remember that the Dataset is based on the schema you select, which is based on a schema class like Profile.
Once created it will conceptually look like this:
Next we'll call Catalog: Get CRM Dataset
to inspect the dataset:
In the response area of the postman call you view the datasets metadata. In our next exercise we'll populate this dataset.
Now that we've successfully created a dataset we'll create a batch so we can start ingesting files.
Expand the Create Batch
folder, select Ingest: Create Batch
and click Send
:
Next we'll upload a file to the batch.
Select Ingest: Upload File
then click on the Body
tab.
From there you'll need to select a file to upload by clicking on Choose Files
and selecting Profile_CRM_Data_Sample.parquet.
Then click Send
to upload the file. If the upload succeeds you won't see anything in the response section other than a 200 OK.
Since batches may contain multiple files we have to make an additional call to close off the batch to indicate that the batch is now ready for processing.
So select the Ingest: Close Batch
action and click Send
.
Once again you won't see anything in the response section of postman other than a 200 OK.
If we've done everything right up until this point the file and batch should be successfully ingested. In order to check on that we'll open the Batch Status
folder, select the Batch by ID
action and click Send
.
In the response section pay close attention to the value of status
. We want it to say success
but if it says processing
it just means we have to wait a bit and make this postman request again.
In the meantime, this is what the response would look like if the ingested file brings up a failed
status - likely due to not being in the correct format for that dataset:
Once we've seen the file in the batch has been successfully ingested we can check to see how many files are in the batch. Select the Files in Batch
action and click Send
.
The response pane will show you how many files are in this batch. In our case there should only be one file.
Let's walk back up the containers and check on the status of our dataset. Expand the Dataset Status
folder, select the Catalog: Get Dataset
action and click Send
.
Scroll down in the response pane until you find lastBatchStatus
property which should read as success
.
Things are looking good for our dataset but what if we want to get more information about the files stored in the dataset? Then we open the Data Access
folder, select Data Export: Get File Details
action and hit Send
.
This response contains important metadata like the file name and file size.
Next, let's take a look at the file stored in Experience Platform.
Select the Data Export: Get File
action and hitting Send
.
In this case the response is not that useful to us as .parquet files are binary in format so they are difficult to read. However, it should make customers happy to know whatever data they import into Adobe Experience Platform can be exported as well.
Finally, let's use the preview API to get a human readable version of the dataset we've just created. Select the Data Export: Get Dataset Preview
action and hitting Send
.
This response returns JSON data which is at least textual format which can be read without any additional programs to do the formatting. Note: preview will only show you the results of the latest batch to be ingested into that dataset.
Well that was a lot of work to ingest the data! Let's move on to the next step, which is collecting real time data from user interactions in the format of Experience Events.
Catalog Service RESTful API Resource
Once Luma has its customer data system set up on Experience Platform by defining schema to use, datasets to ingest data, and ingesting existing customer data, we now want to make sure that new Profile data (from sign-ups, updates, etc) and behavioral ExperienceEvent data (browsing habits, actionable triggers, etc) are recorded and updated in near real-time. We can do this by utilizing Adobe Experience Platform Launch on our website. For this exercise, we will see how we can bring in behavioral and profile data from website interactions into Adobe Experience Platform. We can flex our experience in creating new schema and new datasets from Lessons 1 & 2, and apply it to creating streaming datasets to use with Experience Platform Launch.
Some elements of Launch are pre-packaged for sake of time. Rules, Data Elements and Extensions are already populated with a few settings that we will add or update during this lab to demonstrate how Launch works.
Make sure that Fenix webserver is running with luma.com. Navigate to http://luma.com to make sure that you have Luma running properly.
Since we will be streaming in data for this lab, we will need to create new datasets that are streaming-enabled, as well as a new schema for behavioral events with the ExperienceEvent
class, which we'll name L778+[AccountID]_WebBehavior_Schema
. Since we covered schema creation in Lesson 1 and dataset creation in detail in Lesson 2, we've put the correct POST calls into the Postman collection for you to run. First, open the Create Streaming Datasets
folder in Lesson 3
, and send the Create Web Behavior Schema
POST call. You should receive a response showing the structure of the ExperienceEvent, including a few mixins we've defined. This schema has also been enabled for Unified Profile as well.
Next, we need to define the schema identity decriptor in order to tell Experience Platform what the main identifying field for data is in this schema, which here will be the ECID, or Experience Cloud ID that a user is auto-assigned within their browser session. Click Define Identity Descriptor
and send the POST call.
Send the POST call for Create Web Behavior Event Dataset
, which includes the tag "streamingIngestionEnabled": "true"
within it to enable data streaming. You should then get a Dataset ID in response:
Do the same with the next call, Create User Signups Dataset
, to create a new Profile-based dataset with "streamingIngestionEnabled": "true"
in it as well.
Then send a GET call to get the ExperienceEvent dataset details. Now you will have your WebBehavior Dataset
ExperienceEvent dataset created:
You can also GET the details of the User Signups Dataset
Profile-based dataset that you just crated for Profile data streaming:
Examples of the two datasets after creation in the Experience Platform UI is below:
Open Adobe Experience Platform Launch at https://launch-demo.adobe.com
Login with your Adobe ID, and ensure in the top right Organization selector that you are in Summit 2019 Platform Lab 03
Find the property associated with your user number (L778+### Property
) and open it. This property points to the URL "luma.com", which should be set up on your local machine prior to this lab. The property will open up and take you to the Launch property home screen.
First, we need to set up a Launch library in the development environment. Go to the Publishing
tab and click Add New Library
.
Next, name your library web publish
and select Development
as the environment.
Launch requires us to select which elements in our Property that it should publish. This property already has several Data Elements and Rules set (we'll learn about this in a bit), so we need to click Add All Changed Resources
at the bottom of the page to add every existing element to this library. Then, click Save & Build for Development
.
The last step is to select our new library as the working environment. In the top right drop down for Working Library
, select our new web publish
library. This will now update the library whenever we make any changes in the next steps.
Now, we need to take a snippet of HTML to insert in your Luma site. This snippet will be how Launch sends data to the Experience Platform. Click the Environments
tab from the home screen, and then click the small box icon to the right of the Development
row.
Click the "copy" square button to the right of the script textbox to copy this snippet to the clipboard.
Open up the index.html
and signup.html
pages of your Luma site in your text/code editor, and paste this code into the <head>
tags of each file. This will enable Launch to run its rules on these two pages. Remember to paste this snippet in BOTH files.
Go to the Data Elements
tab. This is where we will define elements from our Luma site to pull in values - names, emails, IDs, descriptors, etc. There are already a few populated in here that we will be using when we send data to Experience Platform.
Let's take a look at the extensions we will need for our Luma site. Click the Extensions
tab. You'll see that the Core extension is there by default, and that we've already had the following extensions installed:
We're going to need to configure the Adobe Experience Platform extension, so click Config
on that box. Click on Create a streaming endpoint
, and name it Launch Streaming Endpoint [AccountID]
, with Source ID
= Launch_[AccountID]
, and then click Save
.
Select your streaming endpoint - in the text box, type your [AccountID] and your endpoint should pop up. Select it and click Save to Library and Build
.
Now, we have our streaming endpoint from our Luma site into Experience Platform defined and ready to go. The next step is to create triggers that will make actions happen on our Luma site. Go to the Rules
tab. Take alook at the existing rules - we have two defined here. Look into Page Visit
.
You can see that the Event, Core - Page Bottom
is triggered when the browser hits the bottom of the page. This stands in as a de-facto "page view" as the bottom of the page is automatically hit when the page renders. At the bottom, the Action flow shows two actions - one for Experience Cloud ID Service, and then one for Adobe Experience Platform.
Click on the Experience Cloud ID Service action. You can see that it is setting the ECID to share across Adobe solutions with the ECID being sent to it from our "ECID" Data Element. Click Cancel
and Discard changes
(don't worry, no changes were made) to go back to the Rule editor.
Now let's define data that we send to Experience Platform. Click on Adobe Experience Platform - Send Beacon
. This action is sending an ExperienceEvent. You'll see that some of the variables and data elements populating those variables are already set - sending the ECID as an identity for the event, the name of the webpage currently viewed, and a timestamp of that action.
There are two things here that we need to change. First, the dataset selected that we're sending data to needs to but changed. Click the dropdown and find your corresponding L778+[AccountID]_WebBehavior_Dataset
dataset from earlier. You will get a warning to make sure the fields match, which you should assume is true here.
Now let's add another variable to send in our beacon - the Language data element. Click on Add Another
and create the following:
Environment - browserDetails - acceptLanguage
schema field%Language%
- (Select this from the list of data elements by clicking the round stack icon next to the value field. It can also be typed in directly)Let's save our changes. Click on Keep Changes
to save our Action, and then Save to Library and Build
. Now the rule for "Page View" has been set.
We just set up our first rule to send an ExperienceEvent to Adobe Experience Platform! Now let's do the same thing for same with Profile data, by taking a look at the Sign Up
rule. Clicking on Core - Click
you can see the composition of the event that we're looking for - a click on the Submit
button during sign-up, with a specific CSS selector for that button.
Under the Conditions
field, you can see there as a requirement that we be on the signup.html
page for this rule to fire.
Just like we did with the ExperienceEvent dataset, we need to select the correct dataset. Since this is a Profile-updating action, we want to send our data beacon to our L778+[AccountID]_Signups_Dataset
dataset, so select that from the dropdown.
Explore the actions for this rule, as there are several already pre-populated. You'll notice that we are sending the many variables, including identityMap which defines who to attribute this beacon data to, sending the ECID and the user's email (which is marked as our primary identifier):
You'll also see that the personalEmail.address
value is DELETE
and needs to be corrected. This value sets one of the identities that Unified Profile uses to match XDM data to each other. Clear this text box and then select the Email Address
data element using the round stack icon next to the text box.
Click Keep Changes
on the action, and thten Save to Library and Build
on the rule. Click on the Publishing
tab to watch the status of our build and see it succeed.
Now we've got a Launch property set up, and a Luma instance running with the Launch streaming endpoint included. Let's watch it stream data into Experience Platform.
Go to http://www.luma.com. This should be pointing to your Fenix webserver instance of Luma. Right click and open up Inspect
to see the page inspector.
Go to the Network
tab, and type dcs
into the filter box. This will filter our network commands being sent to the Launch streaming URL. Refresh the page.
Select the endpoint under Name"\
and scroll dwon on the right pane to Request Payload
. You should see an expandable JSON payload starting with {header:
. Expand all these fields and you will be able to see the XDM data that we defined in our Launch rule for "Page View" being sent from the page to Launch.
Go back to your Luma screen. Now let's see our Sign-Up rule in action by going to Sign Up
in the top right of the page. Scroll down to the sign-up form and fill in some dummy details with your name and email. These do not have to be real or accurate.
Click Sign Up
. Nothing will happen visually; our instance of Luma is stripped-down and does not contain any functionality past this page. Go back to your inspector and you can now observe two Launch beacons being sent!
One beacon data contains the Profile XDM data we filled into the form. You can see all the fields we defined in the "Sign Up" rule on Launch here.
The other beacon is an ExperienceEvent XDM fired by the "Page View" rule, the same as when we were at the Luma Homepage. You'll notice the data for this page is basically identical, except for the webPageDetails
and timestamp
fields.
We can also look at all of our ExperienceEvent data within Experience Platform using an API call through Postman. Within the beacon data, find the ecid: {id: "##"
field and select your ECID. This ID is specific to your browser session and can be used to look up ExperienceEvent and Unified Profile data. Copy this field and we'll use it in the first step of Lesson 4.
Congratulations! You've got all the basics you need for your Adobe Experience Platform setup to build out not just a basis for customer loyalty programs, but also for data collection, storage, and analysis of your incoming customer data from batch ingestion and streaming data.
The streaming capabilities of data into Experience Platform are not limited to websites - mobile apps, POS devices, and other points of data capture can all stream behavioral and other types of data into Experience Platform using methods like Launch.
Getting Started with Adobe Launch
Much of the value of having all our data integrated with the Adobe Experience Platform is to be able to utilize the Unified Profile. This will give us a real-time customer profile that's built on a user's interactions with various channels (web, physical beacon, email) and can help segment customers better as well as do further analytics using other tools on the Experience Platform. Here we will show how to do a lookup of a user's Unified Profile - namely, the user info we just entered in Lesson 3 - using both an email address as well as an ECID.
Go to Postman, and under the Lesson 4
collection find the Behavioral Data Point Lookup
call. find the entityId
Field and paste your copied ECID here from the clipboard. Make sure that the entityIdNS
field is set to "ecid". Send the GET
call.
You should get back a list of all ExperienceEvents associated with your ECID - browsing the homepage and the signup page. Take a look, your streaming data is now in Experience Platform!
This data is now in Experience Platform! In addition, it's now been ingested into the Unified Profile, since we tagged it for Unified Profile when creating our dataset back in Lesson 2. To confirm this in the API since we're still in Postman, you can take the ECID that you used in the last step and do a similar call with the GET
call for Unified Profile Lookup
. You will then be able to see your full Unified Profile associated with your ECID - your sign-up data for Profile data should now be available as a profile view.
NOTE: It may take up to 5 minutes for the Unified Profile data to refresh and show your profile data
Well that was quick and easy - and brings us to the end of our journey for the regular portion of our lab! You were able to observe and perform the most critical and necessary functions of the Adobe Experience Platform completely from a developer or architect's point of view - using APIs and external sources to help complement the Experience Platform UI or to help you integrate Experience Platform's features into your own custom applications. The streaming capabilities of data into Experience Platform are not limited to websites - mobile apps, POS devices, and other points of data capture can all stream behavioral and other types of data into Experience Platform using methods like Launch.
NOTE: This section is bonus material and can be completed within this lab, if you have extra time.
Now that Luma has a system with Unified Profile populated from both batch customer data and streaming data, and demonstrated the ability to look up Unified Profile data, we can utilize Adobe Experience Platform to take further action on these profiles. Using the segmentation engine, we can specify filters to segment our database of Luma customers. These segments can then be delivered directly to other platforms either through Experience Platform's Segment Builder integrations with other Adobe solutions, or by creating new datasets of segmented customers to export to external solutions.
This will help utilize several aspects Luma's customer loyalty program, including targeted upgrade promotions, re-engagement or re-activation of customers, garnering insights from the loyalty program statistics, etc. Because all data for a customer including and beyond customer loyalty will be tied to one Unified Profile, we can bring in more information to further segment customers, or use segments for a broader scope of actions.
Now that we've ingested some data in Lesson 2 into our Unified Profile Service, we're ready to begin segmenting the data in order to gain insights.
Start by expanding the Lesson 5
folder in postman under L778
.
Select the Profile: Create A Segment Definition
action. Note that in the body section of the action we are creating a simple segment that will include all females who have interacted with the system in the past 90 days.
Then click Send
to create the segment and view the results in the response section of Postman.
Now that we've been able to define a segment the next step is to start a job to run the segment query for us. So select POST Segment Job
and hit Send
.
A segment job may take a little while to run so let's check on it's status by selecting GET Segment Job status
and hitting Send
.
If the status
property says PROCESSING
wait a little longer and Send
the query again until you see SUCCEEDED
.
Let's create a new dataset to hold the results of our segment job.
Now that the job has succeeded and we have a dataset to hold our segmentation data it's time to export the data. Select POST Export Job
and hit Send
.
We are getting close to being able to view our segment data. Select the Get Export Job By Id
action and hit Send
.
If the status
property says PROCESSING
wait a little longer and Send
the query again until you see SUCCEEDED
.
Now the easiest way to view your segmented dataset is to log in to https://platform.adobe.com, click on the Data tab, then click on {{ldap}}_ProfileSegmentDataset and finally the Preview
button.
As you can see all of the people in the dataset are female
.
Unified Profile, Identity, & Segmentation
Now that we've been able to see how to use Adobe Experience Platform's APIs to empower Luma, the opportunities that to understand and utilize the real-time customer profile that Unified Profile offers are numerous. As an API-first built platform, Adobe Experience Platform allows developers to take full control of its features and integrate them into their own solutions as well.
We hope you enjoyed this workshop and that this workbook can serve as a useful reference for you going forward. Please don't hesitate to reach out to one of the Platform team members if you have any questions, and visit our booth at the pavillion to see the many other faces of Adobe Experience Platform!