Interpreting Email Performance Report

Balkar_Singh
Level 9 - Community Advisor + Adobe Champion Level 9 - Community Advisor + Adobe Champion
Level 9 - Community Advisor + Adobe Champion

It's July 2023 and at the forefront of the AI landscape is Code Interpreter. It's the most advanced AI that we've seen at this point. There's hardly a week that goes by without some major organization releasing a new update, shaking things up and showing us something new that AI can achieve.

 

A lot of people, including myself, are excited to see these updates, to test out the new features, and to share all about what they can do. But here's where I think we might be missing the point. Knowing what AI can do is all well and good, but it's when we figure out how to apply it to our own work that it really starts to shine.

 

So here I am to share my perspective about ChatGPT Code Interpreter and its application in our work. In this post, let’s pick the most basic data we usually have - Email Performance Report.

 

Email Performance Report is one of the most useful reports in Adobe Marketo Engage. It offers a snapshot of multiple metrics, enabling a comprehensive review of your campaign's performance. Typically, we all keep one for all email campaigns within the programs, to have it handy, and review performance.

 

Balkar_Singh_0-1689949961812.png


But mostly, marketers get busy running more campaigns, racing against time. These reports could get sidelined due to the constant urgency of managing and launching new campaigns. As a result, key insights that could guide future strategies may go unnoticed. These reports are valuable, but sometimes, they don't get the attention they deserve due to lack of time. AI solves this.

 

This blog post gives you ideas to analyze your email campaign performance using Code Interpreter. It actually also enables you to discover new ideas, and execute with or without Code Interpreter. How to consistently evaluate and refine your strategies based on your campaign data? How to understand and interpret these reports? How to make the process of extracting useful insights from the reports simpler, speedy and accurate?

 

How to Fetch the Email Performance Reports?

 

There are two ways you can fetch these reports. The most direct way is to navigate to Analytics, and access the Email Performance Report in the global analytics view.

 

Balkar_Singh_1-1689949961853.png


This is where you get the Global Email Performance Report.

 

Balkar_Singh_2-1689949961942.png

 

A key benefit of this report is that you get data about all emails in the system. You may also change the time-frame, or restrict it to some criteria, but as a raw report, it gives you preselected “All Emails”

 

Balkar_Singh_3-1689949961832.png

 

Another approach is to create a custom Email Performance Report for specific emails. This can be done within Marketing Activities, inside the Programs you create. Alternatively, you can also set this up within Analytics. Unlike the global view where all emails are selected by default, creating a custom report requires manual selection of emails. It requires expanding the options in the Email Filter View. To access this view, simply double-click on the "All Emails" option (snapshot above).

 

Balkar_Singh_4-1689949961910.png

 

Understanding Key Email Performance Metrics

 

In this report, you will have the following key metrics listed as columns.

 

Sent

Delivered

% Delivered

Hard Bounced

Soft Bounced

Opened

% Opened

Clicked

% Clicked

Clicked to Open

Unsubscribed

% Unsubscribed

First Activity

Last Activity

   

 

Adobe Marketo Engage works at a Peson Record’s level to a significant extent. It is a marketing automation platform - it is not an email blast app. It is one of the reasons why interpreting this report might cause inaccuracy if you are assuming a definition as per general understanding of what each of these metrics mean. The difference is subtle, and it is valuable to know it while analyzing your performance. For example, you may think that we sent 10K emails this month, and by that you may mean that one email was sent 10K times.

 

Metric

General Understanding

System Definition

Difference

Sent

Total number of emails sent.

Number of people to whom at least one email delivery attempt was made.

General understanding refers to the number of emails, while the system refers to the number of people.

Delivered

Total number of emails that arrived in recipients' inboxes.

People who successfully received at least one message.

Similar to Sent, the general definition is about the number of emails, while the system's definition is about the number of people.

% Delivered

Percentage of emails sent that were successfully delivered.

Ratio of the number of Delivered emails to the Sent emails.

Both definitions are largely similar.

Hard Bounced

Emails not delivered due to permanent reasons.

Recipients whose emails were permanently rejected.

The general understanding refers to the emails, while the system refers to the recipients.

Soft Bounced

Emails not delivered due to temporary reasons.

Recipients whose emails were temporarily rejected.

Similar to Hard Bounced, the difference lies in whether we're referring to emails or recipients.

Opened

Total number of emails opened by recipients.

People who opened the email at least once.

Again, the difference is in referring to the count of emails versus the count of people.

% Opened

Percentage of delivered emails that were opened.

Ratio of Opened emails to Delivered emails.

Both definitions are largely similar.

Clicked

Number of recipients who clicked on any links within the emails.

Count of people who clicked at least one link in the email.

Both definitions are largely similar. Also, in Adobe Marketo Engage, clicks on unsubscribed pages in emails are not counted in Clicked.

% Clicked

Percentage of Clicked/Delivered, or

Percentage of opened emails where a recipient clicked on a link.

Ratio of Clicked emails to Delivered emails.

Although general understanding about this is the ratio of Clicked/Delivered, one understanding also exists, which calculates this as a percentage of opened emails.

Clicked to Open

Percentage of recipients who clicked on a link after opening the email.

Ratio of Clicked emails to Opened emails.

Both definitions are largely similar.

Unsubscribed

Number of recipients who unsubscribed from your emails.

Recipients who unsubscribed from this email.

Both definitions are largely similar.

% Unsubscribed

Percentage of delivered emails that led to an unsubscribe.

Ratio of Unsubscribed to Delivered.

Both definitions are largely similar.

First Activity

Date and time of the first engagement activity with the email.

First recorded action taken by a recipient on the email.

Both definitions are largely similar.

Last Activity

Date and time of the most recent engagement activity with the email.

Most recent action taken by a recipient on the email.

Both definitions are largely similar.

 

Analyzing the Email Performance Report


There are certain steps you can take to unlock insights. As a first step, you need to fetch this report as an export (preferably in csv format, as it works best in this case). And we start by plotting how many people did we reach over time?

 

There are at least three ways you could consider how to fetch the timing.

 

  • Make separate Email Performance Reports for each time period by changing the Sent Date in the report settings. This is the most accurate method to obtain various report versions. Afterwards, you can put all these reports into a Zip file. If you want, you can also combine all these reports into one spreadsheet using different tabs.

 

Balkar_Singh_5-1689949961925.png

 

  • Use the First Activity Date to get an idea of when the email was sent. This method isn't as precise as the first one, but it's quicker and provides a reasonable estimate, even if it's not 100% accurate

 

Balkar_Singh_6-1689949961882.png

 

  • Using Email Names, extract information about campaigns - assuming you follow some naming convention. This however requires due diligence of adhering to certain naming conventions. Hence, it is a great, great way to analyze email performance - but it’s at the mercy of diligence you have, to adhere to naming conventions. An example of a naming convention being -  [Abbreviation of Program Type] [YYYY]-[MM]-[Optional DD] [Brief Description]


Line Plot

 

Let's use First Activity and limit this to the performance between April and June 2023. Download this report and load the file into Code Interpreter. Once you upload the file, type the prompt to create the chart. You'll want to request a line plot over time. You can use the following example* to get a similar line plot as shown. Feel free to change the details. 

 

Prompt 1

 

Create a line plot that displays the trend of the total number of people reached (Delivered) over time. The time should be represented by the month and year extracted from the 'First Activity (PDT)' column, in the format 'Month Year' (e.g., 'June 2023').

 

Balkar_Singh_7-1689949961958.png

When you have this, you could contemplate, discuss to establish grounds and make considerations for future strategies.

  • What is the overall trend in the number of people reached over time? Look for general increases or decreases in this metric.
  • Are there any seasonal patterns in email reach? For example, you might reach more people during certain times of the year, such as holidays, and fewer people during other times.
  • Are there any anomalies in the reach data? Sometimes, you might see a sudden spike or drop in reach that doesn't seem to be explained by any of the factors above. Investigating these anomalies could lead to interesting insights.

 

Heatmaps

 

Another useful visual to have is a heatmap of Correlation between metrics. This will tell you how your metrics change, compared to another metric. Find the positive and negative correlations, and identify what you can maximize on. A positive correlation means that when one variable goes up, the other tends to go up as well. If correlation between % Delivered and Sent is negative, it means that an increase in Sent Emails may hamper Deliverability. 

 

To add to the context, let’s say, the maximum positive correlation % Delivered has is with % Opened, or Clicked to Opened Ratio. It’s natural that as % Delivered increases, so does the scope to engage. However a nominal positive correlation would mean that while it is positively correlated, it is not strongly correlated. 

 

Translated to plain English, it means that % Deliverability does tend to increase % Opened, however not strongly. This seems to make sense, as engagement is a factor of Subject Line, and Content as well, not just whether an email was delivered.

 

Prompt 2

 

Create a heatmap to see correlation of metrics. Do not use white cells. Use white background. Display correlations in both the lower-left and upper-right halves, so we can see the correlation between any pair of variables.

 

Balkar_Singh_8-1689949962324.png

 

Another great heatmap you can create is an activity heatmap. These heatmaps provide insights into the times when users are most likely to first interact with emails. You can use this information to optimize the timing of your email campaigns, aiming to send emails when users are most likely to interact with them.

 

Prompt 3

 

Hello, I have a dataset that includes timestamps of when users interact with our Emails. I am interested in understanding when users are most active so that we can better tailor our engagement strategies.

 

Could you please help me create a heatmap to visualize user activity based on the hour of the day and the day of the week? Please plot the days of the week on the X axis and the hours of the day on the Y axis. I'm interested in both when users first interact with the emails and when they last interact with it.

 

Once the heatmap is created, could you also provide an interpretation of the results? Please try to keep your explanation simple and easy to understand, focusing on the overall trends and any particularly noteworthy observations. Thank you

 

Balkar_Singh_9-1689949962013.png

 

 

Line graphs to view email performance metrics over time.

 

How does email reach correlate with other metrics like open rate, click-through rate, and conversion rate? An increase in reach is good, but we need to ensure that people are also engaging with your emails and converting. A handy graph that shows changes in measures such as the percentage of emails opened and clicked gives us a deeper understanding. 

 

This graph lets us see all at once how these measures evolve over time. We can spot things like the click-to-open ratio, which got a lot better in June 2023 (example), along with an increase in the email open rate. It also suggests that once people opened the campaign emails, they found the content a lot more engaging than usual.

Spotting these trends can help us find hidden chances to make the most of what's working well. 

 

Prompt 4

 

Create a line graph to see how % Opened, % Clicked, % Clicked to Opened, % Unsubscribed & % Bounced change over time. Refer to First Activity to refer to time. Plot months on X Axis in format “Month Year”, e.g. May 2023. Plot percentage on Y Axis.

 

Balkar_Singh_10-1689949962189.png

 

Categorization: Enhancing your Analysis

 

Grouping data is an indispensable technique in data analysis, offering invaluable insights that drive strategic decision-making. By segmenting data into distinct categories, we can uncover nuanced patterns, trends, and relationships that are often masked in an aggregated view. This ability to 'zoom in' allows you to personalize offerings, optimize operations, and effectively allocate resources by targeting specific regions, personas, or program types. 

 

Moreover, it facilitates the identification of what works and what doesn't in marketing campaigns, enabling continuous learning and improvement. Therefore, the power of grouping data lies in its capacity to transform a vast amount of information into actionable intelligence, ultimately leading to more informed decisions, improved customer engagement, and increased business performance.

 

Again, there are two key ways to fetch category-wise versions for Email Performance Reports. Create different versions using smart lists, or refer to naming conventions. In this case, let us consider naming conventions. You want to see how metrics differ for different program types, and for different regions. 

 

Program Type - Webinar, Newsletter, Nurture

Region - APAC, ANZ, EMEA, NA

 

Stacked Bar Graph

 

Stacked bar graph represents a lucid visual way to see how much from each category contributes to total emails sent. Herein, I used entirely synthetic mock data, just for creating this graph. You can uncover useful insights about what the trend for each category looks like.

 

Prompt 5


Email Name follows the following naming convention

<write the naming convention>

The following is what it means

<explain what parts represent what>

For example

<take example>

 

Create a stacked bar graph, to represent the volume of people reached each month. Group the stacked bar graph by region as fetched from the Email Name as per Naming Convention explained above.

Use time from <Date in naming convention or First Activity> and on X Axis mention months in the format "Month Year" (Example, June 2023)

 

Balkar_Singh_11-1689949962078.png

 

Line graphs to view volumes in different categories


Stacked bars are not the only way to see this trend - you could use trend lines as well, plot for different categories differently (by regions, for example)

 

Balkar_Singh_12-1689949962476.png

 

Translating Analysis into Action

 

Consider potential observations from the analysis, e.g. low deliverability rates or high bounce rates. Identify the course of action to take, drawing from industry best practices, and a lot of wisdom in this community. And address those areas, to arrive at a better world. This is where you work with your marketing automation experts to improve on these metrics. 


Conclusion

 

We’re looking at it from the perspective of an Email Performance Report. I used different data, mixed up with synthetic mock data to run these experiments - every analysis would be different. The data you upload is an aggregated view of email performance.

 

*Keep in mind that this is a beta feature, and it tends to behave differently. The use cases above may not render similarly with the same prompts. The overall value is how you can put thought into prompts, and come up with how you can make new observations, and hence, tweak your campaigns for improvements. Adobe Marketo Engage enables you to incorporate your considerations in campaigns, by using the capabilities it has.

 

With some basic data analysis you can uncover areas of improvements with speed and accuracy with these new apps in town. You may want to either capitalize on certain areas, or improve any areas as per your observations. Many of these have been discussed extensively in this community. You want to address a deliverability issue? Search community. You want to address data governance? Search community.

 

Theory will take you only so far. What have you found useful in your experiments?

10245
4
4 Comments
Vinay_Kumar
Level 10 - Community Advisor

Great learning! It emphasizes the importance of turning analysis into action. Thanks for the practical insights!

Zoe_Forman
Level 9 - Community Advisor + Adobe Champion

This a great way to explain email deliverable to new users in Marketo and also into our Marcoms team - thanks for supplementing my onboarding documents

Balkar_Singh
Level 9 - Community Advisor + Adobe Champion

@Zoe_Forman Glad you liked it, and it is helpful for you!

pflovie67
Level 2 - Champion

Outstanding work @Balkar_Singh