5 Replies Latest reply on Jun 27, 2016 11:24 AM by 14211deb86fe17e5e12c6b82393397ab502efd7f

    Overlap in A/B Test Reporting of Nurture Emails

      Hi,

       

      We're having some issues with the A/B test reporting for our nurture emails. We've set up A/B tests for many of our nurture emails, ranging from subject line to whole email testing.

       

      We've also ran several email blasts that reference these same emails. However, when those blasts run they don't actually run the test since they're batch campaigns. Now our A/B test results for those nurture emails are inflated on one side (since the email was referenced in the blasts, but only one version of the email actually ran, so it looks as though that version got more opens and clicks). I tried creating a separate email performance report to separate out the test results of the nurture program from our email blasts, but couldn't find a way. Has anyone else had similar issues or know how to set up a report that provides A/B test results just within the nurture? Moving forward do we need to create new local emails to reference in our blasts?

       

      Thanks,

      Jasmine

        • Re: Overlap in A/B Test Reporting of Nurture Emails
          Justin Norris

          Jasmine Foruzani

           

          I think you are talking about Champion/Challenger functionality.

           

          This only works in triggered email sends and engagement programs. It's designed to gauge improvement over time as leads flow through.

           

          Add an Email Champion/Challenger - Marketo Docs - Product Docs

           

          For batch sends you would use random sample in the flow step choices, or use the A/B testing in Email Programs which conversely only works for batch sends.

           

          To see the data just on the Champion/Challenger sends, read the "View Test Summary" option when you are selecting the email (see screenshot here):

           

          Champion/Challenger: Declare a Champion - Marketo Docs - Product Docs

           

          Appears view test summary still includes the batch sends!

           

          Either way though if you are looking at ratios (e.g., open rate, clicked rate) the raw volume of sends shouldn't matter, aside from the fact that the higher volume will give you greater confidence in the results for one of the emails.

            • Re: Overlap in A/B Test Reporting of Nurture Emails

              Thanks Justin Norris - Yes, although the issue we're having is that the one version of the email is generating opens/clicks, which is also inflating the open/click rates because the other test version isn't sent. We're aware the functionality only works for triggers and engagement programs, but we reference our nurture emails (which have the tests built in them) in the blasts so that we can later ensure that if individuals who received the blasts later enter our nurture stream, they will not receive that email again.

               

              Basically, the data on the Champion/Challenger sends within the "View Test Summary," is also including our email blast opens and clicks. Do you know of a way to separate the reporting out for the open/click rates if we're not using separate local emails for our blasts?

                • Re: Overlap in A/B Test Reporting of Nurture Emails
                  Justin Norris

                  Jasmine Foruzani

                   

                  Ah interesting. I didn't know the batch sends would show up in the View Test Summary!

                   

                  The other approach here is using smart lists with test variant constraints, but it seems like the batch sends may still count towards variant A (champion) in this case based on what you've seen.

                   

                  I think the better approach on the whole is to take a step back and look at your engagement program setup. It would be better to use programs inside your engagement programs instead of individual emails. Edward Unthank devised a good solution here where you set leads to an "Exclusion" status in the child program if you send leads the same content or if they download it organically outside the engagement program. Because they have a status in that program, it skips them.

                   

                  This ensures you do not send the same people the same stuff twice and also preserves your test data.

              • Re: Overlap in A/B Test Reporting of Nurture Emails

                Is anyone else bothered by the fact that you can't do champion/challenger AB testing on batch-based smart campaigns? That seems like A) a huge limitation and B) something that Marketo should warn anyone using the AB testing about! I just erroneously ran an AB test batch campaign for a client and now we look like we don't know what we're doing. I'm certified in Marketo and nowhere in the tests did it mention this limitation. Does anyone else think that's strange or at least odd that they don't warn you about it?