Ah great - thanks for catching this Conor!
Conor Fitzpatrick (and others who have participated in this thread), we had to remove this filter from our campaigns. We were finding in many instances, Marketo wasn't logging the "visited web page" activity upwards of 5 minutes after the click link activity. Therefore, the trigger campaigns weren't triggering with this filter in place. Thought we had this one solved.
Well that's a huge bummer. I wonder if it has to do with the munchkin 2.0 updates? I haven't tested it post-release, but I will now to see if we get a different result.
The thing is, Munchkin v2 has yet to be rolled out to anyone yet. There was a minor change made today, but I doubt that's affecting this.
Hmmm... Do you think it has anything to do with your visit volume? Our web traffic volume is quite low right now as we're in the process of growing our content strategy.
I stumbled on a slightly different version of this solution last week, using a click trigger and then a wait and remove from flow step based on if the lead visited the page.
The only problem, though, is that today I've been looking through the leads that were removed from the flow, and some look like legitimate users - they clicked on the link several minutes after the email delivered, and registered an open, but never visited the web page.
I don't think the munchkin tracking code works 100% of the time. It would be best to just not count clicks either from before delivery, or within seconds of delivery.
Edit: Just set up an idea for this, if anyone wants to up-vote it:
The solution is as I described above and elsewhere. Anything that involves assumptions like "no one ever opens emails 30 seconds after they get them" isn't gonna work.
Venus - we (meaning Becky Miner, our Marketo Queen, and myself - a humble court jester) have found that, many times, the suspected "bot" clicks are checking the first link in the entire email. So we've put our one-pixel link at the very front of the email, before any other links...we've caught a few flies in the trap so far.
We just ran into this same issue today. After reading through Matt's response, not exactly sure what our next step should be. Lots of good info though!
Matt gives a lot of great advice here, but I did want to just add as a footnote/call to action for anyone else annoyed by the current situation: I've been talking to a few different filtering companies about adding some unique, filter service-only string to their UA when checking links (they normally spoof specifically as IE/Win7) in order to correctly differentiate human vs. machine clicks. The problem affects more than marketing automation platforms; I find myself continually explaining this to transactional email provider users, for instance.
I'd highly encourage people to go bother Cloudmark/Symantec/Barracuda as well so I don't seem like a lone weird geek on this point.