Now that our site has been switched to mobile-first Google indexing, speed has become even more important. In analyzing the load times of different page elements, the biggest (slowest) offenders are always Munchkin code. Obviously Munchkin is resource-intensive because of all the work it's doing, but we're wondering if anyone else seeing a similar issue, and have any recommendations to mitigate?
Let's not be alarmist now. It adds to FUD when an automated tool has merely reported "this network request is the slowest on your page". Something must always be the slowest, by definition.
Automated tools are not capable of determining whether the relative performance actually matters. The same goes for preposterous automated security checks.
On the other hand, it's not really true that Munchkin is doing a lot of "work". The load time refers to the time to (asynchronously) load the Munchkin bootstrapper and (again asynchronously) load the current Munchkin library. Neither of these loads has any user-facing impact at all, unless you're using an old-school(as opposed to ) event in some other essential JS, which is itself a bad practice.
A vital way to reduce the user-facing performance impact of Munchkin is to turn off click tracking (addto links, or at least to most links). You must be acutely aware of the consequences of doing this.
Hi Sanford, thanks for your response. I apologize if my question was unclear, I'm not talking about the user performance impact but rather the asynchronous load time that impacts the overall speed of our site as a search ranking factor. In some testing we performed of all the elements that load on a page, the top 9 longest load time offenders were all Munchkin (in an uncached browser representative of a first-time user), all taking between 7 and 12 seconds to completely load. From a user's perspective, everything would look normal, but it impacts our site performance especially on mobile which as you know is extremely important to Google's algorithm.
Just curious if anyone else is having similar issues and how they've addressed.
Hey Natalie, Sanford,
My team is also in the process of measuring the performance of our website and have come to a similar conclusion regarding Munchkin's performance. Understandably, some of the functionality that leads to website slowdown is unavoidable for the script, such as the synchronous XHR requests that happen on link clicks. However simply switching off link-tracking isn't really a solution to this problem in my opinion.
I think the best possible way of reducing the front-end performance impact of these scripts is to attempt do more of the work in the back-end rather than in the front-end. For example, I don't need to track navigation with JS - it can be tracked in the backend when a GET request is received for a page, as long as I can communicate with a Marketo API to pass on the necessary info. Does such an API exist, is there documentation for it? Currently, the synchronous link tracking slows down every navigation on our website by up to 70%, which is huge, so being able to perform this outside of the client's machine would be incredibly powerful.
Hi Elizabeth, we have not found a solution for this yet but I like the direction Alex O'Regan is headed in his comment above.
Since this is an issue that would likely impact any site using Munchkin, I'm surprised Marketo has not addressed it within the product or with an API (at least not that we've been able to find). #product idea
If you want to avoid the impact of synchronous tracking (which is mandatory if done from the front end, without it click tracking is worthless), then it's not like it's difficult to send those requests from your back end. The "line protocol", if you will, is easily reverse-engineered.
You can also switch click tracking to use beacons where supported. I posted this code years ago if you search for it. But not all browsers support beacons.
How about setting disableClickDelay to True? Since it doesn't wait 350ms to track clicks, that should certainly help a bit.
If you don't wait, then there's no guarantee that you're logging the hit.
Bottom line, in browsers with beacons you can eliminate all delays, in other browsers you have to play it safe.