1 of 2 people found this helpful
Let's not be alarmist now. It adds to FUD when an automated tool has merely reported "this network request is the slowest on your page". Something must always be the slowest, by definition.
Automated tools are not capable of determining whether the relative performance actually matters. The same goes for preposterous automated security checks.
On the other hand, it's not really true that Munchkin is doing a lot of "work". The load time refers to the time to (asynchronously) load the Munchkin bootstrapper and (again asynchronously) load the current Munchkin library. Neither of these loads has any user-facing impact at all, unless you're using an old-school(as opposed to ) event in some other essential JS, which is itself a bad practice.
A vital way to reduce the user-facing performance impact of Munchkin is to turn off click tracking (addto links, or at least to most links). You must be acutely aware of the consequences of doing this.
Hi Sanford, thanks for your response. I apologize if my question was unclear, I'm not talking about the user performance impact but rather the asynchronous load time that impacts the overall speed of our site as a search ranking factor. In some testing we performed of all the elements that load on a page, the top 9 longest load time offenders were all Munchkin (in an uncached browser representative of a first-time user), all taking between 7 and 12 seconds to completely load. From a user's perspective, everything would look normal, but it impacts our site performance especially on mobile which as you know is extremely important to Google's algorithm.
Just curious if anyone else is having similar issues and how they've addressed.