We have a ~4 Million record database, and we routinely run into challenges with the daily 50k API limit, specifically with VidYard and their native Marketo integration...
This is effectively limiting us from using the VidYard integration (we shut it down because it was snarling all of our operations each day as we surpassed our API limit) and prevents us from considering ANY other LaunchPoint vendor that relies on API integration. Honestly, it diminishes the overall net value Marketo can provide.
Is anyone else having problem with this limit? Or suggestions for bypassing it? Has anyone successfully got Marketo to raise their API limit without having to pay extra? We already pay for a nearly FOUR MILLION RECORD DATABASE and it just seems like an API load to support that database size should be included in the base product, especially since Marketo is trying to court more enterprise clients...
If you think it's bad now, you should've tried it when the standard allocation was 10,000 calls! (Should also check with your account rep b/c you should be able to get 100,000 these days given your database size.)
Really, Vidyard should've known better than to lean on the REST API (let alone to use the API without hard-capping their own daily consumption and being explicit about the dangers). The API has never been suited for one-by-one calls in response to end user activity, and video tracking is particularly gnarly in this regard... if you're logging plays, pauses, and finishes you've got crazy multipliers, and it can get out of hand even using the (de facto unlimited) Munchkin API!
Of course we know they were thinking, "Enterprise product = enterprise capacity" but not testing isn't right. We vet products by asking, "How do you deal with the API limitations?" If they're dumbfounded, or if they don't have an answer that's already baked into the product settings, that's a bad sign; if the answer is, "You set a daily limit in this dialog box," that's at least a better sign.
The silver lining, and take this with the requisite irony, is that if you were allowed 1MM calls/day like you probably have with SFDC, the back end couldn't handle the traffic. I think that's where the main problem is and that some reengineering is necessary (and, one hopes, pending). While you can buy more API calls if you have the $ (you probably can't wrangle your rep above 100K bundled) that doesn't mean performance will be up to your expectations, because the whole resource-governor-time-slice stuff you get with SFDC isn't there yet.
Another thing to think about is what you might call the "API privatization" movement -- some major players have realized that supporting 3rd-party integrations isn't as profitable as, well, owning the integrations (at which point they're not really integrations!). Witness Salesforce swallowing MuleSoft and in turn dataloader.io, etc. Marketo's data transfer hub (I think that's what it's called), which I wholly support, is another example, but only suitable for bulk extracts, not inbound activities. Perhaps there will be a similar offering that, while not "open" in the sense of creating an 3rd-party ecosystem, allows for this kind of scale.
Sanford - have you had any experience with the amount of API calls Marketo can handle in a day? In a call with Marketo, they mentioned that they have accounts that use 1 million+ API calls in a day.
Totally depends on which endpoints -- some take much, much longer than others to complete.
I would not recommend paying for more than 200K extra (i.e., 250K total) calls per day as it's extraordinarily unlikely that you will be able to use that many due to resource constraints.
Perhaps they are updating 1 million+ records which is quite feasible in a few ways. For example- some endpoints will update 150 or 300 records at once.
The main limitations to work around are:
- 50k API calls per day (Marketo can increase this for a fee)
- 10 concurrent connections (system wide)
- 100 API calls w/in 20 seconds (system wide)
If you need larger data sets there are bulk import/export options. The limits to work around are:
- Max Concurrent Export Jobs: 2
- Max Queued Export Jobs (inclusive of currently exporting jobs): 10
- File Retention Period: 10 days
- Default Daily Export Allocation: 500MB (Increases available for purchase)
- Max Time Span for Date Range Filter (createdAt or updatedAt): 31 days
For some reference, my team has hit the 500MB limit a few times when extracting email activity for one day where 750k-1million emails were sent. We're pulling email Sent/Delivered/Opened/Clicked/Bounced etc. so the dataset grows a bit larger than the total audience. Now we've upgraded to a 1gig daily limit. The next tier I believe is 3gig.
hope it helps!