We’re looking for suggestions with regards to getting our data to Marketo and keeping it up to date. If you have experience with either of these approaches please let me know how it went, why you made the decision you did and if you would change anything now.
We have a large custom CRM with lots of data in it. Our BI department connects to our main databases and extracts data with their business logic (outside of the CRM) to their tables. Through their ETL process they’re able to keep a mostly up to date marketing list for us to use.
From there, our process keeps a cached version of this table to compare against. If it finds new or updated data, it sends those leads in batches to Marketo.
Our webhooks then take any data received from Marketo for use in our CRM such as triggering workflows. We also pull changed records to make available for our CRM in the same fashion.
We could then define critical triggers within the CRM that push updates or new leads to Marketo without sitting in our rudimentary queue.
While we’re able to get this up and running relatively quickly, the main disadvantage to this approach is that the data may be N minutes stale which could have an impact on how effective our use of Marketo is. Additionally, the critical data that needs to be updated may grow past quite a bit which would add overhead for maintenance.
Another way we thought of would be to setup a real time ETL process outside of our BI dept for the whole of our CRM.
As changes happen in our system, we would send through Rabbit MQ to handle internal processes such as audit tracking. Using Rabbit, our process could push out to Marketo without having to compare our BI table to a cached version.
Pulling changes and receiving webhooks would relatively remain the same approach.
The biggest disadvantage here is that it would significantly increase our development time for Marketo integration. However, this would keep our data in sync nearly instantly.