Hi community!
I have a Google Sheets document with data from different sources that will enhance my database, however, I tried using Zapier but it breaks if there are more than 10,000 rows. They literally told me "It’s likely that much data in a GSheet will cause issues/errors in the Zaps."
So, does anyone knows of a solution for this scenario?
-Raul
Solved! Go to Solution.
AFAICT, Google spreadsheets in the first place are not optimized for large volumes of information, and on top of that reading that volume of data via Zapier from G-sheets would likely cause errors. IMHO you could try decreasing the batch size by running the Zap more often (i.e., increasing the frequency, if you're running this batch job 1x a day, maybe do 2-3 times a day). This can put you in a safe zone, but as you can imagine, theoretically it's not a guarantee that you'd never encounter 100k rows for processing. Alternatively, you could also use Google Sheets API (using the custom API request action) to loop through the smaller pieces of data until you're done processing the entire file.
AFAICT, Google spreadsheets in the first place are not optimized for large volumes of information, and on top of that reading that volume of data via Zapier from G-sheets would likely cause errors. IMHO you could try decreasing the batch size by running the Zap more often (i.e., increasing the frequency, if you're running this batch job 1x a day, maybe do 2-3 times a day). This can put you in a safe zone, but as you can imagine, theoretically it's not a guarantee that you'd never encounter 100k rows for processing. Alternatively, you could also use Google Sheets API (using the custom API request action) to loop through the smaller pieces of data until you're done processing the entire file.
Thanks @Darshil_Shah1, I'll try the Google Sheets API. These files will always load one column from 75k to 100k records on a daily basis.