Using the Salesforce Data Loader is pretty much mandatory when you are working with large numbers of records, and need to perform a mass upload or update. To get optimal results, Salesforce recommends Data Loader usage when you are working with 50,000 to 5 million records. If less, regular batch tools work perfectly, and if you are working with more, 3rd data apps are the best way to go.
Nonetheless, when running Data Loader, you can schedule these big loads into batches, and let automation process them in the background. There is a catch though: the larger the batch size, the greater the chance of running into issues, like CPU timeouts from lack of memory or query limits in orgs with lots of automation. Basically, errors that are the result of too much data being processed all at once. Considering how essential it can be to use a Data Loader process when working with a large volume of records, how can you avoid hitting these errors and still process your uploads?
One solution is to manually drop the Data loader batch size from the default 200 to 1. Whoa, that’s a big leap, right? By dropping your default batch size you can easily avoid the above errors from large data batches and the resulting lack of bandwidth. However, this solution does have a few drawbacks to consider.
First, there are daily batch limits, and for larger datasets, you may not be able to process all records without reaching that limit. Second, it smaller batch sizes mean your total upload time will take a bit longer. How much longer varies depending on automation and the number of records, but it generally won’t be a crazy large time increase. In the end, if you have a lot of processes running every day, lowering your batch size is a handy trick to help you avoid overloading the system, but still being able to process all of your data.
Read the: Data Loader Guide from Salesforce
-Ryan and the CloudMyBiz Team