0

Our current design starts from a trigger, checks the data against a bunch of rules that the customer sets up. The ones that pass get sent to another job which exports it to another external system through a SOAP call.

The problem is that we hit governor limits when a user updates too much data at one time (in particular we are looking at a large import from an external system or through data loader.

Since it's a large amount of data that we are targeting (~200,000+ accounts) we were planing on using a batchable class. The problem I'm seeing with that is that we are only allowed to enqueue 5 batches at a time which could be a huge problem if our customer has multiple workflows (which they do).

I've seen posts on here talking about a singleton solution however I am not clear on how to get that to work with workflows. Is there a way to delay the execution until after workflows have completed? Do I have to settle with putting the sync job on a schedule/add a large delay to give it time to collect all instances from the workflows?

Thanks for the help.

J. Larson
  • 41
  • 5

2 Answers2

1

I might go with:

  • Start a schedulable at Time(0)
  • Schedulable checks to see if batchable is running, if yes, abort
  • Schedulable calls Batchable that queries for work
  • On batchable finish(), start the schedulable again if not already running

Don't tie any of this to triggers; the triggers merely leave the database in a state such that the batchable can find work to do.

cropredy
  • 71,240
  • 8
  • 120
  • 270
  • I'm not entirely following. I'm doing a performance refactor on a system that is already utilizing triggers. It sounds like what you are describing would require a much larger effort than my team can afford to make. – J. Larson Aug 21 '20 at 05:25
  • I was trying to illustrate a singleton-type pattern – cropredy Aug 21 '20 at 18:13
0

Queueable is the only way to reasonably handle large volumes of data from a trigger. Batchable is for updating a large number of records in bulk, but is too limited for frequent use in triggers. You can chain Queueable calls like a Batchable class, but without the stricter limitations in place.

sfdcfox
  • 489,769
  • 21
  • 458
  • 806
  • Is Queueable able to handle the large volume we need? (over 200,000 records) – J. Larson Aug 19 '20 at 18:31
  • @J.Larson Generally speaking, yes, it's possible, but it depends on the specific circumstances. You can chain a queueable into another queueable to process a large amount of data. It's a different design from batchable, but better suited for triggers. – sfdcfox Aug 19 '20 at 18:38
  • @J.Larson - chained queuables will have an SFDC-imposed backoff time delay after the 3rd or 4th link, asymptoting to 60 secs between links – cropredy Aug 20 '20 at 01:22
  • @cropredy In theory. I've never actually seen it happen, though (see this). I've never actually observed it happening. – sfdcfox Aug 20 '20 at 03:03
  • Well I certainly have seen this but it is worth me revisiting this with v49 – cropredy Aug 20 '20 at 06:05
  • Thank you both. I'll bring this up in my next meeting with my team. I was originally leaning towards batchable because it was better for large volumes and we know our customer is going to import ~200,000 accounts and I know they have a number of workflows that will update related objects as well so I'm estimating having to handle 700k-1m+ objects. – J. Larson Aug 21 '20 at 05:30
  • Even if the chained jobs take a long time (few hours) that should be fine in my use case. – J. Larson Aug 21 '20 at 05:35
  • @sfdcfox what would the logic be for recursively chaining the queueable for large volume? Just process a chunk and send the remaining data set to the next one in the chain? Or just process all the records in one queueable execution? I feel like I'm missing a common design pattern with queueables. – J. Larson Aug 25 '20 at 00:13
  • @J.Larson Yes, just process a chunk of data, then chain to another Queueable, repeat as necessary. – sfdcfox Aug 25 '20 at 01:25