5

What is the maximum number of records a trigger needs to be able to process in a single transaction/batch?

Is it limited to 200? or do triggers need to be able to handle more?

Robs
  • 9,336
  • 20
  • 106
  • 215

4 Answers4

7

When more than 200 records need to be triggered, Salesforce runs the trigger in chunks of 200.

So, if 1000 records are updating, Salesforce runs the trigger 5 times on 200 records each time.

This means that whatever you do in that trigger gets hit 5 times e.g. if you do 1 SOQL query, then the 1000 record update will use 5 queries.

For example, suppose we have this useless trigger on Account:

trigger AccountTrigger on Account (after insert) {
    List<Account> accounts = [SELECT Id FROM Account];

    System.debug('Number of SOQL queries used: ' + Limits.getQueries());
}

Then, the following test passes:

@IsTest
private class TriggerChunkingTest {

    @IsTest
    static void fiveChunks() {
        Integer nAccounts = 1000;
        List<Account> accounts = new List<Account>();

        for(Integer i=0; i < nAccounts; i++) {
            accounts.add(new Account(Name = String.valueOf(i)));
        }

        Test.startTest();
        insert accounts;
        System.assertEquals(5, Limits.getQueries());
        Test.stopTest();
    }
}

With the following debug output:

11:50:15.253 (1264272078)|USER_DEBUG|[9]|DEBUG|Number of SOQL queries used: 1 11:50:17.429 (3437153880)|USER_DEBUG|[9]|DEBUG|Number of SOQL queries used: 2 11:50:18.693 (4705911499)|USER_DEBUG|[9]|DEBUG|Number of SOQL queries used: 3 11:50:20.586 (6601999951)|USER_DEBUG|[9]|DEBUG|Number of SOQL queries used: 4 11:50:22.141 (8156881669)|USER_DEBUG|[9]|DEBUG|Number of SOQL queries used: 5

As you can imagine, even with efficient triggers, this can blow up pretty fast. 5000 records? Your trigger is going to run 25 times.

As a rule of thumb, I like things to work OK on 1000 records.

If my system can cause updates to more records than that at once, I tend to use some asynchronous method to split that update up.

For example, Campaign Members tend to blow up in crazy ways. Suppose I have a trigger on Campaign which needs to update the corresponding Campaign Members (CMs). And those CMs have their own triggers. I would update the CMs using a Queueable which only updates 1000 or so at-a-time. In Campaigns, you can easily end up with more than 10,000 members, so you literally have no option but to split that update yourself or you'll hit the DML row limit.

Aidan
  • 13,656
  • 1
  • 38
  • 83
  • Thanks for the extensive answer. Surely, if the trigger breaks everything into chunks of 200, you don't need it to work on 1000 records, just 200 records? don't the limits get reset between each 200 chunk? – Robs Jul 09 '19 at 08:00
  • 1
    I mean, I want my trigger(s) to be able to run 5 times without exceeding any limits. So, this means I have to stay within 1/5 of the Salesforce limits i.e. less than 2s of CPU (overall limit is 10), less than 20 SOQL queries, etc. – Aidan Jul 09 '19 at 08:39
  • But doesn't the trigger reset the limits for each chunk? – Robs Jul 09 '19 at 09:20
  • 2
    Nope, that would be nice, but that's not how it works. The chunks all happen in one transaction. See my addition to the answer with code example. It does mean that most SF implementations will struggle with very large updates. But, for things like Data Loader and Batch Apex, you can choose the batch size (not chunk size) and each batch has its own transaction context. – Aidan Jul 09 '19 at 10:53
  • Does this same thing apply for process builder? i.e. a 1000 records are broken into chunks of 200 and are all processed within a single transaction and therefore DML limit? – Robs Jul 09 '19 at 11:01
  • It's probably safest to treat Process Builder as incompatible with anything beyond single-record updates. Early versions of PB run on individual records i.e. updating 15 records would run the PB 15 times. More recently, that has been improved but, depending what's in the PB, it can still fall back to running once per record. As far as I'm aware, there's no documentation on where PB performance falls apart. – Aidan Jul 09 '19 at 11:06
  • My tests show that Process Builder uses chunks of 200 and the DML limits persist between chunks because they are all part of the same transaction. – Robs Jul 09 '19 at 11:56
3

As per the documentation

Implementation Considerations:

the maximum chunk size is 200 (for newer API versions).

This mentions bulk API but my understanding (and experience) it is across any "bulk operation" (including a DML operation in Apex).

Also covered in @cropredy's answer, platform event subscribers receive chunks of up to 2000 records, though this is now configurable.

Phil W
  • 36,007
  • 4
  • 45
  • 91
  • Updated the answer accordingly. – Phil W Jul 09 '19 at 08:15
  • Thanks Phil W, I read that in the documentation, but I wasn't 100% certain it was what I needed because it mentions Bulk API and I am not using the Bulk API – Robs Jul 09 '19 at 08:17
2

In addition to the other answers, Platform Event triggers can be presented up to 2000 recs and are not chunked into 200.

There are ways to work around this with Summer 19, see Platform Events Guide Smaller Batches

Update 2023-10-23 (Winter 24)

I discovered an unexpected use case besides Platform Events where the trigger size was > 200 ...

  • If you have a trigger on SObject Event
  • You create a recurring event that recurs daily without limit
  • There will be two trigger executions in separate transactions
    • Execution 1 - a single record reflecting the initial event (Day 0)
    • Execution 2 - 379 records reflecting the next 379 days. This execution is NOT broken into batches of 200.
cropredy
  • 71,240
  • 8
  • 120
  • 270
  • 1
    Link added, hope it helps – cropredy Jul 09 '19 at 11:40
  • Any idea how many records will be passed to a Process Builder subscribing to a Platform Events? is it also a maximum of 2000? – Robs Jul 09 '19 at 11:46
  • Great question-I have never tried this. Easy enough experiment to run. I suspect 2000 – cropredy Jul 09 '19 at 11:49
  • It's tricky to test... getting this error Error returned: LIMIT_EXCEEDED - The number of platform event messages published from an Apex test context exceeded the limit of 500 – Robs Jul 09 '19 at 12:27
  • Before it errored, the log showed it was processing in batches of 400. But this should be how sandbox orgs work, and production orgs could be different. – Robs Jul 09 '19 at 12:36
  • Use 201 recs and sandbox works same as prod. You can test 2000 w anon apex – cropredy Jul 09 '19 at 12:50
0

This also applies to batch jobs that SF does not reset limit in auto chunking. That is, if your batch job updates 1000 records (the same object) in one batch, the trigger will be fired 5 times, SF does not reset SF limit between each trigger chunk.

sherry peng
  • 164
  • 5