26

This question is a follow-up from another question where the accepted answer states that for each Update Records action in a process, following limits are consumed:

As you can therefore conclude, it is 1 query and 1 DML statement per chunked transaction.

I was running some benchmark tests with process builder in a brand new dev org that has no other automations running. The process itself is super simple, firing on every account insert/update and updating 1 field:

enter image description here

I was then inserting a list of 200 accounts:

List<Account> accounts = new List<Account>();
for(Integer i = 0; i < 200; i++){
    Account a = new Account(Name='Account '+i);
    accounts.add(a);
}
insert accounts;

The logs show the following:

enter image description here

1 DML and 200 DML rows are used by the initial insert, the rest is caused by the process. 1 SOQL and 200 SOQL is in line with what's expected. However, that's leaving 2 DML and 200 DML rows.

The execution log is showing the following:

enter image description here

At the very end, a DML is used with no additional rows consumed.

I then did the same test, but now inserting 199 accounts. Now the process consumes 1 DML and 199 DML rows. Testing with 201 accounts, I get these results: 3 DML and 201 DML rows. I get similar results around the 400 - 600 - ... treshold.

To me, it looks like when arriving at the 200 records mark, Salesforce is already enqueuing the next chunk of records, which consumes a DML, without checking if there really are more records to process? Then if there are any, that's one more DML. Who can shed some light on this?

Robin De Bondt
  • 3,761
  • 1
  • 22
  • 35
  • Does your PB run only when a record is created or it runs on both created and updated? I would assume the DML count 3 to be correct here as the first would have been from the trigger and then the other two from the process if it runs on created and updated scenario. – Jayant Das Feb 22 '19 at 16:00
  • @JayantDas That does not have an impact. I get the same results if I set the process to fire on create only as well. Also, I would have gotten the same amount of DML statements with 199 and 200 records if that would be the case. – Robin De Bondt Feb 25 '19 at 07:36
  • 2
    Not sure, Can it be Savepoints manually added by process builder for each chunk in case to support Partial DML? – Pranay Jaiswal Feb 25 '19 at 18:10
  • Just a thought - the DML insert happening in Apex code is triggering the Process builder in the same transaction. Process builder "doesn't know" of the existing accounts (rather, that data is not passed directly into the process builder), and so Process Builder has to pull up the account data again for itself. Hence double the DML Rows. – Brian Miller Feb 25 '19 at 18:14
  • @pranayJaiswal savepoints is an interesting thought! Might be related to this: https://releasenotes.docs.salesforce.com/en-us/summer18/release-notes/rn_forcecom_flow_run_partialsave.htm However, that critical update is not enabled in my org... – Robin De Bondt Feb 26 '19 at 07:40
  • @BrianMiller that should result in an additional SOQL query, not DML? Also, why would there then be a difference between 199 and 200 records? – Robin De Bondt Feb 26 '19 at 07:46
  • @RobinDeBondt Ah, I think I have a better handle of what's happening. It could be the Process builder actually uses the original SOQL, but there are two different 200 DML actions happening. First 200 is with your insert accounts; line of code, and then when it goes into the Process Builder, another 200 DML actions happen because technically 200 records are being updated at the PB level. Anytime a database record is inserted, updated, or deleted, a DML Row is counted – Brian Miller Feb 26 '19 at 14:18
  • @BrianMiller I don't think you read the question correctly. The first DML is my insert, the second is the update by the PB. Those are clear and according to the documentation. Where is the third one coming from, which consuming 0 rows? – Robin De Bondt Feb 27 '19 at 16:15
  • @RobinDeBondt Got it, really strange. Not sure what it could be – Brian Miller Feb 28 '19 at 15:07

1 Answers1

1

Your assumption is correct. SF works with batches of 200 records. There are few other situations you should pay attention to.

  1. If you push 201 records it will call two dmls.
  2. if you for example push List< sObject > with multiple kind of objects like accounts and contacts at same time. It will call new batch each time it will reach new kind of object in list. So remember to sort the list and take that in to counting limits. Cause for example list with 100 records contacts and accounts set up the way that there are next to each other ( i mean liek this : List: Acc1,Con1, Acc2 , Con2 ... ) will reach limits as well. Cause of new batch for each sObject type.

https://help.salesforce.com/articleView?id=process_limits.htm&type=5 <- here we can read that each update statement on process takes 1dml and 1soql.

So sorry for missing a good point in previous comment. So to fallow it step by step: 1. You insert all 200 accounts ( DML Rows should be 200 and DML statements 1 ). 2. Then for each of them as batch triggers you process and updates them ( DML Rows should be 400 and DML Statements 2 ) . 3. My guess is! That maybe the process triggers again. It does update but. Since there are no changes it is just using the dml and that is all. Could you please add condition so the process will not trigger if the field you update with it has been changed ?

Marcin Trofiniak
  • 317
  • 4
  • 16