0

I get the rationale of the EIP-170 MAX_CODE_SIZE state here https://github.com/ethereum/EIPs/blame/master/EIPS/eip-170.md#L25 and here https://github.com/ethereum/EIPs/issues/170#issuecomment-259933671

What I want to know is how MAX_CODE_SIZE 24,576 bytes or 0x6000 derive by theoretical prediction only or backed by a benchmark somewhere?

1 Answers1

0

It is explained in the initial comment of the second link.

The solution is to put a hard cap on the size of an object that can be saved to the blockchain, and do so non-disruptively by setting the cap at a value slightly higher than what is feasible with current gas limits (an pathological worst-case contract can be created with ~23200 bytes using 4.7 million gas, and a normally created contract can go up to ~18 kb).

At the time of the proposal the maximum possible contract length was 23200.

Ismael
  • 30,570
  • 21
  • 53
  • 96
  • Thank you for your answer. It's made me realize that my question still lacks context. I want to know how "disruptive" it is? Assuming disruptive means disruptive to load time. How load time will grew in proportion to code size? How do they know where is the sweet spot between code size and load time trade-off If they never measure it? What does disruptive mean to begin with? – Kyrielight Feb 16 '22 at 17:17
  • At the time the target was to protect against a hypothetical vulnerability. There doesn't seem to be plans to increase the transaction size. The delegatecall opcode makes increasing the transaction size not that important. An increase there will impact the blockchain size, that is quite large today. – Ismael Feb 16 '22 at 21:35