I am trying to figure out the best ways of benchmarking Solidity contracts. As with normal benchmarking, I presume one would basically want to measure computational effort and storage overhead:
- gas cost can be used as an easy indicator of computational complexity. Mist & Mix both estimate this before sending a transaction, though this usually differs slightly to the real cost when run (correct?). So you would want to measure both- estimation presumably can be done by importing one of the core platform libraries; real can be tracked via the contract globals.
- storage overhead would depend on a few things: the compiled bytecode size and size of the contract's
storagememory space. The former is again estimated by the tools, the latter I am not so sure how to quantify without reading through the details on how the EVM allocates memory and writing some low-level tool to parse the AST or source and compute the details.
Look forward to seeing if anyone has any pointers on this (:
traceTransactionAPI - see http://ethereum.stackexchange.com/questions/4282/how-to-check-the-vm-trace-using-geth/4289#4289. – BokkyPooBah Jun 13 '16 at 09:20