Skip to content
This repository has been archived by the owner on Feb 26, 2024. It is now read-only.

Several bugs in debug_traceTransaction #3214

Open
robmcl4 opened this issue Jun 11, 2022 · 5 comments
Open

Several bugs in debug_traceTransaction #3214

robmcl4 opened this issue Jun 11, 2022 · 5 comments

Comments

@robmcl4
Copy link
Contributor

robmcl4 commented Jun 11, 2022

There are several bugs causing strange behavior of debug_traceTransaction.

I'll briefly enumerate the issues below, then dig into the root cause. Here is a gist with a script that demonstrates the issues: https://gist.github.com/robmcl4/164d02de04c8708817c5c32a9c325843

Summary

  1. gas returned does not include the gas refund, so it is over the true amount of gas spent (as an aside, I would like to see structLogs to also contain the property refund with the current gas refund -- this is the behavior of the current geth release, 1.10.18.)
  2. In a scenario where a contract calls itself recursively, SSTORE operations in the deeper call don't modify the storage property of the structLogs when the outer call resumes (after the inner call completes). This improperly leads the web3 user to believe the outer call sees stale storage.
  3. In a scenario where one calls debug_traceTransaction on a transaction which occurred before the fork block, and a contract attempts to set a storage slot to 0x0 when its value prior to the transaction was nonnull (ie, you attempt to delete the slot), then the delete is not persisted and the EVM will continue to execute performing stale reads on that slot

Deeper discussion

Issue (1)

This one is simple. On blockchain.ts:1366, runTx returns a RunTxResult ... the actual gas consumed is found in the property gasUsed.

Issue (2)

This is because at blockchain.ts:1287, only storageStack[eventDepth] is modified. However, stale copies of the storage may exist at higher depths, further up in the call stack. This can be fixed by iterating over storageStack and updating storage whenever the same contract is seen further up the stack.

Issue (3)

The core issue is because of the conditional statement at trie.ts:167 and trie.ts:275.

Consider that we are tracing a pre-fork transaction, and a contract attempts to delete storage (set it to 0x0). Eventually, we call trie.del(..). The conditional on line 167 evaluates to false, and the else block is taken on line 177. Critically, this means that the storage is never 'marked' as deleted. Subsequent reads will then notice that the slot is not available locally, query the fallback, and a stale read is observed in the EVM.

Demonstration

I run this on ubuntu 20.04, python 3.8.10, node 16.15.1, npm 8.11.0, and geth 1.10.18. I have geth running as a full archive node running on ws://127.0.0.1:8546. Before running, I install web3: python3 -m pip install web3==5.29.2

I invoke ganache using:

node \
    src/packages/ganache/dist/node/cli.js \
        --fork.url ws://localhost:8546 \
        --server.port 34451 \
        --chain.chainId 1 \
        --chain.hardfork arrowGlacier

Then, I run the linked script:

-> % python3 bug.py          
querying archive node
done querying archive node, querying ganache
done querying ganache
Gas mismatch!
653656
819869

!!!!!!!!!!!!!!!!!!!!!! BUG !!!!!!!!!!!!!!!!!!!!!!!!!
expected storage 0xb47e3cd837dDF8e4c57F05d70Ab865de6e193BBB slot b5f5b1f5c4654b9ad9ba55eab198a3e6fc295c570faf1562b80df25b9119f014 to be 0000000000000000000000000000000000000000000000000000000000000000 but got 0000000000000000000000000000000000000000000000000000000000000001
6017 SWAP1 slot None
6018 SHA3 slot None
6019 SWAP9 slot None
6020 MLOAD slot None
6021 DUP10 slot None
6022* SLOAD slot 0000000000000000000000000000000000000000000000000000000000000001
6023 PUSH1 slot 0000000000000000000000000000000000000000000000000000000000000001
6025 NOT slot 0000000000000000000000000000000000000000000000000000000000000001
6026 AND slot 0000000000000000000000000000000000000000000000000000000000000001
6027 SWAP1 slot 0000000000000000000000000000000000000000000000000000000000000001

The script traces transaction 0xf514259f185019704004b241cb4c6e31ed5e6e474a998b99c4b89209e304f417, both on geth and on ganache. This immediately demonstrates bugs (1), mismatched gas, and (3), stale read of 0x1 after slot ...f014 was deleted. Bug (2) appears after issue (3) is addressed. It's also important to point out that (1) cannot be addressed without (3) -- as the gas usage of SSTORE is computed incorrectly because of stale reads.

If you need a json dump of geth's trace, I can supply that as well.

Work-Arounds

My current fixes are on my fork's branch robmcl4/severalBugFixes.

With patches applied, the file outputs:

-> % python3 bug.py                                                                                                                               
querying archive node
done querying archive node, querying ganache
done querying ganache
no bug
@robmcl4
Copy link
Contributor Author

robmcl4 commented Jun 13, 2022

It's probably important to note: I think that the work-around for (2) is not a complete fix: if the deeper call reverts, then the outer call's state must not be modified.

@antazoey
Copy link

antazoey commented Dec 9, 2022

I am seeing an issue where a top-level CALL will show as reverted when it shouldn't (comparing to Hardhat and Geth output of structlogs for the same txn). I wonder if #2 describes the same bug.

@antazoey
Copy link

antazoey commented Dec 9, 2022

Another issue I have: the trace cuts off... It suddenly ends in the middle whereas I am expecting a lot more to come after it

@davidmurdoch
Copy link
Member

@unparalleled-js can you provide a reproduction?

@antazoey
Copy link

antazoey commented Dec 9, 2022

@unparalleled-js can you provide a reproduction?

I am using mainnet-fork and historical transaction 0xb7d7f1d5ce7743e821d3026647df486f517946ef1342a1ae93c96e4a8016eab7

When you compare the struct logs with Hardhat's, the results are different in a few ways.
There are more than twice as many struct logs in the Hardhat response versus the Ganache, and top-level log looks a lot different as well.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
Status: Backlog
Development

No branches or pull requests

4 participants