A

Advanced Block Explorer

Allows users to time-travel and evaluate custom EVM code at any arbitrary context of a transaction.

The problem Advanced Block Explorer solves

There are existing blockchain browsers out there, what makes Advanced Block Explorer different? Powered by Parsiq, it allows not just browsing the historical data, including transactions and blocks, but also execute arbitrary EVM code at any point in any historical transaction. Amongst other things, this can be used to execute code in failed transactions, which can be very useful for debugging them.

Future work

Beyond MVP, we are planning to add Solidity code compiler, instead of making users write bytecode.

We will also implement smart contract deployment/redeployment/destruction history, which is another feature that Parsiq API makes possible. It will also be possible to upload/validate contract's ABI, which will allow decoding information in call calls involving this contract in the transaction trace.

Links

Live application
Transaction with many nested operations
Source code
Video demo

Challenges we ran into

This challenge was chosen since both team members are experienced in builing web applications and were keen on learning more about Etherium. We faced a number of challenges.

Domain knowledge

As both team members are quite new to the world of Etherium, understanding how it works and the meaning of the data took some time, we used help of online resources, as well as consulting with domain experts.

Understanding the API

Parsiq's API is still new and not documented, and it took us a while to understand how to use the data provided. In particular, some edge cases had to be figured out, for example: determining correct data to show as the initial gas, building a tree of transaction items or figuring out transaction operation to display in the transacation list.

Dealing with the size of the data

API response sizes are very large, in order not to kill the API we have introduced a Redis cache layer that caches all API responses, apart from getting most recent blocks. Even with the data cached, retrieving many large responses from Redis hasn't been very efficient, so we have been caching results of our calculations as well.

Discussion