Token Flow provides emerging teams with fully hosted data pipelines and an analytics team that works with and for them. It delivers more than just a product, it answers questions.
Token Flow has always focused on delivering the highest quality data for Ethereum ecosystem, moving beyond calls- and events-based analytics to decoded state and storage. The goal was, and is, to find the source of truth for data and add the necessary context, thereby creating powerful and insightful analytics.
While decoding state and storage still remains the most fault-proof way to do blockchain analytics, the landscape has changed in these last few years. We’re no longer an ecosystem of a few L2s, we’re an ecosystem of hundreds, and soon thousands, of L2s and L3s.
To deliver on their promise, these rollups should focus on building their use cases and user bases, rather than having to manage data infrastructure.
Analytics team as a service
Chains are the new dApps.
The winning ones will bring valuable new applications to users instead of reinventing the infrastructure wheel.

Ken Deeter
Electric Capital
All players in the ecosystem need visibility to make informed decisions. The established chains benefit from existing infrastructure and communities. For the new ones the barrier to entry is much bigger. They not only need to build the right infrastructure to support their data load, they need to build the analytics to support it and the platform for their community to do their own analysis.
We can argue that blockchain data is a public goods, but we’re sure the analytics aren’t. Community analytics can be insightful and helpful, but also subjective and incorrect. Internal teams require size and structure to deliver the best results. That means precious focus spent on something that is not growing their use cases and user base.
Token Flow provides emerging teams with fully hosted data pipelines and an analytics team that works with and for them. It delivers more than just a product, it answers questions.
Super fast indexing
Before any analytics can happen, data needs to be indexed and processed.And in the current space, it needs to be indexed fast. Token Flow ingests, decodes and makes available new chains in a matter of days from first contact – so that the first analytics will always be ready on day 1.
Contextualised datasets
The blockchain ecosystem generates an absurd amount of data every second. Combing through that is extremely difficult, unless the data is decoded and then curated. Token Flow data comes with higher level business tables such as tokens, transfers, vaults, contracts that make analysis easier.
Decoded by default
Blockchain data in its raw form is incredibly difficult to work with and decoding in-query makes an analyst’s work cumbersome. Thanks to our extensive semantics and signature library, Token Flow automatically decodes the contents of the chain from genesis, making it much easier to work with, and without the delay of having to wait for the specific contract you want to analyze to be decoded. Even contracts without verified code or published ABI can often still be decoded.
Built for advanced analytics
Token Flow’s primary innovations are decoding state and storage. Storage is a smart contract’s own internal ledger and once it is accessible and queryable more reliable analysis can be conducted – for example getting token balances by holder at any point in time without needing to aggregate numerous calls and events.
Available in Studio
Just as chains require an ecosystem of apps to fully show their potential, Token Flow’s data requires a visual outlet. Token Flow Studio, or Studio for short, is a new SQL (structured query language editor service, offering traditional and Web3 native teams and analysts simple and direct data analysis for all on-chain activities and smart contracts. Through Studio, developers will have a service at hand to explore blockchain data, create visualizations, query via standard SQL, and publish dashboards.
Studio supports all of Token Flow’s best features: fully decoded chain data, state and storage, multi-chain view and advanced analytical dashboards.
Currently it includes data from Ethereum, Optimism, Starknet, Mode, Zora, Mint, Base, Blast, Linea and Frame ecosystems, including tables such as storage_diffs, events, calls, transactions, blocks, and more.
The integrations with Arbitrum, Polygon and zkSync chains are underway too, just as integrations with non-EVM chains and DA layers.
Looking for analytics partner?
Talk to us about decoded data, visibility and community analytics
WE’RE HERE TO HELP