dobby
One Dashboard. Every Tool. Zero Subscriptions.
Created on 17th January 2026
•
dobby
One Dashboard. Every Tool. Zero Subscriptions.
The problem dobby solves
The Crypto Experience is Broken
Let me tell you a story. It's 3 AM. You're watching Solana pump. You've got 15 browser tabs open-CoinGecko for prices, DeFiLlama for TVL, Birdeye for new tokens, Twitter for alpha, GoPlus for security scans, and a Discord where everyone's screaming about the next 100x. You're drowning in noise, desperately trying to connect dots that exist across a dozen fragmented interfaces.
This is insane.
Every single day, billions of dollars move through crypto markets. Yet the tools we use look like they were designed in 2015. Static dashboards. Manual refreshes. Copy-pasting contract addresses between 10 different scanners. Alert fatigue from bots that cry wolf. And when opportunity strikes? You're too slow. The information was there-scattered across the internet like pieces of a puzzle no human could assemble in time.
The average crypto trader toggles between 12+ applications daily. Not because they want to. Because they have to. The ecosystem is fragmented by design. Each protocol, each chain, each data provider operates in its own silo. There's no unified view. No intelligent layer connecting the dots.
Until now.
Dobby: Your Intelligent Crypto Operating System
We didn't build another dashboard. Dashboards are dead.
We built Dobby-an agentic operating system that understands crypto the way you do. It doesn't just display data. It thinks about data. It connects whale movements to funding rates. It correlates token unlocks with price action. It spots the patterns you'd miss because you're human and you need sleep.
Here's what makes Dobby different:
Traditional dashboards show you information. Dobby gives you understanding. When you link our AI agent to your market widgets, it doesn't just read numbers—it synthesizes them. Ask it: "Should I be worried about this token?" It will check the GoPlus security scan, cross-reference DeFi hack patterns from DeFiLlama, analyze the liquidity depth, and give you a verdict in seconds. What would take you 20 minutes across 8 tabs takes Dobby three seconds.
We've integrated 40+ real-time data sources-from Solana TPS to Bitcoin mempool stats, from NFT floor sweeps to perpetual funding rates. All streaming live. All connected to an AI brain powered by OnDemand's parallel processing architecture.
And for the first time, we've made crypto analysis multimodal. Drag a screenshot of any chart into Dobby's Vision Agent. It will analyze the patterns, identify support levels, and tell you what it sees. Because sometimes the alpha is in an image someone posted-and until now, no AI could read it.
The Future We're Building
We believe the next generation of financial tools won't be dashboards—they'll be intelligent partners. Tools that work with you, not just for you. Tools that learn your risk tolerance, understand your portfolio, and proactively surface the opportunities and threats that matter.
Dobby is that future.
We're not just solving the tab problem. We're solving the intelligence problem. The crypto market moves at the speed of light. Human attention is finite. The only way to win is to have an AI co-pilot that never sleeps, never misses a beat, and always has your back.
Fifteen tabs? That's yesterday.
Welcome to Dobby.
The tools we build shape the way we think. Build smarter tools, and you'll think smarter.
Challenges we ran into
Challenges I Ran Into
Building DOBBY was technically ambitious and came with significant challenges. The first major hurdle was managing and orchestrating multiple AI agents on OnDemand. Creating six distinct agent personas (The Auditor, Yield Hunter, Strategist, Sniper, Curator, and Visionary) required careful prompt engineering to ensure each agent had the right context and tools registered. I had to design a system where each agent could access specific widgets while maintaining consistent behavior across different user queries. The OnDemand Actions API was powerful but required extensive testing to get the plugin registration working correctly so that the AI could dynamically invoke the right tools based on natural language input.
API integration was another constant challenge. Many of the free-tier APIs I wanted to use had inconsistent documentation or unexpected rate limits. For example, some endpoints would work in testing but fail in production due to CORS issues or authentication problems. The GoPlus security API occasionally returned malformed responses that broke my parsing logic, requiring robust error handling. DeFiLlama's API structure changed mid-development, forcing me to refactor several components. I spent considerable time building fallback mechanisms and caching strategies with Redis to handle API failures gracefully without breaking the user experience.
Deployment to Vercel presented its own complexities. Next.js 15 with the App Router had some quirks with server-side rendering and client-side state management that required careful component architecture. Managing environment variables across development and production while keeping API keys secure was tricky. The initial build failed multiple times due to dependency conflicts between different packages, particularly with the drag-and-drop library and the charting components. I had to optimize bundle sizes because Vercel has deployment size limits, which meant carefully analyzing which libraries were truly necessary and finding lighter alternatives where possible.
The modular dashboard architecture itself was conceptually challenging. Building a system where 62 widgets could be dynamically loaded, rearranged, and connected to AI agents required thinking through state management very carefully. I used Zustand to create isolated stores for each widget, but coordinating data flow between widgets, the AI layer, and the backend APIs required a lot of debugging. Ensuring that widget layouts persisted across sessions while keeping the code maintainable was harder than expected.
Performance optimization was critical since DOBBY aggregates data from 10+ external APIs simultaneously. I implemented parallel API calls, intelligent caching with Redis, and lazy loading for widgets to keep the dashboard responsive. However, balancing real-time data updates with API rate limits meant building a sophisticated queuing system that prioritizes critical data while batching less urgent requests. WebSocket connections for live price feeds added another layer of complexity in managing connection stability and handling reconnection logic when users switch networks or lose connectivity.
Cross-chain data consistency was surprisingly difficult. Different blockchains return data in different formats, with varying block times and finality guarantees. Normalizing Ethereum transaction data to match Solana's structure while maintaining accuracy required building adapter layers for each chain. Gas calculations across Layer 2s like Base, Arbitrum, and Optimism each have unique quirks that needed special handling.
Tracks Applied (4)
Open Innovation
OnDemand Track
Airev
Best UI/UX
Eleven Studios
Vultr
Major League Hacking

