Created on 1st March 2025
•
You have so many great ideas, but you fail to take action to make them a reality. Whether it's your fear of failure, your lack of confidence, or being too busy, without a bias for action your ideas are forgotten and become opportunities for regret.
Users can connect the action item app to our open source voice activated notetaker. The action item app will transcribe their conversations into “memories” including overviews, action items and events. All conversations are displayed in the Action Item App, and the users can choose to send the memory to a contact, save the memory privately, or burn the data after reading.
The action item app connects to the real Time streaming API and memories Web hook API provided by the open source Omi project. The Omi is a small necklace wearable microphone that connects to your phone via Bluetooth using an open source mobile app the audio recording is streamed live to a transcription service deep Graeme and the resulting transcription is sent back to the device to the phone as well as sent to be stored in a vector database known as pinecone. The data stored in the vector database is then used for retrieval augmented generation (RAG) with the Open AI inference model. rather than utilize this open AI model and cloud based pinecone service we chose to develop our own application that allows for better privacy and consent features. We send the transcript to the action item server to store the memories. Users can then choose to burn the memory or store the memory as well as send the memory to their new contacts. The action item app takes those memories and breaks them down into different projects with different action items.
We use farcaster for posting memories to social. We use GitHub for storing projects and issues. We use Claude for AI inference and creating project issues.We use pinecone for a vector database. We also use open AI in some cases to generate text to post a fire caster. We use postgres for our database on our server. we're using next JS as our primary job care framework.
We are using the only open source AI wearable and AI app on the App Store as a tool to collect and provide the API with the detailed transcripts using this device, which is brand new to the market gives us a big advantage and it's open source making it the ideal suited platform for building and hacking.
We've improved upon privacy and consent by giving people the ability to Delete or Forget memories. We've built a product specifically suited for a Dev rail working at a hackathon, collecting ideas and projects and making suggestions as how they can go about building in a Hackathon. This is meant to be the perfect tool for hackathons.
We are using Deepgram for transcription, Open AI & Claude for inference, and Pinecone as vector database for speed of development. Vitalik told us “this is why we need edge compute.” We consider these centralized cloud services to be reasonable trade-offs in order to use the open source Omi project. We can improve our solution by using a local LLM or decentralized cloud like Akash.
None of this existed before the build-athon. We started on this Sunday night or Monday morning. All we had before the Hackathon was a concept of having a hacker starter kit for building AI and using the OMI device as some feature. We now realize that hackers need more than just a starter kit to clone, but rather an entire project full of action items and fully formed ideas based on their conversations and brainstorming.
Tracks Applied (4)
Flow
Hedera
Flow