Cashew ๐Ÿฅœ

Memory infrastructure for AI agents

"The graph stays dumb. The agent gets smarter."

About โ€” Raj Kripal Danday

Microsoft โ€” Azure SQL team. Worked on the storage engine for Azure SQL Hyperscale. 2 patents.

Meta โ€” Infrastructure (blob storage). Built Dagger: AI task execution system, 200+ autonomous tasks, top 8% of Claude Code users at Meta.

Education โ€” Georgia Institute of Technology (MS) ยท IIT Hyderabad (BTech)

"I built storage engines for databases. Now I'm building the storage engine for AI memory."

The Problem

AI agents are goldfish โ€” no memory between sessions

Vector DBs

Recall degrades after 10K memories. Similarity search can't capture complex relationships.

Flat Files

O(n) token cost, no filtering. Every memory retrieved costs tokens, whether relevant or not.

Memory Frameworks

Middleware, not architecture. Band-aids on fundamentally broken storage models.

Use Cases

๐Ÿง  Personal AI Memory

3,200+ nodes across 6 months of daily use. Decisions, preferences, patterns โ€” retrieved in ~50ms. The AI that actually knows you.

๐Ÿ’ป Codebase Understanding

Beyond search: why code exists, how components relate, what decisions led here. The graph remembers what documentation never captured.

๐Ÿค– Multi-Agent Memory

Multiple agents, separated responsibilities, shared context. Each agent sees its own domain while the graph gives them all the big picture. One memory layer, many agents.

๐Ÿข Institutional Knowledge

When senior employees leave, their context walks out the door. Cashew captures reasoning patterns, decisions, and domain expertise. The graph stays.

๐Ÿ’ผ Investor Pattern Memory

Thousands of conversations, hundreds of deals. Pattern recognition across them is where alpha lives. "This resembles your thesis on X, which failed because Y."

The Insight

"As models improve, a smart engine becomes a ceiling. A dumb graph gets better for free."
Extract Connect Decay Protect Synthesize
Cashew Architecture Diagram

How It Works

Cashew Retrieval Process
Local embeddings
โ†’
Vector seeds
โ†’
Graph walk
โ†’
LLM reasoning

Retrieval: ~50ms local. Only reasoning needs an LLM.

What Makes It Different

Autonomous Insight Generation

When sleep cycles run, they create nodes that didn't exist before โ€” synthesized from graph structure, not from any single input. The system generates new knowledge, not just consolidation.

Organic Decay

Low-value nodes fade naturally. Unlike traditional databases, cashew forgets what isn't important, keeping the graph clean.

Cross-Domain Synthesis

Graph walk finds connections vector search can't. Related concepts emerge through graph structure, not just similarity.

Traction

Open Source (MIT)
github.com/rajkripal/cashew
3,200+ nodes
~6 months of decisions, conversations, and reasoning in daily production. Retrieved at ~50ms locally.
Blog launched
April 11, 2026
Zero dependencies
Python + SQLite only
2 Microsoft patents
Storage engine expertise

Vision

Memory infrastructure for the AI agent era

Domain-agnostic: personal today, institutional tomorrow

As models improve, cashew improves โ€” no ceiling

Contact