cognee

SKU: cognee

Reliable LLM Memory for AI Applications and AI Agents
Memory for AI Agents
Ontology definition
Entity resolution
Chatbot memory
Open Source
Free
Modular: Cognee is modular by nature, using tasks grouped into pipelines
Local Setup: By default, LanceDB runs locally with NetworkX and OpenAI.
Vector Stores: Cognee supports LanceDB, Qdrant, PGVector and Weaviate for vector storage.
Language Models (LLMs): You can use either Anyscale or Ollama as your LLM provider.
Graph Stores: In addition to NetworkX, Neo4j is also supported for graph storage.