llama-index-storage-chat-store-superlocalmemory

LlamaIndex chat store backed by SuperLocalMemory V2 — 100% local, zero cloud

Installation

In a virtualenv (see these instructions if you need to create one):

pip3 install llama-index-storage-chat-store-superlocalmemory

Dependencies

Releases

Version Released Bullseye
Python 3.9
Bookworm
Python 3.11
Trixie
Python 3.13
Files
0.1.0 2026-02-13      

Issues with this package?

Page last updated 2026-02-28 08:18:48 UTC