Convex Chat Memory
For longer-term persistence across chat sessions, you can swap out the default in-memory chatHistory that backs chat memory classes like BufferMemory for Convex.
Setupβ
Create projectβ
Get a working Convex project set up, for example by using:
npm create convex@latest
Add database accessorsβ
Add query and mutation helpers to convex/langchain/db.ts:
convex/langchain/db.ts
export * from "@langchain/community/utils/convex";
Configure your schemaβ
Set up your schema (for indexing):
convex/schema.ts
import { defineSchema, defineTable } from "convex/server";
import { v } from "convex/values";
export default defineSchema({
  messages: defineTable({
    sessionId: v.string(),
    message: v.object({
      type: v.string(),
      data: v.object({
        content: v.string(),
        role: v.optional(v.string()),
        name: v.optional(v.string()),
        additional_kwargs: v.optional(v.any()),
      }),
    }),
  }).index("bySessionId", ["sessionId"]),
});
Usageβ
Each chat history session stored in Convex must have a unique session id.
- npm
- Yarn
- pnpm
npm install @langchain/openai @langchain/community
yarn add @langchain/openai @langchain/community
pnpm add @langchain/openai @langchain/community
convex/myActions.ts
"use node";
import { v } from "convex/values";
import { BufferMemory } from "langchain/memory";
import { ChatOpenAI } from "@langchain/openai";
import { ConversationChain } from "langchain/chains";
import { ConvexChatMessageHistory } from "@langchain/community/stores/message/convex";
import { action } from "./_generated/server.js";
export const ask = action({
  args: { sessionId: v.string() },
  handler: async (ctx, args) => {
    // pass in a sessionId string
    const { sessionId } = args;
    const memory = new BufferMemory({
      chatHistory: new ConvexChatMessageHistory({ sessionId, ctx }),
    });
    const model = new ChatOpenAI({
      model: "gpt-3.5-turbo",
      temperature: 0,
    });
    const chain = new ConversationChain({ llm: model, memory });
    const res1 = await chain.invoke({ input: "Hi! I'm Jim." });
    console.log({ res1 });
    /*
      {
        res1: {
          text: "Hello Jim! It's nice to meet you. My name is AI. How may I assist you today?"
        }
      }
    */
    const res2 = await chain.invoke({
      input: "What did I just say my name was?",
    });
    console.log({ res2 });
    /*
      {
        res2: {
          text: "You said your name was Jim."
        }
      }
    */
    // See the chat history in the Convex database
    console.log(await memory.chatHistory.getMessages());
    // clear chat history
    await memory.chatHistory.clear();
  },
});
API Reference:
- BufferMemory from langchain/memory
- ChatOpenAI from @langchain/openai
- ConversationChain from langchain/chains
- ConvexChatMessageHistory from @langchain/community/stores/message/convex