VectorStoreToolkit
This will help you getting started with the VectorStoreToolkit. For detailed documentation of all VectorStoreToolkit features and configurations head to the API reference.
The VectorStoreToolkit is a toolkit which takes in a vector store, and
converts it to a tool which can then be invoked, passed to LLMs, agents
and more.
Setup
If you want to get automated tracing from runs of individual tools, you can also set your LangSmith API key by uncommenting below:
process.env.LANGCHAIN_TRACING_V2 = "true";
process.env.LANGCHAIN_API_KEY = "your-api-key";
Installation
This toolkit lives in the langchain package:
- npm
- yarn
- pnpm
npm i langchain
yarn add langchain
pnpm add langchain
Instantiation
Now we can instantiate our toolkit. First, we need to define the LLM we’ll use in the toolkit.
Pick your chat model:
- OpenAI
- Anthropic
- FireworksAI
- MistralAI
- Groq
- VertexAI
Install dependencies
- npm
- yarn
- pnpm
npm i @langchain/openai 
yarn add @langchain/openai 
pnpm add @langchain/openai 
Add environment variables
OPENAI_API_KEY=your-api-key
Instantiate the model
import { ChatOpenAI } from "@langchain/openai";
const llm = new ChatOpenAI({
  model: "gpt-4o-mini",
  temperature: 0
});
Install dependencies
- npm
- yarn
- pnpm
npm i @langchain/anthropic 
yarn add @langchain/anthropic 
pnpm add @langchain/anthropic 
Add environment variables
ANTHROPIC_API_KEY=your-api-key
Instantiate the model
import { ChatAnthropic } from "@langchain/anthropic";
const llm = new ChatAnthropic({
  model: "claude-3-5-sonnet-20240620",
  temperature: 0
});
Install dependencies
- npm
- yarn
- pnpm
npm i @langchain/community 
yarn add @langchain/community 
pnpm add @langchain/community 
Add environment variables
FIREWORKS_API_KEY=your-api-key
Instantiate the model
import { ChatFireworks } from "@langchain/community/chat_models/fireworks";
const llm = new ChatFireworks({
  model: "accounts/fireworks/models/llama-v3p1-70b-instruct",
  temperature: 0
});
Install dependencies
- npm
- yarn
- pnpm
npm i @langchain/mistralai 
yarn add @langchain/mistralai 
pnpm add @langchain/mistralai 
Add environment variables
MISTRAL_API_KEY=your-api-key
Instantiate the model
import { ChatMistralAI } from "@langchain/mistralai";
const llm = new ChatMistralAI({
  model: "mistral-large-latest",
  temperature: 0
});
Install dependencies
- npm
- yarn
- pnpm
npm i @langchain/groq 
yarn add @langchain/groq 
pnpm add @langchain/groq 
Add environment variables
GROQ_API_KEY=your-api-key
Instantiate the model
import { ChatGroq } from "@langchain/groq";
const llm = new ChatGroq({
  model: "mixtral-8x7b-32768",
  temperature: 0
});
Install dependencies
- npm
- yarn
- pnpm
npm i @langchain/google-vertexai 
yarn add @langchain/google-vertexai 
pnpm add @langchain/google-vertexai 
Add environment variables
GOOGLE_APPLICATION_CREDENTIALS=credentials.json
Instantiate the model
import { ChatVertexAI } from "@langchain/google-vertexai";
const llm = new ChatVertexAI({
  model: "gemini-1.5-flash",
  temperature: 0
});
import { VectorStoreToolkit, VectorStoreInfo } from "langchain/agents/toolkits";
import { OpenAIEmbeddings } from "@langchain/openai";
import { MemoryVectorStore } from "langchain/vectorstores/memory";
import { RecursiveCharacterTextSplitter } from "@langchain/textsplitters";
import fs from "fs";
// Load a text file to use as our data source.
const text = fs.readFileSync(
  "../../../../../examples/state_of_the_union.txt",
  "utf8"
);
// Split the text into chunks before inserting to our store
const textSplitter = new RecursiveCharacterTextSplitter({ chunkSize: 1000 });
const docs = await textSplitter.createDocuments([text]);
const vectorStore = await MemoryVectorStore.fromDocuments(
  docs,
  new OpenAIEmbeddings()
);
const vectorStoreInfo: VectorStoreInfo = {
  name: "state_of_union_address",
  description: "the most recent state of the Union address",
  vectorStore,
};
const toolkit = new VectorStoreToolkit(vectorStoreInfo, llm);
Tools
Here, we can see it converts our vector store into a tool:
const tools = toolkit.getTools();
console.log(
  tools.map((tool) => ({
    name: tool.name,
    description: tool.description,
  }))
);
[
  {
    name: 'state_of_union_address',
    description: 'Useful for when you need to answer questions about state_of_union_address. Whenever you need information about the most recent state of the Union address you should ALWAYS use this. Input should be a fully formed question.'
  }
]
Use within an agent
First, ensure you have LangGraph installed:
- npm
- yarn
- pnpm
npm i @langchain/langgraph
yarn add @langchain/langgraph
pnpm add @langchain/langgraph
Then, instantiate the agent:
import { createReactAgent } from "@langchain/langgraph/prebuilt";
const agentExecutor = createReactAgent({ llm, tools });
const exampleQuery =
  "What did biden say about Ketanji Brown Jackson is the state of the union address?";
const events = await agentExecutor.stream(
  { messages: [["user", exampleQuery]] },
  { streamMode: "values" }
);
for await (const event of events) {
  const lastMsg = event.messages[event.messages.length - 1];
  if (lastMsg.tool_calls?.length) {
    console.dir(lastMsg.tool_calls, { depth: null });
  } else if (lastMsg.content) {
    console.log(lastMsg.content);
  }
}
[
  {
    name: 'state_of_union_address',
    args: {
      input: 'What did Biden say about Ketanji Brown Jackson in the State of the Union address?'
    },
    type: 'tool_call',
    id: 'call_glJSWLNrftKHa92A6j8x4jhd'
  }
]
In the State of the Union address, Biden mentioned that he nominated Circuit Court of Appeals Judge Ketanji Brown Jackson, describing her as one of the nation’s top legal minds who will continue Justice Breyer’s legacy of excellence. He highlighted her background as a former top litigator in private practice, a former federal public defender, and noted that she comes from a family of public school educators and police officers. He also pointed out that she has received a broad range of support since her nomination.
In the State of the Union address, President Biden spoke about Ketanji Brown Jackson, stating that he nominated her as one of the nation’s top legal minds who will continue Justice Breyer’s legacy of excellence. He highlighted her experience as a former top litigator in private practice and a federal public defender, as well as her background coming from a family of public school educators and police officers. Biden also noted that she has received a broad range of support since her nomination.
API reference
For detailed documentation of all VectorStoreToolkit features and configurations head to the API reference.