ChatVertexAI
Google Vertex is a service that
exposes all foundation models available in Google Cloud, like
gemini-1.5-pro, gemini-1.5-flash, etc.
This will help you getting started with ChatVertexAI chat
models. For detailed documentation of all
ChatVertexAI features and configurations head to the API
reference.
Overviewβ
Integration detailsβ
| Class | Package | Local | Serializable | PY support | Package downloads | Package latest | 
|---|---|---|---|---|---|---|
| ChatVertexAI | @langchain/google-vertexai | β | β | β |  |  | 
Model featuresβ
See the links in the table headers below for guides on how to use specific features.
| Tool calling | Structured output | JSON mode | Image input | Audio input | Video input | Token-level streaming | Token usage | Logprobs | 
|---|---|---|---|---|---|---|---|---|
| β | β | β | β | β | β | β | β | β | 
Setupβ
LangChain.js supports two different authentication methods based on whether youβre running in a Node.js environment or a web environment.
To access ChatVertexAI models youβll need to setup Google VertexAI in
your Google Cloud Platform (GCP) account, save the credentials file, and
install the @langchain/google-vertexai integration package.
Credentialsβ
Head to your GCP account and
generate a credentials file. Once youβve done this set the
GOOGLE_APPLICATION_CREDENTIALS environment variable:
export GOOGLE_APPLICATION_CREDENTIALS="path/to/your/credentials.json"
If running in a web environment, you should set the
GOOGLE_VERTEX_AI_WEB_CREDENTIALS environment variable as a JSON
stringified object, and install the @langchain/google-vertexai-web
package:
GOOGLE_VERTEX_AI_WEB_CREDENTIALS={"type":"service_account","project_id":"YOUR_PROJECT-12345",...}
If you want to get automated tracing of your model calls you can also set your LangSmith API key by uncommenting below:
# export LANGCHAIN_TRACING_V2="true"
# export LANGCHAIN_API_KEY="your-api-key"
Installationβ
The LangChain ChatVertexAI integration lives in the
@langchain/google-vertexai package:
- npm
- yarn
- pnpm
npm i @langchain/google-vertexai
yarn add @langchain/google-vertexai
pnpm add @langchain/google-vertexai
Or if using in a web environment like a Vercel Edge function:
- npm
- yarn
- pnpm
npm i @langchain/google-vertexai-web
yarn add @langchain/google-vertexai-web
pnpm add @langchain/google-vertexai-web
Instantiationβ
Now we can instantiate our model object and generate chat completions:
import { ChatVertexAI } from "@langchain/google-vertexai";
// Uncomment the following line if you're running in a web environment:
// import { ChatVertexAI } from "@langchain/google-vertexai-web"
const llm = new ChatVertexAI({
  model: "gemini-1.5-pro",
  temperature: 0,
  maxRetries: 2,
  // For web, authOptions.credentials
  // authOptions: { ... }
  // other params...
});
Invocationβ
const aiMsg = await llm.invoke([
  [
    "system",
    "You are a helpful assistant that translates English to French. Translate the user sentence.",
  ],
  ["human", "I love programming."],
]);
aiMsg;
AIMessageChunk {
  "content": "J'adore programmer. \n",
  "additional_kwargs": {},
  "response_metadata": {},
  "tool_calls": [],
  "tool_call_chunks": [],
  "invalid_tool_calls": [],
  "usage_metadata": {
    "input_tokens": 20,
    "output_tokens": 7,
    "total_tokens": 27
  }
}
console.log(aiMsg.content);
J'adore programmer.
Chainingβ
We can chain our model with a prompt template like so:
import { ChatPromptTemplate } from "@langchain/core/prompts";
const prompt = ChatPromptTemplate.fromMessages([
  [
    "system",
    "You are a helpful assistant that translates {input_language} to {output_language}.",
  ],
  ["human", "{input}"],
]);
const chain = prompt.pipe(llm);
await chain.invoke({
  input_language: "English",
  output_language: "German",
  input: "I love programming.",
});
AIMessageChunk {
  "content": "Ich liebe das Programmieren. \n",
  "additional_kwargs": {},
  "response_metadata": {},
  "tool_calls": [],
  "tool_call_chunks": [],
  "invalid_tool_calls": [],
  "usage_metadata": {
    "input_tokens": 15,
    "output_tokens": 9,
    "total_tokens": 24
  }
}
API referenceβ
For detailed documentation of all ChatVertexAI features and configurations head to the API reference: https://api.js.langchain.com/classes/langchain_google_vertexai.ChatVertexAI.html
Relatedβ
- Chat model conceptual guide
- Chat model how-to guides