Providers

LangChain Provider

View as markdown

The LangChain Provider transforms Composio tools into a format compatible with LangChain's function calling capabilities.

Setup

pip install composio_langchain==0.8.0 langchain
npm install @composio/langchain

Usage

from composio import Composio
from composio_langchain import LangchainProvider
from langchain import hub  # type: ignore
from langchain.agents import AgentExecutor, create_openai_functions_agent
from langchain_openai import ChatOpenAI

# Pull relevant agent model.
prompt = hub.pull("hwchase17/openai-functions-agent")

# Initialize tools.
openai_client = ChatOpenAI(model="gpt-5")

composio = Composio(provider=LangchainProvider())

# Get All the tools
tools = composio.tools.get(user_id="default", toolkits=["GITHUB"])

# Define task
task = "Star a repo composiohq/composio on GitHub"

# Define agent
agent = create_openai_functions_agent(openai_client, tools, prompt)
agent_executor = AgentExecutor(agent=agent, tools=tools, verbose=True)

# Execute using agent_executor
agent_executor.invoke({"input": task})
import { class ChatOpenAI<CallOptions extends ChatOpenAICallOptions = ChatOpenAICallOptions>
OpenAI chat model integration. To use with Azure, import the `AzureChatOpenAI` class. Setup: Install `@langchain/openai` and set an environment variable named `OPENAI_API_KEY`. ```bash npm install
@langchain/openai export OPENAI_API_KEY="your-api-key" ``` ## [Constructor args](https://api.js.langchain.com/classes/langchain_openai.ChatOpenAI.html#constructor) ## [Runtime args](https://api.js.langchain.com/interfaces/langchain_openai.ChatOpenAICallOptions.html) Runtime args can be passed as the second argument to any of the base runnable methods `.invoke`. `.stream`, `.batch`, etc. They can also be passed via `.withConfig`, or the second arg in `.bindTools`, like shown in the examples below: ```typescript // When calling `.withConfig`, call options should be passed via the first argument const llmWithArgsBound = llm.withConfig({ stop: ["\n"], tools: [...], }); // When calling `.bindTools`, call options should be passed via the second argument const llmWithTools = llm.bindTools( [...], { tool_choice: "auto", } ); ``` ## Examples <details open> <summary><strong>Instantiate</strong></summary> ```typescript import { ChatOpenAI } from '@langchain/openai'; const llm = new ChatOpenAI({ model: "gpt-4o-mini", temperature: 0, maxTokens: undefined, timeout: undefined, maxRetries: 2, // apiKey: "...", // configuration: { // baseURL: "...", // } // organization: "...", // other params... }); ``` </details> <br /> <details> <summary><strong>Invoking</strong></summary> ```typescript const input = `Translate "I love programming" into French.`; // Models also accept a list of chat messages or a formatted prompt const result = await llm.invoke(input); console.log(result); ``` ```txt AIMessage { "id": "chatcmpl-9u4Mpu44CbPjwYFkTbeoZgvzB00Tz", "content": "J'adore la programmation.", "response_metadata": { "tokenUsage": { "completionTokens": 5, "promptTokens": 28, "totalTokens": 33 }, "finish_reason": "stop", "system_fingerprint": "fp_3aa7262c27" }, "usage_metadata": { "input_tokens": 28, "output_tokens": 5, "total_tokens": 33 } } ``` </details> <br /> <details> <summary><strong>Streaming Chunks</strong></summary> ```typescript for await (const chunk of await llm.stream(input)) { console.log(chunk); } ``` ```txt AIMessageChunk { "id": "chatcmpl-9u4NWB7yUeHCKdLr6jP3HpaOYHTqs", "content": "" } AIMessageChunk { "content": "J" } AIMessageChunk { "content": "'adore" } AIMessageChunk { "content": " la" } AIMessageChunk { "content": " programmation",, } AIMessageChunk { "content": ".",, } AIMessageChunk { "content": "", "response_metadata": { "finish_reason": "stop", "system_fingerprint": "fp_c9aa9c0491" }, } AIMessageChunk { "content": "", "usage_metadata": { "input_tokens": 28, "output_tokens": 5, "total_tokens": 33 } } ``` </details> <br /> <details> <summary><strong>Aggregate Streamed Chunks</strong></summary> ```typescript import { AIMessageChunk } from '@langchain/core/messages'; import { concat } from '@langchain/core/utils/stream'; const stream = await llm.stream(input); let full: AIMessageChunk | undefined; for await (const chunk of stream) { full = !full ? chunk : concat(full, chunk); } console.log(full); ``` ```txt AIMessageChunk { "id": "chatcmpl-9u4PnX6Fy7OmK46DASy0bH6cxn5Xu", "content": "J'adore la programmation.", "response_metadata": { "prompt": 0, "completion": 0, "finish_reason": "stop", }, "usage_metadata": { "input_tokens": 28, "output_tokens": 5, "total_tokens": 33 } } ``` </details> <br /> <details> <summary><strong>Bind tools</strong></summary> ```typescript import { z } from 'zod'; const GetWeather = { name: "GetWeather", description: "Get the current weather in a given location", schema: z.object({ location: z.string().describe("The city and state, e.g. San Francisco, CA") }), } const GetPopulation = { name: "GetPopulation", description: "Get the current population in a given location", schema: z.object({ location: z.string().describe("The city and state, e.g. San Francisco, CA") }), } const llmWithTools = llm.bindTools( [GetWeather, GetPopulation], { // strict: true // enforce tool args schema is respected } ); const aiMsg = await llmWithTools.invoke( "Which city is hotter today and which is bigger: LA or NY?" ); console.log(aiMsg.tool_calls); ``` ```txt [ { name: 'GetWeather', args: { location: 'Los Angeles, CA' }, type: 'tool_call', id: 'call_uPU4FiFzoKAtMxfmPnfQL6UK' }, { name: 'GetWeather', args: { location: 'New York, NY' }, type: 'tool_call', id: 'call_UNkEwuQsHrGYqgDQuH9nPAtX' }, { name: 'GetPopulation', args: { location: 'Los Angeles, CA' }, type: 'tool_call', id: 'call_kL3OXxaq9OjIKqRTpvjaCH14' }, { name: 'GetPopulation', args: { location: 'New York, NY' }, type: 'tool_call', id: 'call_s9KQB1UWj45LLGaEnjz0179q' } ] ``` </details> <br /> <details> <summary><strong>Structured Output</strong></summary> ```typescript import { z } from 'zod'; const Joke = z.object({ setup: z.string().describe("The setup of the joke"), punchline: z.string().describe("The punchline to the joke"), rating: z.number().nullable().describe("How funny the joke is, from 1 to 10") }).describe('Joke to tell user.'); const structuredLlm = llm.withStructuredOutput(Joke, { name: "Joke", strict: true, // Optionally enable OpenAI structured outputs }); const jokeResult = await structuredLlm.invoke("Tell me a joke about cats"); console.log(jokeResult); ``` ```txt { setup: 'Why was the cat sitting on the computer?', punchline: 'Because it wanted to keep an eye on the mouse!', rating: 7 } ``` </details> <br /> <details> <summary><strong>JSON Object Response Format</strong></summary> ```typescript const jsonLlm = llm.withConfig({ response_format: { type: "json_object" } }); const jsonLlmAiMsg = await jsonLlm.invoke( "Return a JSON object with key 'randomInts' and a value of 10 random ints in [0-99]" ); console.log(jsonLlmAiMsg.content); ``` ```txt { "randomInts": [23, 87, 45, 12, 78, 34, 56, 90, 11, 67] } ``` </details> <br /> <details> <summary><strong>Multimodal</strong></summary> ```typescript import { HumanMessage } from '@langchain/core/messages'; const imageUrl = "https://example.com/image.jpg"; const imageData = await fetch(imageUrl).then(res => res.arrayBuffer()); const base64Image = Buffer.from(imageData).toString('base64'); const message = new HumanMessage({ content: [ { type: "text", text: "describe the weather in this image" }, { type: "image_url", image_url: { url: `data:image/jpeg;base64,${base64Image}` }, }, ] }); const imageDescriptionAiMsg = await llm.invoke([message]); console.log(imageDescriptionAiMsg.content); ``` ```txt The weather in the image appears to be clear and sunny. The sky is mostly blue with a few scattered white clouds, indicating fair weather. The bright sunlight is casting shadows on the green, grassy hill, suggesting it is a pleasant day with good visibility. There are no signs of rain or stormy conditions. ``` </details> <br /> <details> <summary><strong>Usage Metadata</strong></summary> ```typescript const aiMsgForMetadata = await llm.invoke(input); console.log(aiMsgForMetadata.usage_metadata); ``` ```txt { input_tokens: 28, output_tokens: 5, total_tokens: 33 } ``` </details> <br /> <details> <summary><strong>Logprobs</strong></summary> ```typescript const logprobsLlm = new ChatOpenAI({ model: "gpt-4o-mini", logprobs: true }); const aiMsgForLogprobs = await logprobsLlm.invoke(input); console.log(aiMsgForLogprobs.response_metadata.logprobs); ``` ```txt { content: [ { token: 'J', logprob: -0.000050616763, bytes: [Array], top_logprobs: [] }, { token: "'", logprob: -0.01868736, bytes: [Array], top_logprobs: [] }, { token: 'ad', logprob: -0.0000030545007, bytes: [Array], top_logprobs: [] }, { token: 'ore', logprob: 0, bytes: [Array], top_logprobs: [] }, { token: ' la', logprob: -0.515404, bytes: [Array], top_logprobs: [] }, { token: ' programm', logprob: -0.0000118755715, bytes: [Array], top_logprobs: [] }, { token: 'ation', logprob: 0, bytes: [Array], top_logprobs: [] }, { token: '.', logprob: -0.0000037697225, bytes: [Array], top_logprobs: [] } ], refusal: null } ``` </details> <br /> <details> <summary><strong>Response Metadata</strong></summary> ```typescript const aiMsgForResponseMetadata = await llm.invoke(input); console.log(aiMsgForResponseMetadata.response_metadata); ``` ```txt { tokenUsage: { completionTokens: 5, promptTokens: 28, totalTokens: 33 }, finish_reason: 'stop', system_fingerprint: 'fp_3aa7262c27' } ``` </details> <br /> <details> <summary><strong>JSON Schema Structured Output</strong></summary> ```typescript const llmForJsonSchema = new ChatOpenAI({ model: "gpt-4o-2024-08-06", }).withStructuredOutput( z.object({ command: z.string().describe("The command to execute"), expectedOutput: z.string().describe("The expected output of the command"), options: z .array(z.string()) .describe("The options you can pass to the command"), }), { method: "jsonSchema", strict: true, // Optional when using the `jsonSchema` method } ); const jsonSchemaRes = await llmForJsonSchema.invoke( "What is the command to list files in a directory?" ); console.log(jsonSchemaRes); ``` ```txt { command: 'ls', expectedOutput: 'A list of files and subdirectories within the specified directory.', options: [ '-a: include directory entries whose names begin with a dot (.).', '-l: use a long listing format.', '-h: with -l, print sizes in human readable format (e.g., 1K, 234M, 2G).', '-t: sort by time, newest first.', '-r: reverse order while sorting.', '-S: sort by file size, largest first.', '-R: list subdirectories recursively.' ] } ``` </details> <br /> <details> <summary><strong>Audio Outputs</strong></summary> ```typescript import { ChatOpenAI } from "@langchain/openai"; const modelWithAudioOutput = new ChatOpenAI({ model: "gpt-4o-audio-preview", // You may also pass these fields to `.withConfig` as a call argument. modalities: ["text", "audio"], // Specifies that the model should output audio. audio: { voice: "alloy", format: "wav", }, }); const audioOutputResult = await modelWithAudioOutput.invoke("Tell me a joke about cats."); const castMessageContent = audioOutputResult.content[0] as Record<string, any>; console.log({ ...castMessageContent, data: castMessageContent.data.slice(0, 100) // Sliced for brevity }) ``` ```txt { id: 'audio_67117718c6008190a3afad3e3054b9b6', data: 'UklGRqYwBgBXQVZFZm10IBAAAAABAAEAwF0AAIC7AAACABAATElTVBoAAABJTkZPSVNGVA4AAABMYXZmNTguMjkuMTAwAGRhdGFg', expires_at: 1729201448, transcript: 'Sure! Why did the cat sit on the computer? Because it wanted to keep an eye on the mouse!' } ``` </details> <br /> <details> <summary><strong>Audio Outputs</strong></summary> ```typescript import { ChatOpenAI } from "@langchain/openai"; const modelWithAudioOutput = new ChatOpenAI({ model: "gpt-4o-audio-preview", // You may also pass these fields to `.withConfig` as a call argument. modalities: ["text", "audio"], // Specifies that the model should output audio. audio: { voice: "alloy", format: "wav", }, }); const audioOutputResult = await modelWithAudioOutput.invoke("Tell me a joke about cats."); const castAudioContent = audioOutputResult.additional_kwargs.audio as Record<string, any>; console.log({ ...castAudioContent, data: castAudioContent.data.slice(0, 100) // Sliced for brevity }) ``` ```txt { id: 'audio_67117718c6008190a3afad3e3054b9b6', data: 'UklGRqYwBgBXQVZFZm10IBAAAAABAAEAwF0AAIC7AAACABAATElTVBoAAABJTkZPSVNGVA4AAABMYXZmNTguMjkuMTAwAGRhdGFg', expires_at: 1729201448, transcript: 'Sure! Why did the cat sit on the computer? Because it wanted to keep an eye on the mouse!' } ``` </details> <br />
ChatOpenAI
} from '@langchain/openai';
import { class HumanMessage<TStructure extends MessageStructure = MessageStructure<MessageToolSet>>
Represents a human message in a conversation.
HumanMessage
, class AIMessage<TStructure extends MessageStructure = MessageStructure<MessageToolSet>>AIMessage } from '@langchain/core/messages';
import { class ToolNode<T = any>
A node that runs the tools requested in the last AIMessage. It can be used either in StateGraph with a "messages" key or in MessageGraph. If multiple tool calls are requested, they will be run in parallel. The output will be a list of ToolMessages, one for each tool call.
@example```ts import { ToolNode } from "@langchain/langgraph/prebuilt"; import { tool } from "@langchain/core/tools"; import { z } from "zod"; import { AIMessage } from "@langchain/core/messages"; const getWeather = tool((input) => { if (["sf", "san francisco"].includes(input.location.toLowerCase())) { return "It's 60 degrees and foggy."; } else { return "It's 90 degrees and sunny."; } }, { name: "get_weather", description: "Call to get the current weather.", schema: z.object({ location: z.string().describe("Location to get the weather for."), }), }); const tools = [getWeather]; const toolNode = new ToolNode(tools); const messageWithSingleToolCall = new AIMessage({ content: "", tool_calls: [ { name: "get_weather", args: { location: "sf" }, id: "tool_call_id", type: "tool_call", } ] }) await toolNode.invoke({ messages: [messageWithSingleToolCall] }); // Returns tool invocation responses as: // { messages: ToolMessage[] } ```@example```ts import { StateGraph, MessagesAnnotation, } from "@langchain/langgraph"; import { ToolNode } from "@langchain/langgraph/prebuilt"; import { tool } from "@langchain/core/tools"; import { z } from "zod"; import { ChatAnthropic } from "@langchain/anthropic"; const getWeather = tool((input) => { if (["sf", "san francisco"].includes(input.location.toLowerCase())) { return "It's 60 degrees and foggy."; } else { return "It's 90 degrees and sunny."; } }, { name: "get_weather", description: "Call to get the current weather.", schema: z.object({ location: z.string().describe("Location to get the weather for."), }), }); const tools = [getWeather]; const modelWithTools = new ChatAnthropic({ model: "claude-3-haiku-20240307", temperature: 0 }).bindTools(tools); const toolNodeForGraph = new ToolNode(tools) const shouldContinue = (state: typeof MessagesAnnotation.State) => { const { messages } = state; const lastMessage = messages[messages.length - 1]; if ("tool_calls" in lastMessage && Array.isArray(lastMessage.tool_calls) && lastMessage.tool_calls?.length) { return "tools"; } return "__end__"; } const callModel = async (state: typeof MessagesAnnotation.State) => { const { messages } = state; const response = await modelWithTools.invoke(messages); return { messages: response }; } const graph = new StateGraph(MessagesAnnotation) .addNode("agent", callModel) .addNode("tools", toolNodeForGraph) .addEdge("__start__", "agent") .addConditionalEdges("agent", shouldContinue) .addEdge("tools", "agent") .compile(); const inputs = { messages: [{ role: "user", content: "what is the weather in SF?" }], }; const stream = await graph.stream(inputs, { streamMode: "values", }); for await (const { messages } of stream) { console.log(messages); } // Returns the messages in the state at each step of execution ```
ToolNode
} from '@langchain/langgraph/prebuilt';
import { class StateGraph<SD extends SDZod | unknown, S = SD extends SDZod ? StateType<ToStateDefinition<SD>> : SD, U = SD extends SDZod ? UpdateType<ToStateDefinition<SD>> : Partial<...>, N extends string = "__start__", I extends SDZod = SD extends SDZod ? ToStateDefinition<...> : StateDefinition, O extends SDZod = SD extends SDZod ? ToStateDefinition<...> : StateDefinition, C extends SDZod = StateDefinition, NodeReturnType = unknown, InterruptType = unknown, WriterType = unknown>
A graph whose nodes communicate by reading and writing to a shared state. Each node takes a defined `State` as input and returns a `Partial<State>`. Each state key can optionally be annotated with a reducer function that will be used to aggregate the values of that key received from multiple nodes. The signature of a reducer function is (left: Value, right: UpdateValue) => Value. See {@link Annotation } for more on defining state. After adding nodes and edges to your graph, you must call `.compile()` on it before you can use it.
@example```ts import { type BaseMessage, AIMessage, HumanMessage, } from "@langchain/core/messages"; import { StateGraph, Annotation } from "@langchain/langgraph"; // Define a state with a single key named "messages" that will // combine a returned BaseMessage or arrays of BaseMessages const StateAnnotation = Annotation.Root({ sentiment: Annotation<string>, messages: Annotation<BaseMessage[]>({ reducer: (left: BaseMessage[], right: BaseMessage | BaseMessage[]) => { if (Array.isArray(right)) { return left.concat(right); } return left.concat([right]); }, default: () => [], }), }); const graphBuilder = new StateGraph(StateAnnotation); // A node in the graph that returns an object with a "messages" key // will update the state by combining the existing value with the returned one. const myNode = (state: typeof StateAnnotation.State) => { return { messages: [new AIMessage("Some new response")], sentiment: "positive", }; }; const graph = graphBuilder .addNode("myNode", myNode) .addEdge("__start__", "myNode") .addEdge("myNode", "__end__") .compile(); await graph.invoke({ messages: [new HumanMessage("how are you?")] }); // { // messages: [HumanMessage("how are you?"), AIMessage("Some new response")], // sentiment: "positive", // } ```
StateGraph
,
const MessagesAnnotation: AnnotationRoot<{
    messages: BinaryOperatorAggregate<BaseMessage<MessageStructure<MessageToolSet>, MessageType>[], Messages>;
}>
Prebuilt state annotation that combines returned messages. Can handle standard messages and special modifiers like {@link RemoveMessage } instances. Specifically, importing and using the prebuilt MessagesAnnotation like this:
@example```ts import { MessagesAnnotation, StateGraph } from "@langchain/langgraph"; const graph = new StateGraph(MessagesAnnotation) .addNode(...) ... ``` Is equivalent to initializing your state manually like this:@example```ts import { BaseMessage } from "@langchain/core/messages"; import { Annotation, StateGraph, messagesStateReducer } from "@langchain/langgraph"; export const StateAnnotation = Annotation.Root({ messages: Annotation<BaseMessage[]>({ reducer: messagesStateReducer, default: () => [], }), }); const graph = new StateGraph(StateAnnotation) .addNode(...) ... ```
MessagesAnnotation
} from '@langchain/langgraph';
import { class Composio<TProvider extends BaseComposioProvider<unknown, unknown, unknown> = OpenAIProvider>
This is the core class for Composio. It is used to initialize the Composio SDK and provide a global configuration.
Composio
} from '@composio/core';
import { class LangchainProviderLangchainProvider } from '@composio/langchain'; // initiate composio const const composio: Composio<LangchainProvider>composio = new new Composio<LangchainProvider>(config?: ComposioConfig<LangchainProvider> | undefined): Composio<LangchainProvider>
Creates a new instance of the Composio SDK. The constructor initializes the SDK with the provided configuration options, sets up the API client, and initializes all core models (tools, toolkits, etc.).
@paramconfig - Configuration options for the Composio SDK@paramconfig.apiKey - The API key for authenticating with the Composio API@paramconfig.baseURL - The base URL for the Composio API (defaults to production URL)@paramconfig.allowTracking - Whether to allow anonymous usage analytics@paramconfig.provider - The provider to use for this Composio instance (defaults to OpenAIProvider)@example```typescript // Initialize with default configuration const composio = new Composio(); // Initialize with custom API key and base URL const composio = new Composio({ apiKey: 'your-api-key', baseURL: 'https://api.composio.dev' }); // Initialize with custom provider const composio = new Composio({ apiKey: 'your-api-key', provider: new CustomProvider() }); ```
Composio
({
apiKey?: string | null | undefined
The API key for the Composio API.
@example'sk-1234567890'
apiKey
: var process: NodeJS.Processprocess.NodeJS.Process.env: NodeJS.ProcessEnv
The `process.env` property returns an object containing the user environment. See [`environ(7)`](http://man7.org/linux/man-pages/man7/environ.7.html). An example of this object looks like: ```js { TERM: 'xterm-256color', SHELL: '/usr/local/bin/bash', USER: 'maciej', PATH: '~/.bin/:/usr/bin:/bin:/usr/sbin:/sbin:/usr/local/bin', PWD: '/Users/maciej', EDITOR: 'vim', SHLVL: '1', HOME: '/Users/maciej', LOGNAME: 'maciej', _: '/usr/local/bin/node' } ``` It is possible to modify this object, but such modifications will not be reflected outside the Node.js process, or (unless explicitly requested) to other `Worker` threads. In other words, the following example would not work: ```bash node -e 'process.env.foo = "bar"' &#x26;&#x26; echo $foo ``` While the following will: ```js import { env } from 'node:process'; env.foo = 'bar'; console.log(env.foo); ``` Assigning a property on `process.env` will implicitly convert the value to a string. **This behavior is deprecated.** Future versions of Node.js may throw an error when the value is not a string, number, or boolean. ```js import { env } from 'node:process'; env.test = null; console.log(env.test); // => 'null' env.test = undefined; console.log(env.test); // => 'undefined' ``` Use `delete` to delete a property from `process.env`. ```js import { env } from 'node:process'; env.TEST = 1; delete env.TEST; console.log(env.TEST); // => undefined ``` On Windows operating systems, environment variables are case-insensitive. ```js import { env } from 'node:process'; env.TEST = 1; console.log(env.test); // => 1 ``` Unless explicitly specified when creating a `Worker` instance, each `Worker` thread has its own copy of `process.env`, based on its parent thread's `process.env`, or whatever was specified as the `env` option to the `Worker` constructor. Changes to `process.env` will not be visible across `Worker` threads, and only the main thread can make changes that are visible to the operating system or to native add-ons. On Windows, a copy of `process.env` on a `Worker` instance operates in a case-sensitive manner unlike the main thread.
@sincev0.1.27
env
.string | undefinedCOMPOSIO_API_KEY,
provider?: LangchainProvider | undefined
The tool provider to use for this Composio instance.
@examplenew OpenAIProvider()
provider
: new new LangchainProvider(): LangchainProvider
Creates a new instance of the LangchainProvider. This provider enables integration with the Langchain framework, allowing Composio tools to be used with Langchain agents and chains.
@example```typescript // Initialize the Langchain provider const provider = new LangchainProvider(); // Use with Composio const composio = new Composio({ apiKey: 'your-api-key', provider: new LangchainProvider() }); // Use the provider to wrap tools for Langchain const langchainTools = provider.wrapTools(composioTools, composio.tools.execute); ```
LangchainProvider
(),
}); // fetch the tool var console: Console
The `console` module provides a simple debugging console that is similar to the JavaScript console mechanism provided by web browsers. The module exports two specific components: * A `Console` class with methods such as `console.log()`, `console.error()` and `console.warn()` that can be used to write to any Node.js stream. * A global `console` instance configured to write to [`process.stdout`](https://nodejs.org/docs/latest-v24.x/api/process.html#processstdout) and [`process.stderr`](https://nodejs.org/docs/latest-v24.x/api/process.html#processstderr). The global `console` can be used without importing the `node:console` module. _**Warning**_: The global console object's methods are neither consistently synchronous like the browser APIs they resemble, nor are they consistently asynchronous like all other Node.js streams. See the [`note on process I/O`](https://nodejs.org/docs/latest-v24.x/api/process.html#a-note-on-process-io) for more information. Example using the global `console`: ```js console.log('hello world'); // Prints: hello world, to stdout console.log('hello %s', 'world'); // Prints: hello world, to stdout console.error(new Error('Whoops, something bad happened')); // Prints error message and stack trace to stderr: // Error: Whoops, something bad happened // at [eval]:5:15 // at Script.runInThisContext (node:vm:132:18) // at Object.runInThisContext (node:vm:309:38) // at node:internal/process/execution:77:19 // at [eval]-wrapper:6:22 // at evalScript (node:internal/process/execution:76:60) // at node:internal/main/eval_string:23:3 const name = 'Will Robinson'; console.warn(`Danger ${name}! Danger!`); // Prints: Danger Will Robinson! Danger!, to stderr ``` Example using the `Console` class: ```js const out = getStreamSomehow(); const err = getStreamSomehow(); const myConsole = new console.Console(out, err); myConsole.log('hello world'); // Prints: hello world, to out myConsole.log('hello %s', 'world'); // Prints: hello world, to out myConsole.error(new Error('Whoops, something bad happened')); // Prints: [Error: Whoops, something bad happened], to err const name = 'Will Robinson'; myConsole.warn(`Danger ${name}! Danger!`); // Prints: Danger Will Robinson! Danger!, to err ```
@see[source](https://github.com/nodejs/node/blob/v24.x/lib/console.js)
console
.Console.log(message?: any, ...optionalParams: any[]): void (+1 overload)
Prints to `stdout` with newline. Multiple arguments can be passed, with the first used as the primary message and all additional used as substitution values similar to [`printf(3)`](http://man7.org/linux/man-pages/man3/printf.3.html) (the arguments are all passed to [`util.format()`](https://nodejs.org/docs/latest-v24.x/api/util.html#utilformatformat-args)). ```js const count = 5; console.log('count: %d', count); // Prints: count: 5, to stdout console.log('count:', count); // Prints: count: 5, to stdout ``` See [`util.format()`](https://nodejs.org/docs/latest-v24.x/api/util.html#utilformatformat-args) for more information.
@sincev0.1.100
log
(`🔄 Fetching the tool...`);
const const tools: LangChainToolCollectiontools = await const composio: Composio<LangchainProvider>composio.Composio<LangchainProvider>.tools: Tools<unknown, unknown, LangchainProvider>
List, retrieve, and execute tools
tools
.Tools<unknown, unknown, LangchainProvider>.get<LangchainProvider>(userId: string, slug: string, options?: AgenticToolOptions | undefined): Promise<LangChainToolCollection> (+1 overload)
Get a specific tool by its slug. This method fetches the tool from the Composio API and wraps it using the provider.
@paramuserId - The user id to get the tool for@paramslug - The slug of the tool to fetch@paramoptions - Optional provider options including modifiers@returnsThe wrapped tool@example```typescript // Get a specific tool by slug const hackerNewsUserTool = await composio.tools.get('default', 'HACKERNEWS_GET_USER'); // Get a tool with schema modifications const tool = await composio.tools.get('default', 'GITHUB_GET_REPOS', { modifySchema: (toolSlug, toolkitSlug, schema) => { // Customize the tool schema return {...schema, description: 'Custom description'}; } }); ```
get
('default', 'HACKERNEWS_GET_USER');
// Define the tools for the agent to use const const toolNode: ToolNode<any>toolNode = new new ToolNode<any>(tools: (StructuredToolInterface | DynamicTool | RunnableToolLike)[], options?: ToolNodeOptions): ToolNode<any>
A node that runs the tools requested in the last AIMessage. It can be used either in StateGraph with a "messages" key or in MessageGraph. If multiple tool calls are requested, they will be run in parallel. The output will be a list of ToolMessages, one for each tool call.
@example```ts import { ToolNode } from "@langchain/langgraph/prebuilt"; import { tool } from "@langchain/core/tools"; import { z } from "zod"; import { AIMessage } from "@langchain/core/messages"; const getWeather = tool((input) => { if (["sf", "san francisco"].includes(input.location.toLowerCase())) { return "It's 60 degrees and foggy."; } else { return "It's 90 degrees and sunny."; } }, { name: "get_weather", description: "Call to get the current weather.", schema: z.object({ location: z.string().describe("Location to get the weather for."), }), }); const tools = [getWeather]; const toolNode = new ToolNode(tools); const messageWithSingleToolCall = new AIMessage({ content: "", tool_calls: [ { name: "get_weather", args: { location: "sf" }, id: "tool_call_id", type: "tool_call", } ] }) await toolNode.invoke({ messages: [messageWithSingleToolCall] }); // Returns tool invocation responses as: // { messages: ToolMessage[] } ```@example```ts import { StateGraph, MessagesAnnotation, } from "@langchain/langgraph"; import { ToolNode } from "@langchain/langgraph/prebuilt"; import { tool } from "@langchain/core/tools"; import { z } from "zod"; import { ChatAnthropic } from "@langchain/anthropic"; const getWeather = tool((input) => { if (["sf", "san francisco"].includes(input.location.toLowerCase())) { return "It's 60 degrees and foggy."; } else { return "It's 90 degrees and sunny."; } }, { name: "get_weather", description: "Call to get the current weather.", schema: z.object({ location: z.string().describe("Location to get the weather for."), }), }); const tools = [getWeather]; const modelWithTools = new ChatAnthropic({ model: "claude-3-haiku-20240307", temperature: 0 }).bindTools(tools); const toolNodeForGraph = new ToolNode(tools) const shouldContinue = (state: typeof MessagesAnnotation.State) => { const { messages } = state; const lastMessage = messages[messages.length - 1]; if ("tool_calls" in lastMessage && Array.isArray(lastMessage.tool_calls) && lastMessage.tool_calls?.length) { return "tools"; } return "__end__"; } const callModel = async (state: typeof MessagesAnnotation.State) => { const { messages } = state; const response = await modelWithTools.invoke(messages); return { messages: response }; } const graph = new StateGraph(MessagesAnnotation) .addNode("agent", callModel) .addNode("tools", toolNodeForGraph) .addEdge("__start__", "agent") .addConditionalEdges("agent", shouldContinue) .addEdge("tools", "agent") .compile(); const inputs = { messages: [{ role: "user", content: "what is the weather in SF?" }], }; const stream = await graph.stream(inputs, { streamMode: "values", }); for await (const { messages } of stream) { console.log(messages); } // Returns the messages in the state at each step of execution ```
ToolNode
(const tools: LangChainToolCollectiontools);
// Create a model and give it access to the tools const const model: Runnable<BaseLanguageModelInput, AIMessageChunk<MessageStructure<MessageToolSet>>, ChatOpenAICallOptions>model = new new ChatOpenAI<ChatOpenAICallOptions>(fields?: ChatOpenAIFields | undefined): ChatOpenAI<ChatOpenAICallOptions>
OpenAI chat model integration. To use with Azure, import the `AzureChatOpenAI` class. Setup: Install `@langchain/openai` and set an environment variable named `OPENAI_API_KEY`. ```bash npm install
@langchain/openai export OPENAI_API_KEY="your-api-key" ``` ## [Constructor args](https://api.js.langchain.com/classes/langchain_openai.ChatOpenAI.html#constructor) ## [Runtime args](https://api.js.langchain.com/interfaces/langchain_openai.ChatOpenAICallOptions.html) Runtime args can be passed as the second argument to any of the base runnable methods `.invoke`. `.stream`, `.batch`, etc. They can also be passed via `.withConfig`, or the second arg in `.bindTools`, like shown in the examples below: ```typescript // When calling `.withConfig`, call options should be passed via the first argument const llmWithArgsBound = llm.withConfig({ stop: ["\n"], tools: [...], }); // When calling `.bindTools`, call options should be passed via the second argument const llmWithTools = llm.bindTools( [...], { tool_choice: "auto", } ); ``` ## Examples <details open> <summary><strong>Instantiate</strong></summary> ```typescript import { ChatOpenAI } from '@langchain/openai'; const llm = new ChatOpenAI({ model: "gpt-4o-mini", temperature: 0, maxTokens: undefined, timeout: undefined, maxRetries: 2, // apiKey: "...", // configuration: { // baseURL: "...", // } // organization: "...", // other params... }); ``` </details> <br /> <details> <summary><strong>Invoking</strong></summary> ```typescript const input = `Translate "I love programming" into French.`; // Models also accept a list of chat messages or a formatted prompt const result = await llm.invoke(input); console.log(result); ``` ```txt AIMessage { "id": "chatcmpl-9u4Mpu44CbPjwYFkTbeoZgvzB00Tz", "content": "J'adore la programmation.", "response_metadata": { "tokenUsage": { "completionTokens": 5, "promptTokens": 28, "totalTokens": 33 }, "finish_reason": "stop", "system_fingerprint": "fp_3aa7262c27" }, "usage_metadata": { "input_tokens": 28, "output_tokens": 5, "total_tokens": 33 } } ``` </details> <br /> <details> <summary><strong>Streaming Chunks</strong></summary> ```typescript for await (const chunk of await llm.stream(input)) { console.log(chunk); } ``` ```txt AIMessageChunk { "id": "chatcmpl-9u4NWB7yUeHCKdLr6jP3HpaOYHTqs", "content": "" } AIMessageChunk { "content": "J" } AIMessageChunk { "content": "'adore" } AIMessageChunk { "content": " la" } AIMessageChunk { "content": " programmation",, } AIMessageChunk { "content": ".",, } AIMessageChunk { "content": "", "response_metadata": { "finish_reason": "stop", "system_fingerprint": "fp_c9aa9c0491" }, } AIMessageChunk { "content": "", "usage_metadata": { "input_tokens": 28, "output_tokens": 5, "total_tokens": 33 } } ``` </details> <br /> <details> <summary><strong>Aggregate Streamed Chunks</strong></summary> ```typescript import { AIMessageChunk } from '@langchain/core/messages'; import { concat } from '@langchain/core/utils/stream'; const stream = await llm.stream(input); let full: AIMessageChunk | undefined; for await (const chunk of stream) { full = !full ? chunk : concat(full, chunk); } console.log(full); ``` ```txt AIMessageChunk { "id": "chatcmpl-9u4PnX6Fy7OmK46DASy0bH6cxn5Xu", "content": "J'adore la programmation.", "response_metadata": { "prompt": 0, "completion": 0, "finish_reason": "stop", }, "usage_metadata": { "input_tokens": 28, "output_tokens": 5, "total_tokens": 33 } } ``` </details> <br /> <details> <summary><strong>Bind tools</strong></summary> ```typescript import { z } from 'zod'; const GetWeather = { name: "GetWeather", description: "Get the current weather in a given location", schema: z.object({ location: z.string().describe("The city and state, e.g. San Francisco, CA") }), } const GetPopulation = { name: "GetPopulation", description: "Get the current population in a given location", schema: z.object({ location: z.string().describe("The city and state, e.g. San Francisco, CA") }), } const llmWithTools = llm.bindTools( [GetWeather, GetPopulation], { // strict: true // enforce tool args schema is respected } ); const aiMsg = await llmWithTools.invoke( "Which city is hotter today and which is bigger: LA or NY?" ); console.log(aiMsg.tool_calls); ``` ```txt [ { name: 'GetWeather', args: { location: 'Los Angeles, CA' }, type: 'tool_call', id: 'call_uPU4FiFzoKAtMxfmPnfQL6UK' }, { name: 'GetWeather', args: { location: 'New York, NY' }, type: 'tool_call', id: 'call_UNkEwuQsHrGYqgDQuH9nPAtX' }, { name: 'GetPopulation', args: { location: 'Los Angeles, CA' }, type: 'tool_call', id: 'call_kL3OXxaq9OjIKqRTpvjaCH14' }, { name: 'GetPopulation', args: { location: 'New York, NY' }, type: 'tool_call', id: 'call_s9KQB1UWj45LLGaEnjz0179q' } ] ``` </details> <br /> <details> <summary><strong>Structured Output</strong></summary> ```typescript import { z } from 'zod'; const Joke = z.object({ setup: z.string().describe("The setup of the joke"), punchline: z.string().describe("The punchline to the joke"), rating: z.number().nullable().describe("How funny the joke is, from 1 to 10") }).describe('Joke to tell user.'); const structuredLlm = llm.withStructuredOutput(Joke, { name: "Joke", strict: true, // Optionally enable OpenAI structured outputs }); const jokeResult = await structuredLlm.invoke("Tell me a joke about cats"); console.log(jokeResult); ``` ```txt { setup: 'Why was the cat sitting on the computer?', punchline: 'Because it wanted to keep an eye on the mouse!', rating: 7 } ``` </details> <br /> <details> <summary><strong>JSON Object Response Format</strong></summary> ```typescript const jsonLlm = llm.withConfig({ response_format: { type: "json_object" } }); const jsonLlmAiMsg = await jsonLlm.invoke( "Return a JSON object with key 'randomInts' and a value of 10 random ints in [0-99]" ); console.log(jsonLlmAiMsg.content); ``` ```txt { "randomInts": [23, 87, 45, 12, 78, 34, 56, 90, 11, 67] } ``` </details> <br /> <details> <summary><strong>Multimodal</strong></summary> ```typescript import { HumanMessage } from '@langchain/core/messages'; const imageUrl = "https://example.com/image.jpg"; const imageData = await fetch(imageUrl).then(res => res.arrayBuffer()); const base64Image = Buffer.from(imageData).toString('base64'); const message = new HumanMessage({ content: [ { type: "text", text: "describe the weather in this image" }, { type: "image_url", image_url: { url: `data:image/jpeg;base64,${base64Image}` }, }, ] }); const imageDescriptionAiMsg = await llm.invoke([message]); console.log(imageDescriptionAiMsg.content); ``` ```txt The weather in the image appears to be clear and sunny. The sky is mostly blue with a few scattered white clouds, indicating fair weather. The bright sunlight is casting shadows on the green, grassy hill, suggesting it is a pleasant day with good visibility. There are no signs of rain or stormy conditions. ``` </details> <br /> <details> <summary><strong>Usage Metadata</strong></summary> ```typescript const aiMsgForMetadata = await llm.invoke(input); console.log(aiMsgForMetadata.usage_metadata); ``` ```txt { input_tokens: 28, output_tokens: 5, total_tokens: 33 } ``` </details> <br /> <details> <summary><strong>Logprobs</strong></summary> ```typescript const logprobsLlm = new ChatOpenAI({ model: "gpt-4o-mini", logprobs: true }); const aiMsgForLogprobs = await logprobsLlm.invoke(input); console.log(aiMsgForLogprobs.response_metadata.logprobs); ``` ```txt { content: [ { token: 'J', logprob: -0.000050616763, bytes: [Array], top_logprobs: [] }, { token: "'", logprob: -0.01868736, bytes: [Array], top_logprobs: [] }, { token: 'ad', logprob: -0.0000030545007, bytes: [Array], top_logprobs: [] }, { token: 'ore', logprob: 0, bytes: [Array], top_logprobs: [] }, { token: ' la', logprob: -0.515404, bytes: [Array], top_logprobs: [] }, { token: ' programm', logprob: -0.0000118755715, bytes: [Array], top_logprobs: [] }, { token: 'ation', logprob: 0, bytes: [Array], top_logprobs: [] }, { token: '.', logprob: -0.0000037697225, bytes: [Array], top_logprobs: [] } ], refusal: null } ``` </details> <br /> <details> <summary><strong>Response Metadata</strong></summary> ```typescript const aiMsgForResponseMetadata = await llm.invoke(input); console.log(aiMsgForResponseMetadata.response_metadata); ``` ```txt { tokenUsage: { completionTokens: 5, promptTokens: 28, totalTokens: 33 }, finish_reason: 'stop', system_fingerprint: 'fp_3aa7262c27' } ``` </details> <br /> <details> <summary><strong>JSON Schema Structured Output</strong></summary> ```typescript const llmForJsonSchema = new ChatOpenAI({ model: "gpt-4o-2024-08-06", }).withStructuredOutput( z.object({ command: z.string().describe("The command to execute"), expectedOutput: z.string().describe("The expected output of the command"), options: z .array(z.string()) .describe("The options you can pass to the command"), }), { method: "jsonSchema", strict: true, // Optional when using the `jsonSchema` method } ); const jsonSchemaRes = await llmForJsonSchema.invoke( "What is the command to list files in a directory?" ); console.log(jsonSchemaRes); ``` ```txt { command: 'ls', expectedOutput: 'A list of files and subdirectories within the specified directory.', options: [ '-a: include directory entries whose names begin with a dot (.).', '-l: use a long listing format.', '-h: with -l, print sizes in human readable format (e.g., 1K, 234M, 2G).', '-t: sort by time, newest first.', '-r: reverse order while sorting.', '-S: sort by file size, largest first.', '-R: list subdirectories recursively.' ] } ``` </details> <br /> <details> <summary><strong>Audio Outputs</strong></summary> ```typescript import { ChatOpenAI } from "@langchain/openai"; const modelWithAudioOutput = new ChatOpenAI({ model: "gpt-4o-audio-preview", // You may also pass these fields to `.withConfig` as a call argument. modalities: ["text", "audio"], // Specifies that the model should output audio. audio: { voice: "alloy", format: "wav", }, }); const audioOutputResult = await modelWithAudioOutput.invoke("Tell me a joke about cats."); const castMessageContent = audioOutputResult.content[0] as Record<string, any>; console.log({ ...castMessageContent, data: castMessageContent.data.slice(0, 100) // Sliced for brevity }) ``` ```txt { id: 'audio_67117718c6008190a3afad3e3054b9b6', data: 'UklGRqYwBgBXQVZFZm10IBAAAAABAAEAwF0AAIC7AAACABAATElTVBoAAABJTkZPSVNGVA4AAABMYXZmNTguMjkuMTAwAGRhdGFg', expires_at: 1729201448, transcript: 'Sure! Why did the cat sit on the computer? Because it wanted to keep an eye on the mouse!' } ``` </details> <br /> <details> <summary><strong>Audio Outputs</strong></summary> ```typescript import { ChatOpenAI } from "@langchain/openai"; const modelWithAudioOutput = new ChatOpenAI({ model: "gpt-4o-audio-preview", // You may also pass these fields to `.withConfig` as a call argument. modalities: ["text", "audio"], // Specifies that the model should output audio. audio: { voice: "alloy", format: "wav", }, }); const audioOutputResult = await modelWithAudioOutput.invoke("Tell me a joke about cats."); const castAudioContent = audioOutputResult.additional_kwargs.audio as Record<string, any>; console.log({ ...castAudioContent, data: castAudioContent.data.slice(0, 100) // Sliced for brevity }) ``` ```txt { id: 'audio_67117718c6008190a3afad3e3054b9b6', data: 'UklGRqYwBgBXQVZFZm10IBAAAAABAAEAwF0AAIC7AAACABAATElTVBoAAABJTkZPSVNGVA4AAABMYXZmNTguMjkuMTAwAGRhdGFg', expires_at: 1729201448, transcript: 'Sure! Why did the cat sit on the computer? Because it wanted to keep an eye on the mouse!' } ``` </details> <br />
ChatOpenAI
({
model?: OpenAIChatModelId | undefined
Model name to use
model
: 'gpt-4o-mini',
temperature?: number | undefined
Sampling temperature to use
temperature
: 0,
}).BaseChatOpenAI<ChatOpenAICallOptions>.bindTools(tools: ChatOpenAIToolType[], kwargs?: Partial<ChatOpenAICallOptions> | undefined): Runnable<BaseLanguageModelInput, AIMessageChunk<MessageStructure<MessageToolSet>>, ChatOpenAICallOptions>
Bind tool-like objects to this chat model.
@paramtools A list of tool definitions to bind to this chat model. Can be a structured tool, an OpenAI formatted tool, or an object matching the provider's specific tool schema.@paramkwargs Any additional parameters to bind.
bindTools
(const tools: LangChainToolCollectiontools);
// Define the function that determines whether to continue or not function function shouldContinue({ messages }: typeof MessagesAnnotation.State): "tools" | "__end__"shouldContinue({ messages: BaseMessage<MessageStructure<MessageToolSet>, MessageType>[]messages }: typeof
const MessagesAnnotation: AnnotationRoot<{
    messages: BinaryOperatorAggregate<BaseMessage<MessageStructure<MessageToolSet>, MessageType>[], Messages>;
}>
Prebuilt state annotation that combines returned messages. Can handle standard messages and special modifiers like {@link RemoveMessage } instances. Specifically, importing and using the prebuilt MessagesAnnotation like this:
@example```ts import { MessagesAnnotation, StateGraph } from "@langchain/langgraph"; const graph = new StateGraph(MessagesAnnotation) .addNode(...) ... ``` Is equivalent to initializing your state manually like this:@example```ts import { BaseMessage } from "@langchain/core/messages"; import { Annotation, StateGraph, messagesStateReducer } from "@langchain/langgraph"; export const StateAnnotation = Annotation.Root({ messages: Annotation<BaseMessage[]>({ reducer: messagesStateReducer, default: () => [], }), }); const graph = new StateGraph(StateAnnotation) .addNode(...) ... ```
MessagesAnnotation
.
AnnotationRoot<{ messages: BinaryOperatorAggregate<BaseMessage<MessageStructure<MessageToolSet>, MessageType>[], Messages>; }>.State: StateType<{
    messages: BinaryOperatorAggregate<BaseMessage<MessageStructure<MessageToolSet>, MessageType>[], Messages>;
}>
State
) {
const const lastMessage: AIMessage<MessageStructure<MessageToolSet>>lastMessage = messages: BaseMessage<MessageStructure<MessageToolSet>, MessageType>[]messages[messages: BaseMessage<MessageStructure<MessageToolSet>, MessageType>[]messages.Array<T>.length: number
Gets or sets the length of the array. This is a number one higher than the highest index in the array.
length
- 1] as class AIMessage<TStructure extends MessageStructure = MessageStructure<MessageToolSet>>AIMessage;
// If the LLM makes a tool call, then we route to the "tools" node if (const lastMessage: AIMessage<MessageStructure<MessageToolSet>>lastMessage.
AIMessage<MessageStructure<MessageToolSet>>.tool_calls?: {
    readonly type?: "tool_call";
    id?: string;
    name: string;
    args: Record<string, any>;
}[] | undefined
tool_calls
?.Array<T>.length: number | undefined
Gets or sets the length of the array. This is a number one higher than the highest index in the array.
length
) {
return 'tools'; } // Otherwise, we stop (reply to the user) using the special "__end__" node return '__end__'; } // Define the function that calls the model async function
function callModel(state: typeof MessagesAnnotation.State): Promise<{
    messages: AIMessageChunk<MessageStructure<MessageToolSet>>[];
}>
callModel
(
state: StateType<{
    messages: BinaryOperatorAggregate<BaseMessage<MessageStructure<MessageToolSet>, MessageType>[], Messages>;
}>
state
: typeof
const MessagesAnnotation: AnnotationRoot<{
    messages: BinaryOperatorAggregate<BaseMessage<MessageStructure<MessageToolSet>, MessageType>[], Messages>;
}>
Prebuilt state annotation that combines returned messages. Can handle standard messages and special modifiers like {@link RemoveMessage } instances. Specifically, importing and using the prebuilt MessagesAnnotation like this:
@example```ts import { MessagesAnnotation, StateGraph } from "@langchain/langgraph"; const graph = new StateGraph(MessagesAnnotation) .addNode(...) ... ``` Is equivalent to initializing your state manually like this:@example```ts import { BaseMessage } from "@langchain/core/messages"; import { Annotation, StateGraph, messagesStateReducer } from "@langchain/langgraph"; export const StateAnnotation = Annotation.Root({ messages: Annotation<BaseMessage[]>({ reducer: messagesStateReducer, default: () => [], }), }); const graph = new StateGraph(StateAnnotation) .addNode(...) ... ```
MessagesAnnotation
.
AnnotationRoot<{ messages: BinaryOperatorAggregate<BaseMessage<MessageStructure<MessageToolSet>, MessageType>[], Messages>; }>.State: StateType<{
    messages: BinaryOperatorAggregate<BaseMessage<MessageStructure<MessageToolSet>, MessageType>[], Messages>;
}>
State
) {
var console: Console
The `console` module provides a simple debugging console that is similar to the JavaScript console mechanism provided by web browsers. The module exports two specific components: * A `Console` class with methods such as `console.log()`, `console.error()` and `console.warn()` that can be used to write to any Node.js stream. * A global `console` instance configured to write to [`process.stdout`](https://nodejs.org/docs/latest-v24.x/api/process.html#processstdout) and [`process.stderr`](https://nodejs.org/docs/latest-v24.x/api/process.html#processstderr). The global `console` can be used without importing the `node:console` module. _**Warning**_: The global console object's methods are neither consistently synchronous like the browser APIs they resemble, nor are they consistently asynchronous like all other Node.js streams. See the [`note on process I/O`](https://nodejs.org/docs/latest-v24.x/api/process.html#a-note-on-process-io) for more information. Example using the global `console`: ```js console.log('hello world'); // Prints: hello world, to stdout console.log('hello %s', 'world'); // Prints: hello world, to stdout console.error(new Error('Whoops, something bad happened')); // Prints error message and stack trace to stderr: // Error: Whoops, something bad happened // at [eval]:5:15 // at Script.runInThisContext (node:vm:132:18) // at Object.runInThisContext (node:vm:309:38) // at node:internal/process/execution:77:19 // at [eval]-wrapper:6:22 // at evalScript (node:internal/process/execution:76:60) // at node:internal/main/eval_string:23:3 const name = 'Will Robinson'; console.warn(`Danger ${name}! Danger!`); // Prints: Danger Will Robinson! Danger!, to stderr ``` Example using the `Console` class: ```js const out = getStreamSomehow(); const err = getStreamSomehow(); const myConsole = new console.Console(out, err); myConsole.log('hello world'); // Prints: hello world, to out myConsole.log('hello %s', 'world'); // Prints: hello world, to out myConsole.error(new Error('Whoops, something bad happened')); // Prints: [Error: Whoops, something bad happened], to err const name = 'Will Robinson'; myConsole.warn(`Danger ${name}! Danger!`); // Prints: Danger Will Robinson! Danger!, to err ```
@see[source](https://github.com/nodejs/node/blob/v24.x/lib/console.js)
console
.Console.log(message?: any, ...optionalParams: any[]): void (+1 overload)
Prints to `stdout` with newline. Multiple arguments can be passed, with the first used as the primary message and all additional used as substitution values similar to [`printf(3)`](http://man7.org/linux/man-pages/man3/printf.3.html) (the arguments are all passed to [`util.format()`](https://nodejs.org/docs/latest-v24.x/api/util.html#utilformatformat-args)). ```js const count = 5; console.log('count: %d', count); // Prints: count: 5, to stdout console.log('count:', count); // Prints: count: 5, to stdout ``` See [`util.format()`](https://nodejs.org/docs/latest-v24.x/api/util.html#utilformatformat-args) for more information.
@sincev0.1.100
log
(`🔄 Calling the model...`);
const const response: AIMessageChunk<MessageStructure<MessageToolSet>>response = await const model: Runnable<BaseLanguageModelInput, AIMessageChunk<MessageStructure<MessageToolSet>>, ChatOpenAICallOptions>model.Runnable<BaseLanguageModelInput, AIMessageChunk<MessageStructure<MessageToolSet>>, ChatOpenAICallOptions>.invoke(input: BaseLanguageModelInput, options?: Partial<ChatOpenAICallOptions> | undefined): Promise<AIMessageChunk<MessageStructure<MessageToolSet>>>invoke(
state: StateType<{
    messages: BinaryOperatorAggregate<BaseMessage<MessageStructure<MessageToolSet>, MessageType>[], Messages>;
}>
state
.messages: BaseMessage<MessageStructure<MessageToolSet>, MessageType>[]messages);
// We return a list, because this will get added to the existing list return { messages: AIMessageChunk<MessageStructure<MessageToolSet>>[]messages: [const response: AIMessageChunk<MessageStructure<MessageToolSet>>response] }; } // Define a new graph const
const workflow: StateGraph<{
    messages: BinaryOperatorAggregate<BaseMessage<MessageStructure<MessageToolSet>, MessageType>[], Messages>;
}, StateType<{
    messages: BinaryOperatorAggregate<BaseMessage<MessageStructure<MessageToolSet>, MessageType>[], Messages>;
}>, ... 7 more ..., unknown>
workflow
= new
new StateGraph<{
    messages: BinaryOperatorAggregate<BaseMessage<MessageStructure<MessageToolSet>, MessageType>[], Messages>;
}, StateType<{
    messages: BinaryOperatorAggregate<BaseMessage<MessageStructure<MessageToolSet>, MessageType>[], Messages>;
}>, UpdateType<...>, "__start__", {
    messages: BinaryOperatorAggregate<BaseMessage<MessageStructure<MessageToolSet>, MessageType>[], Messages>;
}, {
    messages: BinaryOperatorAggregate<BaseMessage<MessageStructure<MessageToolSet>, MessageType>[], Messages>;
}, StateDefinition, unknown, unknown, unknown>(state: AnnotationRoot<...>, options?: {
    ...;
} | undefined): StateGraph<...> (+5 overloads)
A graph whose nodes communicate by reading and writing to a shared state. Each node takes a defined `State` as input and returns a `Partial<State>`. Each state key can optionally be annotated with a reducer function that will be used to aggregate the values of that key received from multiple nodes. The signature of a reducer function is (left: Value, right: UpdateValue) => Value. See {@link Annotation } for more on defining state. After adding nodes and edges to your graph, you must call `.compile()` on it before you can use it.
@example```ts import { type BaseMessage, AIMessage, HumanMessage, } from "@langchain/core/messages"; import { StateGraph, Annotation } from "@langchain/langgraph"; // Define a state with a single key named "messages" that will // combine a returned BaseMessage or arrays of BaseMessages const StateAnnotation = Annotation.Root({ sentiment: Annotation<string>, messages: Annotation<BaseMessage[]>({ reducer: (left: BaseMessage[], right: BaseMessage | BaseMessage[]) => { if (Array.isArray(right)) { return left.concat(right); } return left.concat([right]); }, default: () => [], }), }); const graphBuilder = new StateGraph(StateAnnotation); // A node in the graph that returns an object with a "messages" key // will update the state by combining the existing value with the returned one. const myNode = (state: typeof StateAnnotation.State) => { return { messages: [new AIMessage("Some new response")], sentiment: "positive", }; }; const graph = graphBuilder .addNode("myNode", myNode) .addEdge("__start__", "myNode") .addEdge("myNode", "__end__") .compile(); await graph.invoke({ messages: [new HumanMessage("how are you?")] }); // { // messages: [HumanMessage("how are you?"), AIMessage("Some new response")], // sentiment: "positive", // } ```
StateGraph
(
const MessagesAnnotation: AnnotationRoot<{
    messages: BinaryOperatorAggregate<BaseMessage<MessageStructure<MessageToolSet>, MessageType>[], Messages>;
}>
Prebuilt state annotation that combines returned messages. Can handle standard messages and special modifiers like {@link RemoveMessage } instances. Specifically, importing and using the prebuilt MessagesAnnotation like this:
@example```ts import { MessagesAnnotation, StateGraph } from "@langchain/langgraph"; const graph = new StateGraph(MessagesAnnotation) .addNode(...) ... ``` Is equivalent to initializing your state manually like this:@example```ts import { BaseMessage } from "@langchain/core/messages"; import { Annotation, StateGraph, messagesStateReducer } from "@langchain/langgraph"; export const StateAnnotation = Annotation.Root({ messages: Annotation<BaseMessage[]>({ reducer: messagesStateReducer, default: () => [], }), }); const graph = new StateGraph(StateAnnotation) .addNode(...) ... ```
MessagesAnnotation
)
.
StateGraph<{ messages: BinaryOperatorAggregate<BaseMessage<MessageStructure<MessageToolSet>, MessageType>[], Messages>; }, ... 8 more ..., unknown>.addNode<"agent", StateType<{
    messages: BinaryOperatorAggregate<BaseMessage<MessageStructure<MessageToolSet>, MessageType>[], Messages>;
}>, {
    messages: AIMessageChunk<MessageStructure<MessageToolSet>>[];
}>(key: "agent", action: NodeAction<...>, options?: StateGraphAddNodeOptions<...>): StateGraph<...> (+3 overloads)
addNode
('agent',
function callModel(state: typeof MessagesAnnotation.State): Promise<{
    messages: AIMessageChunk<MessageStructure<MessageToolSet>>[];
}>
callModel
)
.
StateGraph<{ messages: BinaryOperatorAggregate<BaseMessage<MessageStructure<MessageToolSet>, MessageType>[], Messages>; }, ... 8 more ..., unknown>.addEdge(startKey: "__start__" | "agent" | ("__start__" | "agent")[], endKey: "__start__" | "__end__" | "agent"): StateGraph<{
    messages: BinaryOperatorAggregate<BaseMessage<MessageStructure<MessageToolSet>, MessageType>[], Messages>;
}, StateType<{
    messages: BinaryOperatorAggregate<BaseMessage<MessageStructure<MessageToolSet>, MessageType>[], Messages>;
}>, ... 7 more ..., unknown>
addEdge
('__start__', 'agent') // __start__ is a special name for the entrypoint
.
StateGraph<{ messages: BinaryOperatorAggregate<BaseMessage<MessageStructure<MessageToolSet>, MessageType>[], Messages>; }, ... 8 more ..., unknown>.addNode<"tools", any, any>(key: "tools", action: NodeAction<any, any, StateDefinition, unknown, unknown>, options?: StateGraphAddNodeOptions<string>): StateGraph<{
    messages: BinaryOperatorAggregate<BaseMessage<MessageStructure<MessageToolSet>, MessageType>[], Messages>;
}, ... 8 more ..., unknown> (+3 overloads)
addNode
('tools', const toolNode: ToolNode<any>toolNode)
.
StateGraph<{ messages: BinaryOperatorAggregate<BaseMessage<MessageStructure<MessageToolSet>, MessageType>[], Messages>; }, ... 8 more ..., unknown>.addEdge(startKey: "__start__" | "tools" | "agent" | ("__start__" | "tools" | "agent")[], endKey: "__start__" | "tools" | "__end__" | "agent"): StateGraph<{
    messages: BinaryOperatorAggregate<BaseMessage<MessageStructure<MessageToolSet>, MessageType>[], Messages>;
}, StateType<{
    messages: BinaryOperatorAggregate<BaseMessage<MessageStructure<MessageToolSet>, MessageType>[], Messages>;
}>, ... 7 more ..., unknown>
addEdge
('tools', 'agent')
.
Graph$1<"__start__" | "tools" | "agent", StateType<{ messages: BinaryOperatorAggregate<BaseMessage<MessageStructure<MessageToolSet>, MessageType>[], Messages>; }>, UpdateType<...>, StateGraphNodeSpec<...>, StateDefinition>.addConditionalEdges(source: "__start__" | "tools" | "agent", path: RunnableLike$1<StateType<{
    messages: BinaryOperatorAggregate<BaseMessage<MessageStructure<MessageToolSet>, MessageType>[], Messages>;
}>, BranchPathReturnValue, LangGraphRunnableConfig<StateType<StateDefinition>>>, pathMap?: Record<...> | ... 1 more ... | undefined): StateGraph<...> (+1 overload)
addConditionalEdges
('agent', function shouldContinue({ messages }: typeof MessagesAnnotation.State): "tools" | "__end__"shouldContinue);
// Finally, we compile it into a LangChain Runnable. const
const app: CompiledStateGraph<{
    messages: BaseMessage<MessageStructure<MessageToolSet>, MessageType>[];
}, {
    messages?: Messages | undefined;
}, "__start__" | "tools" | "agent", {
    messages: BinaryOperatorAggregate<BaseMessage<MessageStructure<MessageToolSet>, MessageType>[], Messages>;
}, ... 4 more ..., unknown>
app
=
const workflow: StateGraph<{
    messages: BinaryOperatorAggregate<BaseMessage<MessageStructure<MessageToolSet>, MessageType>[], Messages>;
}, StateType<{
    messages: BinaryOperatorAggregate<BaseMessage<MessageStructure<MessageToolSet>, MessageType>[], Messages>;
}>, ... 7 more ..., unknown>
workflow
.
StateGraph<{ messages: BinaryOperatorAggregate<BaseMessage<MessageStructure<MessageToolSet>, MessageType>[], Messages>; }, ... 8 more ..., unknown>.compile({ checkpointer, store, cache, interruptBefore, interruptAfter, name, description }?: {
    checkpointer?: BaseCheckpointSaver | boolean;
    store?: BaseStore;
    cache?: BaseCache;
    interruptBefore?: ("__start__" | "tools" | "agent")[] | "*" | undefined;
    interruptAfter?: ("__start__" | "tools" | "agent")[] | "*" | undefined;
    name?: string;
    description?: string;
} | undefined): CompiledStateGraph<{
    messages: BaseMessage<MessageStructure<MessageToolSet>, MessageType>[];
}, {
    messages?: Messages | undefined;
}, ... 6 more ..., unknown>
compile
();
// Use the agent const
const finalState: StateType<{
    messages: BinaryOperatorAggregate<BaseMessage<MessageStructure<MessageToolSet>, MessageType>[], Messages>;
}>
finalState
= await
const app: CompiledStateGraph<{
    messages: BaseMessage<MessageStructure<MessageToolSet>, MessageType>[];
}, {
    messages?: Messages | undefined;
}, "__start__" | "tools" | "agent", {
    messages: BinaryOperatorAggregate<BaseMessage<MessageStructure<MessageToolSet>, MessageType>[], Messages>;
}, ... 4 more ..., unknown>
app
.
Pregel<Record<"__start__" | "tools" | "agent", PregelNode<{ messages: BaseMessage<MessageStructure<MessageToolSet>, MessageType>[]; }, { ...; }>>, ... 8 more ..., any>.invoke(input: UpdateType<{
    messages: BinaryOperatorAggregate<BaseMessage<MessageStructure<MessageToolSet>, MessageType>[], Messages>;
}> | CommandInstance<unknown, {
    messages?: Messages | undefined;
}, "__start__" | "tools" | "agent"> | null, options?: Partial<...> | undefined): Promise<...>
Run the graph with a single input and config.
@paraminput The input to the graph.@paramoptions The configuration to use for the run.
invoke
({
messages?: Messages | undefinedmessages: [new new HumanMessage<MessageStructure<MessageToolSet>>(fields: string | (ContentBlock | ContentBlock.Text)[] | HumanMessageFields<MessageStructure<MessageToolSet>>): HumanMessage<MessageStructure<MessageToolSet>>
Represents a human message in a conversation.
HumanMessage
('Find the details of the user `pg` on HackerNews')],
}); var console: Console
The `console` module provides a simple debugging console that is similar to the JavaScript console mechanism provided by web browsers. The module exports two specific components: * A `Console` class with methods such as `console.log()`, `console.error()` and `console.warn()` that can be used to write to any Node.js stream. * A global `console` instance configured to write to [`process.stdout`](https://nodejs.org/docs/latest-v24.x/api/process.html#processstdout) and [`process.stderr`](https://nodejs.org/docs/latest-v24.x/api/process.html#processstderr). The global `console` can be used without importing the `node:console` module. _**Warning**_: The global console object's methods are neither consistently synchronous like the browser APIs they resemble, nor are they consistently asynchronous like all other Node.js streams. See the [`note on process I/O`](https://nodejs.org/docs/latest-v24.x/api/process.html#a-note-on-process-io) for more information. Example using the global `console`: ```js console.log('hello world'); // Prints: hello world, to stdout console.log('hello %s', 'world'); // Prints: hello world, to stdout console.error(new Error('Whoops, something bad happened')); // Prints error message and stack trace to stderr: // Error: Whoops, something bad happened // at [eval]:5:15 // at Script.runInThisContext (node:vm:132:18) // at Object.runInThisContext (node:vm:309:38) // at node:internal/process/execution:77:19 // at [eval]-wrapper:6:22 // at evalScript (node:internal/process/execution:76:60) // at node:internal/main/eval_string:23:3 const name = 'Will Robinson'; console.warn(`Danger ${name}! Danger!`); // Prints: Danger Will Robinson! Danger!, to stderr ``` Example using the `Console` class: ```js const out = getStreamSomehow(); const err = getStreamSomehow(); const myConsole = new console.Console(out, err); myConsole.log('hello world'); // Prints: hello world, to out myConsole.log('hello %s', 'world'); // Prints: hello world, to out myConsole.error(new Error('Whoops, something bad happened')); // Prints: [Error: Whoops, something bad happened], to err const name = 'Will Robinson'; myConsole.warn(`Danger ${name}! Danger!`); // Prints: Danger Will Robinson! Danger!, to err ```
@see[source](https://github.com/nodejs/node/blob/v24.x/lib/console.js)
console
.Console.log(message?: any, ...optionalParams: any[]): void (+1 overload)
Prints to `stdout` with newline. Multiple arguments can be passed, with the first used as the primary message and all additional used as substitution values similar to [`printf(3)`](http://man7.org/linux/man-pages/man3/printf.3.html) (the arguments are all passed to [`util.format()`](https://nodejs.org/docs/latest-v24.x/api/util.html#utilformatformat-args)). ```js const count = 5; console.log('count: %d', count); // Prints: count: 5, to stdout console.log('count:', count); // Prints: count: 5, to stdout ``` See [`util.format()`](https://nodejs.org/docs/latest-v24.x/api/util.html#utilformatformat-args) for more information.
@sincev0.1.100
log
(`✅ Message recieved from the model`);
var console: Console
The `console` module provides a simple debugging console that is similar to the JavaScript console mechanism provided by web browsers. The module exports two specific components: * A `Console` class with methods such as `console.log()`, `console.error()` and `console.warn()` that can be used to write to any Node.js stream. * A global `console` instance configured to write to [`process.stdout`](https://nodejs.org/docs/latest-v24.x/api/process.html#processstdout) and [`process.stderr`](https://nodejs.org/docs/latest-v24.x/api/process.html#processstderr). The global `console` can be used without importing the `node:console` module. _**Warning**_: The global console object's methods are neither consistently synchronous like the browser APIs they resemble, nor are they consistently asynchronous like all other Node.js streams. See the [`note on process I/O`](https://nodejs.org/docs/latest-v24.x/api/process.html#a-note-on-process-io) for more information. Example using the global `console`: ```js console.log('hello world'); // Prints: hello world, to stdout console.log('hello %s', 'world'); // Prints: hello world, to stdout console.error(new Error('Whoops, something bad happened')); // Prints error message and stack trace to stderr: // Error: Whoops, something bad happened // at [eval]:5:15 // at Script.runInThisContext (node:vm:132:18) // at Object.runInThisContext (node:vm:309:38) // at node:internal/process/execution:77:19 // at [eval]-wrapper:6:22 // at evalScript (node:internal/process/execution:76:60) // at node:internal/main/eval_string:23:3 const name = 'Will Robinson'; console.warn(`Danger ${name}! Danger!`); // Prints: Danger Will Robinson! Danger!, to stderr ``` Example using the `Console` class: ```js const out = getStreamSomehow(); const err = getStreamSomehow(); const myConsole = new console.Console(out, err); myConsole.log('hello world'); // Prints: hello world, to out myConsole.log('hello %s', 'world'); // Prints: hello world, to out myConsole.error(new Error('Whoops, something bad happened')); // Prints: [Error: Whoops, something bad happened], to err const name = 'Will Robinson'; myConsole.warn(`Danger ${name}! Danger!`); // Prints: Danger Will Robinson! Danger!, to err ```
@see[source](https://github.com/nodejs/node/blob/v24.x/lib/console.js)
console
.Console.log(message?: any, ...optionalParams: any[]): void (+1 overload)
Prints to `stdout` with newline. Multiple arguments can be passed, with the first used as the primary message and all additional used as substitution values similar to [`printf(3)`](http://man7.org/linux/man-pages/man3/printf.3.html) (the arguments are all passed to [`util.format()`](https://nodejs.org/docs/latest-v24.x/api/util.html#utilformatformat-args)). ```js const count = 5; console.log('count: %d', count); // Prints: count: 5, to stdout console.log('count:', count); // Prints: count: 5, to stdout ``` See [`util.format()`](https://nodejs.org/docs/latest-v24.x/api/util.html#utilformatformat-args) for more information.
@sincev0.1.100
log
(
const finalState: StateType<{
    messages: BinaryOperatorAggregate<BaseMessage<MessageStructure<MessageToolSet>, MessageType>[], Messages>;
}>
finalState
.messages: BaseMessage<MessageStructure<MessageToolSet>, MessageType>[]messages[
const finalState: StateType<{
    messages: BinaryOperatorAggregate<BaseMessage<MessageStructure<MessageToolSet>, MessageType>[], Messages>;
}>
finalState
.messages: BaseMessage<MessageStructure<MessageToolSet>, MessageType>[]messages.Array<T>.length: number
Gets or sets the length of the array. This is a number one higher than the highest index in the array.
length
- 1].BaseMessage<MessageStructure<MessageToolSet>, MessageType>.content: string | (ContentBlock | ContentBlock.Text)[]
Array of content blocks that make up the message content
content
);
const
const nextState: StateType<{
    messages: BinaryOperatorAggregate<BaseMessage<MessageStructure<MessageToolSet>, MessageType>[], Messages>;
}>
nextState
= await
const app: CompiledStateGraph<{
    messages: BaseMessage<MessageStructure<MessageToolSet>, MessageType>[];
}, {
    messages?: Messages | undefined;
}, "__start__" | "tools" | "agent", {
    messages: BinaryOperatorAggregate<BaseMessage<MessageStructure<MessageToolSet>, MessageType>[], Messages>;
}, ... 4 more ..., unknown>
app
.
Pregel<Record<"__start__" | "tools" | "agent", PregelNode<{ messages: BaseMessage<MessageStructure<MessageToolSet>, MessageType>[]; }, { ...; }>>, ... 8 more ..., any>.invoke(input: UpdateType<{
    messages: BinaryOperatorAggregate<BaseMessage<MessageStructure<MessageToolSet>, MessageType>[], Messages>;
}> | CommandInstance<unknown, {
    messages?: Messages | undefined;
}, "__start__" | "tools" | "agent"> | null, options?: Partial<...> | undefined): Promise<...>
Run the graph with a single input and config.
@paraminput The input to the graph.@paramoptions The configuration to use for the run.
invoke
({
// Including the messages from the previous run gives the LLM context. // This way it knows we're asking about the weather in NY messages?: Messages | undefinedmessages: [...
const finalState: StateType<{
    messages: BinaryOperatorAggregate<BaseMessage<MessageStructure<MessageToolSet>, MessageType>[], Messages>;
}>
finalState
.messages: BaseMessage<MessageStructure<MessageToolSet>, MessageType>[]messages, new new HumanMessage<MessageStructure<MessageToolSet>>(fields: string | (ContentBlock | ContentBlock.Text)[] | HumanMessageFields<MessageStructure<MessageToolSet>>): HumanMessage<MessageStructure<MessageToolSet>>
Represents a human message in a conversation.
HumanMessage
('what about haxzie')],
}); var console: Console
The `console` module provides a simple debugging console that is similar to the JavaScript console mechanism provided by web browsers. The module exports two specific components: * A `Console` class with methods such as `console.log()`, `console.error()` and `console.warn()` that can be used to write to any Node.js stream. * A global `console` instance configured to write to [`process.stdout`](https://nodejs.org/docs/latest-v24.x/api/process.html#processstdout) and [`process.stderr`](https://nodejs.org/docs/latest-v24.x/api/process.html#processstderr). The global `console` can be used without importing the `node:console` module. _**Warning**_: The global console object's methods are neither consistently synchronous like the browser APIs they resemble, nor are they consistently asynchronous like all other Node.js streams. See the [`note on process I/O`](https://nodejs.org/docs/latest-v24.x/api/process.html#a-note-on-process-io) for more information. Example using the global `console`: ```js console.log('hello world'); // Prints: hello world, to stdout console.log('hello %s', 'world'); // Prints: hello world, to stdout console.error(new Error('Whoops, something bad happened')); // Prints error message and stack trace to stderr: // Error: Whoops, something bad happened // at [eval]:5:15 // at Script.runInThisContext (node:vm:132:18) // at Object.runInThisContext (node:vm:309:38) // at node:internal/process/execution:77:19 // at [eval]-wrapper:6:22 // at evalScript (node:internal/process/execution:76:60) // at node:internal/main/eval_string:23:3 const name = 'Will Robinson'; console.warn(`Danger ${name}! Danger!`); // Prints: Danger Will Robinson! Danger!, to stderr ``` Example using the `Console` class: ```js const out = getStreamSomehow(); const err = getStreamSomehow(); const myConsole = new console.Console(out, err); myConsole.log('hello world'); // Prints: hello world, to out myConsole.log('hello %s', 'world'); // Prints: hello world, to out myConsole.error(new Error('Whoops, something bad happened')); // Prints: [Error: Whoops, something bad happened], to err const name = 'Will Robinson'; myConsole.warn(`Danger ${name}! Danger!`); // Prints: Danger Will Robinson! Danger!, to err ```
@see[source](https://github.com/nodejs/node/blob/v24.x/lib/console.js)
console
.Console.log(message?: any, ...optionalParams: any[]): void (+1 overload)
Prints to `stdout` with newline. Multiple arguments can be passed, with the first used as the primary message and all additional used as substitution values similar to [`printf(3)`](http://man7.org/linux/man-pages/man3/printf.3.html) (the arguments are all passed to [`util.format()`](https://nodejs.org/docs/latest-v24.x/api/util.html#utilformatformat-args)). ```js const count = 5; console.log('count: %d', count); // Prints: count: 5, to stdout console.log('count:', count); // Prints: count: 5, to stdout ``` See [`util.format()`](https://nodejs.org/docs/latest-v24.x/api/util.html#utilformatformat-args) for more information.
@sincev0.1.100
log
(`✅ Message recieved from the model`);
var console: Console
The `console` module provides a simple debugging console that is similar to the JavaScript console mechanism provided by web browsers. The module exports two specific components: * A `Console` class with methods such as `console.log()`, `console.error()` and `console.warn()` that can be used to write to any Node.js stream. * A global `console` instance configured to write to [`process.stdout`](https://nodejs.org/docs/latest-v24.x/api/process.html#processstdout) and [`process.stderr`](https://nodejs.org/docs/latest-v24.x/api/process.html#processstderr). The global `console` can be used without importing the `node:console` module. _**Warning**_: The global console object's methods are neither consistently synchronous like the browser APIs they resemble, nor are they consistently asynchronous like all other Node.js streams. See the [`note on process I/O`](https://nodejs.org/docs/latest-v24.x/api/process.html#a-note-on-process-io) for more information. Example using the global `console`: ```js console.log('hello world'); // Prints: hello world, to stdout console.log('hello %s', 'world'); // Prints: hello world, to stdout console.error(new Error('Whoops, something bad happened')); // Prints error message and stack trace to stderr: // Error: Whoops, something bad happened // at [eval]:5:15 // at Script.runInThisContext (node:vm:132:18) // at Object.runInThisContext (node:vm:309:38) // at node:internal/process/execution:77:19 // at [eval]-wrapper:6:22 // at evalScript (node:internal/process/execution:76:60) // at node:internal/main/eval_string:23:3 const name = 'Will Robinson'; console.warn(`Danger ${name}! Danger!`); // Prints: Danger Will Robinson! Danger!, to stderr ``` Example using the `Console` class: ```js const out = getStreamSomehow(); const err = getStreamSomehow(); const myConsole = new console.Console(out, err); myConsole.log('hello world'); // Prints: hello world, to out myConsole.log('hello %s', 'world'); // Prints: hello world, to out myConsole.error(new Error('Whoops, something bad happened')); // Prints: [Error: Whoops, something bad happened], to err const name = 'Will Robinson'; myConsole.warn(`Danger ${name}! Danger!`); // Prints: Danger Will Robinson! Danger!, to err ```
@see[source](https://github.com/nodejs/node/blob/v24.x/lib/console.js)
console
.Console.log(message?: any, ...optionalParams: any[]): void (+1 overload)
Prints to `stdout` with newline. Multiple arguments can be passed, with the first used as the primary message and all additional used as substitution values similar to [`printf(3)`](http://man7.org/linux/man-pages/man3/printf.3.html) (the arguments are all passed to [`util.format()`](https://nodejs.org/docs/latest-v24.x/api/util.html#utilformatformat-args)). ```js const count = 5; console.log('count: %d', count); // Prints: count: 5, to stdout console.log('count:', count); // Prints: count: 5, to stdout ``` See [`util.format()`](https://nodejs.org/docs/latest-v24.x/api/util.html#utilformatformat-args) for more information.
@sincev0.1.100
log
(
const nextState: StateType<{
    messages: BinaryOperatorAggregate<BaseMessage<MessageStructure<MessageToolSet>, MessageType>[], Messages>;
}>
nextState
.messages: BaseMessage<MessageStructure<MessageToolSet>, MessageType>[]messages[
const nextState: StateType<{
    messages: BinaryOperatorAggregate<BaseMessage<MessageStructure<MessageToolSet>, MessageType>[], Messages>;
}>
nextState
.messages: BaseMessage<MessageStructure<MessageToolSet>, MessageType>[]messages.Array<T>.length: number
Gets or sets the length of the array. This is a number one higher than the highest index in the array.
length
- 1].BaseMessage<MessageStructure<MessageToolSet>, MessageType>.content: string | (ContentBlock | ContentBlock.Text)[]
Array of content blocks that make up the message content
content
);

On this page