Tavily web search using Langraph agent

 Hi, In this post, we will see how to do tavily web search using langraph 


1, Run the below command to create a new repository and initiating a new project.


mkdir langgraph-agent

cd langgraph-agent

npm init -y



2, Run the below command to install dependencies

# runtime
npm i @langchain/core @langchain/langgraph @langchain/openai @langchain/community dotenv

# dev tooling
npm i -D typescript tsx @types/node



3, Create tsconfig.json 

{
  "compilerOptions": {
    "target": "ES2022",
    "module": "NodeNext",
    "moduleResolution": "NodeNext",
    "esModuleInterop": true,
    "forceConsistentCasingInFileNames": true,
    "strict": true,
    "skipLibCheck": true
  }
}



4, Update package.json file



5, Create .env file in the root of the repository and update the open ai api key and tavily api key



6, In root of the repository, create a new file named agent.mts and paste the below code 

// agent.mts
import * as dotenv from 'dotenv';
dotenv.config();

import { TavilySearchResults } from "@langchain/community/tools/tavily_search";
import { ChatOpenAI } from "@langchain/openai";
import { MemorySaver } from "@langchain/langgraph";
import { HumanMessage } from "@langchain/core/messages";
import { createReactAgent } from "@langchain/langgraph/prebuilt";
import { writeFileSync } from "node:fs";

// 1) Tools (web search via Tavily)
const agentTools = [new TavilySearchResults({ maxResults: 3 })];

// 2) LLM (explicit model is safest)
const agentModel = new ChatOpenAI({ model: "gpt-4o-mini", temperature: 0 });

// 3) Memory/checkpointing (lets you keep conversation state by thread_id)
const agentCheckpointer = new MemorySaver();

// 4) Assemble the agent
const agent = createReactAgent({
  llm: agentModel,
  tools: agentTools,
  checkpointSaver: agentCheckpointer,
});

// 5) Use it!
const thread = { configurable: { thread_id: "42" } };

const run1 = await agent.invoke(
  { messages: [new HumanMessage("what is the current weather in sf")] },
  thread
);
console.log("\nSF answer:\n", run1.messages.at(-1)?.content);

const run2 = await agent.invoke(
  { messages: [new HumanMessage("what about ny")] },
  thread
);
console.log("\nNY answer:\n", run2.messages.at(-1)?.content);


const graph = agent.getGraph();
const image = await graph.drawMermaidPng();
const buf = Buffer.from(await image.arrayBuffer());
writeFileSync("./graph.png", buf);
console.log("Graph saved to graph.png");

Run this project using npm run dev command in terminal.



You should able to view the graph.png file in your project folder after the run, shows the graph of agent



Thank you for reading !




## Clone the repository using

git clone https://github.com/BharathanBtech/langraph-tavily-search.git

## Install the dependency using

npm install

## Run the project using

npm run dev

Comments

Popular posts from this blog

Check Windows DRIVE Disk space by Folders in POWERSHELL