An LCEL Runnable. The runnable input must take in input
, and if there
is chat history should take it in the form of chat_history
.
The Runnable output is a list of Documents
// yarn add langchain @langchain/openai
import { ChatOpenAI } from "@langchain/openai";
import { pull } from "langchain/hub";
import { createHistoryAwareRetriever } from "langchain/chains/history_aware_retriever";
const rephrasePrompt = await pull("langchain-ai/chat-langchain-rephrase");
const llm = new ChatOpenAI({});
const retriever = ...
const chain = await createHistoryAwareRetriever({
llm,
retriever,
rephrasePrompt,
});
const result = await chain.invoke({"input": "...", "chat_history": [] })
Generated using TypeDoc
Create a chain that takes conversation history and returns documents. If there is no
chat_history
, then theinput
is just passed directly to the retriever. If there ischat_history
, then the prompt and LLM will be used to generate a search query. That search query is then passed to the retriever.