diff --git a/core-jdk8/src/site/markdown/how-tos/stream-tokens.md b/core-jdk8/src/site/markdown/how-tos/stream-tokens.md index e7df0fe..7b9f16e 100644 --- a/core-jdk8/src/site/markdown/how-tos/stream-tokens.md +++ b/core-jdk8/src/site/markdown/how-tos/stream-tokens.md @@ -5,16 +5,6 @@ agent. We will use a ReAct agent as an example. The tl;dr is to use [streamEvents](https://js.langchain.com/v0.2/docs/how_to/chat_streaming/#stream-events) ([API Ref](https://api.js.langchain.com/classes/langchain_core_runnables.Runnable.html#streamEvents)). - - This how-to guide closely follows the others in this directory, showing how to incorporate the functionality into a prototypical agent in LangGraph. @@ -23,19 +13,9 @@ This works for and all its subclasses, such as [MessageGraph](/langgraphjs/reference/classes/langgraph.MessageGraph.html). - +In this how-to, we will create our agent from scratch to be transparent (but verbose). You can accomplish similar functionality using the `createReactAgent({ llm, tools })` [API doc](/langgraphjs/reference/functions/langgraph_prebuilt.createReactAgent.html) constructor. This may be more appropriate if you are used to LangChain's [AgentExecutor](https://js.langchain.com/v0.2/docs/how_to/agent_executor) class. ## Setup @@ -121,12 +101,7 @@ Now load the [chat model](https://js.langchain.com/v0.2/docs/concepts/#chat-mode [tool calling](https://js.langchain.com/v0.2/docs/how_to/tool_calling/#passing-tools-to-llms), meaning it can return function arguments in its response. - +These model requirements are not general requirements for using LangGraph - they are just requirements for this one example. ```typescript