Path: blob/main/examples/reference/chat/PanelCallbackHandler.ipynb
2011 views
The Langchain PanelCallbackHandler
is useful for rendering and streaming the chain of thought from Langchain objects like Tools, Agents, and Chains. It inherits from Langchain's BaseCallbackHandler.
Check out the panel-chat-examples docs to see more examples on how to use PanelCallbackHandler
. If you have an example to demo, we'd love to add it to the panel-chat-examples gallery!
Parameters:
Core
instance
(ChatFeed | ChatInterface): The ChatFeed or ChatInterface instance to render or stream to.user
(str): Name of the user who sent the message.avatar
(str | BinaryIO): The avatar to use for the user. Can be a single character text, an emoji, or anything supported bypn.pane.Image
. If not set, uses the first character of the name.
Basics
To get started:
Pass the instance of a
ChatFeed
orChatInterface
toPanelCallbackHandler
.Pass the
callback_handler
as a list intocallbacks
when constructing or using Langchain objects.
This example shows the response from the llm
only. A llm
by it self does not show any chain of thought. Later we will build an agent that uses tools. This will show chain of thought.
Async
async
can also be used. This will make your app more responsive and enable it to support more users.
Streaming
To stream tokens from the LLM, simply set streaming=True
when constructing the LLM.
Note, async
is not required to stream, but it is more efficient.
Agents with Tools
Tools can be used and it'll also be streamed by passing callback_handler
to callbacks
.
Chains with Retrievers
Again, async
is not required, but more efficient.
Please note that we use the hack
because retrievers currently do not call any CallbackHandler
s. See LangChain Issue#7290.