forked from langchain-ai/langchain
-
Notifications
You must be signed in to change notification settings - Fork 0
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
- Loading branch information
Showing
16 changed files
with
2,224 additions
and
58 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,203 @@ | ||
{ | ||
"cells": [ | ||
{ | ||
"cell_type": "markdown", | ||
"id": "e389175d-8a65-4f0d-891c-dbdfabb3c3ef", | ||
"metadata": {}, | ||
"source": [ | ||
"# How to filter messages\n", | ||
"\n", | ||
"In more complex chains and agents we might track state with a list of messages. This list can start to accumulate messages from multiple different models, speakers, sub-chains, etc., and we may only want to pass subsets of this full list of messages to each model call in the chain/agent.\n", | ||
"\n", | ||
"The `filter_messages` utility makes it easy to filter messages by type, id, or name.\n", | ||
"\n", | ||
"## Basic usage" | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": 1, | ||
"id": "f4ad2fd3-3cab-40d4-a989-972115865b8b", | ||
"metadata": {}, | ||
"outputs": [ | ||
{ | ||
"data": { | ||
"text/plain": [ | ||
"[HumanMessage(content='example input', name='example_user', id='2'),\n", | ||
" HumanMessage(content='real input', name='bob', id='4')]" | ||
] | ||
}, | ||
"execution_count": 1, | ||
"metadata": {}, | ||
"output_type": "execute_result" | ||
} | ||
], | ||
"source": [ | ||
"from langchain_core.messages import (\n", | ||
" AIMessage,\n", | ||
" HumanMessage,\n", | ||
" SystemMessage,\n", | ||
" filter_messages,\n", | ||
")\n", | ||
"\n", | ||
"messages = [\n", | ||
" SystemMessage(\"you are a good assistant\", id=\"1\"),\n", | ||
" HumanMessage(\"example input\", id=\"2\", name=\"example_user\"),\n", | ||
" AIMessage(\"example output\", id=\"3\", name=\"example_assistant\"),\n", | ||
" HumanMessage(\"real input\", id=\"4\", name=\"bob\"),\n", | ||
" AIMessage(\"real output\", id=\"5\", name=\"alice\"),\n", | ||
"]\n", | ||
"\n", | ||
"filter_messages(messages, include_types=\"human\")" | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": 2, | ||
"id": "7b663a1e-a8ae-453e-a072-8dd75dfab460", | ||
"metadata": {}, | ||
"outputs": [ | ||
{ | ||
"data": { | ||
"text/plain": [ | ||
"[SystemMessage(content='you are a good assistant', id='1'),\n", | ||
" HumanMessage(content='real input', name='bob', id='4'),\n", | ||
" AIMessage(content='real output', name='alice', id='5')]" | ||
] | ||
}, | ||
"execution_count": 2, | ||
"metadata": {}, | ||
"output_type": "execute_result" | ||
} | ||
], | ||
"source": [ | ||
"filter_messages(messages, exclude_names=[\"example_user\", \"example_assistant\"])" | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": 3, | ||
"id": "db170e46-03f8-4710-b967-23c70c3ac054", | ||
"metadata": {}, | ||
"outputs": [ | ||
{ | ||
"data": { | ||
"text/plain": [ | ||
"[HumanMessage(content='example input', name='example_user', id='2'),\n", | ||
" HumanMessage(content='real input', name='bob', id='4'),\n", | ||
" AIMessage(content='real output', name='alice', id='5')]" | ||
] | ||
}, | ||
"execution_count": 3, | ||
"metadata": {}, | ||
"output_type": "execute_result" | ||
} | ||
], | ||
"source": [ | ||
"filter_messages(messages, include_types=[HumanMessage, AIMessage], exclude_ids=[\"3\"])" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"id": "b7c4e5ad-d1b4-4c18-b250-864adde8f0dd", | ||
"metadata": {}, | ||
"source": [ | ||
"## Chaining\n", | ||
"\n", | ||
"`filter_messages` can be used in an imperatively (like above) or declaratively, making it easy to compose with other components in a chain:" | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": 4, | ||
"id": "675f8f79-db39-401c-a582-1df2478cba30", | ||
"metadata": {}, | ||
"outputs": [ | ||
{ | ||
"data": { | ||
"text/plain": [ | ||
"AIMessage(content=[], response_metadata={'id': 'msg_01Wz7gBHahAwkZ1KCBNtXmwA', 'model': 'claude-3-sonnet-20240229', 'stop_reason': 'end_turn', 'stop_sequence': None, 'usage': {'input_tokens': 16, 'output_tokens': 3}}, id='run-b5d8a3fe-004f-4502-a071-a6c025031827-0', usage_metadata={'input_tokens': 16, 'output_tokens': 3, 'total_tokens': 19})" | ||
] | ||
}, | ||
"execution_count": 4, | ||
"metadata": {}, | ||
"output_type": "execute_result" | ||
} | ||
], | ||
"source": [ | ||
"# pip install -U langchain-anthropic\n", | ||
"from langchain_anthropic import ChatAnthropic\n", | ||
"\n", | ||
"llm = ChatAnthropic(model=\"claude-3-sonnet-20240229\", temperature=0)\n", | ||
"# Notice we don't pass in messages. This creates\n", | ||
"# a RunnableLambda that takes messages as input\n", | ||
"filter_ = filter_messages(exclude_names=[\"example_user\", \"example_assistant\"])\n", | ||
"chain = filter_ | llm\n", | ||
"chain.invoke(messages)" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"id": "4133ab28-f49c-480f-be92-b51eb6559153", | ||
"metadata": {}, | ||
"source": [ | ||
"Looking at the LangSmith trace we can see that before the messages are passed to the model they are filtered: https://smith.langchain.com/public/f808a724-e072-438e-9991-657cc9e7e253/r\n", | ||
"\n", | ||
"Looking at just the filter_, we can see that it's a Runnable object that can be invoked like all Runnables:" | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": 6, | ||
"id": "c090116a-1fef-43f6-a178-7265dff9db00", | ||
"metadata": {}, | ||
"outputs": [ | ||
{ | ||
"data": { | ||
"text/plain": [ | ||
"[HumanMessage(content='real input', name='bob', id='4'),\n", | ||
" AIMessage(content='real output', name='alice', id='5')]" | ||
] | ||
}, | ||
"execution_count": 6, | ||
"metadata": {}, | ||
"output_type": "execute_result" | ||
} | ||
], | ||
"source": [ | ||
"filter_.invoke(messages)" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"id": "ff339066-d424-4042-8cca-cd4b007c1a8e", | ||
"metadata": {}, | ||
"source": [ | ||
"## API reference\n", | ||
"\n", | ||
"For a complete description of all arguments head to the API reference: https://api.python.langchain.com/en/latest/messages/langchain_core.messages.utils.filter_messages.html" | ||
] | ||
} | ||
], | ||
"metadata": { | ||
"kernelspec": { | ||
"display_name": "poetry-venv-2", | ||
"language": "python", | ||
"name": "poetry-venv-2" | ||
}, | ||
"language_info": { | ||
"codemirror_mode": { | ||
"name": "ipython", | ||
"version": 3 | ||
}, | ||
"file_extension": ".py", | ||
"mimetype": "text/x-python", | ||
"name": "python", | ||
"nbconvert_exporter": "python", | ||
"pygments_lexer": "ipython3", | ||
"version": "3.9.1" | ||
} | ||
}, | ||
"nbformat": 4, | ||
"nbformat_minor": 5 | ||
} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,170 @@ | ||
{ | ||
"cells": [ | ||
{ | ||
"cell_type": "markdown", | ||
"id": "ac47bfab-0f4f-42ce-8bb6-898ef22a0338", | ||
"metadata": {}, | ||
"source": [ | ||
"# How to merge consecutive messages of the same type\n", | ||
"\n", | ||
"Certain models do not support passing in consecutive messages of the same type (a.k.a. \"runs\" of the same message type).\n", | ||
"\n", | ||
"The `merge_message_runs` utility makes it easy to merge consecutive messages of the same type.\n", | ||
"\n", | ||
"## Basic usage" | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": 1, | ||
"id": "1a215bbb-c05c-40b0-a6fd-d94884d517df", | ||
"metadata": {}, | ||
"outputs": [ | ||
{ | ||
"name": "stdout", | ||
"output_type": "stream", | ||
"text": [ | ||
"SystemMessage(content=\"you're a good assistant.\\nyou always respond with a joke.\")\n", | ||
"\n", | ||
"HumanMessage(content=[{'type': 'text', 'text': \"i wonder why it's called langchain\"}, 'and who is harrison chasing anyways'])\n", | ||
"\n", | ||
"AIMessage(content='Well, I guess they thought \"WordRope\" and \"SentenceString\" just didn\\'t have the same ring to it!\\nWhy, he\\'s probably chasing after the last cup of coffee in the office!')\n" | ||
] | ||
} | ||
], | ||
"source": [ | ||
"from langchain_core.messages import (\n", | ||
" AIMessage,\n", | ||
" HumanMessage,\n", | ||
" SystemMessage,\n", | ||
" merge_message_runs,\n", | ||
")\n", | ||
"\n", | ||
"messages = [\n", | ||
" SystemMessage(\"you're a good assistant.\"),\n", | ||
" SystemMessage(\"you always respond with a joke.\"),\n", | ||
" HumanMessage([{\"type\": \"text\", \"text\": \"i wonder why it's called langchain\"}]),\n", | ||
" HumanMessage(\"and who is harrison chasing anyways\"),\n", | ||
" AIMessage(\n", | ||
" 'Well, I guess they thought \"WordRope\" and \"SentenceString\" just didn\\'t have the same ring to it!'\n", | ||
" ),\n", | ||
" AIMessage(\"Why, he's probably chasing after the last cup of coffee in the office!\"),\n", | ||
"]\n", | ||
"\n", | ||
"merged = merge_message_runs(messages)\n", | ||
"print(\"\\n\\n\".join([repr(x) for x in merged]))" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"id": "0544c811-7112-4b76-8877-cc897407c738", | ||
"metadata": {}, | ||
"source": [ | ||
"Notice that if the contents of one of the messages to merge is a list of content blocks then the merged message will have a list of content blocks. And if both messages to merge have string contents then those are concatenated with a newline character." | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"id": "1b2eee74-71c8-4168-b968-bca580c25d18", | ||
"metadata": {}, | ||
"source": [ | ||
"## Chaining\n", | ||
"\n", | ||
"`merge_message_runs` can be used in an imperatively (like above) or declaratively, making it easy to compose with other components in a chain:" | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": 3, | ||
"id": "6d5a0283-11f8-435b-b27b-7b18f7693592", | ||
"metadata": {}, | ||
"outputs": [ | ||
{ | ||
"data": { | ||
"text/plain": [ | ||
"AIMessage(content=[], response_metadata={'id': 'msg_01D6R8Naum57q8qBau9vLBUX', 'model': 'claude-3-sonnet-20240229', 'stop_reason': 'end_turn', 'stop_sequence': None, 'usage': {'input_tokens': 84, 'output_tokens': 3}}, id='run-ac0c465b-b54f-4b8b-9295-e5951250d653-0', usage_metadata={'input_tokens': 84, 'output_tokens': 3, 'total_tokens': 87})" | ||
] | ||
}, | ||
"execution_count": 3, | ||
"metadata": {}, | ||
"output_type": "execute_result" | ||
} | ||
], | ||
"source": [ | ||
"# pip install -U langchain-anthropic\n", | ||
"from langchain_anthropic import ChatAnthropic\n", | ||
"\n", | ||
"llm = ChatAnthropic(model=\"claude-3-sonnet-20240229\", temperature=0)\n", | ||
"# Notice we don't pass in messages. This creates\n", | ||
"# a RunnableLambda that takes messages as input\n", | ||
"merger = merge_message_runs()\n", | ||
"chain = merger | llm\n", | ||
"chain.invoke(messages)" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"id": "72e90dce-693c-4842-9526-ce6460fe956b", | ||
"metadata": {}, | ||
"source": [ | ||
"Looking at the LangSmith trace we can see that before the messages are passed to the model they are merged: https://smith.langchain.com/public/ab558677-cac9-4c59-9066-1ecce5bcd87c/r\n", | ||
"\n", | ||
"Looking at just the merger, we can see that it's a Runnable object that can be invoked like all Runnables:" | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": 4, | ||
"id": "460817a6-c327-429d-958e-181a8c46059c", | ||
"metadata": {}, | ||
"outputs": [ | ||
{ | ||
"data": { | ||
"text/plain": [ | ||
"[SystemMessage(content=\"you're a good assistant.\\nyou always respond with a joke.\"),\n", | ||
" HumanMessage(content=[{'type': 'text', 'text': \"i wonder why it's called langchain\"}, 'and who is harrison chasing anyways']),\n", | ||
" AIMessage(content='Well, I guess they thought \"WordRope\" and \"SentenceString\" just didn\\'t have the same ring to it!\\nWhy, he\\'s probably chasing after the last cup of coffee in the office!')]" | ||
] | ||
}, | ||
"execution_count": 4, | ||
"metadata": {}, | ||
"output_type": "execute_result" | ||
} | ||
], | ||
"source": [ | ||
"merger.invoke(messages)" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"id": "4548d916-ce21-4dc6-8f19-eedb8003ace6", | ||
"metadata": {}, | ||
"source": [ | ||
"## API reference\n", | ||
"\n", | ||
"For a complete description of all arguments head to the API reference: https://api.python.langchain.com/en/latest/messages/langchain_core.messages.utils.merge_message_runs.html" | ||
] | ||
} | ||
], | ||
"metadata": { | ||
"kernelspec": { | ||
"display_name": "poetry-venv-2", | ||
"language": "python", | ||
"name": "poetry-venv-2" | ||
}, | ||
"language_info": { | ||
"codemirror_mode": { | ||
"name": "ipython", | ||
"version": 3 | ||
}, | ||
"file_extension": ".py", | ||
"mimetype": "text/x-python", | ||
"name": "python", | ||
"nbconvert_exporter": "python", | ||
"pygments_lexer": "ipython3", | ||
"version": "3.9.1" | ||
} | ||
}, | ||
"nbformat": 4, | ||
"nbformat_minor": 5 | ||
} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.