Skip to content

How to store 'artifacts' with tool responses? #2034

Open
@thisisarko

Description

@thisisarko

Question

I understand tool call results are sent as ToolReturnPart objects.

In the docs, under Function Tool Output:

Some models (e.g. Gemini) natively support semi-structured return values, while some expect text (OpenAI) but seem to be just as good at extracting meaning from the data. If a Python object is returned and the model expects a string, the value will be serialized to JSON.

However I want to have some metadata associated with the tool response that I can reference. An example use-case is sending back str formatted tool response to the model but retaining structured output in the messages.

Langchain handles this like so.

Is there a way to achieve this currently?

Additional Context

No response

Metadata

Metadata

Assignees

Labels

Feature requestNew feature requestquestionFurther information is requested

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions