Skip to content

Allow OutputGuardrail to accept ChatRequestParameters #1463

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
andreadimaio opened this issue May 7, 2025 · 6 comments
Open

Allow OutputGuardrail to accept ChatRequestParameters #1463

andreadimaio opened this issue May 7, 2025 · 6 comments

Comments

@andreadimaio
Copy link
Collaborator

It might be useful to specify some different parameters if something is wrong in the LLM response.

@geoand
Copy link
Collaborator

geoand commented May 7, 2025

Can you give an example of what you have in mind?

@andreadimaio
Copy link
Collaborator Author

For example, suppose that I want to execute a reprompt but with a different temperature. We could use the ChatRequestParameters to override the default parameters.

@andreadimaio
Copy link
Collaborator Author

The ChatRequestParameters could also be added to the retry method.

@geoand
Copy link
Collaborator

geoand commented May 7, 2025

Seems reasonable.

cc @cescoffier @mariofusco

@mariofusco
Copy link
Contributor

Yes probably doable, even though I'm not sure how many could use this feature. I will give it a try.

@edeandrea
Copy link
Collaborator

Remember too that the guardrail functionality is moving to upstream LangChain4j (see langchain4j/langchain4j#2571). Maybe better to wait for this capability until that has been merged and then #1284 is complete?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants