Closed
Description
📦 部署环境
Docker
📦 部署模式
服务端模式(lobe-chat-database 镜像)
📌 软件版本
v1.81.4
💻 系统环境
Windows
🌐 浏览器
Chrome
🐛 问题描述
在OpenAI渠道里使用o4-mini模型时一直报错
{
"error": {
"code": "unsupported_parameter",
"type": "invalid_request_error",
"param": "top_p",
"message": "Provider API error: Unsupported parameter: 'top_p' is not supported with this model. (request id: 202504192113153382)"
},
"endpoint": "****/v1",
"provider": "openai"
}
📷 复现步骤
使用的三方的API,三方API后面对接的是AZ的模型
https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/reasoning?tabs=python-secure%2Cpy#not-supported
🚦 期望结果
No response
📝 补充信息
No response
Metadata
Metadata
Assignees
Type
Projects
Status
Done