[Q&A] Response streaming fail 自部署流式返回失效 #531
-
🥰 需求描述 | Feature DescriptionI would like to have a functionality for streaming the response from ChatGPT, so that I don't have to wait for the complete generation of the answer. 🧐 解决方案 | Proposed SolutionOpenAI API supports response streaming 📝 补充信息 | Additional InformationNo response |
Beta Was this translation helpful? Give feedback.
Replies: 21 comments 10 replies
-
Thank you for raising an issue. We will investigate into the matter and get back to you as soon as possible. |
Beta Was this translation helpful? Give feedback.
-
Please provide more info with your deployment. By default LobeChat support streaming |
Beta Was this translation helpful? Give feedback.
-
Right now, it is not working by default for me. Do I need to enable it somewhere in the settings? |
Beta Was this translation helpful? Give feedback.
-
No. It should work by default. Are you using custom api proxy? |
Beta Was this translation helpful? Give feedback.
-
No, i am using OpenAI API with gpt-3.5-turbo model. |
Beta Was this translation helpful? Give feedback.
-
Screen.Recording.2023-11-30.at.14.30.43.mov |
Beta Was this translation helpful? Give feedback.
-
Are you using the chat-preview.lobehub.com ? |
Beta Was this translation helpful? Give feedback.
-
No, i deployed my own instance with docker |
Beta Was this translation helpful? Give feedback.
-
相同问题。我注意到如果本机部署(localhost)使用或者不使用透明代理本地直连,流式特性均正常,但是一旦上服务器,无论使用 fastAPI 或者使用透明代理都不正常,我推测可能是由于 Nginx 配置反向代理导致的问题? |
Beta Was this translation helpful? Give feedback.
-
Same problem. I noticed that if the local deployment (localhost) uses or does not use a transparent proxy for local direct connection, the streaming characteristics are normal, but once on the server, it does not work properly regardless of using fastAPI or using a transparent proxy. I speculate that it may be due to the Nginx configuration inversion. Issues caused to the agent? |
Beta Was this translation helpful? Give feedback.
-
If you are deploying with Nginx, please check your conf. try with:
or
for example:
Or maybe you can try to google about gpt api nginx stream to find some solutions? |
Beta Was this translation helpful? Give feedback.
-
我尝试了,这并不生效 @arvinxx |
Beta Was this translation helpful? Give feedback.
-
I tried it and it doesn't work @arvinxx |
Beta Was this translation helpful? Give feedback.
-
那我建议:
|
Beta Was this translation helpful? Give feedback.
-
Then I suggest:
|
Beta Was this translation helpful? Give feedback.
-
试下这个配置 proxy_cache off; # 关闭缓存
proxy_buffering off; # 关闭代理缓冲
chunked_transfer_encoding on; # 开启分块传输编码
tcp_nopush on; # 开启TCP NOPUSH选项,禁止Nagle算法
tcp_nodelay on; # 开启TCP NODELAY选项,禁止延迟ACK算法
keepalive_timeout 300; # 设定keep-alive超时时间为65秒 |
Beta Was this translation helpful? Give feedback.
-
!这个对我有效,感谢大佬 完整配置如下: location ^~ / {
proxy_pass http://127.0.0.1:3210;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header REMOTE-HOST $remote_addr;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "upgrade";
proxy_http_version 1.1;
add_header Cache-Control no-cache;
proxy_cache off; # 关闭缓存
proxy_buffering off; # 关闭代理缓冲
chunked_transfer_encoding on; # 开启分块传输编码
tcp_nopush on; # 开启TCP NOPUSH选项,禁止Nagle算法
tcp_nodelay on; # 开启TCP NODELAY选项,禁止延迟ACK算法
keepalive_timeout 300; # 设定keep-alive超时时间为65秒
} |
Beta Was this translation helpful? Give feedback.
-
在bt的docker里的反向代理设置在哪里啊?在Nginx 软件里的配置文件里没有这些设置,与docker里的反向代理设置不一样。 |
Beta Was this translation helpful? Give feedback.
-
答案亲测有效! |
Beta Was this translation helpful? Give feedback.
-
Beta Was this translation helpful? Give feedback.
!这个对我有效,感谢大佬
完整配置如下: