-
Notifications
You must be signed in to change notification settings - Fork 516
Add streaming #61
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add streaming #61
Conversation
7118f64
to
9af3296
Compare
@pamelafox this is ready and working but I wanna experiment with their approach here
https://github.com/microsoft/ai-chat-protocol/blob/main/samples/frontend/js/react/src/Chat.tsx
instead of setTimeout.
If I have time today I will look into it.
I separated the functions to run
and run_stream
as I can't yield and return something from the same function.
I had to do lots of refactoring to not repeat the code in the two functions.
@john0isaac I believe we need setTimeout for the desired effect. Without setTimeout, React flushes too much at once. I've recommended to Gerardo (the author of that) to show setTimeout for streaming. But let me know if you see otherwise, that was my experience.
aa43d90
to
7bd37b9
Compare
Co-authored-by: Pamela Fox <pamela.fox@gmail.com>
b14e86c
to
f5a2733
Compare
@john0isaac I believe we need setTimeout for the desired effect. Without setTimeout, React flushes too much at once. I've recommended to Gerardo (the author of that) to show setTimeout for streaming. But let me know if you see otherwise, that was my experience.
It works without it but it in the frontend it looks like it's dumping the text bulk by bulk not streaming it.
I think the delay in updating the answers makes it better, readable, and it's smoother.
I won't change it. setTimeout is technically not needed but I will leave it for aesthetics.
Done, what's left is if you have any suggestions for the SQLAlchmey error, as I said it's fixed but I'm open to alternative approaches.
@john0isaac I did some digging and found this thread:
fastapi/fastapi#11321
I refactored so that there's a "prepare_context" method as well as a "answer"/"answer_stream", so we call "prepare_context" first, and then pass answer_stream to StreamingResponse.
I think that ends up being a pretty nice interface, actually.
LGTM
Uh oh!
There was an error while loading. Please reload this page.
Purpose
fix #27
Does this introduce a breaking change?
When developers merge from main and run the server, azd up, or azd deploy, will this produce an error?
If you're not sure, try it out on an old environment.
Type of change
Code quality checklist
See CONTRIBUTING.md for more details.
python -m pytest
).python -m pytest --cov
to verify 100% coverage of added linespython -m mypy
to check for type errorsruff
manually on my code.