-
Notifications
You must be signed in to change notification settings - Fork 5.2k
Agent output streams #1116
-
I just saw that AutoGen is planning to add an output streams feature beyond console output.
microsoft/autogen#1290 (comment)
This would be great if crewAI would provide a similar feature.
Beta Was this translation helpful? Give feedback.
All reactions
-
👍 9
Replies: 12 comments 2 replies
-
Interesting, curious to hear more about the use cases so we can get something that is great for those, anything comes to mind?
Beta Was this translation helpful? Give feedback.
All reactions
-
One use case would be to stream, agent result to a web frontend.
Another could be to have really distributed agents working together.
Beta Was this translation helpful? Give feedback.
All reactions
-
👍 8
-
I agree, i'd like to send answers via WebSockets to a web client.
Beta Was this translation helpful? Give feedback.
All reactions
-
👍 1
-
I also want to be able to stream live output to a frontend.
Seems like it is also related to #146
Beta Was this translation helpful? Give feedback.
All reactions
-
Currently I found a workaround to capture terminal output while crew.kickoff() is working with subprocess and to stream the stdout. It’s ugly and gets unsupported characters, but at least can stream.
Beta Was this translation helpful? Give feedback.
All reactions
-
👍 2
-
Currently I found a workaround to capture terminal output while crew.kickoff() is working with subprocess and to stream the stdout. It’s ugly and gets unsupported characters, but at least can stream.
please, share the code
i'm currently doing it like this with "verbose=2" and it sucks
from io import StringIO # Python 3 import sys # Create the in-memory "file" temp_out = StringIO() # Replace default stdout (terminal) with our stream sys.stdout = temp_out print("This is going in to the memory stream")
Sometimes i see agents questions to one another in console, i'd really like to capture these too
They write pretty high quality prompts into each other, gathering these would be 100% useful
Beta Was this translation helpful? Give feedback.
All reactions
-
👍 5 -
👎 1
-
Capture stdout would be a dream. Capturing chain of thought reasoning for audit logs and such.
Beta Was this translation helpful? Give feedback.
All reactions
-
Any news on this?
Beta Was this translation helpful? Give feedback.
All reactions
-
If CrewAI could add streaming outputs, it would be a breakthrough
Beta Was this translation helpful? Give feedback.
All reactions
-
I genuinely don’t want to use this:
from typing import AsyncGenerator import time async def stream(text: str, chunk_size: int = 4, sleep: float = 0.03) -> AsyncGenerator: """Generator function to yield chunks of the final answer.""" for i in range(0, len(text), chunk_size): time.sleep(sleep) yield text[i : i + chunk_size] text = "This is a test message. This is a piece of text that will be streamed. This is the final message." async for chunk in stream(text): print(chunk, end="")
Beta Was this translation helpful? Give feedback.
All reactions
-
+1 on this feature.
Beta Was this translation helpful? Give feedback.
All reactions
-
Anything update?
Beta Was this translation helpful? Give feedback.
All reactions
-
yeah, i switched to pydanticAI.
Beta Was this translation helpful? Give feedback.
All reactions
-
+1
Beta Was this translation helpful? Give feedback.