-
Notifications
You must be signed in to change notification settings - Fork 184
async model stream interface #306
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
…nto async-components
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's make sure this does not go out today if we're releasing.
I'd like to bug bash on this before it shipped a bit
Hey, I tried using this WIP PR, but found I couldn't run agents concurrently. Maybe it's because |
@pgrayy this might be something to explore/file an issue for. We've seen something similar before strands-agents/tools#91 (comment). Ultimately this might accelerate the usage of the experimental client. |
Yes this is correct. Bedrock does have an async client in the works though (source). We will likely work out a process to give customers an option to opt-in for its use while we wait for a 1.0 release. In the meanwhile, we will update our docs to identify current limitations with async. I will note though that for strands 1.0, some model providers will be converted to full async. For example, the OpenAIModel provider will be updated to use AsyncOpenAI. |
…nto async-components
Ah okay thanks, didn't know about the experimental async client. Providing that option would be great! For context, my team uses Strands heavily and needs offline concurrent utilities for evaluation:
|
…nto async-components
Description
We are currently working on support for an iterative async stream method on the agent class (#83). As part of this work, we need to convert any component that yields model events into an async generator.
NOTE: There is still follow up work to transition to using async versions of the underlying clients for each model provider. For example, OpenAIModel provider will transition to use the AsyncOpenAI client.
Related Issues
#83
Type of Change
Testing
How have you tested the change? Verify that the changes do not break functionality or introduce warnings in consuming repositories: agents-docs, agents-tools, agents-cli
hatch run prepare
Checklist
By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.