Skip to content

feat: add Llama API as model provider #11

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 6 commits into from
May 16, 2025
Merged

Conversation

yanxi0830
Copy link
Collaborator

@yanxi0830 yanxi0830 commented May 16, 2025

Description

  • add Llama API as a model provider
  • add llama-api-client as dependency

Related Issues

  • updated README to include Llama API as a model provider

Type of Change

  • New feature

Testing

  • add tests
  • test with script
Screenshot 2025-05-16 at 9 07 12 AM
from strands import Agent
from strands.models import LlamaAPIModel
from strands import Agent, tool

from rich.pretty import pprint

def main():
    # Llama API
    llama_model = LlamaAPIModel(
        model_id="Llama-4-Maverick-17B-128E-Instruct-FP8",
    )
    agent = Agent(model=llama_model)
    response = agent("Tell me about Agentic AI")

    print()
    pprint(response)


def main_with_get_weather_tool():
    @tool
    def get_weather(location: str) -> str:
        """Get the weather for a location.
        
        :param location: The city, state, or country for which to fetch the temperature. 
        """
        return f"The weather in {location} is sunny."

    llama_model = LlamaAPIModel(
        model_id="Llama-4-Maverick-17B-128E-Instruct-FP8",
    )
    agent = Agent(model=llama_model, tools=[get_weather])
    response = agent("What is the weather in San Francisco?")

    print()
    pprint(response)

def main_with_word_count_tool():
    @tool
    def word_count(text: str) -> int:
        """Count the number of words in a text.
        
        :param text: The text to count the words of. 
        """
        return len(text.split())

    llama_model = LlamaAPIModel(
        model_id="Llama-4-Maverick-17B-128E-Instruct-FP8",
    )
    agent = Agent(model=llama_model, tools=[word_count])
    response = agent("How many words are in this sentence?")

    print()
    pprint(response)



if __name__ == "__main__":
    # main()
    main_with_word_count_tool()
    # main_with_get_weather_tool()
  • hatch fmt --linter
  • hatch fmt --formatter
  • hatch test --all
  • Verify that the changes do not break functionality or introduce warnings in consuming repositories: agents-docs, agents-tools, agents-cli

Checklist

  • I have read the CONTRIBUTING document
  • I have added tests that prove my fix is effective or my feature works
  • I have updated the documentation accordingly
  • I have added an appropriate example to the documentation to outline the feature
  • My changes generate no new warnings
  • Any dependent changes have been merged and published

By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.

@yanxi0830 yanxi0830 requested a review from a team as a code owner May 16, 2025 15:59
pgrayy
pgrayy previously approved these changes May 16, 2025
@yonib05 yonib05 enabled auto-merge (squash) May 16, 2025 16:58
auto-merge was automatically disabled May 16, 2025 17:13

Head branch was pushed to by a user without write access

@yanxi0830 yanxi0830 requested a review from pgrayy May 16, 2025 17:13
@yonib05 yonib05 enabled auto-merge (squash) May 16, 2025 17:24
@yonib05 yonib05 merged commit a99107f into strands-agents:main May 16, 2025
11 checks passed
satsumas added a commit to Stability-AI/strands-sdk-python that referenced this pull request Jun 23, 2025
…ementation-complete

feat(stability): response images can be returned as bytes
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants