-
Notifications
You must be signed in to change notification settings - Fork 648
models: Add support for DeepSeek models #428
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
## Features - **New DeepSeek Model Provider**: Full implementation with streaming and structured output support - **OpenAI-compatible API**: Uses OpenAI client for seamless integration with DeepSeek endpoints - **Reasoning Model Support**: Handles DeepSeek-specific features like reasoning content - **Beta Features**: Support for beta endpoint and advanced DeepSeek capabilities ## Changes - `src/strands/models/deepseek.py`: New DeepSeek model provider implementation - `src/strands/models/__init__.py`: Export DeepSeekModel class - `tests_integ/models/test_model_deepseek.py`: Comprehensive integration tests (7 test cases) - `tests_integ/models/providers.py`: Add DeepSeek to provider configuration - `README.md`: Update documentation with DeepSeek examples and provider list ## Usage ```python from strands.models.deepseek import DeepSeekModel model = DeepSeekModel(api_key="your-key", model_id="deepseek-chat") Testing ✅ All 7 integration tests passing ✅ Basic conversation, structured output, streaming, tool usage ✅ Configuration updates and async operations
|
+1 |
|
Hi @veeragoni , |
@pgrayy Processing of agent OpenAIModel
Openai chat create api reference
Deepseek chat create api reference
Solution
However, I have found that many people have the need to call deepseek, so I think it would be better to add a deepseek provider |
|
Hi @veeragoni ! That said, we absolutely want to support your integration! We'd recommend you publish the model provider implementation as a standalone package on PyPI. This way, customers can easily install it and use it with our SDK. Here's a great example of how this works: https://pypi.org/project/strands-nvidia-nim/ We'd be happy to feature your implementation on our documentation page as a supported model provider, giving you visibility to our community. We're excited to see a DeepSeek model provider driven by the community! |



Features
Changes
src/strands/models/deepseek.py: New DeepSeek model provider implementation with proper tool use streamingsrc/strands/models/__init__.py: Export DeepSeekModel classtests_integ/models/test_model_deepseek.py: Comprehensive integration tests (8 test cases)tests_integ/models/ds_test.py: Demonstration script showing tool use capabilitiesREADME.md: Update documentation with DeepSeek examples and provider listUsage
Testing
✅ All 8 integration tests passing
✅ Basic conversation, structured output, streaming, tool usage
✅ Multi-tool workflows (calculator, file_read, shell)
✅ Configuration updates and async operations
✅ Tool use streaming matches OpenAI/Ollama implementations
✅ All mypy type checks passing
Documentation PR
strands-agents/docs#135
Type of Change
New feature
How have you tested the change? Verify that the changes do not break functionality or introduce warnings in consuming repositories: agents-docs, agents-tools, agents-cli
I ran
hatch run prepareChecklist
[x] I have read the CONTRIBUTING document
[x] I have added any necessary tests that prove my fix is effective or my feature works
[x] I have updated the documentation accordingly
[x] I have added an appropriate example to the documentation to outline the feature, or no new docs are needed
[x] My changes generate no new warnings
[x] Any dependent changes have been merged and published
By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.