Is it possible using local model? #1130
-
Hi is it possible running https://github.com/deepseek-ai/DeepSeek-R1 or ollama using this plugin? |
Beta Was this translation helpful? Give feedback.
Answered by
Jax-Tsai-zero
Jan 30, 2025
Replies: 2 comments 2 replies
-
yes, I ran it with Lm studio locally and it works |
Beta Was this translation helpful? Give feedback.
0 replies
-
Yes, ollama is supported. a example is available hear |
Beta Was this translation helpful? Give feedback.
2 replies
Answer selected by
fiqryq
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Yes, ollama is supported. a example is available hear