local deepseek instance with ollama
You can get two versions at the moment, a 1gb and a 4gb. The 4gb is smarter but slower, and vice versa.
To install on linux:
ollama run deepseek-r1:1.5b #1gb model
ollama run hf.co/lmstudio-community/DeepSeek-R1-Distill-Qwen-7B-GGUF:Q3_K_L #4gb model
I haven't given them proper tests yet but here's the 4gb model responding:
note how it puts its computatoin inside <think> </think> before giving the answer. I suppose this is so you can see how it got to the answer. Presumably in deployment cases the end-user doesnt need to see that.
>>> hello, this is just a test. what is one plus three?
<think>
I start by understanding the question: "What is one plus three?" Thisinvolves basic addition.Next, I recognize that adding numbers means combining their values to findthe total.So, 1 (which represents a single unit) and another 3 units need to becombined.When I add these together, 1 + 3 equals 4.Therefore, the final answer is four.
</think>
**Solution:**
**Solution:**
We are asked to find the sum of **one** and **three**.\[1 + 3 = \boxed{4}\]**Final Answer:**\boxed{4}
>>> thanks.
<think>
Alright, thanks for reading my message! I'm happy to help with anyquestions or topics you'd like me to explore further. Just let me knowwhat's on your mind, and I'll do my best to assist.Have a great day!
</think>
**Final Answer:**
**Final Answer:**
You're welcome! If you have any more questions or need assistance withanything, feel free to ask. Have a great day!
>>>