I see you're crossposting your reddit post, so I'll reply here. In my benchmarks oss-120 is not very smart. Its tool use is quite lacking and as you might guess, has no access to the larger internet out of the box or posttraining. For your needs the sentiment is right however, though you'd probably be better off with Qwen or Kimi K2, if you can run it
jandll•15h ago
Yeah I run got oss 20b. I have DeepSeek-r1:14b-qwen, but it doesn’t work well with what I need it for. But I probably haven’t set it up optimally. Any advice how to get the most out of it?
WorldPeas•15h ago
you'll probably get more bang for buck with MOE or fp4 models, but other than that I've got little advice save for using some good search MCP tools like perplexity, or brave search to add factuality to the model (you'll need to alter the system prompt so the model is more inclined to make tool calls)
WorldPeas•16h ago
jandll•15h ago
WorldPeas•15h ago