Just released v1.5.0 with IBM watsonx integration.
```python
# Works with any provider now
client = LLMClient(provider="watsonx")
client = LLMClient(provider="anthropic")
client = LLMClient(provider="openai")
response = client.query("Analyze this data")
New in v1.5.0:
- IBM watsonx provider with Granite models
- Same API across all 5 providers
- Auto-fallback still works
- Enterprise authentication
Also supports 100+ local Ollama models including OpenAI's new GPT-OSS.
sreenathmenon•2h ago