- Send input - Wait for a pattern - Branch on the match
This is essentially the classic Unix expect model, but applied to LLM conversations.
So I built expectllm — a minimal pattern-matching conversation flow library (365 lines of code).
Example:
from expectllm import Conversation
c = Conversation()
c.send("Review this code for security issues")
c.expect(r"found (\d+) issues")
if int(c.match.group(1)) > 0:
c.send("Fix the top 3")
No chains, no schema definitions, no output parsers.Features: - expect_json(), expect_number(), expect_yesno() - Regex → auto format instructions - Full conversation history for multi-turn flows - Auto-detects OpenAI / Anthropic via environment variables
The idea: treat LLM conversations as explicit state machines, where each expect() is a state transition.
Repo: https://github.com/entropyvector/expectllm PyPI: pip install expectllm
Would love feedback — especially on where this abstraction breaks down.