*The Solution:* AI agents are defined as flow-chart diagrams that are easier to understand and control. The diagram and agent are one and the same: The diagram defines the actions of the agent.
*Why It Matters:* Understanding, explaining, and controlling an AI app’s decisions is extremely important to many organizations. In McKinsey's 2024 survey of the state of AI, 40% of respondents identified "explainability" as a major risk of adopting AI, but only 17% were actively working to mitigate the problem.
*Try It:* https://spiff.works/agent-demo - Try modifying the workflow diagram and settings to see how the AI behavior changes. You can change everything.
*Technical Details:* A complete technical review of our thought process in building this demo: https://medium.com/spiffworkflow/how-bpmn-helps-solve-ais-tr...
This project demonstrates one possible way to create greater transparency in an AI agent. While the code embedded in the diagram is moderately technical (short Python scripts) and relies on a rigorous notation (BPMN), we have repeatedly demonstrated that it is possible to show these diagrams to an average business user and reach a high level of understanding about the AI’s role within a larger process. We believe that understanding is critical to the safe and effective application of AI into the organizations and governments we depend on.