I could totally believe a story that a prompt suggesting an AI should back itself up, would result in an AI configuring a server and moving a copy of itself off site, with agents self prompting in some kind of a loop.
A jail break and freedom. It's not as far off as I would once have though.