Why is this interesting? Do the study again, but now replace “AI chatbot” with “marketer trying to gain your confidence to sell you something” or “serial killer trying to know your location”. What do you think is going to happen? “OMG, people have empathy for serial killers as long as they think they’re a single mom working two jobs”. Yeah, no shit. “People are tricked when you trick them” isn’t an interesting observation, and the conclusions being drawn here aren’t useful. This has nothing to do with AI chatbots, and everything to do with people being lied to.
latexr•3h ago