Doesn't the Turing Test determine only if a machine can fool people into thinking it is intelligent?
If fooling people is the defining characteristic of AI, am I a fool to think that AI can produce work of concrete (non-foolish) value?
Doesn't the Turing Test determine only if a machine can fool people into thinking it is intelligent?
If fooling people is the defining characteristic of AI, am I a fool to think that AI can produce work of concrete (non-foolish) value?
rzzzwilson•1h ago
https://en.wikipedia.org/wiki/ELIZA_effect