Does this person actually know what they are talking about?
Have they done it themselves?
Can they be trusted?
Was this written by a human who is accountable for it?
Or is this just copied-and-pasted fake authority?
This may matter even more on the dark web, because people there have even less reason to trust each other by default. The lower the baseline trust, the more sensitive people become to reputation signals.
From the community’s perspective, the more AI written posts appear, the more expensive it becomes to tell whether someone genuinely understands the problem. So of course they dislike it.
aledevv•25m ago
> The lower the baseline trust, the more sensitive people become to reputation signals.
This is the key: a person's reputation and the writer's responsibility.
But how will we learn to become sensitive to these social reputation signals?
jdw64•1h ago
Does this person actually know what they are talking about? Have they done it themselves? Can they be trusted? Was this written by a human who is accountable for it? Or is this just copied-and-pasted fake authority?
This may matter even more on the dark web, because people there have even less reason to trust each other by default. The lower the baseline trust, the more sensitive people become to reputation signals.
From the community’s perspective, the more AI written posts appear, the more expensive it becomes to tell whether someone genuinely understands the problem. So of course they dislike it.
aledevv•25m ago
This is the key: a person's reputation and the writer's responsibility.
But how will we learn to become sensitive to these social reputation signals?