But then I thought about it some more, and I realized those I initially wanted to dismiss as pearl-clutching ninnies whinnying about “privacy” when of course I would have no intention of “spying on their emails” were actually right, in a way. Not about me, but an aspect of the world that I really hadn't thought much about, nor experienced at all.
Because there are enough shady companies, governments, intelligence people, and scammers out there who do spy/lie/harass/stalk/steal and try to violate people six ways to Sunday. And they try it with impunity. Some big corps even put it in their terms that they are doing this. I haven't had any experience of this world; for me it's not even a thing. I'm just focused on making something that works and gives a good experience (which btw really takes a lot, even with AI), but I get that some people have faced this, and it is an aspect of reality. It's not the whole reality at all, but it is an aspect that exists. And so it's understandable that people will have a sense about this.
I just didn't think about it enough before launching. I think I should have put things like “Your data remains your own” (even tho, of course it does — to me it's so obvious, why do I even have to say it?); but I realized the way this industry has worked is that it has mined people's data, stolen it, ripped it off, and abused people's expectations and boundaries, and people are sick of it, and wise to it, and have a sense about that, and so they go looking for it. Even when it's not there, they think it could be — and that's right. Because that's the caution they've learned. And if I'm going to launch a product that has some overlap with that, then I have to respect that people will feel that way. After all, wouldn't I want the explicit assurance that some service that could potentially steal my ideas will not, and explicitly commit to not doing so? Yes, I would want that, actually. I just didn't think people would expect I was going to do that. Because I'm just like them — I don't want my data ripped off, so of course I wouldn't.
But I was thinking more like a private user who already trusts what I use by default (because I built it), not like a world-weary, wizened netizen who’s faced or read about countless scams and abuses. But I think you have to factor that in, because especially now, ideas aren't cheap anymore. They're valuable. Why? Because AI has made the cost of execution so low.
So ideas matter. First mover advantage matters. Marketing matters. Privacy matters. Brand matters. Reputation matters. Execution is distributed. Ideas are valuable again, not like the ol' PG essay about not signing NDAs because ideas are cheap. Ideas are no longer cheap. AI has made them valuable.
So privacy matters. That said, I don't think you have to go full-NSA and have homomorphic E2E post-quantum security for everything, but you do have to have a cognizance of the environment in which we now operate. And I realized, for all their whinnying, the pearl-clutchers were correct: I didn't think about that aspect enough before I launched.
al_borland•1h ago
As for the rest, I think this line is key:
> But I was thinking more like a private user who already trusts what I use by default (because I built it)
Of course you trust what you wrote, but do you trust what everyone else writes? Put yourself in the shoes of your potential customers. Most people don’t know you, your values, your intent… all they have to go by is what you tell them. Also remember, people lie. So don’t just tell them, prove it.
And to your point with AI, the ability for someone to make something that was seemingly done with care and to delight, just so they can steal data, has never been easier. Your take away was execution is cheap, but maybe the take away should be data harvesting is now cheap, and that data is more valuable than ever, so people are right to be wary of anything that is accessing their data.