This is a good question to ask of people who are like your aging parents (asking here in HN may not be very useful if your potential customers are not HN readers). If you talk to enough people you'll get a sense of who really wants it.
That said, my experience with LLMs is that they tend to lie/misrepresent user input and intention, especially when translating text. Doesn't sound like a problem to me if you're just ordering pizza or scheduling a haircut, but when it comes to healthcare, that might become problematic. Furthermore, there are quite a few regulations when it comes to healthcare services that you might need to double check, just to make sure you're avoiding having to comply with difficult and expensive regulation. Not really an issue for a tool built for your parents, but when you're marketing it to people (especially vulnerable groups, like people not speaking the language of the country they're in).
Also, does this bot announce that it's a bot?
Also, how will you prevent scam callcenters from ruining your bot's reputation? Is there some kind of abuse detection in place? Because if you just have a service that will call people and tell them what you instruct it to tell them, I can guarantee that malicious people will flock to it.
Because why would you want to make phone calls in the first place and not just send an email, or an SMS?
Because of spam filters and because people do not read their emails immediately because we get so many. But now we just get the same with phone calls.
It was already bad enough with fake Microsoft support.
ranabasheer•6mo ago
Note: due to lack of scaling resources, only messaging works for general public. if you would like to test voice, I am happy to add you to whitelisted phone numbers.
averageRoyalty•5mo ago
There are many non native speakers in many countries, this could very quickly become a great global service. Be careful with privacy and hosted data, especially medical.
Well done to you!
ranabasheer•5mo ago
thrown-0825•5mo ago
Before you open this up to the public you should prepare to be used as a potential spam vector and put some rate limits in place.
Assuming you are using something like twilio behind the scenes it can be very difficult to get yourself off a blacklist once you wind up on one.
Animats•5mo ago
thrown-0825•5mo ago
ranabasheer•5mo ago
probably_wrong•5mo ago
First, this service is breaking enough European data privacy rules that you should seriously consider blocking European visitors altogether (who's your GDPR Data Protection Officer? How do I get in touch with them?). On that vein:
> We use enterprise-grade encryption for all data and follow strict privacy protocols. Your information is never sold or shared
No information on what those privacy protocols are, though. And unless you're self-hosting the entire stack, are you really sure you're not sharing my private information with, say, OpenAI?
At a more general level, speech recognition and LLM performance outside English ranges from "it's okay" to "bad". If you're offering a service in a language that you don't speak (and forgive me for doubting your ability to speak Korean, Vietnamese, Russian, Hindi, Telugu, Bengali, and several more), be prepared for things to go wrong in ways you cannot understand. And speaking of which, how big is your "support team"? You wouldn't write "team" to mean just a single person, right?
> During your conversation with the officer, Maya stays on the line taking detailed notes about next steps, required documents, deadlines, and contact information so nothing gets lost.
I hope you're checking that you're in a one-party consent state. I also hope you've accounted for the person saying "I do not consent to be recorded".
At an even more general level: the problem with this idea is that it's aimed at the "average" person, but everyone has different pain points and your app will probably break up in contact with them. I can imagine it works well for your family because you know them, but are you sure it will work with mine? And you're aiming it at a sector of the population that, by definition, is bad with technology. That's a tough sell.
Anything related to healthcare can be a minefield. I wouldn't walk in there unprepared.
ranabasheer•5mo ago
NDxTreme•5mo ago
So, the other end of the spectrum, doctors offices, insurance agents, etc.
These people have to already make reminder calls/emails etc, and they can get the consents needed.
That said, you probably have a gap between what you are doing now and what they require, which you need a case study to fill in.