NHacker Next
login
▲A telephony agent for my parents. Should I turn it into a full-fledged service?sutrasphere.com
32 points by ranabasheer 4 days ago | 14 comments
Loading comments...
ranabasheer 4 days ago [-]
I developed a telephony agent service to help my aging parents, who aren't fluent in English, navigate their healthcare needs. My parents simply talk to the agent, which I call Maya, in their native language. Maya then understands their requests and handles the outgoing calls to various offices and providers in English. So far, it's been an invaluable resource for them, helping with three key tasks: Scheduling doctor's appointments across multiple insurance plans. Checking pharmacy readiness for medicine pickups. Arranging transportation (both medical and non-medical) through services like CalOptima and OC ACCESS. The system uses a straightforward tech stack on AWS, combining separate ASR, LLM, and TTS models. It's a pragmatic solution I built because I couldn't find an existing service that fit my specific requirements. It’s been so useful for my family that I'm starting to wonder if this could help others in similar situations. What do you think, Hacker News? Is there a broader need for a service like this? I'm looking for your honest and brutal feedback on whether I should pursue this full-time.

Note: due to lack of scaling resources, only messaging works for general public. if you would like to test voice, I am happy to add you to whitelisted phone numbers.

probably_wrong 1 hours ago [-]
Here's some honest and brutal feedback:

First, this service is breaking enough European data privacy rules that you should seriously consider blocking European visitors altogether (who's your GDPR Data Protection Officer? How do I get in touch with them?). On that vein:

> We use enterprise-grade encryption for all data and follow strict privacy protocols. Your information is never sold or shared

No information on what those privacy protocols are, though. And unless you're self-hosting the entire stack, are you really sure you're not sharing my private information with, say, OpenAI?

At a more general level, speech recognition and LLM performance outside English ranges from "it's okay" to "bad". If you're offering a service in a language that you don't speak (and forgive me for doubting your ability to speak Korean, Vietnamese, Russian, Hindi, Telugu, Bengali, and several more), be prepared for things to go wrong in ways you cannot understand. And speaking of which, how big is your "support team"? You wouldn't write "team" to mean just a single person, right?

> During your conversation with the officer, Maya stays on the line taking detailed notes about next steps, required documents, deadlines, and contact information so nothing gets lost.

I hope you're checking that you're in a one-party consent state. I also hope you've accounted for the person saying "I do not consent to be recorded".

At an even more general level: the problem with this idea is that it's aimed at the "average" person, but everyone has different pain points and your app will probably break up in contact with them. I can imagine it works well for your family because you know them, but are you sure it will work with mine? And you're aiming it at a sector of the population that, by definition, is bad with technology. That's a tough sell.

Anything related to healthcare can be a minefield. I wouldn't walk in there unprepared.

averageRoyalty 2 days ago [-]
This obviously hasn't taken off on HN, but absolutely you should pursue it. This is a fantastic use of AI, and you've already done the leg work making it functional and building a concise "what is it" site that is very clear.

There are many non native speakers in many countries, this could very quickly become a great global service. Be careful with privacy and hosted data, especially medical.

Well done to you!

2 hours ago [-]
thrown-0825 3 hours ago [-]
Great idea and good luck.

Before you open this up to the public you should prepare to be used as a potential spam vector and put some rate limits in place.

Assuming you are using something like twilio behind the scenes it can be very difficult to get yourself off a blacklist once you wind up on one.

Animats 3 hours ago [-]
Twilio will be a problem. Twilio no longer comprehends that there are non-advertising SMS applications. I used to use Twilio for something that just responded to incoming SMS messages. Then they changed their rules so that I had to register as an "ad campaign" and pay extra fees, even though I didn't do anything outbound.
thrown-0825 2 hours ago [-]
Yeah i have had similar issues with them.
awwx 2 hours ago [-]
> Is there a broader need for a service like this?

This is a good question to ask of people who are like your aging parents (asking here in HN may not be very useful if your potential customers are not HN readers). If you talk to enough people you'll get a sense of who really wants it.

avhception 3 hours ago [-]
This website already looks like a full-fledged service / sales pitch? Sounds great as a service, but you seem to have already made up your mind and asking if you should feels a little disingenuous.
luke-stanley 2 hours ago [-]
Exactly, and by the way, this is why I did not upvote it! Others probably have the same feeling.
villgax 60 minutes ago [-]
trying to growth hack but rather poorly
villgax 1 hours ago [-]
exactly my point too
jeroenhd 2 hours ago [-]
It's not a new idea, seeing as Google pretty much announced this years ago. I don't know how much uptake it got (it isn't available in my country) but it's definitely one of the main real-world use cases of LLMs and related AI that I can see surviving the bubble. If you can get enough money out of it to pay for your service costs, that seems like an excellent tool.

That said, my experience with LLMs is that they tend to lie/misrepresent user input and intention, especially when translating text. Doesn't sound like a problem to me if you're just ordering pizza or scheduling a haircut, but when it comes to healthcare, that might become problematic. Furthermore, there are quite a few regulations when it comes to healthcare services that you might need to double check, just to make sure you're avoiding having to comply with difficult and expensive regulation. Not really an issue for a tool built for your parents, but when you're marketing it to people (especially vulnerable groups, like people not speaking the language of the country they're in).

Also, does this bot announce that it's a bot?

Also, how will you prevent scam callcenters from ruining your bot's reputation? Is there some kind of abuse detection in place? Because if you just have a service that will call people and tell them what you instruct it to tell them, I can guarantee that malicious people will flock to it.

nelgaard 24 minutes ago [-]
That is the problem with this kind of stuff. If it was only used by aging parents it might be OK. But it will be used by everyone. Even if OP manage to somehow prevent abuse, others will just build it themselves.

Because why would you want to make phone calls in the first place and not just send an email, or an SMS?

Because of spam filters and because people do not read their emails immediately because we get so many. But now we just get the same with phone calls.

It was already bad enough with fake Microsoft support.

villgax 1 hours ago [-]
This doesn't seem like a genuine post. You went out of your way to make a landing page. Since you asked for brutal feedback, you should have shared some actual footage of people using this instead of making some landing page which already acts like it's a ready-to-use product, then what would be the point of this phony behavior in your comment?