I built Mallo, an iOS AI companion that remembers you and proactively checks in with you every day.
### What it is
Mallo is a chat-based companion app for iOS. You talk to it like you would to a friend: about your day, your worries, your goals, or even trivial things. Under the hood, it:
- extracts long-term memories from your conversations (people, events, preferences, recurring topics), - stores them in a structured way, and - uses them in future chats so it can say things like “How did that interview go?” or “You mentioned you’ve been feeling burned out recently—how are you now?”
The second piece is proactive behavior. Instead of waiting for you to open the app, Mallo:
- schedules daily greetings and check-ins, - sends “streak rescue” nudges if you’re about to break your streak, - can follow up after emotionally significant days.
So it’s not just a generic “ask me anything” chatbot—it’s trying to be a lightweight, long-term companion.
### Why I built it
I’ve been working remotely for a long time and building products mostly by myself. I noticed a recurring pattern: I’d have days where I just wanted some light emotional support or a feeling of “someone is here with me,” but:
- traditional social apps felt noisy and performative, - therapy (while hugely valuable) has a higher barrier to entry and a different cadence, - and generic LLM chat apps felt like tools, not companions: every conversation started from zero, and they never reached out first.
I wanted to explore whether an AI, with proper boundaries, could provide _consistent_ low-friction emotional support—checking in, remembering my context, and being there every day—without pretending to be a therapist or a human friend.
### How it works (high-level)
- *Stack*: Flutter app, Gemini-based backend, local database (for conversations, memories, streaks), RevenueCat for subscriptions, Firebase Analytics for metrics. - *Memory*: After conversations, a memory extraction service summarizes and stores key facts and ongoing threads. These are then used to personalize future prompts. - *Proactive engagement*: A proactive engagement worker + local notifications schedule daily greetings, emotion follow-ups, and streak rescue messages, taking into account your subscription tier and usage history. - *Behavior layer*: Streaks, a usage dashboard, and small mini-games are layered on top so that “maintaining contact” doesn’t always mean having deep conversations—some days you can just play a quick thing or do a 10-second check-in.
### Privacy and data
- Conversations are stored so that Mallo can remember you over time. - Data is not sold to advertisers. - Tracking for ads (Meta/TikTok) only happens if you explicitly allow it on iOS via ATT; otherwise those SDKs stay effectively off.
Longer version is on the website with more detail about storage, retention, and analytics.
### Pricing
Right now there is a free tier with limits, and a paid subscription that unlocks more usage and features. I chose subscription instead of a one-time purchase because:
- the app has ongoing API and infrastructure costs (LLM inference, notifications, analytics), - I want to be able to keep iterating quickly based on feedback.
I’m very open to feedback on whether the current pricing structure feels fair.
### Links
- Website / details / privacy FAQ: https://mallopal.com/?utm_source=hn&utm_medium=show_hn&utm_c... - App Store (iOS): https://apps.apple.com/us/app/mallo-ai-companion-remembers/i...
Thanks for reading! I’d love feedback from this community—on both the product idea and the technical approach. I’ll be in the comments all day answering questions and taking notes.
tmproduction•2h ago
*Q: Is this therapy?* A: No. Mallo is not a therapist and is not a replacement for professional mental health care. It’s designed for everyday emotional check-ins and companionship, with clear in-app language about its limitations.
*Q: What happens to my data and conversations?* A: Conversations are stored so that Mallo can remember your context over time. They are not sold to advertisers. Analytics are used to understand usage (e.g., retention, funnels). For specific details on storage and retention, see the privacy section on https://mallopal.com/privacy/.
*Q: Are you training your own models on my data?* A: No, conversations are not used to train new models. They are used only to personalize your own experience.
*Q: Does this send data to Meta/TikTok/etc.?* A: If you deny tracking on iOS (ATT prompt), their SDKs won’t receive personal identifiers. I do use anonymized analytics to understand high-level behavior, but you can opt out of any personalized advertising tracking via iOS system settings.
*Q: Why a subscription instead of one-time purchase?* A: The app has ongoing costs (LLM inference, notifications, analytics) and I want to keep iterating based on feedback. Subscriptions let me do that sustainably. I’m open to adjusting pricing and tiers based on what feels fair.
*Q: Any plans for Android or web?* A: I’d like to add them if there is enough interest and once I’m confident in the core experience. Flutter should make Android more straightforward; web is trickier for some of the notification/engagement features.