What database?
This page is clearly written for developers that are already familiar with it.
From this I can already predict this project is going nowhere.
The local database used by Signal to organize every message, every contact, every profile photo, every attachment, every group, basically every dynamic piece of data you interact with in the app.
Signal is basically a UI layer for a database. The in-transit encryption is genuinely good enough to be textbook study material for cryptographers, but the at-rest encryption became a joke the moment they stopped using your pin to encrypt the local DB and requiring it to open the app.
As someone who's been enthusiastic about Signal since it was TextSecure and RedPhone, the changes made over the years to broaden the userbase have been really exciting from an adoption perspective, and really depressing from a security perspective.
TL;DR of Molly is that it fixes/improves several of those security regressions (and adds new security features, like wiping RAM on db lock) while maintaining transparent compatibility with the official servers, and accordingly, other people using the regular Signal client.
But if my phone gets taken and an exploit is used to get root access on it, I don't want the messages to be readable and there's nothing I can do about it. It's not like I can just use a different storage backend.
It's also a very simple solution - just let me set an encryption password. It's not an open-ended problem like protecting from malware running on the device when you're using it.
Which is to say this is an incoherent security boundary: you're not encrypting your phone's storage in a meaningful way, but planning to rely on entering a pin number every time you launch Signal to secure it? (Which in turn is also not secure because a pin is not secure without hardware able to enforce lock outs and tamper resistance...which in this scenario you just indicated have been bypassed).
A passphrase can be long, not just a short numeric PIN. It can be different from the phone unlock one. It could even be different for different chats.
This is less true for fully patched GrapheneOS devices than it is for fully patched iOS and other Android devices, but this space is basically a constantly evolving cat and mouse game. We don't get a press release when GrayKey or Cellebrite develop a new zero day, so defense in depth can be helpful even for hardened platforms like GOS.
I think, if you were actually willing to do that, it would probably be about as convenient and at least as effective to leave the device powered off and rely on the device full disk encryption and hardware security to protect the data at rest, only powering it on occasionally to check or send messages, then immediately powering back off.
Their justification here https://source.android.com/docs/security/features/encryption is that
> Upon boot, the user must provide their credentials before any part of the disk is accessible.
> While this is great for security, it means that most of the core functionality of the phone is not immediately available when users reboot their device. Because access to their data is protected behind their single user credential, features like alarms could not operate, accessibility services were unavailable, and phones could not receive calls.
I'm sure they could have found a better approach, instead of file based encryption, but must have been nice to simplify engineering overhead and giving 3 letter agencies, at the same time, something that simplifies their work.
That said, Molly definitely isn't designed for the average person's threat model, that's totally true, but it's also worth noting that just because someone isn't aware of a certain risk in their threat model, that doesn't mean they will never benefit from taking steps to proactively protect themselves from that risk.
IMO, security and privacy are best conceptualized not as binary properties where you either have it or you don't, but rather as journeys, where every step in the right direction is a good one.
I'd always encourage everyone to question their own assumptions about security and never stop learning, it's good for your brain even if you ultimately decide that you don't want to accept the tradeoffs of an approach like the one Molly takes towards at-rest encryption.
I find that unconvincing. If your phone is hacked, your phone is hacked. I think its bad to make assumptions that an attacker can compromise your phone but not log keystrokes. I'm not super familiar with state of the art of phone malware and countermeasures, but i think anything trying to be secure in the face of a compromised platform is like trying to get toothpaste back in the tube.
> it's also worth noting that just because someone isn't aware of a certain risk in their threat model, that doesn't mean they will never benefit from taking steps to proactively protect themselves from that risk.
Threat models are just as much about ensuring you have all your bases covered as ensuring you don't spend effort in counterproductive ways.
> IMO, security and privacy are best conceptualized not as binary properties where you either have it or you don't
I agree. I think security is relative to the threat you are trying to defend against. There are no absolutes.
> but rather as journeys, where every step in the right direction is a good one.
Here is where i disagree. Just because you take a step does not mean you are walking forward.
A poorly thought out security measure can have negative impacts on overall system security.
As always, it depends on your threat model.
I use signal because I value my privacy and don't trust Facebook. Not because I'm an activist. So I'm in the target group for Signal's new behavior and I welcome it (especially since to use it to share personal information that I don't want Facebook or advertisers to get, I need my parents and in-laws to use it as well, so it must be user friendly enough).
I wish they continue moving forward in that direction by the way and allow shared pictures to be stored directly on the phone's main memory (or at least add an opt-in setting for that), because the security I get from it not being is zero and the usability suffers significantly.
I'm a really big fan of the airport bathroom analogy. When you use the restroom in the airport, you close the stall door behind you.
You're not doing anything wrong, you have nothing to hide, and everyone knows what you're doing. But you take actions to preserve your privacy anyway, and that's good.
Everyone deserves privacy, and the psychological comfort that comes with it. Dance like nobody's watching, encrypt like everyone is :)
This is a much better way of saying what I wanted, thank you.
What "proprietary blobs" does Signal have?
I'll also just add: it's probably not a good idea to use any modifications to an E2EE messenger unless you are comfortable with those privacy/security guarantees possibly being violated by the 3rd party code.
The only exception to this would be if I really trusted the goals of the 3rd party, like Graphene.
There are actually two builds of Molly: Molly and Molly-FOSS. IIRC Molly uses regular Firebase, which can be faster and more reliable but comes with the above tradeoffs, while Molly-FOSS uses UnifiedPush.
Your point about exercising caution with forks of encrypted messaging apps is a great rule of thumb, and in general, social proof should NOT substitute for competent software security specialists reading and evaluating source code, but given you seem to trust GrapheneOS, it's worth noting that they've formally endorsed Molly: https://xcancel.com/GrapheneOS/status/1769277147569443309
Also a great point :) And thank you for the reference.
As they say in the Github readme, FCM and Google Maps.
FCM doesn't technically require a blob — it's just that Google wants you to think it does. I reverse engineered their library and it turned out to be a criminally over-engineered wrapper around two broadcast receivers. So, the Mastodon app is proudly the first app ever to both support FCM push notifications, and be 100% open-source.
It's Apache 2.
Here's how you request a push token:
Intent intent = new Intent("com.google.iid.TOKEN_REQUEST");
intent.setPackage("com.google.android.gms");
intent.putExtra("app", PendingIntent.getBroadcast(context, 0, new Intent(), PendingIntent.FLAG_IMMUTABLE));
intent.putExtra("sender", FCM_SENDER_ID);
intent.putExtra("subtype", FCM_SENDER_ID);
intent.putExtra("scope", "*");
intent.putExtra("kid", "|ID|1|");
context.sendBroadcast(intent);
Here are the two receivers: <receiver android:name=".PushNotificationReceiver" android:exported="true" android:permission="com.google.android.c2dm.permission.SEND">
<intent-filter>
<action android:name="com.google.android.c2dm.intent.RECEIVE" />
</intent-filter>
</receiver>
<receiver android:name=".api.PushSubscriptionManager$RegistrationReceiver" android:exported="true" android:permission="com.google.android.c2dm.permission.SEND">
<intent-filter>
<action android:name="com.google.android.c2dm.intent.REGISTRATION"/>
</intent-filter>
</receiver>
The first one is where you get notifications. The parameters you sent from the server will simply be your intent extras.The second one is where you get push tokens. There will be a "registration_id" extra string which is your token. It may start with "|ID|1|" (the "kid" parameter from the request, not quite sure what it does), in which case you need to remove that part.
You want to refresh your push token every time your app gets updated and also just periodically if you haven't done it in a while. I do it every 30 days.
But if someone knows better I would appreciate any correction. Legal matters are seldom clear or logical. Your jurisdiction may vary, etc etc.
It's why black-box clones where you look at an application and just try to make one with the same externally-observable behavior without looking at the code is legal (as long as you don't recycle copyrighted assets like images or icons) but can be infringing if you reuse any of the actual source code.
This was an issue that got settled early on and got covered in my SWE ethics class in college, but then more recently was re-tried in Oracle v Google in the case of Google cloning the Java standard library for the Android SDK.
I have no idea how copyright applies here. StackOverflow has a rule in their terms of use that all the user-generated content there is redistributable under some kind of creative commons license that makes it easy to reuse. Perhaps HN has a similar rule? Not that I'm aware of, though.
But then I'm not sure how much code is enough to be considered copyrightable. Is "2*2" copyrightable? Clearly not, because it's too trivial. Where is the line?
Thanks, Grishka.
But seriously, write this as a blog post, more people need to know about this, and I'd like to have something "authoritative" to send references to.
Which is documented in this gist: https://gist.github.com/mar-v-in/2a054e3a4c0a508656549fc7d0a...
Thanks, I didn't notice that. Reading this, I'm kind of surprised that Signal doesn't offer an OpenStreetMaps build as it seems like it'd be more inline with their philosophy.
The protocol itself was easy, but my problem was that Google Play Services have a special permission to exempt itself from power management. And more importantly, grant that permission temporarily to the individual apps when they have a notification. I don't think I ever found out how to work around this.
Ltt.rs has support for both UnifiedPush and FCM and is fully open source. The code difference between UP and FCM is very very minimal since - as I said - both are just WebPush endpoints.
I miss the times IM software respected, or at least didn't fight hard to defeat, the end-user's freedom to computing on their own device, which includes viewing and sending messages through whatever interface they see fit, including indirectly as part of a script/automation. But that was all before E2EE era, hell, before mobile dominance.
> might as well use Whatsapp.
- still scrapes metadata
- run by company who's entire objective is to profile you
Stop being so ridiculous. You can criticize Signal (and there's plenty to critique) but that's just silly. What, should we also just use telegram where E2EE is off by default?You know signal is open source, right? That's why Molly exists. They can run their own servers too.
Now I wish you could do both. Talk in both signal and the decentralized molly servers. I wish signal had a mesh like feature since it's way harder to snoop on conversations if you have to be physically near. I even wish Signal made the signal sticker site accessible from inside the app. There's tons of things they should do but let's not pretend that just because they're not perfect that we should use apps from a company whose motto might as well be "be evil".
There are plenty of others, all with their pros and cons.
Ultimately,the network effect is usually the hardest parameter to overcome.
Ironically, the only person who mentionned wanting to use signal instead of whatsapp in my network circle is my 71y old mother.
edit: lol, i assumed you were the OP. ignore me
> - run by company who's entire objective is to profile you
And? Pick your poison. Being profiled by Meta isn't high enough on my threat board to be worth switching to E2EE as a countermeasure; in fact, I only use E2EE because Meta forced it on me with Whatsapp (new network effects) and by enabling it in Messenger (old network effects).
But that's besides the point. The point is, I did not expect such an alignment of outcomes between user-hostile corporations and grassroots OSS developers, as both fight to saturate the IM space with network effects-driven apps that disenfranchise end users "because security".
I imagine Signal is also more than happy about remote attestation and upcoming Android developer verification? All more ways to protect the integrity of the network and ensure the user isn't accidentally stripping E2EE by doing something silly like perusing their messages in ways not prescribed by the developer?
> What, should we also just use telegram where E2EE is off by default?
I don't like it because it made other choices that led to their larger network being infested with scammers and all kinds of shady types, but at least the client itself doesn't suck :).
> I imagine Signal is also more than happy about remote attestation and upcoming Android developer verification?
No, why would you think so? They were very against the Europe encryption issue[0]. You can also go check Meredith's Twitter[1] or Moxie before her. Their stance on things have been consistent and clear.But does Signal comply with government warrants? Yes, absolutely. But they don't get many requests because they designed their program to assume Signal is an adversary to the user[2]
> I don't like it because it made other choices that led to their larger network being infested with scammers and all kinds of shady types, but at least the client itself doesn't suck :).
Does it? Or have you made an assumption? I get a spam messenger on Signal maybe twice a year. Over SMS about 10 times a week (though a bit more than half are auto blocked), over phone I get at least 5 calls a week but sometimes I get that in a day, over WhatsApp I get a few a month. So I don't really buy your argument, but I'm only a sample size of one. But I also know there's no technical reason for your argument either. I'm also pretty sure we've had this conversation in the past, so what's up? You're one of the most active HN users[3], you're easy to remember.[0] https://signal.org/blog/uk-online-safety-bill/
[2] https://signal.org/bigbrother/
[3] For anyone interested https://news.ycombinator.com/leaders
> No, why would you think so? They were very against the Europe encryption issue[0]. You can also go check Meredith's Twitter[1] or Moxie before her. Their stance on things have been consistent and clear.
Never was a fan of Moxie, and I'm not going to read Twitter backlog right this moment to confirm what Meredith thinks, but my reasoning here is pretty simple: remote attestation and developer verification are tools that enable what Signal seems to want, which is to be its own walled garden with only one app from one official source - this is necessary for them to deliver on the promises of privacy and security they make.
>>> What, should we also just use telegram where E2EE is off by default?
>> I don't like it because it made other choices that led to their larger network being infested with scammers and all kinds of shady types, but at least the client itself doesn't suck :).
> Does it? Or have you made an assumption? I get a spam messenger on Signal maybe twice a year. (...)
Here I meant Telegram, not Signal. I don't get any spam on Signal at all, but then hardly anyone in my circles uses it. WhatsApp, maybe few times a year. Telegram, all the time, and I only keep it because my local Hackerspace moved over to it from IRC, + it's actually useful for some automation here and there.
People have different preferences; not everyone sees privacy from powers that be as the only requirement. I for one care more about my freedom of computing on my own device; I'm not that worried about my shitposting remaining private. I don't like the push for E2EE messaging in its current form, because I see incentives of both megacorps and OSS devs align against my own. I speak up, because messaging is inherently network-effects based, and I don't want to end up in a situation where all communication goes through end-user-opaque black boxes, regardless of whether they're corporate or community-made.
> I'm also pretty sure we've had this conversation in the past, so what's up?
I know your username and remember us having some back-and-forth on various topics, but I don't recall this particular one.
Edit: Found it! "Careless Whisper: Exploiting Silent Delivery Receipts to Monitor Users on Mobile Instant Messengers" https://arxiv.org/abs/2411.11194
i don't use any of the enhancements, but it does receive notifications over the websocket it keeps open in the background vs only waking up on an FCM push notification like the regular app
i wonder if the supply chain risk of having a second entity (that signs the apks!) involved is really worth it to anyone... hope signal can be published on Accrescent or similar someday :p
3rd party repositories ship whatever.
For apps i do install from f-droid repos (official or otherwise) i prefer https://github.com/Droid-ify/client
FWIW you can actually do the FOSS version of this now with UnifiedPush support (rolled out in Molly a while back).
It's a massive saver on battery life but it does require that you have a server set up to forward notifications to your unifiedpush distributor.
Happy user for many years now, thanks for the support!
In fact, that's not a build by the Guardian Project, but (when I tried) a redistribution of Signal's https://github.com/signalapp/Signal-Android/releases builds.
I'm not sure why they're doing it; anyhow, I'd at least avoid doing the initial installation through that repo, you're trusting an additional party for no gain that I could think of (updates are ok because the signature needs to match the one of the installed version).
> Surprised how many people don't seem to know about it.
I'm pretty sure people just want to be angry. I mean look at how many people are arguing that updating is... bad. I cannot and will not take those people seriously. It's just such a laughable position.There are a few reasons for that.
1. The link to APK cannot be found on the official site[0], so it needs to be looked up in a search engine.
2. Even when downloading from the site, they try to scare you away with a warning [1]. The reason for warning could be avoided by hosting their own F-droid repo, but they refused it, claiming you can download APK and not listening to reason[2].
Though for people using F-droid can still get Signal through the Guardian repository [3]
Thing about the signal APK and the Guardian one is that, it still have the so called "crap" in the final APK, it just runs a background service when required google services are not detected, causing battery drain for many[4].
The drain could also be avoided by supporting UnifiedPush (it can fall back to FCM when it's detected), but they don't want to do that either[5].
[0] https://signal.org/download/
[1] https://signal.org/android/apk/
[2] https://community.signalusers.org/t/how-to-get-signal-apks-o...
[3] https://guardianproject.info/fdroid/
[4] https://github.com/signalapp/Signal-Android/issues/9729
[5] https://community.signalusers.org/t/use-gcm-fcm-alternatives...
I think they've relaxed this quite a bit more recently
Also, as others pointed out, Moxie isn’t part of Signal any longer and hasn’t been for a while.
I'm pretty unconvinced that this is a sane or useful thing to do.
Can someone explain, is this different from adding (up to 5) devices to your Signal account? Are these devices all "primary" or something?
I would ideally want to not have one device being the master and the rest linked to it (e.g. Element can do that for Matrix) but that might be a too big change. And as far as I know Molly does not try to solve that.
I used both desktop and android with no issues.
I have multiple android devices and I can only log into signal on one. I can have as many desktop slaves as I want tho.
But sadly the competitors are as bad, just in different ways. Why has nobody yet managed to build a good IM client? It does not seem like we have come far from what we had back in the Pidgin days.
Whatsapp mentions don’t work (just show the name of the mention to the other users), and polls or albums don’t work.
Messenger disconnects every couple of days at this point.
Pasting links won’t always expand.
Attachments are always hit or miss.
So many small other things. Still love it.
This is par for the course with chat backups, though.
Messenger - Bad - No way to save chat responses of people you have talked to. This means you only ever have one side of a conversation, making it meaningless.
Twitter DMs - Bad - See Messenger.
Jami - Ehhhh - Saves a git local repository of messages. The only problem is message syncing is effing abysmal.
Dino (XMPP) - Bad - Does not allow backing anything up, this is "intentional". Depending on which protocol you use, as soon as you move to another device all the messages you _had_ are retroactively converted to Cannot Decrypt. They're my effing messages!
Discord - Good - Discord History Tracker (tedious to use but slurps everything up into a sqlite3 database that is itself, an official archival format)
WhatsApp - Good - Dumps a text record + files/images/etc. onto the phone's filesystem. This is reasonably easy to archive.
Signal - Mediocre - If you have an old Signal backup from 2018? That you could only transfer off your phone by deleting old messages? lmao you're effed. Load up a version from ten years ago, gradually update it and then maybe, MAYBE you can extract the sqlite3 archive? These days you have a .signalbackup or whatever which is an encrypted archive, and I assume that there's a tool to decrypt it, but uhhhhhh. Last I tried to use it it required way more RAM than I had accessible.
I'm not sure why anyone would trust Telegram.
Maybe you don't believe Durov's statement[0] about it. But is there any actual evidence anywhere that they've ever violated the secrecy of non-e2e private groups or messages for anyone? I've yet to find any.
But besides this, there is really a strong need for a web client, just like Telegram or WhatsApp. If only the protocol can be extended in such a way that it allows for integrating into a web app, that would be incredibly great.
I have always assumed no Signal web client was a choice made to improve security.
All our communication is over signal, so it is a nice record to have.
Signal and other messaging apps offer a 'search' bar across all sessions & history, so I doubt I'm the only one.
It's hard for me to imagine being so present-focused such a history wouldn't be personally useful.
Or, so worried about "someone [using] it against [me] in court" that I'd need more than the occasional auto-expiration, and specifically my messenger "protecting" me with intermittently-enforced loss-of-histories (on just theft/loss/hard-failure of primary device).
I had no issues at all since it's called Signal. I have no idea what people do with it to cause problems at all.
It would still be interesting to find out.
I've been on Android only btw.
The Android app is stable enough, but the UX of having to look at the phone while typing a reply on a normal keyboard is annoying. This is why I prefer Telegram every time.
I'm not sure what's going on for you, but it seems really abnormal.
There is now at least a reminder on the phone app that will prompt you a few days before one of your desktop apps is about to get unlinked.
Your portrayal of my comment is not even close.
Anecdotes that sometimes those problems don't occur are nearly worthless. Of course that's true - the original anecdotal complaint already implicitly relies on, & grants, the idea that there's some default, "hoped for" ideal from which their experience has fallen short.
To chime in, "never had your problems" thus adds no info. Yes, people lucky enough not to hit those Signal limits that cause others to lose data exist, of course. But how does that testimony help those with problems? Should their frustration be considered less important or credible, because of your luck?
The as-if portrayal is one way your anecdote will be perceived, even if that wasn't your intent.
Most apps on the market are E2E by default these days, and that introduces a whole host of complications. It's the wrong tradeoff for 95+ percent of users. If you can only afford 1 device and only switch to a new one when the old device breaks, E2E is a disaster in the making. For the overwhelming majority of users, making sure that they have access to their messages when they switch devices is far more important than being protected from the NSA. This is something most signal advocates are completely unwilling to talk about.
I didn't actually know Wire was FOSS.
That's (almost) incompatible with E2E encryption. You could do it via "social proof" or server-side secure enclaves and pins, but that's about it.
Compare with Signal where there is only one allowed server entity and hardly anyone verifies identities making man in the middle attacks trivial.
https://www.ndss-symposium.org/wp-content/uploads/2018/03/09...
This adds some detail about how Signal can do MITM attacks:
https://sequoia-pgp.org/blog/2021/06/28/202106-hey-signal-gr...
Some of the details might of changed since publication. My current understanding is that Signal doesn't even bring up the idea of identity verification if a user has not previously done it. So if anything, things have gotten worse.
AFAIK signal only blocks due to security patches. Which it's on a much longer timeframe than a few weeks.
Most of the time there is zero explanation for the update. They are just training their users to auto accept updates with no thought about why, which in itself is a security risk.
If signal really is pushing these updates for "security" then it must be one of the most insecure apps ever built. I legitimately can't think of another app or program that updates more frequently... Maybe youtube-dl?
> It sounds good in theory but signal updates are beyond excessive
Those are two different arguments.Updating too frequently is not equivalent to "doesn't need to be updated." I can agree that they update a bit too frequently but that's nowhere near the argument about never updating.
A program cannot be secure if it does not update. Full stop.
> Most of the time there is zero explanation for the update
There's always a changelog.If you, unlike most people, are interested it is all open source
https://github.com/signalapp
https://github.com/signalapp/libsignal/releases
https://github.com/signalapp/Signal-Android/releases
https://github.com/signalapp/Signal-iOS/releases
https://github.com/signalapp/Signal-Desktop/releases
I would suggest looking at the actual commits and not just the release notes. Libsignal usually has more info about the security > legitimately can't think of another app or program that updates more frequently
Probably because they do so silently. >> I would suggest looking at the actual commits and not just the release notes > they want an app that never NEEDS to be updated
That requires the programmer to be omniscient and clairvoyant.You can get pretty close if you're in a static environment like a machine that never connects to the internet and the hardware never changes and no other software on the machine changes, but neither a phone nor a communication platform allow for that.
Frequent updates have the downside of more frequent breakage and of course extra bandwidth usage. Let users make the trade off between those downsides and the risk of zero days.
You're putting everyone who you've talked to at risk. I don't know about you, but I prefer not having to worry about whether I'm communicating with someone whose installation can easily be pwned by any halfway incompetent attacker.
> a update that I not personally security reviewed
Great, can you give me a summary of the updates for the Linux Kernel, Android Kernel, iOS kernel, libssl, and all the drivers that updated this week on my arch machine? > Sorry, thats not a argument.
Neither is pretending you're reviewing hundreds of thousands of lines of code a week.This is Hacker News man, some of us actually understand how computers work.
> update or not shouldn't be taken away from users.
So turn off auto-update? You can do this everywhere except iOS. > Let users make the trade off between those downsides and the risk of zero days.
Those trade-offs are that if your version is too old (protocol has been updated several times and you are out of the lifetime) then you can no longer communicate with those who have updated as you will make their communications insecure.If you don't want to update, that's fine. But your preference for not updating doesn't get to override my preference for secure communication. It is literally the whole point of Signal... if you don't want security and privacy then don't use Signal, that's your choice and no one is forcing you to use the app.
I had to live without a phone for about a year. First my phone broke and I couldn't repair it or buy a new one, then I lost my phone number due to unpaid fees. I kept using the Linux Electron app, updating it as often as possible.
I saw this message on the Linux app after a while:
> Open Signal on your phone to keep your account active
I couldn't open Signal on my phone or install a new Android Signal app even on an Android VM because I wouldn't be able to get the new app verified without access to the phone number I registered with.
I wrote an email to the support team and got this reply:
> Using Signal for iOS or Android as your primary device in order to link and use Signal for Desktop was always a requirement as a QR code must be scanned to link a device. The primary device must remain active during this usage. There is no way around this.
> For more information and recovery steps please see our faq page here: https://support.signal.org/hc/articles/8997185514138-Re-conn...
> Otherwise your account will be deactivated, and you will need to reinstall and register for Signal using an up-to-date version of the application.
And as to when that deactivation would happen, they replied:
> We're unable to provide a specific timeline. We recommend registering for a Signal account on a smartphone and linking your Desktop to that smartphone within the next few weeks.
From their link it seems like there's an actual technical reason behind this. I'm not sure if it's true, but it feels a bit suspect.
So, after a couple of months of seeing this message in the Linux app, I woke up with a deactivated Signal account. I asked some of my Signal contacts to use Matrix until I get a new phone number. It seems much better in this regard - it's not mobile first and it doesn't require ongoing access to a phone number. The basic features are all there, even if there a few minor annoyances and bugs in the clients here and there.
[0] They also use it as a means to help with the social graph. Building a social graph is pretty difficult and you don't want to do it completely from scratch. This is the same reason social media wants you to import your phone contacts and email contacts. The difference is that the "side benefit" to that is that they get data harvesting rather than security.
| > We're unable to provide a specific timeline.
> I'm not sure if it's true, but it feels a bit suspect.
It's because Signal doesn't track metadata. The reason they can't tell you a specific date is that they don't know how to associate your physical name with your Signal account. The information is unavailable to them! Which is the whole point of Signal.Honestly, the best solution to this would have been to buy a cheap phone or something like a VOIP number. I don't know your situation but it seems like it is not that easy to go a year without a phone number. I definitely think Signal should do better in this but I don't think the result is unreasonable. It brings up an edge case they probably didn't consider but having a phone number "abandoned" for a year sounds like it is a very low probability situation. Being reliant on phone numbers they also have to garbage collect, right? Because a phone number is not a unique identification to a person for their life. So while I do agree your situation sucks and is very frustrating I hope you can recognize that it is (from my best guess) a very unlikely situation. That the phone number is being sat on but unused and that the squatting is happening by a legitimate person rather than a scammer.
They can do better, for sure, but I don't think I'd judge a platform harshly by the results of an extremely odd outlier situation.
> update thats infected by some government trojan?
Or even just a hacker!Unfortunately you don't. But this is true for ANY app.
Fortunately, Signal is open source. So you can go read the lines of code. Unfortunately this is a lot of work. But fortunately if you believe a certain checkpoint is secure (your current install) you only need to read the new things. You can also build from source if you don't trust the app store.
Fortunately with open source you also get the benefit of others. Maybe you don't look through everything, but there's definitely other people looking through some things. And with something like Signal, you can be pretty certain that there will be a big uproar if something devious is pushed.
You always need trust, unfortunately. But with closed source you have to trust one entity and get no way to verify. With open source you have to trust very few and can even verify yourself.
With presage, Whisperfish has a high-level Rust library that wraps the official libsignal libraries (which are also written in Rust) and makes it much easier to write clients. The official Signal repo only contains Java/Typescript/Swift wrappers. As presage is rather obscure, I thought that some HN readers might appreciate the link.
But not sure if even the upstream Signal client has this.
or to set up a timed SOS signal if you don't disarm it within a given deadline.
Its not permanent, you just enable it for a few minutes until you find each other
Edit: Found it! "Careless Whisper: Exploiting Silent Delivery Receipts to Monitor Users on Mobile Instant Messengers" https://arxiv.org/abs/2411.11194
Signal still requires a phone number for registration.
The fact that this "improved" version does not show a single screenshot of the UI on their own website, signals to me (pun intended) that this app will address none of my wishes.
It really is weird not to show a single screenshot when the 4th listed feature is design ("Material You | Extra theme that follows your device palette").
/s
- they’re open source and people like me regularly read parts of their code and in some cases use their code elsewhere. Also several undergraduates and PhD’s have written research papers on the signal protocol. It’s also the subject of a lot of security research (there was a good talk at defcon this year that found some minor privacy issues with signal notifications)
- no one has built a decentralized e2ee messaging app that’s actually secure and has privacy anything like the bar Signal sets. Matrix are getting close, they’ve recently made some encouraging changes, but it will take some time to verify.
- Moxie the founder of Signal gave a talk about the challenges of building something like signal in a decentralized environment - https://youtu.be/1W5fuqySBnE
- Signal is a nonprofit. They have stated repeatedly they will shutdown the app in regions or countries that make backdoors required by law.
* Do you know of anything better?
* Do you not trust the Signal Organisation? They've aren't able to subvert their encryption on the servers, and have publicly stated that they will leave a region before integrating client-side scanning. I for one believe them, since it's their raison d'être.
But these alternatives are all niches compared to Signal. Which is to say something considering that Signal itself is a niche compared to Whatsapp.
If not, I'll stick with Matrix. It's got privacy at least as good as signal, with Discord's UX.
I use Signal on Linux, and Molly on my tablet. The issue I ran into was in trying to use Signal on both.
worik•2mo ago
Mindless2112•2mo ago
cowmix•2mo ago
jeltz•2mo ago
And given how in this case Molly could fix it it cannot have been that hard to fix.
patchtopic•2mo ago
ahazred8ta•2mo ago
ThatPlayer•2mo ago
gitaarik•2mo ago
gitaarik•2mo ago
"If you wish to use the same phone number for both Molly and Signal, you must register Molly as a linked device. Registering the same number independently on both apps will result in only the most recently registered app staying active, while the other will go offline."
ThatPlayer•2mo ago
Specifically I'm using Signal as the main device, with Molly as the linked device on 2nd phone.