Forbes recently published a report about court documents in a New York gun-running case that seem to imply the FBI can access encrypted Signal chats.
Court documents obtained by Forbes not only attest to that desire, but indicate the FBI has a way of accessing Signal texts even if they’re behind the lockscreen of an iPhone.
The clues came via Seamus Hughes at the Program on Extremism at the George Washington University in court documents containing screenshots of Signal messages between men accused, in 2020, of running a gun trafficking operation in New York. (The suspects have not yet entered a plea and remain innocent until proven guilty). In the Signal chats obtained from one of their phones, they discuss not just weapons trades but attempted murder too, according to documents filed by the Justice Department. There’s also some metadata in the screenshots, which indicates not only that Signal had been decrypted on the phone, but that the extraction was done in “partial AFU.” That latter acronym stands for “after first unlock” and describes an iPhone in a certain state: an iPhone that is locked but that has been unlocked once and not turned off. An iPhone in this state is more susceptible to having data inside extracted because encryption keys are stored in memory. Any hackers or hacking devices with the right iPhone vulnerabilities could then piece together keys and start unlocking private data inside the device
But this seems to be less about exploiting Signal but instead exploiting vulnerabilities on devices to gain access to Signal (and once you have access to the device, gaining access to messages is not going to be difficult).
Signal’s Moxie Marlinspike made this point on Twitter, responding to a more inflammatory version of the story from Zero Hedge,
The Freedom of the Press Foundation has an excellent article from earlier this year, Locking down Signal, that outlines best practices for using encrypted text apps like Signal while avoiding side-channel attacks, where attackers try to use malware or physically hacking a device (such as the FBI apparently did) to get at the messages. As the FPF nicely summarizes it,
The weak points in end-to-end encrypted conversations are the “ends”— the physical devices where the messages arrive in human-readable text.
Let’s Encrypt recently announced that it had issued its billionth certificate on February 27, 2020.
We issued our billionth certificate on February 27, 2020. We’re going to use this big round number as an opportunity to reflect on what has changed for us, and for the Internet, leading up to this event. In particular, we want to talk about what has happened since the last time we talked about a big round number of certificates – one hundred million.
One thing that’s different now is that the Web is much more encrypted than it was. In June of 2017 approximately 58% of page loads used HTTPS globally, 64% in the United States. Today 81% of page loads use HTTPS globally, and we’re at 91% in the United States! This is an incredible achievement. That’s a lot more privacy and security for everybody.
Another thing that’s different is that our organization has grown a bit, but not by much! In June of 2017 we were serving approximately 46M websites, and we did so with 11 full time staff and an annual budget of $2.61M. Today we serve nearly 192M websites with 13 full time staff and an annual budget of approximately $3.35M. This means we’re serving more than 4x the websites with only two additional staff and a 28% increase in budget. The additional staff and budget did more than just improve our ability to scale though – we’ve made improvements across the board to provide even more secure and reliable service.
Session is “an end-to-end encrypted messenger that removes sensitive metadata collection, and is designed for people who want privacy and freedom from any forms of surveillance.”
Available for Android, iOS, Mac, Windows, and Linux, Session aims to up the ante on secure messaging apps by eliminating the collection of metadata such as phone numbers from users. When signing up for a Session account, users are assigned a Session ID, which is a public key, but that Session ID is not connected to any personal information about the user.
What will Session do if compelled by a court to reveal user identities?
As Session is a project of the Loki Foundation, court orders in situations such as this would be targeted at the Foundation.
The Loki Foundation would comply with lawful orders. However, the Loki Foundation could not reveal user identities simply because the Foundation does not have access to the data required to do so. Session account creation does not use or require email addresses or phone numbers. Session IDs (which are public keys) are recorded, but there is no link between a public key and a person’s real identity, and due to Session’s decentralised network, there’s also no way to link a Session ID to a specific IP address.
The most the Loki Foundation could provide, if compelled to do so, would be tangential information such as access logs for the getsession.org website or statistics collected by the Apple App Store or Google Play Store.
The folks behind Session have a technical white paper explaining how the system works, Session: A Model for End-To-End Encrypted Conversations With Minimal Metadata Leakage.
Mozilla created a bit of controversy today by enabling DNS over HTTPS by default in the United States.
DoH will encrypt DNS traffic from clients (browsers) to resolvers through HTTPS so that users’ web browsing can’t be intercepted or tampered with by someone spying on the network. The resolvers we’ve chosen to work with so far – Cloudflare and NextDNS – have agreed to be part of our Trusted Recursive Resolver program. The program places strong policy requirements on the resolvers and how they handle data. This includes placing strict limits on data retention so providers- including internet service providers – can no longer tap into an unprotected stream of a user’s browsing history to build a profile that can be sold, or otherwise used in ways that people have not meaningfully consented to. We hope to bring more partners into the TRR program.
I agree with Bruce Schneier that this “is a great idea, and long overdue.”
A lot of the criticism of DNS over HTTPS is reminiscent of the criticism over TLS 1.3. Enterprises took advantage of poor security in DNS and TLS 1.2 to manage their networks, which is understandable. But we shouldn’t kneecap the security of the 3.2 billion people worldwide who use the Internet in favor of special interests.
A lot of that criticism also involves “experts” talking out of both sides of their mouths. For example, Caitlin Cimpanu offers contradictory complaints in ZDNet that, on the one hand, DoH doesn’t prevent ISPs or other network providers from tracking users.
But, in the same article, Cimpanu argues that DoH bypasses enterprise policies because it makes it impossible for those enterprises to track users.
Back in 2018, Google announced that beginning with Android 9, it would prevent apps from using unencrypted connections by default. As of December 2019, Google notes that 80 percent of all apps in the Google Play store use TLS, and that rises to 90 percent of all apps targeting Android 9 and higher.
Android 7 (API level 24) introduced the Network Security Configuration in 2016, allowing app developers to configure the network security policy for their app through a declarative configuration file. To ensure apps are safe, apps targeting Android 9 (API level 28) or higher automatically have a policy set by default that prevents unencrypted traffic for every domain.
Today, we’re happy to announce that 80% of Android apps are encrypting traffic by default. The percentage is even greater for apps targeting Android 9 and higher, with 90% of them encrypting traffic by default.
Since November 1 2019, all app (updates as well as all new apps on Google Play) must target at least Android 9. As a result, we expect these numbers to continue improving. Network traffic from these apps is secure by default and any use of unencrypted connections is the result of an explicit choice by the developer.
That last sentence is a bit concerning. If app developers want to explicitly make their apps communicate through unencrypted connections, that’s fine, but as far as I can tell there is no way that consumers are made aware of this.
Just as modern browsers warn me that the website I’m visiting doesn’t use encryption, Google should inform users when they are using apps that do so as well. I’d be happy with a notification on the Google Play store page for such apps that “This app sends network traffic over unencrypted channels” or something like that.
(Yes, users could set up a packet analysis tool to look at the data their phone is sending, but they shouldn’t have to do so).
NoSnoop is a Windows-based tool that will let users know if their SSL is being subjected to a man-in-the-middle attack.
NoSnoop is a standalone, browser-independent application that will perform SSL/TLS handshakes with a list of 250 popular websites and examine the certificate chains received from each server. It will alert on any unexpected certificates.
NoSnoop will check for obvious cases (such as interception by a local proxy, your employer’s SSL inspection gateways, or a malware infection), as well as more advanced attacks (for instance, if the root cert is valid but issued by an unexpected organization or country).
An entire scan typically takes less than 30 seconds.
This is currently in beta, so “bugs and/or false positives detections should be expected.”