I asked my Twitter followers what I should talk about in this issue, and those trolls picked PGP and security vulnerability reporting, so here goes nothing.

As you probably know, the school of modern cryptography thinking I subscribe to says that tools and protocols should be small, simple, and focused on a specific use case. Only then you can make opinionated choices that are safe by default, make the tool impossible to use wrong, and design with a single well-oiled joint avoiding all the issues that come from protocol negotiation, downgrades, and misuse.

This means that replacing PGP is a painstaking effort of finding and breaking down the use cases of this rusty old Swiss Army knife, and finding simple dedicated solutions for each of them. (It's also why people who say "age can't replace gpg, it doesn't do enough things" are missing the point by a few nautical miles.)

Latacora has a good blog post going through the alternatives for use cases like talking to people (a Signal protocol implementation), encrypting email (seriously, don't), signing releases (signify/minisign, or more recently OpenSSH, which I'll talk about in a future issue), backups (Tarsnap, or I add restic), application usage (libsodium, or I add the Go crypto libraries), encrypting files (age)...

The one recommendation they make which makes no sense to me is to use Signal for vulnerability and bug bounty reports. As the security coordinator of the Go project: hell no. Vulnerability reports need to go to a group, not an individual, and while I know Signal desktop exists the UX is too tied to mobile (rightfully so) for the use case. So golang.org/security still has a PGP key on it.[1]

However, I think this use case is actually pretty easy to replace!

First, a quick word about just using plain emails. Email encryption in transit is opportunistic at best, and there is no real way to know if it will transmit in plaintext before sending it. I'm skeptical anyone will run STARTTLS downgrades to get your 0-days, but reasonable people disagree. The good news is that the core flaw of emails, the fact that they asymptotically approach compromise because they are a forever searchable archive that only has to be breached once to compromise the whole history, does not apply here because reports stop being valuable once they are fixed.

My spicy take is that if you are concerned about email, instead of trying to be cool you should skip PGP and just put up a Google Form.

My assumption is that you use either email or a ticketing system within your team to communicate about reports and use some sort of code review system to prepare the fix. Anything that implies compromise of one of those is out of the threat model: why would an attacker that can listen in on your discussion on how to fix a vulnerability care that the report came encrypted with PGP?

For a concrete example, when you email an encrypted report to security@golang.org, we decrypt and send it unencrypted to the internal list that triages vulnerabilities. That's fine because if you popped the Google mailbox of a Go security officer we'd have much, much larger problems, and anyway that's where the review requests from the internal Gerrit instance are going to land in plaintext.

Given that assumption, setting up a secure vulnerability reporting channel is just an exercise in finding a secure way to get a report into the communication and coordination channels you'll use to fix it.

In 2020, the most secure information submission channel is an HTTPS form. This is not only true in absolute (which it is, and you're not really allowed to have an opinion on the CA ecosystem in 2020 unless you know what m.d.s.p is[2]) but again the security of your coordination and resolution channels relies on it, so the reporting channel might as well.

If you are skeptical, we can show that the security of PGP reduces to that of HTTPS anyway: the security@golang.org PGP key has no signatures and everyone will trust it because they fetched it from https://golang.org/security. We should skip the PGP step, and just put a form on there.

The specific implementation of that form really depends on your circumstances, I am not trying to sell you Google Forms specifically[3]. If you talk to each other with Gmail or G Suite, go ahead and make a Google Form with email notifications and file upload. GitHub lets you create private advisories and add external users to them, so you can coordinate that with a reporter over email (and I expect they will add the ability for the public to file reports sooner or later). If you want a dedicated service, sign up for HackerOne[4]. You can even make it a ZenDesk form, if attackers control that the XSS report is the least of your problems.

What about responding? Some of these services are two-way (GitHub, HackerOne, ZenDesk) but if you are using a Google Form you can just ask for an email and optionally a PGP key. In the unlikely circumstance that the report is actually sensitive and you need to send sensitive details in the response and the reporter cares about not using plain emails (this has never happened since I've been the Go security coordinator), you can set up a throwaway PGP key then. This is much less annoying than maintaining a long term key because 1) you can't be forced to use PGP before seeing the report severity, 2) it will be exceedingly rare and you can avoid setting up anything until and unless you need it, and 3) long-term keys are one of the major problems of PGP.

To recap, my point is that you are probably fine with plain email for vulnerability reports, but if you want to go the extra mile, an HTTPS form will probably take you the rest of the way.

Of course, there's the Debian exception that proves all PGP rules. Debian is the only community I'm aware of that actually built a web of trust in active use, and I bet they use it for security vulnerability reports, too, as well as to discuss and develop and ship the fix. You're not Debian.

Follow-ups

I'm really enjoying the conversations that stem from the newsletter. Here are a couple followups to the last two issues.

X25519 is more associative than I thought

The answer to "is X25519 associative" is still "sometimes", but more "most of the times" than "half of the times".

Steve dropped on Twitter a solution to the pair of scalars I said could not be combined. How come? I had forgotten that X25519 only operates on the x coordinate of the curve. For each point on the x coordinate, there are two curve points: one with positive y, and one with negative y. This is why compressed point representations reserve a single bit for y.

X25519 only outputs the x coordinate (because it makes it faster to use the Montgomery ladder, which requires extra steps to recover y, I really recommend Costello if you want to learn more) so the result of scalar multiplication by s and -s is the same, doubling the valid solutions to the associativity problem.

In other words, to find s3 such that

X25519(s3, P) == X25519(s2, X25519(s1, P))

we can solve either

clamp(s3) == clamp(s1) * clamp(s2)         mod q

or

clamp(s3) == clamp(s1) * clamp(s2) * -1    mod q

We can still show that X25519 is not associative applying the pigeonhole principle: there are 2^251 representable scalar values (256 bits of input minus 5 that are clamped) and just above 2^251 x coordinate outputs on the prime order group (the order of the subgroup is 2^252 + 27742317777372353535851937790883648493 and there are half as many x coordinate values, so 2^251 + 13871158888686176767925968895441824247), so we can't find a representable s3 for every arbitrary s1 * s2 value.

Don't let cryptographers have so much fun on Twitter, use ristretto255.

GitHub is dropping support for DSA

Turns out you already can't add new DSA SSH keys, and they are considering a full deprecation (which to be fair is extremely hard). Yay progress!

The BRs don't quite ban DSA yet

I said DSA is banned by the CA/Browser Forum Baseline Requirements. I was wrong. They are banned by the Mozilla Root Store policy, to a similar effect.

This difference shouldn't matter much, but the CAs are currently trying to fight a CABF ballot that would align the BRs with the browser policies, which makes no sense when you understand that the BRs don't exist to make the ecosystem a democracy, but to align the various browser policies so that CAs don't have to get a Mozilla audit, a Microsoft audit, an Apple audit, etc.

A picture

Here's me a few meters underwater in Tenerife last year, back when travel was a thing.[5]


  1. Anecdotally, all interesting reports come unencrypted, and once the key expired for eight months before anyone noticed. ¯\_(ツ)_/¯ ↩︎

  2. I am going to get so. much. hate. for this newsletter from both the decentralization nuts and from the four teams in the world that do all their development through PGP mail. ↩︎

  3. Still, a Google Form going to a Gmail mailbox enrolled in the Advanced Protection Program is probably the most secure reporting channel against real-world threats. PGP will get misused and anything else phished before anyone gets into an APP'd Google Account. ↩︎

  4. If you use HackerOne please don't fall for the temptation to make the program confidential or invite-only or any crap like that. First, you're not that cool. Second, any friction in a vulnerability reporting channel is an invite to email fulldisclosure@seclists.org instead. I will definitely go full disclosure rather than accept an NDA, why wouldn't I. ↩︎

  5. At the end of that dive, a friend and I presented the dive guide with a rubber cake complete of candle for his birthday, while doing the safety stop at 5m of depth. We made him blow on the candle of course. Good times. I miss diving. ↩︎