Back

NSA, NIST, and post-quantum crypto: my second lawsuit against the US government

971 points6 daysblog.cr.yp.to
jcranmer6 days ago

If anyone is curious, the courtlistener link for the lawsuit is here: https://www.courtlistener.com/docket/64872195/bernstein-v-na...

(And somebody has already kindly uploaded the documents to RECAP, so it costs you nothing to access.)

Aside: I really wish people would link to court documents whenever they talk about an ongoing lawsuit.

Natsu5 days ago

> Aside: I really wish people would link to court documents whenever they talk about an ongoing lawsuit.

I just want to second that and thank you for the link. Most reporting is just horribly bad at covering legal stuff because all the stuff that makes headlines that people click on is mostly nonsense.

AndyMcConachie5 days ago

And a big thank you to the wonderful people at the Free Law Project for giving us the ability to find and link to this stuff. They're a non-profit and they accept donations. (hint hint)

tptacek5 days ago

It's just a vanilla FOIA lawsuit, of the kind hundreds of people file every month when public bodies fuck up FOIA.

If NIST puts up any kind of fight (I don't know why they would), it'll be fun to watch Matt and Wayne, you know, win a FOIA case. There's a lot of nerd utility in knowing more about how FOIA works!

But you're not going to get the secrets of the Kennedy assassination by reading this thing.

chasil5 days ago

I will draw to your attention two interesting facts.

First, OpenSSH has disregarded the winning (crystals) variants, and implemented hybrid NTRU-Prime. The Bernstein blog post discusses hybrid designs.

"Use the hybrid Streamlined NTRU Prime + x25519 key exchange method by default ("sntrup761x25519-sha512@openssh.com"). The NTRU algorithm is believed to resist attacks enabled by future quantum computers and is paired with the X25519 ECDH key exchange (the previous default) as a backstop against any weaknesses in NTRU Prime that may be discovered in the future. The combination ensures that the hybrid exchange offers at least as good security as the status quo."

https://www.openssh.com/releasenotes.html

Second, Daniel Bernstein has filed a public complaint against the NIST process, and the FOIA stonewalling adds more concern and doubt that the current results are fair.

https://www.google.com/url?q=https://groups.google.com/a/lis...

What are the aims of the lawsuit? Can the NIST decision on crystals be overturned by the court, and is that the goal?

djmdjm5 days ago

We (OpenSSH) haven't "disregarded" the winning variants, we added NTRU before the standardisation process was finished and we'll almost certainly add the NIST finalists fairly soon.

chasil5 days ago

I will eagerly await the new kex and keytypes, and will be sure to sysupgrade.

I will be very curious if the default kex shifts away from NTRU-Prime.

I might also point out that crystals-kyber was coequal to NTRU-Prime at the time that you set your new default kex.

I trust that the changelog will have a detailed explanation of all the changes that you will make, and why.

I will "ssh-rotate" whatever you decide.

https://www.linuxjournal.com/content/ssh-key-rotation-posix-...

tptacek5 days ago

What are the aims of the lawsuit? NIST fucked up a FOIA response. The thing you do when a public body gives you an unsatisfactory FOIA response is that you sue them. I've been involved in similar suits. I'd be surprised if NIST doesn't just cough up the documents to make this go away.

"Can NIST's decisions on crystals be overturned by the court?" Let me help you out with that: no, you can't use a FOIA suit to "overturn" a NIST contest.

OpenSSH implemneted NTRU-Prime? What's your point? That we should just do whatever the OpenSSH team decides to do? I almost agree! But then, if that's the case, none of this matters.

+1
adrian_b5 days ago
LinuxBender5 days ago

It's not the first time either and it won't be the last. NIST chose Rijndael over Serpent for the AES standard even though Serpent won. I vaguely recall they gave some smarmy answer. I don't think anyone submitted a FOIA not that it would matter. I've been through that bloated semi-pseudo process and saw how easy it was to stall people not answer a simple question.

+2
bannable5 days ago
Thorrez5 days ago

>What are the aims of the lawsuit? Can the NIST decision on crystals be overturned by the court, and is that the goal?

It sounds to me like the goal is to find out if there's any evidence of the NSA adding weaknesses into any of the algorithms. That information would allow people to avoid using those algorithms.

+1
thalassophobia5 days ago
tptacek6 days ago

I may believe almost all of this is overblown and silly, as like a matter of cryptographic research, but I'll say that Matt Topic and Merrick Wayne are the real deal, legit the lawyers you want working on something like this, and if they're involved, presumably some good will come out of the whole thing.

Matt Topic is probably best known as the FOIA attorney who got the Laquan McDonald videos released in Chicago; I've been peripherally involved in some work he and Merrick Wayne did for a friend, in a pretty technical case that got fierce resistance from CPD, and those two were on point. Whatever else you'd say about Bernstein here, he knows how to pick a FOIA lawyer.

A maybe more useful way to say the same thing is: if Matt Topic and Merrick Wayne are filing this complaint, you should probably put your money on them having NIST dead-to-rights with the FOIA process stuff.

daneel_w5 days ago

> "I may believe almost all of this is overblown and silly, as like a matter of cryptographic research ..."

Am I misunderstanding you, or are you saying that you believe almost all of DJB's statements claiming that NIST/NSA is doctoring cryptography is overblown and silly? If that's the case, would you mind elaborating?

tptacek5 days ago

I believe the implication that NIST or NSA somehow bribed one of the PQC researchers to weaken a submission is risible.

I believe that NIST is obligated to be responsive to FOIA requests, even if the motivation behind those requests is risible.

jeffparsons5 days ago

> I believe the implication that NIST or NSA somehow bribed one of the PQC researchers to weaken a submission is risible.

Is that even a claim here? I'm on mobile right now so it's a bit hard for me to trawl through the DJB/NIST dialogue, but I thought his main complaint is that NIST didn't appear to have a proper and clear process for choosing the algorithms they did, when arguably better algorithms were available.

So the suggestion wouldn't necessarily be that one of the respected contestants was bribed or otherwise compromised, but rather that NIST may have been tapped on the shoulder by NSA (again) with the suggestion that they should pick a specific algorithm, and that NSA would make the suggestion they have because their own cryptographers ("true believers" on NSA payroll) have discovered flaws in those suggested algorithms that they believe NSA can exploit but hopefully not adversaries can exploit.

There's no need for any novel conspiracies or corruption; merely an exact repeat of previous NSA/NIST behaviour consistent with NSA policy positions.

It's simultaneously about as banal as it gets, and deeply troubling because of that.

+2
tptacek5 days ago
ckastner5 days ago

> I believe the implication that NIST or NSA somehow bribed one of the PQC researchers to weaken a submission is risible.

Could you elaborate on this? I didn't get this from the article at all. There's no researcher(s) being implicated as far as I can tell.

What I read is the accusation of NIST's decision-making process possibly being influenced by the NSA, something that we know has happened before.

Say N teams of stellar researchers submit proposals, and they review their peers. For the sake of argument, let's say that no flaw is found in any proposal; every single one is considered perfect.

NIST then picks algorithm X.

It is critical to understand the decision making process behind the picking of X, crucially so when the decision-making body has a history of collusion.

Because even if all N proposals are considered perfect by all possible researchers, if the NSA did influence NIST in the process, history would suggest that X would be the least trustable of all proposals.

And that's the main argument I got from the article.

Yes, stone-walling a FOIA request may be common, but in the case of NIST, there is ample precedent for malfeasance.

+1
tptacek5 days ago
jmprspret5 days ago

I believe you have a very naive and trusting view of these US governmental bodies. I don't intend that to be an insult, but by now I think the jury is out that these agencies cannot be trusted (the NSA less so, than NIST).

+1
lmeyerov5 days ago
+2
tptacek5 days ago
+1
daneel_w5 days ago
Semaphor5 days ago

> risible

just in case someone else never heard this word before:

> arousing or provoking laughter

Jotra74 days ago
api6 days ago

I don't think it's a bad thing to push back and demand transparency. At the very least the pressure helps keep NIST honest. Keep reminding them over and over and over again about dual-EC and they're less likely to try stupid stuff like that again.

xt005 days ago

Speaking of dual-EC -- it does seem like 2 questions seem to be often debated, but it can't be neglected that some of the vocal debaters may be NSA shills:

1. does the use of standards actually help people, or make it easier for the NSA to determine which encryption method was used?

2. are there encryption methods that actually do not suffer from reductions in randomness or entropy etc when just simply running the algorithm on the encrypted output multiple times?

It seems that these question often have piles of people ready to jump in saying "oh, don't roll your own encryption, ooh scary... fear uncertainty doubt... and oh whatever you do, don't encrypt something 3X that will probably make it easier to decrypt!!" .. but it would be great if some neutral 3rd party could basically say, ok here is an algorithm that is ridiculously hard to break, and you can crank up the number of bits to a super crazy number.. and then also you can run the encryption N times and just not knowing the number of times it was encrypted would dramatically increase the complexity of decryption... but yea how many minutes before somebody jumps in saying -- yea, don't do that, make sure you encrypt with a well known algorithm exactly once.. "trust me"...

tptacek5 days ago

1. Formal, centralized crypto standards, be they NIST or IETF, are a force for evil.

2. All else equal, fewer dependencies on randomness are better. But all else is not equal, and you can easily lose security by adding determinism to designs willy-nilly in an effort to minimize randomness dependencies.

Nothing is, any time in the conceivable future, change to make a broken RNG not game-over. So the important thing remains ensuring that there's a sound design for your RNG.

None of our problems have anything to do with how "much" you encrypt something, or with "cranking up the number of bits". That should be good news for you; generally, you can run ChaPoly or AES-CTR and trust that a direct attack on the cipher isn't going to be an issue for you. Most of our problems are in the joinery, not the beams themselves.

+1
api4 days ago
Thorrez5 days ago

>2. are there encryption methods that actually do not suffer from reductions in randomness or entropy etc when just simply running the algorithm on the encrypted output multiple times?

I think all block ciphers (e.g. AES) meet that definition. For AES, for a specific key, there's a 1-to-1 mapping of plaintexts to ciphertexts. It's impossible that running a plaintext through AES produces a ciphertext with less entropy, because if the ciphertext had less entropy, it would be impossible to decrypt to get back the plaintext, but AES always allows decryption.

logifail5 days ago

> some neutral 3rd party

Unfortunately, this would appear to be the bit we've not yet solved, nor are we likely to.

MauranKilom5 days ago

> are there encryption methods that actually do not suffer from reductions in randomness or entropy etc when just simply running the algorithm on the encrypted output multiple times?

Unless you can prove that all e.g. 2^256 possible 256 bit inputs map to 2^256 different 256 bit outputs (for every key, in the case of encryption), then chances are you lose strength with every application because multiple inputs map to the same output (and consequently some outputs are not reachable).

comex5 days ago

For encryption, as opposed to hashing, you can’t have multiple inputs map to the same output, because then you wouldn’t be able to decrypt the output.

adgjlsfhk15 days ago

it's very easy to prove that all encryption functions are 1 to 1. Otherwise, you couldn't decrypt the data.

tptacek6 days ago

Transparency is good, and, as Bernstein's attorneys will ably establish, not optional.

ddingus5 days ago

It's as optional as the people can be convinced to not worry about it.

encryptluks25 days ago

I have no doubt that they are great at their job, but when it comes to lawsuits the judge(s) are equally as important. You could get everything right but a judge has extreme power to interpret the law or even ignore it in select cases.

NolF5 days ago

I wouldn't say they ignore the law, but legislation like FOIA has a lot of discretion to balance competing interests and that's where a judge would make the most different despite all the great articulations of the most brilliant lawyers.

tptacek5 days ago

There are very few public bodies that do a solid, to-the-letter job of complying with their open records requirements. Almost all FOIA failings are due to the fact that it isn't staffed adequately; FOIA officers, clerks, and records attorneys are all overworked. When you do a bunch of FOIA stuff, you get a feel for what's going on with the other side, and you build a lot of empathy (which is helpful in getting your data over the long run).

And then other times you run into bloody-mindedness, or worse.

I don't think NIST has many excuses here. It looks like they botched this straightforwardly.

It's a straightforward case. My bet is that they'll lose it. The documents will get delivered. That'll be the end of it.

stefantalpalaru5 days ago
theknocker5 days ago
sigil5 days ago

Near the end of the post – after 50 years of axe grinding – djb does eventually get to the point wrt pqcrypto. I find the below excerpt particularly damning. Why not wrap nascent pqcrypto in classical crypto? Suspect!

--

The general view today is that of course post-quantum cryptography should be an extra layer on top of well-established pre-quantum cryptography. As the French government cybersecurity agency (Agence nationale de la sécurité des systèmes d'information, ANSSI) put it at the end of 2021:

Acknowledging the immaturity of PQC is important: ANSSI will not endorse any direct drop-in replacement of currently used algorithms in the short/medium term. However, this immaturity should not serve as an argument for postponing the first deployments. ANSSI encourages all industries to progress towards an initiation of a gradual overlap transition in order to progressively increase trust on the post-quantum algorithms and their implementations while ensuring no security regression as far as classical (pre-quantum) security is concerned. ...

Given that most post-quantum algorithms involve message sizes much larger than the current pre-quantum schemes, the extra performance cost of an hybrid scheme remains low in comparison with the cost of the underlying post-quantum scheme. ANSSI believes that this is a reasonable price to pay for guaranteeing an additional pre-quantum security at least equivalent to the one provided by current pre-quantum standardized algorithms.

But NSA has a different position: it says that it "does not expect to approve" hybrids. Publicly, NSA justifies this by

- pointing to a fringe case where a careless effort to add an extra security layer damaged security, and

- expressing "confidence in the NIST PQC process".

Does that mean the original NISTPQC process, or the current NISTPQC process in which NIST, evidently surprised by attacks, announced plans to call for new submissions?

Of course, if NSA/IDA have secretly developed an attack that works for a particular type of post-quantum cryptosystem, then it makes sense that they'd want people to start using that type of cryptosystem and turn off the existing pre-quantum cryptosystem.

tptacek5 days ago

This is the least compelling argument Bernstein makes in the whole post, because it's simply not the job of the NIST PQC program to design or recommend hybrid classical/PQC schemes. Is it fucky and weird if NSA later decides to recommend against people using hybrid key establishment? Yes. Nobody should listen to NSA about that, or anything else. But NIST ran a PQC KEM and signature contest, not a secure transport standardization. Sir, this is a Wendy's.

sigil5 days ago

It’s compelling in context. If the NSA influenced NIST standards 3x in the past — DES, DSA, Dual EC — then shouldn’t we be on high alert this 4th time around?

That NSA is already recommending against hybrid, instead of waiting for the contest results, might signal they’ve once again managed to game the standardization process itself.

At the very least — given the exhaustive history in this post — you’d like to know what interactions NSA and NIST have had this time around. Thus, djb’s FOIA. And thus the lawsuit when the FOIA went unanswered. It all seems very reasonable to me.

What’s that old saying, “fool me thrice…”?

tptacek5 days ago

Everybody is on high alert. Being on high alert doesn't make Bernstein right.

I don't even support the premise of NIST crypto standardization, let alone trust them to do it.

+1
blueprint4 days ago
xiphias26 days ago

An interesting thing that is happening on Bitcoin mailing list is that although it would be quite easy to add Lamport signatures as an extra safety feature for high value transactions, as they would be quite expensive and easy to misuse (they can be used only once, which is a problem if money is sent to the same address twice), the current concensus between developers is to ,,just wait for NSA/NIST to be ready with the algorithm''. I haven't seen any discussion on the possibility of never being ready on purpose because of a sabotage.

potatototoo995 days ago

Why not start that discussion yourself?

jack_pp5 days ago

Indeed as potato said, link this article in the ML for them to see that NIST can not be fully trusted

lizardactivist5 days ago

An expert, prominent, and someone who the whole cryptography community listens to, and he calls out the lies, crimes, and blatant hypocrisy of his own government.

I genuinely fear that he will be suicided one of these days.

ok_dad5 days ago

I think the United States is more about charging people with crimes and ruining their lives that way rather than disappearing people. Russia might kill you with Polonium and make sure everyone knows it, but America will straight up “legally“ torture you in prison via several means and then argue successfully that those methods were legal and convince the world you weren’t tortured. Anyone who’s a target for that treatment, though, knows that’s a lie.

dmix5 days ago

The FBI will just interview you over whatever and then charge you for lying to a federal agent or dig up some other unrelated dirt. While the original investigation gets mysteriously dropped a year later.

danuker5 days ago

McAfee and Epstein pop to mind. Maybe also Aaron Swartz.

discordance5 days ago

Assange too.

+1
danuker5 days ago
oittaa5 days ago

It seems silly to me how so many people immediately dismiss anyone even suggesting that something fishy was going on with those cases, when we already know about MKUltra, Tuskegee expirement, etc.

josh26005 days ago

I just want to say, the problem here is worldwide standards bodies for encryption need to be trustworthy. It is incredibly hard to know what encryption is actually real without a deep mathematics background and even then, a choir of peers must be able to present algorithms, and audits of those algorithms with a straight face.

Presenting broken-by-design encryption undermines public confidence in what should be one of our most sacrosanct institutions: the National Institute of Standards and Technology (NIST). Many enterprises do not possess the capability to audit these standards and will simply use whatever NIST recommends. The danger is that we could be engineering embedded systems which will be in use for decades which are not only viewable by the NSA (which you might be ok with depending on your political allegiance) but also likely viewable by any capable organization on earth (which you are probably not ok with irrespective of your political allegiance).

In short, we must have trustworthy cryptography standards. If we do not, bedlam will follow.

Please recall, the last lawsuit that DJB filed was the one that resulted in essentially "Code is speech" in our world (https://en.wikipedia.org/wiki/Bernstein_v._United_States).

tptacek5 days ago

There's an easier problem here, which is that our reliance on formal standards bodies for the selection of cryptography constructions is bad, and, not hardly just at NIST, has been over the last 20 years mostly a force for evil. One of the most important "standards" in cryptography, the Noise Protocol Framework, will probably never be a formal standard. But on the flip side, no formal standards body is going to crud it up with nonsense.

So, no, I'd say that bedlam will not follow from a lack of trustworthy cryptography standards. We've trusted standards too much as it is.

javajosh5 days ago

Believing both "Don't roll your own crypto" and "Don't trust the standards" would seem to leave the average developer in something of a quandry, no?

tptacek5 days ago

No. I don't think we should rely on formal standards, like FIPS, NIST, and the IETF. Like Bernstein himself, I do think we should rely on peer-reviewed expert cryptography. I use Chapoly, not a stream cipher I concocted myself, or some bizarro cipher cascade posted to HN. This is what I'm talking about when I mentioned the Noise Protocol Framework.

If IETF standards happen to end up with good cryptography because they too adopt things like Noise or Ed25519, that's great. I don't distrust the IETF's ability to standardize something like HTTP/3. I do deeply distrust the process they use to arrive at cryptographic architectures. It's gotten markedly better, but there's every reason to believe it'll backslide a generation from now.

(There are very excellent people who contribute to things like CFRG and I wouldn't want to be read as disparaging any of them. It's the process I have an issue with, not anything happening there currently.)

+1
InitialBP5 days ago
+1
dwaite5 days ago
+1
josh26005 days ago
bananapub5 days ago

how could NIST possibly be "one of our most sacrosanct institutions" after the NSA already fucked them with Dual_EC_DRBG?

whoever wants to recommend standards at any point since 2015 needs to be someone else

https://en.wikipedia.org/wiki/NIST_SP_800-90A for this who have forgotten.

josh26005 days ago

Look, my point is that there are lots of companies around the world who can’t afford highly skilled mathematicians and cryptographers on staff. These institutions rely on NIST to help them determine what encryption systems may make sense. If NIST is truly adversarial, the public has a right to know and determine how to engage going forward.

tptacek5 days ago

They don't have to (and shouldn't) retain highly skilled mathematicians. Nobody is suggesting that everyone design their own ciphers, authenticated key exchanges, signature schemes, and secure transports. Peer review is good; vital; an absolute requirement. Committee-based selection processes are what's problematic.

+1
nequo5 days ago
+1
josh26005 days ago
jacooper5 days ago

Flippo valrosida and Matthey green aren't too happy.

https://twitter.com/matthew_d_green/status/15556838562625208...

jeffparsons5 days ago

I think this is a sloppy take. If you read the full back-and-forth on the FOI request between D.J. Bernstein and NIST, it becomes readily apparent that there is _something_ rotten in the state of NIST.

Now of course that doesn't necessarily mean that NIST's work is completely compromised by the NSA (even though it has been in the past), but there are other problems that are similarly serious. For example, if NIST is unable to explain how certain key decisions were made along the way to standardisation, and those decisions appear to go against what would be considered by prominent experts in the field as "good practice", then NIST has a serious process problem. This is important work. It affects everyone in the world. And certain key parts of NIST's decision making process seem to be explained with not much more than a shrug. That's a problem.

tptacek5 days ago

All you're saying here is that NIST failed to comply with FOIA. That's not unusual. No public body does a reliably good job of complying with FOIA, and many public bodies seem to have a bad habit of pre-judging the "merits" of FOIA requests, when no merit threshold exists for their open records requirements.

NIST failing to comply with FOIA makes them an intransigent public body, like all the rest of them, from your local water reclamation board to the Department of Energy.

It emphatically does not lend support to any of this litigants concerns about the PQC process. I don't know enough (really, anything) about the PQC "contest" to judge claims about its validity, but I do know enough --- like, the small amount of background information needed --- to say that it's risible to suggest that any of the participating teams were compromised by intelligence agencies; that claim having been made in this post saps its credibility.

So, two things I think a reasonable person would want to establish here: first, that NIST's behavior with respect to the FOIA request is hardly any kind of smoking gun, and second that the narrative being presented in this post about the PQC contest seems somewhere between "hand-wavy" and "embarrassing".

jeffparsons5 days ago

> It emphatically does not lend support to any of this litigants concerns about the PQC process.

I agree with most of what you're saying except for this. In my view, unlike some of the other organisations you mentioned, the _only value_ of NIST is in the quality and transparency of its processes. My reading of the DJB/NIST FOI dialogue is that there is reason to believe NIST has serious process problems that go far beyond simply handling an FOI well. From their own responses, it reads as if they aren't able to articulate themselves why they would choose one contestant's algorithm over another's. That kind of undermines the entire point of having an open contest.

+1
tptacek5 days ago
silisili5 days ago

What's with the infighting here? Nothing about the post comes across as conspiracy theory level or reputation ruining. It makes me question the motives of those implying he's crazy, to be honest.

tptacek5 days ago

Post-quantum cryptography is essentially a full-employment program for elite academic public key cryptographers, which is largely what the "winning" PQC teams consist of. So, yeah, suggesting that one of those teams was compromised by an intelligence agency is "conspiracy theory level".

Nobody is denying the legitimacy of the suit itself. NIST is obligated to follow public records law, and public records law is important. Filippo's message, which we're all commenting on here, says that directly.

73737373735 days ago

Has the general notion of "conspiracy theory" ever carried any positive value? It only seems to exist to discredit "doubters against the majority consensus" without substance. But I guess words like "crank" wouldn't even exist if there weren't many people like it, so it carries some "definitional" value.

Because they show total disregard for someones opinion (in a more formal way: "unlike you/them, i completely agree with the (apparent) majority consensus (which it also implies), these words probably don't belong into a serious discussion.

+1
woodruffw5 days ago
throwaway6543295 days ago

Dismissing this lawsuit as a conspiracy theory is embarrassing for both of them.

There is ample evidence to document malfeasance by the involved parties, and it’s reasonable to ask NIST to follow public law.

detaro5 days ago

> Dismissing this lawsuit as a conspiracy theory is embarrassing for both of them.

They are not dismissing the lawsuit.

throwaway6543295 days ago

One says he’s doing it wrong. The other says he hopes that he wins, of course!

Meanwhile they go on to attack Bernstein, mischaracterize his writing, completely dismiss his historical analysis, mock him with memes as a conspiracy theorist, and to top it off they question his internal motivations (which they somehow know) as some kind of a sore loser which is demonstrably false.

The plot twist for the last point: he is still in the running for round four and his former PhD students did win major parts of round three.

+2
tptacek5 days ago
svnpenn5 days ago

Filippo Valsorda seems to be happy to ignore the fact that NIST already let an NSA backdoor in, as recently as 2014:

https://wikipedia.org/wiki/Dual_EC_DRBG

is he really just going to ignore something from 8 years ago?

throwaway6543295 days ago

Yes, he appears to be unreasonably dismissive of the blindly obvious history and the current situation.

As an aside, this tracks with his choice of employers - at least one of which was a known and documented NSA collaborator (as well as a victim, irony of irony) before he took the job with them.

As Upton Sinclair remarked: “It is difficult to get a man to understand something when his salary depends upon his not understanding it.”

Joining Google after Snowden revealed PRISM and BULLRUN, as well as MUSCULAR, is almost too rich to believe, Meanwhile he asserts and dismisses Bernstein as a conspiracy theorist. It’s a classic bad faith ad-hominem coincidence theory.

stefantalpalaru5 days ago
oittaa5 days ago
tptacek5 days ago

First, last I checked, Filippo does not in fact work at Google.

Second: the guidelines on this site forbid you to write comments like this; in fact, this pattern of comments is literally the most frequent source of moderator admonitions on HN.

Filippo hardly needs me to defend his reputation, but, as a service to HN and to you in particular, I'd want to raise your awareness of the risk of beclowning yourself by suggesting that he, of all people, is somehow compromised.

+1
oittaa5 days ago
throwaway6543295 days ago
ghoward5 days ago

Thanks for letting me know. I think I'll consider both of them compromised.

jacooper5 days ago

Man, mobile typos suck.

er4hn5 days ago

> The same people tend to have trouble grasping that most of the vulnerabilities exploited and encouraged by NSA are also exploitable by the Chinese government. These people start with the assumption that Americans are the best at everything; ergo, we're also the best at espionage. If the Chinese government stole millions of personnel records from the U.S. government, records easily usable as a springboard for further attacks, this can't possibly be because the U.S. government made a policy decision to keep our computer systems "weak enough to still permit an attack of some nature using very sophisticated (and expensive) techniques".

I'm not sure if I understand this part. I was under the impression that the OPM hack was a result of poor authn and authz controls, unrelated to cryptography. Was there a cryptography component sourced somewhere?

danielheath5 days ago

If, rather than hoarding offensive tools & spying, the NSA had interpreted its mission as being to harden the security of government infrastructure (surely even more firmly within the remit of national security) and spent its considerable budget in that direction, would authn and authz controls have been used at the OPM?

woodruffw5 days ago

This is my understanding as well. I asked this very same question less than a week ago[1], and now it's the first Google result when you search "OPM Dual_EC_DRBG."

The response to my comment covers some circumstantial evidence. But I'm not personally convinced; human factors are a much more parsimonious explanation.

[1]: https://news.ycombinator.com/item?id=32286528

bsaul6 days ago

holy crap, i wondered why the post didn't mention work by dj bernstein outing flaws in curves submitted by nsa...

Well, didn't expect the post to actually be written by him.

efitz5 days ago

Why don’t we invert FOIA?

Why don’t we require that all internal communications and records be public, available within 24 hours on the web, and provide a very painful mechanism involving significant personal effort of high level employees for every single communication or document that is to be redacted in some way? The key is requiring manual, personal (non-delegatable) effort on the part of senior bureaucrats, and to allow a private cause of action for citizens and waiver of immunity for bureaucrats.

We could carve out (or maybe not) specific things like allowing automatic redaction of employee PII and PII of citizens receiving government benefits.

After many decades, it’s clear that the current approach to FOIA and sunshine laws just isn’t working.

[ed] fixed autocorrect error

chaps5 days ago

The carve-out you mention is a decent idea on paper, but in practice is a difficult process. There's really no way to do it in any significant degree without basically putting all gov to a complete halt. Consider that government is not staffed with technical people, nor necessarily critically minded people to implement these systems.

There are ways to push for FOIA improvements that don't require this sort of drastic approach. Problem is, it takes a lot of effort on the parts of FOIA requesters, through litigation and change in the laws. Things get surprisingly nuanced when you really get down into what a "record" is, specifically for digital information. I definitely wouldn't want to have "data" open by default in this manner, because it would lead to privacy hell.

Another component of this all is to consider contractors and subcontractors. Would they fall under this? If so, to what degree? If not, how do we prevent laundering of information through contractors/subcontractors?

To a large degree, a lot of "positive" transparency movements like the one you suggest can ironically lead to reduced transparency in some of the more critical sides of transparency. A good example of that is "open data", which gives an appearance of providing complete data, but without the legal requirements to enforce it. Makes gov look good but it de-incentivizes transparency pushback and there's little way to identify whether all relevant information is truly exposed. I would imagine similar would happen here.

efitz4 days ago

A private right of action and waiver of immunity solves most of the “bad actor” problems.

The big issue is how to preserve what actually needs to be secret (in the interest of the USA, not the interests of the bureaucracy) while forcing everything else to be public.

A lot of things are secret that don’t need to be secret; that’s a side effect of mandatory data classification and normal bureaucratic incentives- you won’t get in trouble for over-classifying, and classified information is a source of bureaucratic power. So you have to introduce a really strong personal incentive to offset that or nothing will ever change.

Personally, I don’t think that information should be classified if it came from public sources. Or maybe only allow such information to be classified for a short period of time, eg one year.

The longer and/or higher the classification level, the more effort should be involved, to create disincentives to over-classification.

chaps2 days ago

I'm sorry, but very little of what you're saying makes sense in practice. I suggest submitting some FOIA requests to your local government to get some context and understanding of the difficulties.

gorgoiler5 days ago

The old Abe rhetoric was powerful but it always felt like it was only hitting home on two of the three points. Obviously government, by definition really, is of the people. The much better parts were for the people and by the people.

afrcnc5 days ago
voz_5 days ago
chmod7755 days ago

Qualifiers such as evil aren't really useful when there hasn't been a country acting honorably on that stage for a long time, if ever.

Here's a phrasing that might be more appropriate:

"Since we're backstabbers and scoundrels, we should exercise caution around each other."

vasco5 days ago

Do you think it's tough for those regimes to pay someone to do FOIA requests for them? Or to get jobs at government agencies?

efitz5 days ago

We should rethink the concept of a “secret”. If it’s really a secret, it will still be worth the effort to protect.

acover5 days ago

They are erroring on the side of caution because people have determined secret information from public information - like the energy in a nuclear bomb (censored) by the blast radius (public).

Another example is they want to protect their means and methods. But those means and methods are how they know most information. Often times it's easy to work backwards from they know x therefore y is compromised.

It's a hard problem similar to how to release anonymized data. See K-anonymity attacks and caveats.

https://en.wikipedia.org/wiki/K-anonymity

denton-scratch5 days ago

Surely "keeping things a little more hidden" depends on reliable cryptography.

nix235 days ago

Not sure if the US with it's torture-base aka Guantanamo and torture-safe-houses around the world really has the right to call someone else "evil", i don't mean that as "whataboutissm" but that human lives are not more "worth" in the US as in Mainland China

bsaul5 days ago

side question :

I've only recently started to digg a bit deeper into crypto algorithms ( looking into various types of curves etc), and it gave me the uneasing feeling that the whole industry is relying on the expertise of only a handful of guys to actually ensure that crypto schemes used today are really working.

Am i wrong ? are there actually thousands and thousands of people with the expertise to actually proove that the algorithms used today are really safe ?

kibibyte5 days ago

I don’t know if that’s easily quantifiable, but I had a cryptography professor (fairly well-known nowadays) several years ago tell us that she only trusted 7 people (or some other absurdly low number), one of them being djb, to be able to evaluate the security of cryptographic schemes.

Perhaps thousands of people in the world can show you proofs of security, but very few of them may be able to take into account all practical considerations like side channels and the like.

benlivengood5 days ago

There may be thousands of people in the entire world who understand cryptanalysis well enough to accurately judge the security of modern ciphers. Most aren't living or working in the U.S.

It's very difficult to do better. The mathematics is complex and computer science hasn't achieved proofs of the hypotheses underlying cryptography. The best we can achieve is heuristic judgements about what the best possible attacks are, and P?=NP is an open question.

Tainnor5 days ago

> The mathematics is complex and computer science hasn't achieved proofs of the hypotheses underlying cryptography.

No unconditional proofs (except for the OTP ofc), but there are quite a few conditional proofs. For example, it's possible to show that CBC is secure if the underlying block cipher is.

aumerle5 days ago

Proof! the entire field of cryptography can prove absolutely nothing other than that a single use of One time pad is secure. the rest is all hand waving, that boils down to no-one I know knows how to do this, and I cant do it myself, so I believe it's secure.

So the best we have in cryptography is trusting "human instincts/judgements" about various algorithms. Which then further reduces to trusting humans.

chasil5 days ago

This "monoculture" post raised this point several years ago.

https://lwn.net/Articles/681616/

NavinF5 days ago

Most programmers don't need to prove crypto algorithms. There are many situations where you can just use TLS 1.3 and let it choose the ciphers. If you really need to build a custom protocol or file format, you can still use libsodium's secretbox, crypto_box, and crypto_kx functions which use the right algorithms.

ziddoap5 days ago

This is completely unrelated to the question being asked by the parent. They aren't asking about the average programmer. They are asking how many people in the world can truly 'prove' (to some reasonable degree) that the cryptography in use and the algorithms that are implementing that cryptography are 'secure' (to some reasonable degree).

Put another way, they are asking how many people in the world could verify that the algorithms used by libsodium, crypto_box, etc. are secure.

NavinF4 days ago

My point was that you don't need "thousands and thousands of people with the expertise to actually proove that the algorithms used today are really safe".

If the demand existed, there would be a lot more of those people.

ziddoap4 days ago

Again, parent poster didn't say there was a need for thousands. They were asking how many there is a demand for. One? Ten? Hundred? That's the question that is being asked.

l33t23285 days ago

The grandparent post is asking about the people who need to know enough to program TLS to

> let it choose

gred6 days ago

This guy is the best kind of curmudgeon. I love it.

dataflow5 days ago

Tangential question: while some FOIA requests do get stonewalled, I continue to be fascinated that they're honored in other cases. What exactly prevents the government from stonewalling practically every request that it doesn't like, until and unless it's ordered by a court to comply? Is there any sort of penalty for their noncompliance?

Tangential to the tangent: is there any reason to believe FOIA won't be on the chopping block in a future Congress? Do the majority of voters even know (let alone care enough) about it to hold their representatives accountable if they try to repeal it?

linuxandrew5 days ago

I know someone who works in gov (Australia, not US) who told me all about a FOI request that he was stonewalling. From memory, the request was open ended and would have revealed more than it possibly intended it to, and would have revealed some proprietary trade secrets from a third party contractor. That said, it was probably a case that would attract some public interest.

The biggest factors preventing governments from stonewalling every FOI case are generally time and money. Fighting FOI cases is time consuming and expensive and it's simply easier to hand over the information.

dhx5 days ago

At least in Australia I gather it it somewhat common for FOI offices to work with an FOI applicant to ask them to narrow the request if it is so broad as to cost too much or take too long to process, or is likely to just to be returned as hundreds of black pages.

Previous FOI responses show more savvy FOI applicants in the past have also (when they don't get the outcome they desired):

1. Formally requested review of decisions to withhold information from release. This almost always lead to more information being released.

2. Waited and tried requesting the same or similar information again in a later year when different people are involved.

3. Sent a follow up FOIA request for correspondence relating to how a previous (or unanswered) request was or is being processed by the FOI office and other parties responding to the request. This has previously shown somewhat humorous interactions with FOI offices such as "We're not going to provide that information because {lame excuse}" vs FOI office "You have to. CC:Executives" vs "No" vs Executives "It's not your information" etc etc.

4. Sent a follow up FOIA request for documentation, policies, training material and the likes for how FOI requests are assessed as well as how and by whom decisions are made to release or withhold information.

5. Sent a follow up FOIA request for documentation, policies, staffing levels, budgets, training material and the likes for how a typical event that the original FOIA request referred to would be handled (if details of a specific event are not being provided).

Responses to (2), (3) and (4) are probably more interesting to applicants than responses to (1), (2) and original requests, particularly when it is clear the applicant currently or previously has knowledge of what they're requesting.

dataflow5 days ago

Interesting, thanks for the anecdote!

> The biggest factors preventing governments from stonewalling every FOI case are generally time and money.

Is there any backpressure in the system to make the employee(s) responsible for responding/signing off on the disclosure actually care about how expensive it is to fight a case? I would've thought they would think, "Well, the litigation cost doesn't affect me, I just approve/deny requests based on their merits."

Panzer045 days ago

Presumably most government employees are acting in good faith - why wouldn’t they fulfil a reasonable FOIA request?

This is likely the result of some actors not acting in good faith, and so have no choice but to stonewall lest their intransigence be revealed.

lupire5 days ago

All execs have to do is not staff the FOIA department, and requests get ignored. People generally prefer free time to doing paperwork, if boss allows.

rnhmjoj5 days ago

If there's the suspiscion that NIST interests aren't aligned with the public ones (at least wrt cryprography, I hope they're at least honest with the physical constants), why do we still allow them do dictate the standards?

I mean, there's plenty of standards bodies and experts in the cryptography community around the world that could probably do a better job. At this point NIST should be treated as a compromised certificate authority: just ignore them and move along.

xenophonf6 days ago

Good god, this guy is a bad communicator. Bottom line up front:

> NIST has produced zero records in response to this [March 2022] FOIA request [to determine whether/how NSA may have influenced NIST's Post-Quantum Cryptography Standardization Project]. Civil-rights firm Loevy & Loevy has now filed suit on my behalf in federal court, the United States District Court for the District of Columbia, to force NIST to comply with the law.

Edit: Yes, I know who DJB is.

kube-system6 days ago

Well, he is an expert in cryptic communication

jcranmer6 days ago

That is truly burying the lede...

I spent most of the post asking myself "okay, I'm guessing this is something about post-quantum crypto, but what are you actually suing about?"

benreesman5 days ago

djb has got to be the single biggest pain in the ass for the NSA and I love it.

eointierney6 days ago

Yippee! DJB for the win for the rest of us!

londons_explore4 days ago

So... Common pattern:. NSA, it's representatives or affiliates make claims that longer key lengths are unnecessary or have too much of a performance cost.

So... I make the claim again. Let's multiply all key lengths by 10. Ie. 2048 bit RSA becomes 20480 bit RSA.

Who here thinks that's a bad idea? Previously on HN such ideas have been downvoted and comments have been made against them. I wonder, who has it been doing that, and what were their motives?

tomgs5 days ago

My background is in normal, enterprise-saas-style software development projects, and the whole notion of post-quantum crypto kind of baffles me.

Funnily enough, this post coincides with the release of a newsletter issue[0] by a friend of mine - unzip.dev - about lattice-based cryptography.

A bit of a shameless plug, but it really is a great bit of intro for noobs in the area like myself.

[0] https://unzip.dev/0x00a-lattice-based-cryptography/

aaaaaaaaaaab5 days ago
lawrenceyan6 days ago

Here's an interesting question. Even if post-quantum cryptography is securely implemented, doesn't the advent of neurotechnology (BCIs, etc.) make that method of security obsolete?

With read and write capability to the brain, assuming this comes to fruition at some point, encryption as we know it won't work anymore. But I don't know, maybe this isn't something we have to worry about just quite yet.

Banana6996 days ago

The thing you're missing is that BCIs and friends are, themselves, computers, and thus securable with post-quantum cryptography, or any cryptography for that matter, or any means of securing a computer. And thus, for somebody to read-write to your computers, they need to read-write to your brain(s), but to read-write to your brain(s), they need to read-write to the computers implanted in your brain(s). It's a security cycle whose overall power is determined by the least-secure element in the chain.

Any sane person will also not touch BCIs and similar technology with a 100 lightyear pole unless the designing company reveals every single fucking silicon atom in the hardware design and every single fucking bit in the software stack at every level of abstraction, and ships the device with several redundant watchdogs and deadmen timers around it that can safely kill or faraday-cage the implant on user-defined events or manually.

Alas, humans are very rarely sane, and I come to the era of bio hacking (in all senses of the word) with low expectations.

xenophonf6 days ago

Cryptographic secrets stored in human brains are already vulnerable to an attack mechanism that requires $5 worth of interface hardware that can be procured and operated with very little training. Physical security controls do a decent job of preventing malicious actors from connecting said hardware to vulnerable brains. I assume the same would be true with the invention of BCIs more sophisticated than a crescent wrench.

yjftsjthsd-h6 days ago

The encryption is fine, that's just a way to avoid it. Much like how tire-iron attacks don't break passwords so much as bypass them.

lawrenceyan6 days ago

Ok that's actually a great point. To make the comparison:

Tire-irons require physical proximity. And torture generally doesn't work, at least in the case of getting a private key.

Reading/writing to the brain, on the other hand, requires no physical proximity if wireless. And the person(s) won't even know it's happening.

These seem like totally different paradigms to me.

ziddoap6 days ago

I think we are a long way away from being able to wirelessly read a few specific bytes of data from the brain of an unknowing person. Far enough away that I'm not sure it's productive to begin thinking of how to design encryption systems around it.

+1
lawrenceyan6 days ago
aaaaaaaaata5 days ago

> And torture generally doesn't work, at least in the case of getting a private key.

This seems incorrect.

PaulDavisThe1st6 days ago

> torture generally doesn't work, at least in the case of getting a private key.

Why not?

fudgefactorfive5 days ago

You can beat me to a pulp, doesn't make me suddenly remember a specific N byte string any faster.

Passwords are to be remembered, private keys are to be stored. I suppose I'll tell you where it's stored, but often even that doesn't help. (E.g. It's on a USB key I didn't label and lost, or this is totally the admin pin to my smartcard, ok you got me these 3 are the real pins, uh oh it's physically wiped itself? Sad face for you)

lysergia6 days ago

Yeah I’ve even had very personal dreams where my Linux root password was spoken in the dream. I’m glad I don’t talk in my sleep. There’s also truth serums that can be weaponized in war scenarios to extract secrets from the enemy without resorting to torture.

londons_explore4 days ago

So this lawsuit is to try and force the release of some documents that might be embarrassing.

However, if the lawsuit is won, I would think it very unlikely the documents aren't rewritten 'on national security ' grounds before release.

So nothing will be learned either way.

dmix5 days ago

This is one hell of a well written argument.

sylware5 days ago

I have the feelings the govs around the world get more and more sued related to serious digital matters. Here, once the heat wave is finally over, I will see again my lawyer about the interoperability of gov related sites with noscript/basic (x)html browsers.

ehzy5 days ago

Ironically, when I visit the site Chrome says my connection is not secured by TLS.

kzrdude5 days ago

I was hoping for chacha20+Poly1305

ziddoap5 days ago

You can see for yourself if you visit the HTTPS version.

>Connection Encrypted (TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256, 256 bit keys, TLS 1.2)

encryptluks25 days ago

Are you logging into the site?

thorwayham5 days ago

dig @1.1.1.1 blog.cr.yp.to is failing for me, but 8.8.8.8 works. Annoying!

frogperson5 days ago

Seems odd to me a crypto blog isn't using https these days.

ris5 days ago

There are ways of writing that make one look less like a paranoid conspiracy theorist.

sgt1015 days ago

yeah, but where do all these big primes come from?

graderjs5 days ago

So the TLDR is… you do roll your own crypto? I mean you probably need to know how to create a RNG that passes Practrand and smasher first and also a hash function that does the same but cool.

thrway33444446 days ago

Why is the link in the URL http: not https: ? Irony?

sam0x175 days ago

Well https uses the NIST standards so.... ;)

creatonez5 days ago

This is just due to the way that the OP posted it, not how it was originally published. This website forces HTTPS using ChaCha20-Poly1305 standard.

cosmiccatnap5 days ago

If you spend all day making bagels do you go home and make bagels for dinner?

It's a static text blog, not a bank

pessimizer5 days ago

> It's a static text blog, not a bank

I want those delivered by https most, because http leaks the exact page I've visited, rather than just the domain.

effie5 days ago

If you care about preventing those kinds of leaks, do not use mainstream browsers (they are likely to leak even your https URLs to the browser company), and do not access those pages directly using your home connection (there may be mitms between you and the page).

creatonez5 days ago

See: "Here's Why Your Static Website Needs HTTPS" by Troy Hunt

https://www.troyhunt.com/heres-why-your-static-website-needs...

theandrewbailey5 days ago

The NSA has recorded your receipt of this message.

mort966 days ago

Weirdly, any time I've suggested that maaaybe being too trusting of a known bad actor which has repeatedly published intentionally weak cryptography is a bad idea, I've received a whole lot of push-back and downvotes here on this site.

throwaway6543296 days ago

Indeed. Have my upvote stranger.

The related “just ignore NIST” crowd is intentionally or unintentionally dismissing serious issues of governance. Anyone who deploys this argument is questionable in my mind, essentially bad faith actors, especially when the topic is about the problems brought to the table by NIST and NSA.

It is a good sign that those people are actively ignoring the areas where you have no choice and you must have your data processed by a party required to deploy FIPS certified software or hardware.

morpheuskafka5 days ago

I'm working on a project that involves a customized version of some unclassified, non-intelligence software for a defense customer at my job (not my ideal choice of market, but it wasn't weapons so okay with it). Some of the people on the project come from the deeper end of that industry, with several TS/SCI contract and IC jobs on their resumes.

We were looking over some errors on the sshd log and it was saying it couldn't find the id_ed25519 server cert. I remarked that that line must have stayed even though the system was put in FIPS mode which probably only allowed the NIST-approved ECC curve and related this story, how everyone else has moved over to ed25519 and the government is the only one left using their broken algorithm.

One of the IC background guys (who is a very nice person, nothing against them) basically said, yeah the NSA used to do all sorts of stuff that was a bad idea, mentioning the Clipper chip, etc. What blew my mind is that they seemed to totally have reasonable beliefs about government surveillance and powers, but then when it comes to someone like Snowden, thinks their are a traitor and should have used the internal channels instead of leaking. I just don't understand how they think those same people who run NSA would have cared one bit, or didn't know about it already. I always assumed the people that worked in the IC would just think all this stuff was OK to begin with I guess.

I don't know what the takeaway is from that, it just seems like a huge cognitive dissonance.

sneak5 days ago

I think the term "doublethink" was invented specifically for government functionaries like the IC guy you describe.

Being consistently and perfectly dogmatic requires holding two contradictory beliefs in your head at once. It's a skill.

l33t23285 days ago

It’s not doublethink to say the programs should have been exposed and that Snowden was a traitor for exposing them in a manner that otherwise hurt our country.

He could have done things properly, instead he dumped thousands of files unrelated to illegal surveillance to the media.

+1
sneak4 days ago
2OEH8eoCRo05 days ago

While I am skeptical of US domestic surveillance, Snowden leaked this information in the worst possible way.

Try internal whistleblower channels first. Not being heard? Mail to members of Congress? Contact congress? Contact the media?

Instead he fled to an adversary with classified material. That's not good faith behavior imo. Traitor

zingplex5 days ago

Regarding trying internal channels, Snowden says he tried this

> despite the fact that I could not legally go to the official channels that direct NSA employees have available to them, I still made tremendous efforts to report these programs to co-workers, supervisors, and anyone with the proper clearance who would listen. The reactions of those I told about the scale of the constitutional violations ranged from deeply concerned to appalled, but no one was willing to risk their jobs, families, and possibly even freedom

The fleeing to a foreign adversary part would have been completely avoidable if the US had stronger whistleblower protections. It's perfectly reasonable to see what happened to Chelsey Manning and Julian Assange and not want to suffer a similar fate.

+1
2OEH8eoCRo05 days ago
glitchc6 days ago

Many government or government affiliated organizations are required to comply with NIST approved algorithms by regulation or for interoperability. If NIST cannot be trusted as a reputable source it leaves those organizations in limbo. They are not equipped to roll their own crypto and even if they did, it would be a disaster.

icodestuff5 days ago

"Other people have no choice but to trust NIST" is not a good argument for trusting NIST. Somehow I don't imagine the NSA is concerned about -- and is probably actively in favor of -- those organizations having backdoors.

wmf5 days ago

It's an argument for fixing NIST so that it is trustworthy again.

throwaway6543295 days ago

This.

One wonders if NIST can be fixed or if it should simply be abolished with all archives opened in the interest of restoring faith in the government. The damage done by NSA and NIST is much larger than either of those organizations.

zamadatix5 days ago

"Roll your own crypto" typically refers to making your own algorithm or implementation of an algorithm not choosing the algorithm.

lazide5 days ago

Would you really want every random corporation having some random person pick from the list of open source cipher packages? Which last I checked , still included things like 3DES, MD5, etc.

You might as well hand a drunk monkey a loaded sub machine gun.

zamadatix5 days ago

Every random corporation having some random person picking from a list of open source cipher packages isn't the only alternative to strictly requiring the algorithm be NIST approved. It may be the worst possible alternative one could conceive though, and one that would probably take more work to do than something more reasonable anyways.

+1
CodeSgt5 days ago
l33t23285 days ago

What’s wrong with 3DES?

lupire5 days ago
616c5 days ago

Another upvote from someone with many friends and colleagues in NIST. I hope transparency prevails and NISTers side with that urge as well (I suspect many do).

throwaway6543295 days ago

They could and should leak more documents if they have evidence of malfeasance.

There are both legal safe avenues via the IG process and legally risky many journalists who are willing to work for major change. Sadly legal doesn’t mean safe in modern America and some whistleblower have suffered massive retribution even when they play by “the rules” laid out in public law.

As Ellsberg said: Courage is contagious!

throwaway6543296 days ago

The history in this blog post is excellently researched on the topic of NSA and NIST cryptographic sabotage. It presents some hard won truths that many are uncomfortable to discuss, let alone to actively resist.

The author of the blog post is also well known for designing and releasing many cryptographic systems as free software. There is a good chance that your TLS connections are secured by some of these designs.

One of his previous lawsuits was critical to practically protecting free speech during the First Crypto War: https://en.m.wikipedia.org/wiki/Bernstein_v._United_States

I hope he wins.

aliqot6 days ago

Given his track record, and the actual meat of this suit, I think he has a good chance.

- He is an expert in the domain

- He made a lawful request

- He believes he's experiencing an obstruction of his rights

I don't see anything egregious here. Being critical of your government is a protected right for USA. Everyone gets a moment to state their case if they'd like to make an accusation.

Suing sounds offensive, but that is the official process for submitting an issue that a government can understand and address. I'm seeing some comments here that seem aghast at the audacity to accuse the government at your own peril, and it shows an ignorance of history.

maerF0x06 days ago

I'd add

* and it's been 20 yrs since the 9/11 attacks which predicated a lot of the more recent dragnets

kevin_thibedeau5 days ago

The dragnets existed before 9/11. That just gave justification for even more funding.

+3
throwaway6543295 days ago
feet6 days ago

I'll also add

Which have not prevented anything and instead are used in parallel construction to go after Americans

+4
gene916 days ago
trasz5 days ago

>Being critical of your government is a protected right for USA. Everyone gets a moment to state their case if they'd like to make an accusation.

Unless a kangaroo “FISA court” says you can’t - in which case you’re screwed, and can’t even tell anyone about the “sentence” if it included a gag order. Still better than getting droned I suppose.

newsclues5 days ago

Trump Card: National Security

CaliforniaKarl5 days ago

That's a valid reason (specifically, 1.4(g) listed at https://www.archives.gov/declassification/iscap/redaction-co...). And while the NIST returning such a response is possible, it goes against the commitment to transparency.

But still, that requires a response, and there hasn't been one.

Kubuxu5 days ago

"National Security" response implies cooperation with NSA and destroys NIST's credibility.

nimbius6 days ago

the author was also part of the Linux kernel SPECK cipher talks that broke down in 2013 due to the nsa's stonewalling and hand waving for technical data and explanations.

nsa speck was never adopted.

https://en.m.wikipedia.org/wiki/Speck_(cipher)

ddingus5 days ago

Interesting read!

fossuser6 days ago

I remember reading about this in Steven Levy's crypto and elsewhere, there was a lot of internal arguing about lots of this stuff at the time and people had different opinions. I remember that some of the suggested changes from NSA shared with IBM were actually stronger against a cryptanalysis attack on DES that was not yet publicly known (though at the the time people suspected they were suggesting this because it was weaker, the attack only became publicly known later). I tried to find the specific info about this, but can't remember the details well enough. Edit: I think it was this: https://en.wikipedia.org/wiki/Differential_cryptanalysis

They also did intentionally weaken a standard separately from that and all the arguing about 'munitions export' intentionally requiring weak keys etc. - all the 90s cryptowar stuff that mostly ended after the clipper chip failure. They also worked with IBM on DES, but some people internally at NSA were upset that they shared this after the fact. The history is a lot more mixed with a lot of people arguing about what the right thing to do is and no general consensus on a lot of this stuff.

throwaway6543295 days ago

You are not accurately reflecting the history that is presented in the very blog post we are discussing.

NSA made DES weaker for everyone by reducing the key size. IBM happily went along. The history of IBM is dark. NSA credited tweaks to DES can be understood as ensuring that a weakened DES stayed deployed longer which was to their advantage. They clearly explain this in the history quoted by the author:

“Narrowing the encryption problem to a single, influential algorithm might drive out competitors, and that would reduce the field that NSA had to be concerned about. Could a public encryption standard be made secure enough to protect against everything but a massive brute force attack, but weak enough to still permit an attack of some nature using very sophisticated (and expensive) techniques?”

They’re not internally conflicted. They’re strategic saboteurs.

bragr5 days ago

>IBM happily went along. The history of IBM is dark.

Then, as of now, I'm confused why people expect these kinds of problems to be solved by corporations "doing the right thing" rather than demanding some kind of real legislative reform.

throwaway6543295 days ago

Agreed. It can be both but historically companies generally do the sabotage upon request, if not preemptively. This hasn’t changed much at all in favor of protecting regular users, except maybe with the expansion of HTTPS, and a few other exceptions.

revscat5 days ago

Libertarian and capitalist propaganda. The answer is always a variation of “if you don’t like it, don’t buy it/let the market decide.” Even if the “market” heads towards apocalypse.

mensetmanusman5 days ago

Is there a third option beyond government and corporations?

fossuser5 days ago

> "NSA credited tweaks to DES can be understood as ensuring that a weakened DES stayed deployed longer which was to their advantage. They clearly explain this in the history quoted by the author"

I'm not sure I buy that this follows, wouldn't the weakened key size also make people not want to deploy it given that known weakness? To me it reads more that some people wanted a weak key so NSA could still break it, but other people wanted it to be stronger against differential cryptanalysis attacks and that they're not really related. It also came across that way in Levy's book where they were arguing about whether they should or should not engage with IBM at all.

+1
throwaway6543295 days ago
api6 days ago

> I remember that some of the suggested changes from NSA shared with IBM were actually stronger against a cryptanalysis attack on DES that was not yet publicly known

So we have that and other examples of NSA apparently strengthening crypto, then we have the dual-EC debacle and some of the info in the Snowden leaks showing that they've tried to weaken it.

I feel like any talk about NSA influence on NIST PQ or other current algorithm development is just speculation unless someone can turn up actual evidence one way or another. I can think of reasons the NSA would try to strengthen it and reasons they might try to weaken it, and they've done both in the past. You can drive yourself nuts constructing infinitely recursive what-if theories.

kmeisthax5 days ago

The NSA wants "NOBUS" (NObody-But-US) backdoors. It is in their interest to make a good show of fixing easily-detected vulnerabilities while keeping their own intentional ones a secret. The fantasy they are trying to sell to politicians is that people can keep secrets from other people but not from the government; that they can make uncrackable safes that still open when presented with a court warrant.

This isn't speculation either; Dual_EC_DRBG and its role as a NOBUS backdoor was part of the Snowden document dump.

+2
api5 days ago
throwaway6543295 days ago

NSA doesn’t want NOBUS, they’re not a person.

NSA leadership has policies to propose and promote the NOBUS dream. Even with Dual_EC_DRBG, the claims of NOBUS were incredibly arrogant. Just ask Juniper and OPM how that NOBUS business worked out. The NSA leadership wants privileged access and data at nearly any cost. The leadership additionally want you to believe that they want NOBUS for special, even exceptional cases. In reality they want bulk data, and they want it even if the NOBUS promises can fail open.

Don’t believe the hype, security is hard enough, NOBUS relies on so many assumptions that it’s a comedy. We know about Snowden because he went public, does anyone think we, the public, would learn if important keys were compromised to their backdoors? It seems extremely doubtful that even the IG would learn, even if NSA themselves could discover it in all cases.

fossuser6 days ago

I think it's just both. It's a giant organization of people arguing in favor of different things at different times over its history, I'd guess there's disagreement internally. Some arguing it's critical to secure encryption (I agree with this camp), others wanting to be able to break it for offense reasons despite the problems that causes.

Since we only see the occasional stuff that's unclassified we don't really know the details and those who do can't share them.

throwaway6543295 days ago

There are plenty of leaked classified documents from NSA (and others) that have been verified as legitimate. Many people working in public know stuff that hasn’t been published in full.

Here is one example with documents: https://www.spiegel.de/international/world/the-nsa-uses-powe...

Here is another: https://www.spiegel.de/international/germany/inside-the-nsa-...

Please read each and every classified document published alongside those two stories. I think you may revise your comments afterwards.

matthewmcg5 days ago

Right came here to make the same point. The first lawsuit alluded to in the blog post title resulted in an important holding that source code can be protected free expression.

taliesinb6 days ago

Why is the submission URL using http instead of https? That just seems... bizarre.

CharlesW6 days ago
sdwr5 days ago

Cryptography experts know when to care about security. Cryptography enthusiasts try to slap encryption on everything.

effie5 days ago

Why? Http is simpler, less fragile, not dependent on good will of third parties, the content is public, and proving authenticity of text on Internet is always hard, even when served via the https scheme. I bet Bernstein thinks there is little point in forcing people to use https to read his page.

oittaa5 days ago

That's just wrong on so many levels. Troy Hunt has an excellent explanation: https://www.troyhunt.com/heres-why-your-static-website-needs...

effie5 days ago

Troy Hunt points out that HTTP traffic is sometimes MITMed in a way that clients and servers do not like, and HTTPS sometimes prevents that. I never said otherwise. I am saying for certain kinds of pages, it's not a major concern. Like for djb website.

Why not use HTTPS for everything? Because it also has costs, not just benefits.

oittaa3 days ago

> Because it also has costs, not just benefits.

That's not really true. Certificates have been free for a long time and every CPU made within the last 10 years has AES acceleration. You can google white papers from companies like Cloudflare and Google, which actually show speedups with HTTP 2 or 3.

z9znz5 days ago

MITM could change what the client receives, right?

effie5 days ago

Yes. But if you worry about being a target for MITM attacks, https alone does not fix that problem. You need some reliable verification mechanism that is hard to fool. The current CA system or "trust on first use" are only partial, imperfect mechanisms.

msk205 days ago

Just FYI, On my Firefox its saying "Connection Secure (upgraded to https)", its actually using ECDHE CHACHA20 SHA256.

Note: I have "Enable HTTPS-Only Mode in all windows" on by default.

timcavel3 days ago
theknocker5 days ago
kvetching5 days ago
dang5 days ago

We ban accounts that post like this, so please don't.

https://news.ycombinator.com/newsguidelines.html

mramadany5 days ago
pvg5 days ago

This isn't the sort of shit you can start here, take a look at

https://news.ycombinator.com/newsguidelines.html

mramadany5 days ago

If you think I went around looking to dig up dirt, I didn't. I just searched djb's name on Twitter to find more discussions about the subject, as post-quantum cryptography is an area I'm curious about.

Regarding asking for a disclosure, I thought that was widely accepted around here. If the CEO of some company criticised a competitor's product, we would generally expect them to disclose that fact upfront. I thought that was appropriate here given the dismissive tone of GP.

pvg5 days ago

If you think I went around looking to dig up dirt

It doesn't matter, you can't toss stuff like that at people here, never mind characterize it the way you did, as a form of argument. It's in the guidelines, there are lots of moderator comments bout it, don't be doing it.

I thought that was appropriate here given the dismissive tone of GP.

It's not, no matter how 'dismissive' you think a comment is.

+1
mramadany5 days ago
andrewflnr5 days ago

> asking for a disclosure

Bullshit, before you even finish the sentence. You didn't ask, you accused. Did you read the context of the tweets you linked?

dang5 days ago

We detached this subthread from https://news.ycombinator.com/item?id=32363982.

lapinot5 days ago

Not sure about the disclosure, having a grudge with djb is not particularly a minority thing.

tptacek5 days ago

Whatever "grudge" I have with Bernstein is, to say the least, grudging.

tptacek5 days ago

You could not have less of an idea of what you're talking about here.

crabbygrabby6 days ago

Seems like a baaad idea lol.

zitterbewegung6 days ago

He won a case against the government representing himself so I think he would be on good footing. He is a professor where I graduated and even the faculty told me he was interesting to deal with. Post QC is his main focus right now and also he published curve25519.

matthewdgreen6 days ago

He was represented by the EFF during the first, successful case. They declined to represent him in the second case, which ended in a stalemate.

throwaway6543296 days ago

The full story is interesting and well documented: https://cr.yp.to/export.html

Personally my favorite part of the history is on the “Dishonest behavior by government lawyers” page: https://cr.yp.to/export/dishonesty.html - the disclaimer at the top is hilarious: “This is, sad to say, not a complete list.” Indeed!

Are you implying that he didn’t contribute to the first win before or during EFF involvement?

Are you further implying that a stalemate against the U.S. government is somehow bad for self representation after the EFF wasn’t involved?

In my view it’s a little disingenuous to call it a stalemate implying everything was equal save EFF involved when the government changes the rules.

He challenged the new rules alone because the EFF apparently decided one win was enough.

When the judge dismissed the case, the judge said said that he should come back when the government had made a “concrete threat” - his self representation wasn’t the issue. Do you have reason to believe otherwise?

To quote his press release at the time: ``If and when there is a concrete threat of enforcement against Bernstein for a specific activity, Bernstein may return for judicial resolution of that dispute,'' Patel wrote, after citing Coppolino's ``repeated assurances that Bernstein is not prohibited from engaging in his activities.'' - https://cr.yp.to/export/2003/10.15-bernstein.txt

+1
matthewdgreen6 days ago
gruturo6 days ago

Yeah, terrible idea, except this is Daniel Bernstein, who already had an equally terrible idea years ago, and won. That victory was hugely important, it pretty much enabled much of what we use today (to be developed, exported, used without restrictions, etc etc etc)

yieldcrv6 days ago

seems like they just need a judge to force the NSA to comply with a Freedom of Information Act request, its just part of the process

I'm stonewalled on an equivalent Public Record Act request w/ a state, and am kind of annoyed that I have to use the state's court system

Doesn't feel super partial and a couple law journals have written about how its not partial at all in this state and should be improved by the legislature

throwaway6543296 days ago

This is part of a class division where we cannot practically exercise our rights which are clearly enumerated in public law. Only people with money or connections can even attempt to get many kinds of records.

It’s wrong and government employees involved should be fired, and perhaps seriously punished. If people at NIST had faced real public scrutiny and sanction for their last round of sabotage, perhaps we wouldn’t see delay and dismissal by NIST.

Delay of responding to these requests is yet another kind of sabotage of the public NIST standardization processes. Delay in standardization is delay in deployment. Delay means mass surveillance adversaries have more ciphertext that they can attack with a quantum computer. This isn’t a coincidence, though I am sure the coincidence theorists will come out in full force.

NIST should be responsive in a timely manner and they should be trustworthy, we rely on their standards for all kinds of mandatory data processing. It’s pathetic that Americans don’t have several IG investigations in parallel covering NIST and NSA behavior. Rather we have to rely on a professor to file lawsuits for the public (and cryptographers involved in the standardization process) to have even a glimpse of what is happening. Unbelievable but good that someone is doing it. He deserves our support.

yieldcrv5 days ago

> This is part of a class division where we cannot practically exercise our rights which are clearly enumerated in public law. Only people with money or connections can even attempt to get many kinds of records.

As someone with those resources, I'm still kind of annoyed because I think this state agency is playing chess accurately too. My request was anonymous through my lawyer and nobody would know that I have these documents, while if I went through the court - even if it was anonymous with the ACLU being the filer - there would still be a public record in the court system that someone was looking for those specific documents, so that's annoying

throwaway6543295 days ago

That’s a thoughtful and hard won insight, thank you.

PaulDavisThe1st6 days ago

Even though I broadly agree with what you've written here ... the situation in question isn't really about NIST/NSA response to FOIA requests at all.

It's about whether the US government has deliberately acted to foist weak encryption on the public (US and otherwise), presumably out of desire/belief that it has the right/need to always decrypt.

Whether and how those agencies respond to FOIA requests is a bit of a side-show, or maybe we could call it a prequel.

denton-scratch5 days ago

> the situation in question isn't really about NIST/NSA response to FOIA requests at all.

I disagree. To my mind, the issue is that a national standards agency with form for certifying standards they knew were broken, still isn't being transparent about their processes. NIST's reputation as been mud since the ECDRBG debacle.

People are not at liberty to ignore NIST recommendations, and use schemes that are attested by the likes of DJB, because NIST recommendations get built into operating systems and hardware. It damages everyone (including the part of NSA that is concerned with national security) that (a) NIST has a reputation for untrustworthiness, and (b) they aren't showing the commitment to transparency that would be needed to make them trustworthy again.

throwaway6543296 days ago

We are probably pretty much in agreement. It looks like they’ve got something to hide and they’re hiding it with delay tactics, among others.

They aren’t alone in failing to uphold FOIA laws, but they’re important in a key way: once the standard is forged, hardware will be built, certified, deployed, and required for certain activities. Delay is an attack that is especially pernicious in this exact FOIA case given the NIST standardization process timeline.

As a side note, the NIST FOIA people seem incompetent for reasons other than delay.

pyuser5835 days ago

Please include links with https://

oittaa5 days ago

NSA employees downvoted this?

pyuser5835 days ago

Seriously! Tons of people ranting about crypto visiting a non-TLs website!

ForHackernews5 days ago

Maybe this is too much tinfoil hattery, but are we sure DJB isn't a government asset? He'd be the perfect deep-cover agent.

throwaway6543295 days ago

Please don’t do the JTRIG thing. Dan is a national treasure and we would be lucky to have more people like him fighting for all of us.

Between the two, material evidence shows that NIST is the deep-cover agent sabotaging our cryptography.

temptemptemp1115 days ago
rethinkpad5 days ago

Though 99% of the time I would agree with you, the public has to have faith in people who claim to be fighting (with previously noted successes in Bernstein v. US) in our best interests.

temptemptemp1115 days ago
elif5 days ago

Perhaps the best way to build trust in a cryptographic algorithm is to have it devised by certifiably neutral general purpose mathematic neural net.

It could even generate an algorithm so complicated it would be close to impossible for a human mind to comprehend the depth of it.

creatonez5 days ago

> It could even generate an algorithm so complicated it would be close to impossible for a human mind to comprehend the depth of it.

Okay... then some nefarious actor's above-human-intelligence neural network instantly decodes the algorithm deemed too complicated for human understanding?

I don't see how opaque neural nets are suddenly going to make security-through-obscurity work.

tooltower5 days ago

"Certifiably neutral"

So, by a process that hasn't been designed yet. Especially when one considers how opaque most neutral nets are to human scrutiny.

elif5 days ago

I mean, if the source, training data, and query interface are public, it would be insanely difficult to hide a backdoor

There i "designed" your impossible criterion in just a few obvious steps you could have inferred

tooltower5 days ago

There are many, many papers that show how you can make innocuous changes to inputs to make neutral nets produce the wrong result. You might be overestimating the difficulty of this process.

elif4 days ago

Could be worse. At least I don't dismiss entire classes of problems simply because they sound hard.

politelemon6 days ago

So, question then, isn't one of the differences between this time's selection, compared to previous selections, that some of the algorithms are open source with their code available.

For example, Kyber, one of the finalists, is here: https://github.com/pq-crystals/kyber

And where it's not open source, I believe in the first round submissions, everyone included reference implementations.

Does the code being available make it easy to verify whether there are some shady/shenanigans going on, even without NIST's cooperation?

lostcolony6 days ago

Not really. For the same reason that "here's your github login" doesn't equate to you suddenly being able to be effective in a new company. You might be able to look things up in the code and understand how things are being done, but you don't know -why- things are being done that way.

A lot of the instances in the post even show the NSA giving a why. It's not a particular convincing why, but it was enough to sow doubt. The reason to make all discussions public is so that there isn't an after the fact "wait, why is that obviously odd choice being done?" but instead a before the fact "I think we should make a change". The burden of evidence is different for that. A "I think we should reduce the key length for performance" is a much harder sell when the spec already prescribes a longer key length, than an after the fact "the spec's key length seems too short" "Nah, it's good enough, and we need it that way for performance". The status quo always has inertia.

politelemon5 days ago

Thanks for the response, that's making sense. I've also tried following the PQC Google Groups but a lot of the language is beyond my grasp.

Also... I don't understand why I've been downvoted for asking a question, I'm trying to learn but HN can certainly be unwelcoming to the 'curious' (which is why I thought we are here)

aaaaaaaaaaab6 days ago

What? :D

Who cares about a particular piece of source code? Cryptanalysis is about the mathematical structure of the ciphers. When we say the NSA backdoored an algorithm, we don't mean that they included hidden printf statements in "the source code". It means that mathematicians at the NSA have knowledge of weaknesses in the construction, that are not known publicly.

politelemon5 days ago

Well, that was why I asked the question. I didn't think asking a question deserved downvotes and ridicule.

gnabgib6 days ago

Worth noting DJB (the article author) was on two competing (losing) teams to Kyber[0] in Round 3. And has an open submission in round 4 (still in progress). That's going to slightly complicate any FOIA until after the fact, or it should. Not that there's no merit in the request.

[0]: https://csrc.nist.gov/Projects/post-quantum-cryptography/pos...

greyface-5 days ago

> the Supreme Court has observed that a FOIA requester's identity generally "has no bearing on the merits of his or her FOIA request."

https://www.justice.gov/archives/oip/foia-guide-2004-edition...

throwaway6543295 days ago

It is wrong to imply he is unreasonable here. NIST has been dismissive and unprofessional towards him and others in this process. They look terrible because they’re not doing their jobs.

Several of his student’s proposals won the most recent round. He still has work in the next round. NIST should have answered in a timely manner.

On what basis do you think any of these matters can or may complicate the FOIA process?

bumper_crop6 days ago

This definitely has the sting of bitterness in it, I doubt djb would have filed this suit if NTRU Prime would have won the PQC NIST contest. It's hard to evaluate this objectively when there are strong emotions involved.

pixl976 days ago

When it comes to the number of times DJB is right versus the number of times that DBJ is wrong, I'll fully back DJB. Simply put the NSA/NIST cannot and should not be trusted in this case.

bumper_crop5 days ago

You misread. I'm saying his reasons for filing are in question. NIST probably was being dishonest. That's not the reason there is a lawsuit though.

throwaway6543295 days ago

They’re not in question for many people carefully tracking this process. He filed his FOIA before the round three results were announced.

The lawsuit is because they refused to answer his reasonable and important FOIA in a timely manner. This is not unlike how they also delayed the round three announcement.

cosmiccatnap5 days ago

It's funny how often the bitterness of a post is used as an excuse to dismiss the long and well documented case being made.

bumper_crop5 days ago

If NTRU Prime had been declared the winner, would this suit have been filed? It's the same contest, same people, same suspicious behavior from NIST. I don't think this suit would have come up. djb is filing this suit because of alleged bad behavior, but I have doubts that it's the real reason.

throwaway6543295 days ago

Yes, I think so. His former PhD students were among the winners in round three and he has other work that has also made it to round four. I believe he would have sued if he won every single area in every round. This is the Bernstein way.

The behavior in question by NIST isn’t just alleged - look at the FOIA ( https://www.muckrock.com/foi/united-states-of-america-10/nsa... ). They’re not responding in a reasonable or timely manner.

Does that seem like reasonable behavior by NIST to you?

To my eyes, it is completely unacceptable behavior by NIST, especially given the timely nature of the standardization process. They don’t even understand the fee structure correctly, it’s a comedy of incompetence with NIST.

His FOIA predates the round three announcement. His lawsuit was filed in a timely manner, and it appears that he filed it fairly quickly. Many requesters wait much longer before filing suit.

dt3ft5 days ago

Perhaps the old advice (“never roll your own crypto”) should be reevaluated? If you’re creative enough, you could combine and apply existing algorithms in such ways that it would be very difficult to decrypt? Think 500 programmatic combinations (steps) of encryption applying different algorithms. Content encrypted in this way would require knowledge of the encryption sequence in order to execute the required steps in reverse. No amount of brute force could help here…

TobTobXX5 days ago

> Would require knowledge of the encryption sequence...

This is security by obscurity. Reputable encryptions work under the assumption that you have full knowledge about the encryption/decryption process.

You could however argue that the sequence then becomes part of the key. However, this key [ie. the sequence of encryptions] would then be at most as strong as the strongest encryption in this sequence, which kindof defeats the purpose.

Tainnor5 days ago

No, an important property of a secure cryptographic cipher is that it should be as close to a random permutation of the input as possible.

A "randomly assembled" cipher that just chains together different primitives without much thought is very unlikely to have that, which will mean that it will probably have "interesting" statistical properties that can be observed given enough plaintext/ciphertext pairs, and those can then be exploited in order to break it.

anfilt5 days ago

No not at all, that advice is still good. Even more important if your are talking about modifying algorithms. Your gonna want proofs of resistance or immunity to certain classes of attacks. A subtle change can easily make a strong primitive useless.