Post Snowden, technologists have rushed a variety of “liberation tech” projects to market, making boastful claims about their cryptographic capabilities to ensure the privacy of their customers. These goals are noble but the results have sometimes been embarrassing.
We’re building a new crypto product ourselves: a high-level secure-by-default framework developers can use to build end-to-end cryptographic applications without writing crypto.
Here’s what we required:
- To be independently verifiable it must be open source
- Have a spec
- Have a threat model
- Have clear, well documented code
- Be audited by security professionals with a crypto background
In this post I’ll share how we’re going about #5. We’re committed to development in the open, including security review.
The first audit we could schedule was with 3 researchers from the Least Authority team. Among other reasons we chose them because they have deep experience building verifiable storage systems. For anyone in that market, Tahoe-LAFS is a must read.
Auditing is both expensive and hard to schedule, with leading organizations booked months in advance. The best teams are not limited by their ability to sell their services but rather by their ability to hire and fulfill that work. Consequently there’s very little downward pressure on their rates.
To get the most from a security audit, it’s best to go in with the cleanest code possible. It’s like brushing your teeth before you visit the dentist. It’s impolite and ineffective to ask someone to puzzle over the subtleties of code you haven’t clarified .
We focused this first audit narrowly on a bare bones single-user (no collaboration or multi-user sharing) demo application built with the Crypton framework. Our goal was good coverage of the framework’s core fundamentals: account creation, authentication, and single-user data storage.
Unfortunately, at the time we could schedule the audit to begin, there were three issues that the Crypton team knew about but hadn’t a chance to fix or even document. The auditors independently discovered two of those three issues with a lead to the third issue (less severe) tagged
[UNRESOLVED] in their report. Additionally they found three other serious issues unknown to the team. Overall, some of the best money we’ve ever spent!
Since the purpose of this post is to give clear expectations, I think it’s important to share real numbers and cleared this with Least Authority.
Zooko explained, “We gave SpiderOak a small discount on our normal price, and moreover we pushed back our other projects in order to get the work done for you first. We did these two things because we wanted to form a relationship with SpiderOak since you provide end-to-end-encrypted storage, and we wanted to support Crypton because it is end-to-end-encrypted and is fully Free and Open-Source Software.”
Our bill was $30,000, or about $5k/researcher per week.
We have a second audit with the nice folks at Leviathan Security, covering the multi-user features of Crypton, and we’ll share that report when it’s complete. In the meantime, here’s the report (rst, pdf) from the first audit by Least Authority.
The resolution for Issue A involves a switch to SRP based authentication. This was part of the longer term roadmap as it provides several additional benefits, but proved to be a nontrivial undertaking and that effort is still ongoing. Some attention is given to this implementation in the next audit by Leviathan Security.
Update: Zooko at Least Authority just published an article discussing their motivation for accepting the project.
Update 2: The originally published version of this post erroneously linked to a non-final draft of the report from Least Authority. That link is corrected; and the final audit report should say “Version 1, 2013-12-20″ at the top.
 Zooko shared a story about an experiment that was conducted by Ping Yee in 2007. The results of the experiment illustrate auditing challenges.
In short several very skilled security auditors examined a small Python program — about 100 lines of code — into which three bugs had been inserted by the authors. There was an “easy,” “medium,” and “hard” backdoor. There were three or four teams of auditors.
1. One auditor found the “easy” and the “medium” ones in about 70 minutes, and then spent the rest of the day failing to find any other bugs.
2. One team of two auditors found the “easy” bug in about five hours, and spent the rest of the day failing to find any other bugs.
3. One auditor found the “easy” bug in about four hours, and then stopped.
4. One auditor either found no bugs or else was on a team with the third auditor — the report is unclear.
See Chapter 7 of Yee’s report for these details.
I should emphasize that that I personally consider these people to be extremely skilled. One possible conclusion that could be drawn from this experience is that a skilled backdoor-writer can defeat skilled auditors. This hypothesis holds that only accidental bugs can be reliably detected by auditors, not deliberately hidden bugs.
Anyway, as far as I understand the bugs you folks left in were accidental bugs that you then deliberately didn’t-fix, rather than bugs that you intentionally made hard-to-spot.