Conversations about life & privacy in the digital age

Privacy VS. Security in a PRISM: The Important Difference

The events of these last many days certainly raise awareness around the integrity of data and the companies we entrust with it. Many of the articles and posts have poured over the impacts: the good, the bad, the necessity, the importance, the invasive, the threat, the martyr and so on. Given this dearth of commentary, I would like to spend some time writing about a finally emerging concept – privacy. And further – how privacy is substantially differentiated from security.

To begin, let’s review the definitions of these two words (according to Google):

Security – The state of being free from danger or threat

Privacy – The state or condition of being free from being observed or disturbed by other people

Of all the conversations and dialogue about PRISM, none have concentrated on the security measures in place at companies like Google, Facebook, Amazon, Apple, Verizon, and others. Why you might ask? Because this was not a breach of security. No one hacked into their systems. No one confiscated passwords. Rather – according to reports – these companies willingly complied. [Note: It would be appropriate to draw attention to NSA's security breach in light of Eric Snowden's ability to access and confiscate these documents.]

If the world were oriented around privacy, the ability for a 3rd party provider of web-based services (such as Google or Facebook or Dropbox or SpiderOak) to access the plaintext data is removed. In other words, privacy takes away the ability to access the data in a meaningful way such that it cannot be supplied to government agencies or stolen under the threat of hackers.

We are not now nor have we ever suggested that there isn’t a need for security; in fact, security is absolutely critical. And for many implementations of  various services, privacy is not applicable. However – in the world of conversation and creation of personally owned content from photos to chat to calls to spreadsheets to documents – privacy is absolutely a critical component that can be achieved.

My hope is that we – as a society – will now start asking the question: Why? Why do companies have access to my photos and documents and chat conversations? Is it a necessary part of the service they are offering? A convenience for me?If yes, what are these companies doing to keep my data private? And are there alternatives if I do want real privacy? From the NSA? From the company? From anyone?

This dialogue is critical and I am very glad to see the word ‘privacy’ start to weave its way into conversations. Further, that the public is being educated on the important difference between privacy and security and – hopefully – we all can start making choices accordingly.

For more information on this topic, please visit and/or watch the explainers below on Privacy VS. Security and the important role of the Privacy Policy .

AMA: Interview with Cryptographer, Computer Security Expert Jon Callas

Jon worked on Apple’s Whole Disk Encryption, PGP (Pretty Good Privacy) Universal Server, co-founded the PGP Corporation, is former CTO of Entrust, and current co-founder and CTO at Silent Circle (Global Encrypted Communications). As an inventor and cryptographer, his designs of security products have won major innovation awards from The Wall Street Journal and others.

Last week, you submitted your questions for Jon Callas, one of  the world’s most respected and brilliant minds when it comes to software security and privacy. We chose five of them, which we sent to Jon. These are his answers.

1. How did you become a security expert / cryptographer?

A long time ago, I worked at the best computer science grad school there was — VMS development at Digital Equipment Corporation. One of the great things there was that I got to work on a wide variety of things, from graphics to schedulers to memory management to operating system security. A lot of the problems we had to deal with at the time are still relevant issues. I did a random password generator among other things, and I still use that for my own passwords.

When DEC fell apart, like many people, I started a startup with a number of friends. We built a system that let you do meetings as well as play games, socialize, and collaborate. It got rave reviews. The venture capital people said to us, “This is amazing! I want to invest in this in ten years!” That was when I started getting into cryptography. People didn’t want to do collaboration on the then very-early Internet without encryption. There was no SSL at the time, either.

So I went to the second ever RSA conference to learn to do enough cryptography to protect our network. I ended up sitting next to a guy named Bruce who had just written a book called “Applied Cryptography” and he had a bunch of them in a duffel bag with him, so I bought one. I may have bought the very first copy; I know I was the first person at RSA who bought one. I asked him to autograph it, and he said, “I can’t deface a book!” I replied that it’s not defacement if you’re the author.

After we got tired of throwing our money into our startup, I went to work for Apple in the Advanced Technologies Group and worked for Gurshuran Sidhu, who was the inventor of AppleTalk, and shipped the very first crypto built into an OS, called PowerTalk. It failed for being far too early, as well. One of its pieces, though, was this password manager called The Keychain, and I claimed that it was a great thing. While it was hardly perfect, it encouraged good password use, and that was better than anything else. So Bruce Gaya and I hacked The Keychain so that you could run it without the rest of PowerTalk, and thus rescued it from oblivion. The present Keychain on Apple products is completely and utterly rewritten, but I’m proud of saving it. I also built a random number manager for Macs that’s now lost to the mists of time.

That was the worst time to be working for Apple, the year before Steve Jobs came back. I named all my computers for things in The Hitchhiker’s Guide to the Galaxy, because as I said, having been through DEC’s collapse I felt a bowl of petunias (“Oh, no, not again”). When SJ came back, we heard a lot about what his plans were, as he and Sidhu were old friends. We knew that he was planning to get rid of all of ATG, so we wondered what to do. Sidhu wanted to start a startup, but none of us had any ideas we really liked. I could have easily gone into the OS group. A friend of a friend said that Phil Zimmermann’s PGP Inc was looking for server architects, and I interviewed there and got an offer. I thought it was a great way to do fun things and change the world for the better, so I went there. That was a great place to really become an expert.

2.  Are there any localities where it is illegal to encrypt calls, text messages, or emails?

Maybe. That’s not a good answer, is it?

In civilized countries, the answer is no. I might even go so far as to say that the places where it’s not legal or even expected are pretty tightly correlated with civilized countries. Repressive governments often try to restrict crypto. I’m sure Syria’s got it’s opinions, but I’m not an expert on Syrian law.

There are places where there are restrictions, but they are also so filled with exceptions that it’s hard to give a definitive answer. For example, China has import restrictions on cryptography. But there are exemptions for non-Chinese doing business there or Chinese people who are doing business with other countries. I am also nothing like an expert on Chinese law.

My rule is that I worry about the laws of countries that I want to operate in. I need to know about them, there. Other places I just ignore.

Most often, even in repressive countries, they aren’t worried about the crypto as such, they’re worried about what the people are using the crypto for.

 3. What are you working on right now that has you the most excited?

On a large scale, it’s Silent Circle. The biggest problem we’ve always had with crypto is that it’s hard to use. Usability is key because if it’s hard to use, then people use insecure systems. They don’t stop talking, they stop being secure. So your security has to fade into the background. It has to be ignorable. But it also has to be there, as well. That’s a paradox.

We also have learned one of the best ways to make security workable is to have it run by an expert staff. So the question is how to have an expert staff running the security and privacy for people who need it and yet the staff can’t undetectably compromise the people using the system. We have a lot of innovative things we’re doing to make the security fade into the background and yet be there.

On a small scale, I’m taking my old password generator from VMS and making it into an iPhone app. I was doing a lot of work on it before Silent Circle as a hobby, and I really ought to finish.

4. As an expert on encryption do you see a natural relationship between encryption and the law? What’s your stance on how encrypted data should be treated when there’s no idea what it may contain? In some countries there are what I consider very severe key disclosure laws and I wonder if there will ever be a duress scheme or method of deniable encryption that could be so perfect as to make the laws moot.

I think it’s an unnatural relationship between encryption and the law. All technologies can be used for good or ill. It’s true for fire. It’s true for just about anything. Encryption, interestingly, is rarely *directly* used for ill. Yes, there are data ransom schemes that use encryption for ill, but that’s not what people are concerned about.

It’s part of our belief in human rights that we believe in the right to be left alone. Yet many people lose their nerve when it comes to privacy technologies on computers and networks. I think that’s an artifact of the fact that we’re comfortable with door locks or window curtains, but every time someone thinks about encryption, the James Bond theme starts playing in their head. That’s an artifact of the relationship between encryption and disagreements between nation-states. With the Internet and computing everywhere, not using encryption is like having an unlocked house with no curtains.

“With the Internet and computing everywhere, not using encryption is like having an unlocked house with no curtains.”

My stance on encrypted data per se is that it’s data. Everyone has reasons that they want something to be private. Everyone has things that *must* be private, like their own records or someone else’s records, which usually *must* be protected. This might have been an interesting debate way back in the 1900s, but it isn’t any more.

I don’t know what to say about key or data disclosure laws. In the US, there’s movement in the courts towards protecting encrypted data in some way or other. It’s all revolved around passwords in specific, but the real issue is a Fifth Amendment issue. Relatively few countries have equivalents of the Fifth Amendment.

But the UK, for example, they don’t have protections against self-incrimination. As a matter of fact, we have one in the US *because* they don’t have one there. They have a disclosure law, RIPA. I think its application has been pretty embarrassing, as I can’t think of a place where it has been used that didn’t do much more than make the defendant more sympathetic.

I am not a fan of deniable encryption and personally, I think it’s impossible. Deniable encryption seems to me to be predicated on the idea that your attacker is either a nice person or stupid. Stupid in the sense that you are managing to hide the fact that you’re using deniable encryption. That predicates that either you’re using something completely custom, or they don’t realize that the deniable encryption is there. That’s what I mean by stupid — you’re pulling a fast one on them and they don’t know it. By being nice, they know you have deniable encryption and yet they’ll say, “Well, I guess if we can’t *prove* you have something encrypted there, I guess you don’t!”

A couple of years ago, I was chatting with the customs agency of a civilized country. I asked them about TrueCrypt and its deniable disk volume. They said, “Oh, we know *all* about TrueCrypt!” One of the guys I talked to added, “If we see you’re using TrueCrypt, we just ask you for the second password.” I asked what happens if someone doesn’t have a second volume and they replied, “Why would someone do *that*? I mean, that’s the whole point of TrueCrypt, to have a second volume. What kind idiot would install TrueCrypt and not have a second volume?” We chatted some more and one of them said, “Look, we don’t look in someone’s laptop for no reason. We have better things to do. If we’re asking for your computer, it’s not because they had a piece of fruit in their bag. If we find special encryption, we know we’re on to something.” I asked again about someone who *doesn’t* have a hidden volume, and they said that you’d have to sit in a room for a while, until you convince them you don’t.

This is the real issue, I think. If you’re in a nice jurisdiction — one where you can say, “Look, I’m an honest person and I have encryption, and no I’m not going to tell you my password” then deniable encryption might work. But if you’re in a jurisdiction where they aren’t nice, then you’re actually more at risk using something that makes you look like you’re up to something.

Ironically, this is an effect of the fact that we’ve succeeded in making encryption normal.

 5. What is your favorite movie?

There are relatively few movies that I’m willing to watch more than once. I’m apathetic about special effects, but a sucker for great dialog.

One of the very few movies I can watch over and over is The Princess Bride. One of my favorite lines to live by is, “Nonsense. You’re only saying that because no one ever has.”

Thanks Jon! If you are interested in learning cryptography, we recommend reading his PDF, An Introduction to Cryptography. Otherwise, be sure to follow or like Silent Circle to stay in stride with their efforts and support their work in encrypted communications.

Security, Privacy & Encryption 101 Roundup

As you know, privacy and security is not something we take lightly. In our efforts to help educate our fellow humans on their importance and the role they play in our lives on and offline, we’ve compiled the below list of recent news, resources and tips.

[For the past few weeks we've focused on encryption. If you missed them: Just Because It's Encrypted Doesn't Mean It's Private and Encryption 101.]

If you would like to share links or resources we’ve missed, we encourage you to do so below.

May Highlight


News & Information



Interesting Reads



  • Don’t send sensitive information over the Internet before checking a website’s security
  • Pay attention to the URL of a website. Malicious websites may look identical to a legitimate site, but the URL may use a variation in spelling or a different domain (e.g., .com vs. .net)
  • Install and maintain anti-virus software, firewalls, and email filters to reduce suspicious traffic
  • Don’t use passwords that are based on personal information that can be easily accessed or guessed
  • Use both lowercase and capital letters in your passwords
  • Use different passwords on different systems
  • Do business with credible companies
  • Do not use your primary email address in online submissions
  • Devote one credit card to online purchases
  • Encrypting data is a good way to protect sensitive information. It ensures that the data can only be read by the person who is authorized to have access to it
  • Use two-factor authentication if available (coming soon to SpiderOak)
  • Back up all of your data on a regular basis

Just Because It’s Encrypted Doesn’t Mean It’s Private

Now that you’ve got a handle on what encryption is and what it can do, it’s important to understand what it can’t do.

Encryption is a tool, and like any tool, it can be used improperly or ineffectively. It may sound a bit strange for us at SpiderOak to disclaim the benefits of encryption, but I hope to show that while encryption is necessary for privacy, it’s not always sufficient.

One prime example of the utility of encryption is HTTPS. By wrapping encryption around regular HTTP, engineers have created a powerful tool for securing content both delivered to you and provided by you. But HTTPS only protects content while it’s in transit. HTTPS will protect your credit card numbers as they travel over the internet to a merchant, but once they arrive on the other end, they’re no longer encrypted and it’s up to the merchant and credit card providers to protect them. Credit card providers and banks have developed PCI DSS regulations to tightly specify the security of credit card processing, but as the frequency of credit card breaches demonstrates, these regulations aren’t sufficient to guarantee privacy.

Another great cryptographic tool is Full Disk Encryption. Whether built-in to your computer hardware or provided by software like TrueCrypt, FDE protects the contents of your hard drive by encrypting every last bit. Anyone who steals your hard drive will find it completely unreadable. But while you’re using the drive, it is readable. While you have your computer on and the drive unlocked, any malicious piece of software running on your computer will find all of your data fully readable. FDE is a valuable tool, but it can only guarantee privacy while the disk isn’t in use.

Privacy is a complex problem that requires attention to many details, one of which is encryption. We’ve tried our best to provide you with the best privacy possible for your important data. If you’re interested in more details about how we protect your privacy, please read our Engineering page. And feel free to ask us about it, we’re always willing to brag!

Drink Your Ovaltine: Encryption 101

When it comes to cryptography, there are no experts. It is considered to be a constantly evolving field. If you started learning today, it is accepted that you might see something new in the code, or do something better that lifelong cryptographers have missed.

The first thing that comes to mind when I think of encryption, is the scene in A Christmas Story when Ralphie gets a decoder ring and decrypts a disappointing (advertising) message:

But at its basic level, this describes encryption. You probably even had similar games you made up as a kid. In the computer world, this means converting plaintext data (ordinary info) into ciphertext, or unintelligible text.


OpenPGP (PGP = Pretty Good Privacy) is thought to be the most widely-used encryption program in the world. But there are two types of encryption methods: symmetric and asymmetric.

1) Symmetric Password-Based Encryption

This is the simplest encryption system. It’s called “symmetric” because the same key is used to encrypt and decrypt the file. If Alice wants to share data privately with Bob, she must first create an encryption key. This can be done by sampling a sufficiently random source, or by deriving it from a password. Alice must securely give this key to Bob. Now Alice can encrypt her data with that key, hand the encrypted data to Bob, and Bob can use the key to decrypt it. This method is useful to encrypt sensitive information for yourself, for family, or for a few trusted friends or coworkers. AES is a popular symmetric cipher.

2) Asymmetric Public/Private Key-Based Encryption:

Asymmetric encryption involves the use of two different keys, one which is private and not shared, and one which is public. The public key encrypts data, and the private key decrypts data. With this scheme, Alice and Bob each have their own private/public key pairs. Alice now uses Bob’s public key to encrypt the data she wants to send to him. Because only Bob has his private key, only he can decrypt the data Alice sends him. Asymmetric encryption takes more computer power than symmetric key encryption, so it is often used to set up secure communications to exchange symmetric keys. RSA is a popular asymmetric cipher.

As for SpiderOak, our old clients used a combination of 2048 bit RSA and 256 bit AES. Now new clients use 3072-bit RSA combined with 256 bit AES to meet industry recommendations. We use this mixture of techniques where each is best suited: asymmetric encryption for communications channel setup and key exchange, and symmetric encryption for internal data structures and improved client performance.

Not only are your files encrypted with SpiderOak, but so are the filenames and paths. Our Engineering Matters page does a good job of explaining in detail how we encrypt your data after the initial scan, and our servers have zero-knowledge of what they are storing. Next week our system administrator will talk about why we went this direction, as well as why encryption doesn’t necessarily mean privacy or safety.

Jon Callas is one of  the world’s most respected and brilliant minds when it comes to software security and privacy. He worked on Apple’s Whole Disk Encryption, PGP Universal Server, co-founded the PGP Corporation, is former CTO of Entrust, and current co-founder and CTO of our friends, Silent Circle (Global Encrypted Communications). As an inventor and cryptographer, his designs of security products have won major innovation awards from The Wall Street Journal and others. If you are interested in learning cryptography, we recommend reading his PDF, An Introduction to Cryptography.

(TeaserOur community gets the opportunity to interview Jon, so we will make a call out for your questions later this week – be thinking of what you’d want to ask him!)

What else would you say about encryption? How did you learn? Why do you think it is important?

It is a Monday…

Greetings SpiderOak Users,

This morning we excitedly sent out an email to you all with news of our latest version – SpiderOak 5.0. Within 5.0 are many new and exciting features including Hive and Windows OS integration (Mac & Linux coming soon) in addition to a completely revamped mobile effort with our 2.0 iOS app currently in the app store (2.0 Android shipping on the 16th of May).

In our efforts to alert you of these wonderful changes, we have received comments that the email was addressed to ‘First Name’ as opposed to the actual name as it appears in our database. Now – while this may appear as a mistake, it is actually us taking privacy one step further and converting all of your ‘first names’ to read ‘First Name’. As we constantly strive to push ‘Zero-Knowledge’ further and further, this is just another step in that process.

NOTE: Of course we are making light of a mistake we made in our email campaign this morning whereby we did not properly include the mail merge in the final deployment. This is my mistake and I do sincerely apologize for any problems this may have caused. Rest assured, the emails are indeed from us at SpiderOak and you can always feel free to download 5.0 from our website here: download.

Given this topic, we would also like to take a moment to mention the domain name present in our email communications to users – We have designated that some of our email correspondence come from this domain so please do not be alarmed or worried about an email coming from as representing a spam or phishing message. That said, we are currently taking steps to push all communications from to limit confusion moving forward.

Please don’t hesitate to send further thoughts and/or questions anytime and we thank you in advance for both your understanding and continued patronage with SpiderOak.

Have a wonderful week ahead.

Sending very best wishes,

Ethan Oberman
SpiderOak, Inc

Privacy Roundup #5 of 2013

Time marches on and it is hard to believe the first four months of the year have now come to a close. The month of April has been a big one for SpiderOak as we have released our long awaited 5.0 client including our newest feature ‘Hive’ as well as Explorer integration for Windows and our newly redesigned 2.0 iOS application (with Android currently in Beta).

In world news it can be noted that the Crypto Coin craze is still going strong with Bitcoins (btc) hovering above $100 and companies such as Butterfly Labs and Avalon shipping more and more advanced equipment for mining cryptographic currencies (more on this and privacy and security implications of crypto currencies in a future post). The world has seen the rise and possibly fall of CISPA once more, and the debate on surveilance drones rages on.

For this Privacy Roundup we have as usual hand picked some interesting tidbits from the news, so stay vigilant and check out some of what we felt was important in the last few weeks:

Well that about sums it up. This week we sign off with a quote from Sean Parker’s character from the movie “The Social Network”: “We lived on farms, then we lived in cities, and now we’re going to live on the internet!”.

As always, we hope you have a productive and private month ahead!

The Fine Print of Privacy

Ever wonder what happens to your data when you accept a “terms of service” agreement? Or, do you always comb through the fine print?

This second explainer, recently promoted by Cory Doctorow on Boing Boing, breaks down the details of a privacy policy and answers the following questions:

  1. What’s the point of a privacy policy?
  2. Are companies required to have one?
  3. How do privacy policies vary between companies?
  4. And how might they change in the future?
Want to help spread the message of privacy? Share this explainer and get people talking about The Fine Print of Privacy.

Fighting for the Right to Know

In February a bill was introduced in the State of California by Assembly Member Lowenthal called AB 1291 or the ‘Right To Know’ Act.

The central theme of the ‘Right to Know’ Act is transparency – creating a way for the user of a service to request and thus understand all the information a company has about them in terms of what was given, what was shared, as well as what may have been inferred. And despite various company spokesmen speaking up in favor of AB 1291 (e.g. increased transparency), behind closed doors many of these same companies are working to defeat the bill via industry groups and coalitions.

It is tough to argue against transparency. And it has always been our belief that a more transparent ecosystem would give way to a better educated consumer which ultimately means a better business environment. After all, we would hate to think that somehow companies were taking advantage of us in any way and the ‘Right to Know’ Act is a way through this potential fear.

For these reasons and more, we felt compelled to come out in favor of AB 1291. Below is the letter we sent to Assemblymember Lowenthal in show of our support:


March 21, 2013


The Honorable Bonnie Lowenthal

State Capitol, Room 3152

Sacramento, CA 95814


Re:  Support for AB 1291 (Lowenthal), The Right to Know Act

Dear Assemblymember Lowenthal:

On behalf of SpiderOak, Inc, we are writing to express our support for AB 1291, the Right to Know Act.

The Right to Know Act would modernize California’s Shine the Light Act (Civ. Code 1798.83), which was intended to provide Californians with the right to know when businesses are sharing their personal information. In the years since the passage of the Shine the Light Act, its definitions and mechanisms have been outpaced by rapid changes in technology, data collection, and business practices. The Right to Know Act would update this important measure.

We stand in support of the Right to Know Act for two reasons. First, SpiderOak strongly supports the principle that individuals deserve the right to know how their personal information has been collected and shared. Protecting user privacy needs to be an essential part of how we as a society address the growth of online activity. Therefore, we support efforts to provide individuals with consistent transparency across all of the companies who handle personal information.

Second, SpiderOak believes that transparency and greater understanding will help all businesses in the modern data ecosystem thrive – including SpiderOak. Businesses that handle personal information rely on user trust – that the business is handling information with the utmost care and concern. As the ‘cloud’ medium grows, information collection and gathering has increased exponentially. By increasing transparency, we believe that the Right to Know Act will promote good data stewardship across the board and thus increase overall trust in and usage of data-driven services, promoting innovation and business growth.

California residents and companies both deserve an online world where users can truly understand how their personal information is collected and shared. Transparency is a necessary step in building that world. For that reason, we are proud to join you in supporting the Right to Know Act.


Ethan Oberman

SpiderOak, Inc


We are curious to hear your thoughts on the ‘Right to Know’ Act and where you stand. Feel free to leave your comments here and looking forward to the dialogue!

Increasing Transparency Alongside Privacy – 2013 Report

As we stated in our Transparency Report in 2012, privacy continues to be at the root of all we do at SpiderOak. Every new product and feature is designed to fit tightly alongside our ‘Zero-Knowledge’ privacy commitment. And we continue to understand how transparency plays a role in overall privacy.

In our ongoing efforts to stay on top and aware of this ever-changing landscape, our work with the Electronic Frontier Foundation (EFF) continues to keep us better informed and aware of what we can do when fighting for the rights of our users.

Given all this, we have reproduced a Transparency Report for that covers all activity over the last calendar year – from April 2012 to April 2013. The report is as follows:

SpiderOak Transparency Report

We are proud to stand behind our commitment in keeping our users informed of any and all activities involving their data and the constant protection of their privacy. Our relationship with the EFF and other organizations will always improve our outreach and understanding so that you – our user – will benefit from a fully transparent and open environment. As always, we greatly value your thoughts and feedback so please don’t hesitate to send further thoughts or questions anytime.