Conversations about life & privacy in the digital age

Privacy VS. Security in a PRISM: The Important Difference

The events of these last many days certainly raise awareness around the integrity of data and the companies we entrust with it. Many of the articles and posts have poured over the impacts: the good, the bad, the necessity, the importance, the invasive, the threat, the martyr and so on. Given this dearth of commentary, I would like to spend some time writing about a finally emerging concept – privacy. And further – how privacy is substantially differentiated from security.

To begin, let’s review the definitions of these two words (according to Google):

Security – The state of being free from danger or threat

Privacy – The state or condition of being free from being observed or disturbed by other people

Of all the conversations and dialogue about PRISM, none have concentrated on the security measures in place at companies like Google, Facebook, Amazon, Apple, Verizon, and others. Why you might ask? Because this was not a breach of security. No one hacked into their systems. No one confiscated passwords. Rather – according to reports – these companies willingly complied. [Note: It would be appropriate to draw attention to NSA's security breach in light of Eric Snowden's ability to access and confiscate these documents.]

If the world were oriented around privacy, the ability for a 3rd party provider of web-based services (such as Google or Facebook or Dropbox or SpiderOak) to access the plaintext data is removed. In other words, privacy takes away the ability to access the data in a meaningful way such that it cannot be supplied to government agencies or stolen under the threat of hackers.

We are not now nor have we ever suggested that there isn’t a need for security; in fact, security is absolutely critical. And for many implementations of  various services, privacy is not applicable. However – in the world of conversation and creation of personally owned content from photos to chat to calls to spreadsheets to documents – privacy is absolutely a critical component that can be achieved.

My hope is that we – as a society – will now start asking the question: Why? Why do companies have access to my photos and documents and chat conversations? Is it a necessary part of the service they are offering? A convenience for me?If yes, what are these companies doing to keep my data private? And are there alternatives if I do want real privacy? From the NSA? From the company? From anyone?

This dialogue is critical and I am very glad to see the word ‘privacy’ start to weave its way into conversations. Further, that the public is being educated on the important difference between privacy and security and – hopefully – we all can start making choices accordingly.

For more information on this topic, please visit ZeroKnowledgePrivacy.org and/or watch the explainers below on Privacy VS. Security and the important role of the Privacy Policy .

Privacy Roundup: PRISM Special Edition

May has rolled into June and summer is fast approaching. Originally I had planned for this privacy update to be another collection of somewhat random links regarding the world of security and privacy. And then… We had Thursday. And then PRISM. And it seemed only right to gather as much information, opinion and material as possible around PRISM and make it available to our readers.

But what is PRISM?

This far in, all anyone can tell for sure is that PRISM is the name of a data collection model and technology solution that improves speed and simplicity in allowing NSA and possibly other US agencies to access user data from a large number of the worlds most popular online services. (Including Google, Skype, Microsoft, Facebook etc.)

It seems the program in itself actually does not introduce any new laws, or even break any current ones. What it does however is enables a more effective way for the NSA to request and receive private user data. And of course, this makes it ripe for speculation as to what this ‘new’ stream lined procurement process is being used for and how.

One of the most informative posts as to the model, use, and participants ironically enough comes from the NSA themselves (via Washington Post) and can be found here:

NSA slides explain the PRISM data-collection program

If you desire to dig a bit deeper into PRISM, what people are saying / thinking, and what companies may or may not have been directly involved, here are a collection of what we found to be the most informative links on the subject from the last several days:

Though we will be elaborating on the PRISM program in relation to SpiderOak in a separate blog post,  I can say definitively that our users’ data is encrypted client-side, uploaded, and stored in its fully encrypted state which means we  are never able to view plaintext user content under any circumstances. In short, PRISM would be wholly and entirely useless in the SpiderOak context. 

To Note: We also have yet to even be contacted by any agency regarding the program – surely a result of our ‘Zero-Knowledge’ privacy environment. After all, encrypted data is rather useless for conducting data mining activity.

In light of recent news and the topic for this special roundup I think it’s only fitting we sign off with this quote of the week:

He who controls the past controls the future. He who controls the present controls the past.” – George Orwell in 1984

 

AMA: Interview with Cryptographer, Computer Security Expert Jon Callas

Jon worked on Apple’s Whole Disk Encryption, PGP (Pretty Good Privacy) Universal Server, co-founded the PGP Corporation, is former CTO of Entrust, and current co-founder and CTO at Silent Circle (Global Encrypted Communications). As an inventor and cryptographer, his designs of security products have won major innovation awards from The Wall Street Journal and others.

Last week, you submitted your questions for Jon Callas, one of  the world’s most respected and brilliant minds when it comes to software security and privacy. We chose five of them, which we sent to Jon. These are his answers.

1. How did you become a security expert / cryptographer?

A long time ago, I worked at the best computer science grad school there was — VMS development at Digital Equipment Corporation. One of the great things there was that I got to work on a wide variety of things, from graphics to schedulers to memory management to operating system security. A lot of the problems we had to deal with at the time are still relevant issues. I did a random password generator among other things, and I still use that for my own passwords.

When DEC fell apart, like many people, I started a startup with a number of friends. We built a system that let you do meetings as well as play games, socialize, and collaborate. It got rave reviews. The venture capital people said to us, “This is amazing! I want to invest in this in ten years!” That was when I started getting into cryptography. People didn’t want to do collaboration on the then very-early Internet without encryption. There was no SSL at the time, either.

So I went to the second ever RSA conference to learn to do enough cryptography to protect our network. I ended up sitting next to a guy named Bruce who had just written a book called “Applied Cryptography” and he had a bunch of them in a duffel bag with him, so I bought one. I may have bought the very first copy; I know I was the first person at RSA who bought one. I asked him to autograph it, and he said, “I can’t deface a book!” I replied that it’s not defacement if you’re the author.

After we got tired of throwing our money into our startup, I went to work for Apple in the Advanced Technologies Group and worked for Gurshuran Sidhu, who was the inventor of AppleTalk, and shipped the very first crypto built into an OS, called PowerTalk. It failed for being far too early, as well. One of its pieces, though, was this password manager called The Keychain, and I claimed that it was a great thing. While it was hardly perfect, it encouraged good password use, and that was better than anything else. So Bruce Gaya and I hacked The Keychain so that you could run it without the rest of PowerTalk, and thus rescued it from oblivion. The present Keychain on Apple products is completely and utterly rewritten, but I’m proud of saving it. I also built a random number manager for Macs that’s now lost to the mists of time.

That was the worst time to be working for Apple, the year before Steve Jobs came back. I named all my computers for things in The Hitchhiker’s Guide to the Galaxy, because as I said, having been through DEC’s collapse I felt a bowl of petunias (“Oh, no, not again”). When SJ came back, we heard a lot about what his plans were, as he and Sidhu were old friends. We knew that he was planning to get rid of all of ATG, so we wondered what to do. Sidhu wanted to start a startup, but none of us had any ideas we really liked. I could have easily gone into the OS group. A friend of a friend said that Phil Zimmermann’s PGP Inc was looking for server architects, and I interviewed there and got an offer. I thought it was a great way to do fun things and change the world for the better, so I went there. That was a great place to really become an expert.

2.  Are there any localities where it is illegal to encrypt calls, text messages, or emails?

Maybe. That’s not a good answer, is it?

In civilized countries, the answer is no. I might even go so far as to say that the places where it’s not legal or even expected are pretty tightly correlated with civilized countries. Repressive governments often try to restrict crypto. I’m sure Syria’s got it’s opinions, but I’m not an expert on Syrian law.

There are places where there are restrictions, but they are also so filled with exceptions that it’s hard to give a definitive answer. For example, China has import restrictions on cryptography. But there are exemptions for non-Chinese doing business there or Chinese people who are doing business with other countries. I am also nothing like an expert on Chinese law.

My rule is that I worry about the laws of countries that I want to operate in. I need to know about them, there. Other places I just ignore.

Most often, even in repressive countries, they aren’t worried about the crypto as such, they’re worried about what the people are using the crypto for.

 3. What are you working on right now that has you the most excited?

On a large scale, it’s Silent Circle. The biggest problem we’ve always had with crypto is that it’s hard to use. Usability is key because if it’s hard to use, then people use insecure systems. They don’t stop talking, they stop being secure. So your security has to fade into the background. It has to be ignorable. But it also has to be there, as well. That’s a paradox.

We also have learned one of the best ways to make security workable is to have it run by an expert staff. So the question is how to have an expert staff running the security and privacy for people who need it and yet the staff can’t undetectably compromise the people using the system. We have a lot of innovative things we’re doing to make the security fade into the background and yet be there.

On a small scale, I’m taking my old password generator from VMS and making it into an iPhone app. I was doing a lot of work on it before Silent Circle as a hobby, and I really ought to finish.

4. As an expert on encryption do you see a natural relationship between encryption and the law? What’s your stance on how encrypted data should be treated when there’s no idea what it may contain? In some countries there are what I consider very severe key disclosure laws and I wonder if there will ever be a duress scheme or method of deniable encryption that could be so perfect as to make the laws moot.

I think it’s an unnatural relationship between encryption and the law. All technologies can be used for good or ill. It’s true for fire. It’s true for just about anything. Encryption, interestingly, is rarely *directly* used for ill. Yes, there are data ransom schemes that use encryption for ill, but that’s not what people are concerned about.

It’s part of our belief in human rights that we believe in the right to be left alone. Yet many people lose their nerve when it comes to privacy technologies on computers and networks. I think that’s an artifact of the fact that we’re comfortable with door locks or window curtains, but every time someone thinks about encryption, the James Bond theme starts playing in their head. That’s an artifact of the relationship between encryption and disagreements between nation-states. With the Internet and computing everywhere, not using encryption is like having an unlocked house with no curtains.

“With the Internet and computing everywhere, not using encryption is like having an unlocked house with no curtains.”

My stance on encrypted data per se is that it’s data. Everyone has reasons that they want something to be private. Everyone has things that *must* be private, like their own records or someone else’s records, which usually *must* be protected. This might have been an interesting debate way back in the 1900s, but it isn’t any more.

I don’t know what to say about key or data disclosure laws. In the US, there’s movement in the courts towards protecting encrypted data in some way or other. It’s all revolved around passwords in specific, but the real issue is a Fifth Amendment issue. Relatively few countries have equivalents of the Fifth Amendment.

But the UK, for example, they don’t have protections against self-incrimination. As a matter of fact, we have one in the US *because* they don’t have one there. They have a disclosure law, RIPA. I think its application has been pretty embarrassing, as I can’t think of a place where it has been used that didn’t do much more than make the defendant more sympathetic.

I am not a fan of deniable encryption and personally, I think it’s impossible. Deniable encryption seems to me to be predicated on the idea that your attacker is either a nice person or stupid. Stupid in the sense that you are managing to hide the fact that you’re using deniable encryption. That predicates that either you’re using something completely custom, or they don’t realize that the deniable encryption is there. That’s what I mean by stupid — you’re pulling a fast one on them and they don’t know it. By being nice, they know you have deniable encryption and yet they’ll say, “Well, I guess if we can’t *prove* you have something encrypted there, I guess you don’t!”

A couple of years ago, I was chatting with the customs agency of a civilized country. I asked them about TrueCrypt and its deniable disk volume. They said, “Oh, we know *all* about TrueCrypt!” One of the guys I talked to added, “If we see you’re using TrueCrypt, we just ask you for the second password.” I asked what happens if someone doesn’t have a second volume and they replied, “Why would someone do *that*? I mean, that’s the whole point of TrueCrypt, to have a second volume. What kind idiot would install TrueCrypt and not have a second volume?” We chatted some more and one of them said, “Look, we don’t look in someone’s laptop for no reason. We have better things to do. If we’re asking for your computer, it’s not because they had a piece of fruit in their bag. If we find special encryption, we know we’re on to something.” I asked again about someone who *doesn’t* have a hidden volume, and they said that you’d have to sit in a room for a while, until you convince them you don’t.

This is the real issue, I think. If you’re in a nice jurisdiction — one where you can say, “Look, I’m an honest person and I have encryption, and no I’m not going to tell you my password” then deniable encryption might work. But if you’re in a jurisdiction where they aren’t nice, then you’re actually more at risk using something that makes you look like you’re up to something.

Ironically, this is an effect of the fact that we’ve succeeded in making encryption normal.

 5. What is your favorite movie?

There are relatively few movies that I’m willing to watch more than once. I’m apathetic about special effects, but a sucker for great dialog.

One of the very few movies I can watch over and over is The Princess Bride. One of my favorite lines to live by is, “Nonsense. You’re only saying that because no one ever has.”

Thanks Jon! If you are interested in learning cryptography, we recommend reading his PDF, An Introduction to Cryptography. Otherwise, be sure to follow or like Silent Circle to stay in stride with their efforts and support their work in encrypted communications.

Security, Privacy & Encryption 101 Roundup

As you know, privacy and security is not something we take lightly. In our efforts to help educate our fellow humans on their importance and the role they play in our lives on and offline, we’ve compiled the below list of recent news, resources and tips.

[For the past few weeks we've focused on encryption. If you missed them: Just Because It's Encrypted Doesn't Mean It's Private and Encryption 101.]

If you would like to share links or resources we’ve missed, we encourage you to do so below.

May Highlight

Education

News & Information

Breaches

Tools

Interesting Reads

Comics

Tips

  • Don’t send sensitive information over the Internet before checking a website’s security
  • Pay attention to the URL of a website. Malicious websites may look identical to a legitimate site, but the URL may use a variation in spelling or a different domain (e.g., .com vs. .net)
  • Install and maintain anti-virus software, firewalls, and email filters to reduce suspicious traffic
  • Don’t use passwords that are based on personal information that can be easily accessed or guessed
  • Use both lowercase and capital letters in your passwords
  • Use different passwords on different systems
  • Do business with credible companies
  • Do not use your primary email address in online submissions
  • Devote one credit card to online purchases
  • Encrypting data is a good way to protect sensitive information. It ensures that the data can only be read by the person who is authorized to have access to it
  • Use two-factor authentication if available (coming soon to SpiderOak)
  • Back up all of your data on a regular basis

Just Because It’s Encrypted Doesn’t Mean It’s Private

Now that you’ve got a handle on what encryption is and what it can do, it’s important to understand what it can’t do.

Encryption is a tool, and like any tool, it can be used improperly or ineffectively. It may sound a bit strange for us at SpiderOak to disclaim the benefits of encryption, but I hope to show that while encryption is necessary for privacy, it’s not always sufficient.

One prime example of the utility of encryption is HTTPS. By wrapping encryption around regular HTTP, engineers have created a powerful tool for securing content both delivered to you and provided by you. But HTTPS only protects content while it’s in transit. HTTPS will protect your credit card numbers as they travel over the internet to a merchant, but once they arrive on the other end, they’re no longer encrypted and it’s up to the merchant and credit card providers to protect them. Credit card providers and banks have developed PCI DSS regulations to tightly specify the security of credit card processing, but as the frequency of credit card breaches demonstrates, these regulations aren’t sufficient to guarantee privacy.

Another great cryptographic tool is Full Disk Encryption. Whether built-in to your computer hardware or provided by software like TrueCrypt, FDE protects the contents of your hard drive by encrypting every last bit. Anyone who steals your hard drive will find it completely unreadable. But while you’re using the drive, it is readable. While you have your computer on and the drive unlocked, any malicious piece of software running on your computer will find all of your data fully readable. FDE is a valuable tool, but it can only guarantee privacy while the disk isn’t in use.

Privacy is a complex problem that requires attention to many details, one of which is encryption. We’ve tried our best to provide you with the best privacy possible for your important data. If you’re interested in more details about how we protect your privacy, please read our Engineering page. And feel free to ask us about it, we’re always willing to brag!

Privacy Roundup #5 of 2013

Time marches on and it is hard to believe the first four months of the year have now come to a close. The month of April has been a big one for SpiderOak as we have released our long awaited 5.0 client including our newest feature ‘Hive’ as well as Explorer integration for Windows and our newly redesigned 2.0 iOS application (with Android currently in Beta).

In world news it can be noted that the Crypto Coin craze is still going strong with Bitcoins (btc) hovering above $100 and companies such as Butterfly Labs and Avalon shipping more and more advanced equipment for mining cryptographic currencies (more on this and privacy and security implications of crypto currencies in a future post). The world has seen the rise and possibly fall of CISPA once more, and the debate on surveilance drones rages on.

For this Privacy Roundup we have as usual hand picked some interesting tidbits from the news, so stay vigilant and check out some of what we felt was important in the last few weeks:

Well that about sums it up. This week we sign off with a quote from Sean Parker’s character from the movie “The Social Network”: “We lived on farms, then we lived in cities, and now we’re going to live on the internet!”.

As always, we hope you have a productive and private month ahead!

Millennials Care About Privacy After All

SpiderOak is passionate about privacy. If you are reading this, chances are you are passionate about privacy. But what about the Millennials? A Millennial is also defined as Generation Y for those generally born from the latter 1970s, or from the early 1980s to the early 2000s.

It’s been said that the Millennial Generation does not care about privacy. According to a recent survey by USC Annenberg, Millennials think differently and are smarter about making decisions when it comes to privacy.

This infographic explains:

  • 70% of Millennials would rather not give others access to their personal data or information.
  • Millennials use social media to keep in contact with friends and family.
  • 25% of Millennials do not mind sharing some information in exchange for relevant advertising.
  • 48% of Millennials use social media several times per day more than those 35 and older.
  • 56% of Millennials don’t mind sharing location with companies in order to receive coupons or deals for nearby businesses.
  • 51% of Millennials don’t mind sharing information with companies as long as they get something in return.

However, a good percentage of millennials, compared to those age 35 and older, are willing to give up some of their privacy only if they will benefit from giving up their personal information.

To read their full study, you can click here.

Are you a Millennial? What do you think about the survey? We’d love to hear your thoughts.

Fighting for the Right to Know

In February a bill was introduced in the State of California by Assembly Member Lowenthal called AB 1291 or the ‘Right To Know’ Act.

The central theme of the ‘Right to Know’ Act is transparency – creating a way for the user of a service to request and thus understand all the information a company has about them in terms of what was given, what was shared, as well as what may have been inferred. And despite various company spokesmen speaking up in favor of AB 1291 (e.g. increased transparency), behind closed doors many of these same companies are working to defeat the bill via industry groups and coalitions.

It is tough to argue against transparency. And it has always been our belief that a more transparent ecosystem would give way to a better educated consumer which ultimately means a better business environment. After all, we would hate to think that somehow companies were taking advantage of us in any way and the ‘Right to Know’ Act is a way through this potential fear.

For these reasons and more, we felt compelled to come out in favor of AB 1291. Below is the letter we sent to Assemblymember Lowenthal in show of our support:

***

March 21, 2013

 

The Honorable Bonnie Lowenthal

State Capitol, Room 3152

Sacramento, CA 95814

 

Re:  Support for AB 1291 (Lowenthal), The Right to Know Act

Dear Assemblymember Lowenthal:

On behalf of SpiderOak, Inc, we are writing to express our support for AB 1291, the Right to Know Act.

The Right to Know Act would modernize California’s Shine the Light Act (Civ. Code 1798.83), which was intended to provide Californians with the right to know when businesses are sharing their personal information. In the years since the passage of the Shine the Light Act, its definitions and mechanisms have been outpaced by rapid changes in technology, data collection, and business practices. The Right to Know Act would update this important measure.

We stand in support of the Right to Know Act for two reasons. First, SpiderOak strongly supports the principle that individuals deserve the right to know how their personal information has been collected and shared. Protecting user privacy needs to be an essential part of how we as a society address the growth of online activity. Therefore, we support efforts to provide individuals with consistent transparency across all of the companies who handle personal information.

Second, SpiderOak believes that transparency and greater understanding will help all businesses in the modern data ecosystem thrive – including SpiderOak. Businesses that handle personal information rely on user trust – that the business is handling information with the utmost care and concern. As the ‘cloud’ medium grows, information collection and gathering has increased exponentially. By increasing transparency, we believe that the Right to Know Act will promote good data stewardship across the board and thus increase overall trust in and usage of data-driven services, promoting innovation and business growth.

California residents and companies both deserve an online world where users can truly understand how their personal information is collected and shared. Transparency is a necessary step in building that world. For that reason, we are proud to join you in supporting the Right to Know Act.

Sincerely,

Ethan Oberman

CEO
SpiderOak, Inc

***

We are curious to hear your thoughts on the ‘Right to Know’ Act and where you stand. Feel free to leave your comments here and looking forward to the dialogue!

Increasing Transparency Alongside Privacy – 2013 Report

As we stated in our Transparency Report in 2012, privacy continues to be at the root of all we do at SpiderOak. Every new product and feature is designed to fit tightly alongside our ‘Zero-Knowledge’ privacy commitment. And we continue to understand how transparency plays a role in overall privacy.

In our ongoing efforts to stay on top and aware of this ever-changing landscape, our work with the Electronic Frontier Foundation (EFF) continues to keep us better informed and aware of what we can do when fighting for the rights of our users.

Given all this, we have reproduced a Transparency Report for that covers all activity over the last calendar year – from April 2012 to April 2013. The report is as follows:

SpiderOak Transparency Report

We are proud to stand behind our commitment in keeping our users informed of any and all activities involving their data and the constant protection of their privacy. Our relationship with the EFF and other organizations will always improve our outreach and understanding so that you – our user – will benefit from a fully transparent and open environment. As always, we greatly value your thoughts and feedback so please don’t hesitate to send further thoughts or questions anytime.

Why Privacy Matters

Why does privacy matter?

To begin breaking down the subject of privacy, we created an explainer, recently published in IT Briefcase, to answer the following questions:

  • What is the difference between privacy and security?
  • Should you care about privacy if you have nothing to hide?
  • What does privacy mean in the digital age?

Want to help spread the message of privacy? Share this explainer and get people thinking about Why Privacy Matters.