Conversations about life & privacy in the digital age

SpiderOak University & Interview with a Cybersecurity Expert

This week we opened the doors to SpiderOak University. Anyone can participate and earn extra GBs.

We were honored to talk to Richard F. Forno, Ph.D., who has more than 20 years of experience in the cybersecurity field. Dr. Forno helped build the first formal cybersecurity program for the U.S. House of Representatives as the first Chief Security Officer for at Network Solutions (operator of the InterNIC), and is considered one of the early thought leaders on the subject of “information warfare.” Today, he is the Assistant Director of the UMBC Center for Cybersecurity, an honors college in Maryland, as well as the director of its cybersecurity graduate program. Dr. Forno is also a SpiderOak fan.

1. How have you seen cybersecurity evolve since you’ve been in the field, and how would you describe where it is right now?

RF: Cybersecurity these days means much more than just people at computers guarding data and network resources. Yes, that’s where it started off decades ago when it was known as ‘computer security’ and existed as a small function of the IT department and treated as an administrative overhead budget item — but with technology, data, and networking permeating nearly every aspect of society, it’s taken on a much broader meaning and become a critical corporate function. Now, ‘cybersecurity’ can refer to nearly anything related to ensuring the security, availability, integrity, and resilience of the many systems and sources of data that form the foundation of modern existence — from protecting company (or national) secrets to personal health care and financial records, from the systems controlling water and power distribution in our cities to the widgets in our televisions, toasters, and electronic devices they all require some degree of security, assurance, and resilience since our lives and much of society depends on them.  That said, I still believe cybersecurity — and by extension, privacy — is a state of mind and very much dependent on the context of any given situation to be effective.

2. Are you seeing more students that care about privacy and cybersecurity, or is it harder to attract people to your program?

RF: The former, absolutely. There remains a sizable global interest in cybersecurity education, from high schools and community colleges all the way through 4-year and postgraduate study. Recurring news reports of data breaches, website defacements, and denial of service attacks certainly help generate interest in the subject both personally and professionally.

That said, given the strong interest in cybersecurity, it’s important to set and manage student (or prospective student) expectations appropriately.  Despite glorified portrayals of cybersecurity in the media, one can’t simply “wave a magic wand” and become a “cyber warrior” exclusively by a single college degree or certification exam … it’s a combination of fundamental and applied technical knowledge, social acumen, and the ability to understand the ‘big picture’ while exercising common sense that makes for an effective cybersecurity professional.  Cybersecurity in 2013 is far more than just working with the bits and bytes….and by contrast, you can work in some areas of cybersecurity and not necessarily need a deep technical background to be successful or make a difference.

3. Are there any trends in cybersecurity or privacy that you are excited about or think are the future?

RF: I think the ongoing revelations from Edward Snowden are giving people and organisations around the world a useful opportunity to reassess how much they share online and/or what third-party services they use to store information and communicate, which naturally includes both privacy and cybersecurity considerations.  That public discussion, in my view, is long overdue — normally folks rush to embrace new technologies first and then figure out if or how they’re dangerous, and usually only after something bad has happened. So in terms of privacy I am quietly optimistic that the pendulum may begin shifting towards people doing ‘less sharing’  – or, perhaps more accurately, leaving ‘less footprints’ around the Internet.  At least they might start doing homework and determining what level of exposure (and to whom) they’re willing to live with and under what circumstances.

The last time I saw such heated public discussion about government intrusion into online privacy was back in the 1990s — first when the US government tried (and failed) to criminalise the distribution of PGP encryption software and then when the Communications Assistance for Law Enforcement Act (CALEA) was enacted by Congress to provide US law enforcement wiretapping capabilities on Internet devices — which was a faint foreshadowing of things-to-come under the ‘Patriot’ Act of 2001 and subsequent legislative proposals.

However, I’m encouraged to see security and privacy capabilities being brought to market and/or incorporated into software and devices.  To many users, security and privacy technologies are hard to understand and implement — so I am pleased that more user-friendly products and services are making it easier for people to understand and manage their privacy and security exposure if they choose to do so.  But by contrast, I worry about our obsession with creating the ‘Internet of Things’ — do we really need to have our home appliances, air conditioners, baby monitors, and automobiles constantly connected to the Internet? While convenient and perhaps fun or useful at times, what risks do they present to our security and privacy?

4. Tell us about how you came to your current role at UMBC, and what this graduate program is about?

RF: At UMBC I wear many hats. My primary role is directing our graduate programs in cybersecurity, which now is entering its third successful year of educating cybersecurity professionals to assume more senior leadership positions in the technology and cybersecurity industry.  I’m also the assistant director of our Center for Cybersecurity, which serves as the University’s central coordination and outreach entity on cybersecurity education, research, and related activities to allow us to better interact with our many partners, prospective collaborators, and the public.  And, through UMBC, I am co-founder of the annual Maryland Cyber Challenge — our state’s official cyber-competition.

As to how I got here?  My cybersecurity career began in the early 1990s before the Dot Com Boom. Over that next 20 years I worked for a variety of government, military, and private organisations and thus not only was an ‘eyewitness to history’ in terms of cybersecurity and the Internet Revolution, but worked for some of the entities that helped shape it.  Along the way, I remained interested in Internet policy, cyberculture, and how Internet technology influences modern society — which, obviously includes many cybersecurity and privacy issues.

After a while, my interests turned toward “giving back” to the professional community and sharing my lessons learned with the next generation of cybersecurity practitioners to help them improve the future and perhaps learn from our collective past.  And thus I landed at UMBC in 2010 — certainly the right place at the right time to be working on this very timely global topic!

5. How long have you been a SpiderOak user?

RF: I learned about SpiderOak in early 2012 from a fellow academic down in Australia and signed up for the free personal account out of curiosity.  Now, with the SpiderOak Hive capability, I expect to increase my account size and replace another popular realtime sync service I’ve used for years with one that places great emphasis on addressing modern privacy concerns for its users in a meaningful way.

We’re grateful to Dr. Forno for sharing his time and expertise with us.

Be sure to check out SpiderOak University so you can participate and earn extra GBs for your account.

AMA: Interview with Cryptographer, Computer Security Expert Jon Callas

Jon worked on Apple’s Whole Disk Encryption, PGP (Pretty Good Privacy) Universal Server, co-founded the PGP Corporation, is former CTO of Entrust, and current co-founder and CTO at Silent Circle (Global Encrypted Communications). As an inventor and cryptographer, his designs of security products have won major innovation awards from The Wall Street Journal and others.

Last week, you submitted your questions for Jon Callas, one of  the world’s most respected and brilliant minds when it comes to software security and privacy. We chose five of them, which we sent to Jon. These are his answers.

1. How did you become a security expert / cryptographer?

A long time ago, I worked at the best computer science grad school there was — VMS development at Digital Equipment Corporation. One of the great things there was that I got to work on a wide variety of things, from graphics to schedulers to memory management to operating system security. A lot of the problems we had to deal with at the time are still relevant issues. I did a random password generator among other things, and I still use that for my own passwords.

When DEC fell apart, like many people, I started a startup with a number of friends. We built a system that let you do meetings as well as play games, socialize, and collaborate. It got rave reviews. The venture capital people said to us, “This is amazing! I want to invest in this in ten years!” That was when I started getting into cryptography. People didn’t want to do collaboration on the then very-early Internet without encryption. There was no SSL at the time, either.

So I went to the second ever RSA conference to learn to do enough cryptography to protect our network. I ended up sitting next to a guy named Bruce who had just written a book called “Applied Cryptography” and he had a bunch of them in a duffel bag with him, so I bought one. I may have bought the very first copy; I know I was the first person at RSA who bought one. I asked him to autograph it, and he said, “I can’t deface a book!” I replied that it’s not defacement if you’re the author.

After we got tired of throwing our money into our startup, I went to work for Apple in the Advanced Technologies Group and worked for Gurshuran Sidhu, who was the inventor of AppleTalk, and shipped the very first crypto built into an OS, called PowerTalk. It failed for being far too early, as well. One of its pieces, though, was this password manager called The Keychain, and I claimed that it was a great thing. While it was hardly perfect, it encouraged good password use, and that was better than anything else. So Bruce Gaya and I hacked The Keychain so that you could run it without the rest of PowerTalk, and thus rescued it from oblivion. The present Keychain on Apple products is completely and utterly rewritten, but I’m proud of saving it. I also built a random number manager for Macs that’s now lost to the mists of time.

That was the worst time to be working for Apple, the year before Steve Jobs came back. I named all my computers for things in The Hitchhiker’s Guide to the Galaxy, because as I said, having been through DEC’s collapse I felt a bowl of petunias (“Oh, no, not again”). When SJ came back, we heard a lot about what his plans were, as he and Sidhu were old friends. We knew that he was planning to get rid of all of ATG, so we wondered what to do. Sidhu wanted to start a startup, but none of us had any ideas we really liked. I could have easily gone into the OS group. A friend of a friend said that Phil Zimmermann’s PGP Inc was looking for server architects, and I interviewed there and got an offer. I thought it was a great way to do fun things and change the world for the better, so I went there. That was a great place to really become an expert.

2.  Are there any localities where it is illegal to encrypt calls, text messages, or emails?

Maybe. That’s not a good answer, is it?

In civilized countries, the answer is no. I might even go so far as to say that the places where it’s not legal or even expected are pretty tightly correlated with civilized countries. Repressive governments often try to restrict crypto. I’m sure Syria’s got it’s opinions, but I’m not an expert on Syrian law.

There are places where there are restrictions, but they are also so filled with exceptions that it’s hard to give a definitive answer. For example, China has import restrictions on cryptography. But there are exemptions for non-Chinese doing business there or Chinese people who are doing business with other countries. I am also nothing like an expert on Chinese law.

My rule is that I worry about the laws of countries that I want to operate in. I need to know about them, there. Other places I just ignore.

Most often, even in repressive countries, they aren’t worried about the crypto as such, they’re worried about what the people are using the crypto for.

 3. What are you working on right now that has you the most excited?

On a large scale, it’s Silent Circle. The biggest problem we’ve always had with crypto is that it’s hard to use. Usability is key because if it’s hard to use, then people use insecure systems. They don’t stop talking, they stop being secure. So your security has to fade into the background. It has to be ignorable. But it also has to be there, as well. That’s a paradox.

We also have learned one of the best ways to make security workable is to have it run by an expert staff. So the question is how to have an expert staff running the security and privacy for people who need it and yet the staff can’t undetectably compromise the people using the system. We have a lot of innovative things we’re doing to make the security fade into the background and yet be there.

On a small scale, I’m taking my old password generator from VMS and making it into an iPhone app. I was doing a lot of work on it before Silent Circle as a hobby, and I really ought to finish.

4. As an expert on encryption do you see a natural relationship between encryption and the law? What’s your stance on how encrypted data should be treated when there’s no idea what it may contain? In some countries there are what I consider very severe key disclosure laws and I wonder if there will ever be a duress scheme or method of deniable encryption that could be so perfect as to make the laws moot.

I think it’s an unnatural relationship between encryption and the law. All technologies can be used for good or ill. It’s true for fire. It’s true for just about anything. Encryption, interestingly, is rarely *directly* used for ill. Yes, there are data ransom schemes that use encryption for ill, but that’s not what people are concerned about.

It’s part of our belief in human rights that we believe in the right to be left alone. Yet many people lose their nerve when it comes to privacy technologies on computers and networks. I think that’s an artifact of the fact that we’re comfortable with door locks or window curtains, but every time someone thinks about encryption, the James Bond theme starts playing in their head. That’s an artifact of the relationship between encryption and disagreements between nation-states. With the Internet and computing everywhere, not using encryption is like having an unlocked house with no curtains.

“With the Internet and computing everywhere, not using encryption is like having an unlocked house with no curtains.”

My stance on encrypted data per se is that it’s data. Everyone has reasons that they want something to be private. Everyone has things that *must* be private, like their own records or someone else’s records, which usually *must* be protected. This might have been an interesting debate way back in the 1900s, but it isn’t any more.

I don’t know what to say about key or data disclosure laws. In the US, there’s movement in the courts towards protecting encrypted data in some way or other. It’s all revolved around passwords in specific, but the real issue is a Fifth Amendment issue. Relatively few countries have equivalents of the Fifth Amendment.

But the UK, for example, they don’t have protections against self-incrimination. As a matter of fact, we have one in the US *because* they don’t have one there. They have a disclosure law, RIPA. I think its application has been pretty embarrassing, as I can’t think of a place where it has been used that didn’t do much more than make the defendant more sympathetic.

I am not a fan of deniable encryption and personally, I think it’s impossible. Deniable encryption seems to me to be predicated on the idea that your attacker is either a nice person or stupid. Stupid in the sense that you are managing to hide the fact that you’re using deniable encryption. That predicates that either you’re using something completely custom, or they don’t realize that the deniable encryption is there. That’s what I mean by stupid — you’re pulling a fast one on them and they don’t know it. By being nice, they know you have deniable encryption and yet they’ll say, “Well, I guess if we can’t *prove* you have something encrypted there, I guess you don’t!”

A couple of years ago, I was chatting with the customs agency of a civilized country. I asked them about TrueCrypt and its deniable disk volume. They said, “Oh, we know *all* about TrueCrypt!” One of the guys I talked to added, “If we see you’re using TrueCrypt, we just ask you for the second password.” I asked what happens if someone doesn’t have a second volume and they replied, “Why would someone do *that*? I mean, that’s the whole point of TrueCrypt, to have a second volume. What kind idiot would install TrueCrypt and not have a second volume?” We chatted some more and one of them said, “Look, we don’t look in someone’s laptop for no reason. We have better things to do. If we’re asking for your computer, it’s not because they had a piece of fruit in their bag. If we find special encryption, we know we’re on to something.” I asked again about someone who *doesn’t* have a hidden volume, and they said that you’d have to sit in a room for a while, until you convince them you don’t.

This is the real issue, I think. If you’re in a nice jurisdiction — one where you can say, “Look, I’m an honest person and I have encryption, and no I’m not going to tell you my password” then deniable encryption might work. But if you’re in a jurisdiction where they aren’t nice, then you’re actually more at risk using something that makes you look like you’re up to something.

Ironically, this is an effect of the fact that we’ve succeeded in making encryption normal.

 5. What is your favorite movie?

There are relatively few movies that I’m willing to watch more than once. I’m apathetic about special effects, but a sucker for great dialog.

One of the very few movies I can watch over and over is The Princess Bride. One of my favorite lines to live by is, “Nonsense. You’re only saying that because no one ever has.”

Thanks Jon! If you are interested in learning cryptography, we recommend reading his PDF, An Introduction to Cryptography. Otherwise, be sure to follow or like Silent Circle to stay in stride with their efforts and support their work in encrypted communications.

Ask American computer security expert, Jon Callas

You know that crazy interview question, “If you could have dinner with any famous person, living or dead, who would it be?”

Well, someone the other night answered, Jon Callas. Perhaps there are several of you out there with interest in the world of cryptography and information security and would also enjoy the opportunity to ask him some questions.

While we can’t set up a dinner meeting between you with Jon, we can pass along your questions. Jon has graciously granted us the opportunity to send over our readers’ most burning questions for him to answer.

Take the weekend to submit your questions in the comment section and the folks over here at SpiderOak will pick the best 10 to be submitted to Jon.

Who knows, maybe you’ll get a dinner out of this afterall…