Why one of cybersecurity’s thought leaders uses a pager instead of a smart phone
By Andrea Peterson
Dan Geer puts on his glasses at the beginning of his keynote at Black Hat USA 2014.
(Image courtesy of Black Hat)
In the computer and network security industry, few people are as well known as Dan Geer. A long-time researcher who is thought of as one of the industry's thought leaders, Geer is currently the Chief Information Security Officer at In-Q-Tel — a non-profit venture capital firm that invests in technology to support the Central Intelligence Agency.
Speaking on his own behalf as the keynote at Black Hat USA last week, Geer laid out an ambitious plan to help secure the Internet and define privacy in the digital age, including mandating security breach disclosure, having the U.S. government buy and disclose all the zero day vulnerabilities it can find, and supporting an even stronger "right to be forgotten" than is currently being tried out by the European Union. His full keynote is available to watch on YouTube — or to read via Black Hat's Web site.
The Switch spoke with him after his keynote to dig into a different topic that he touched on: His distrust of increasing data collection and how he tries to stay off the digital grid in his own life. This interview has been lightly edited for length and clarity.
One of the things I was very interested in from your talk was your personal approach to technology now — as one of the sort of elders of the cybersecurity community you really seem to try to stay off the network as much as possible. Is that accurate?
I don't carry a cellphone. Honestly, it's a nuisance — it would be very helpful because as you know things aren't about planning these days, they're about coordination. "Oh, did you see this? Get over here." It's about coordination rather than planning.
But on the other hand, I testified actually twice — once at the FCC, once in a congressional committee — that if you required location tracking, I was going to give one up. And to an extent, it's only putting my money where my mouth is. I said I would give it up, and went ahead and did it. So you say, "Well, you're cutting of your nose to cut your face, you're just being stubborn." But no, I meant it.
You no doubt have written about data retention laws and the like... The whole bit about data retention laws bothers me in many ways. On the other hand, if you're an optimist or you're in an position to control how data is used you'll be much more comfortable about having it. Does the name Alessandro Acquisti mean anything to you?
It does — although probably more to you...
He's a professor at Carnegie Mellon. I think he's about as good as a designer of experiments in privacy — in particular people's real opinions on privacy — as anybody. He's really good at experimental design, which speaking as someone who was once trained as a statistician appeals to me. He runs very clever experiments, and those clever experiments include getting past the institutional review committee which is not exactly a walk in the park...
But he's done a bunch of things, and shown that if you give people fine-grained control over what their information is in public, people reveal more. They might say if you give me a lot of control, they'd reveal less — but it doesn't work that way. People will reveal more if they have more control so to a certain extent is what he's verifying is sort of my own feeling: If I don't have control I don't want to reveal it.
Part of my personal opinion about all this is that I don't trust a situation where I have not only no control about its use, but no visibility about whether it is being used. Take electronic health records. We're obviously going towards it in a big way. But I ask you, who owns the electronic health records?
That's a good question.
I worked in Harvard's teaching hospitals for 10 years after getting out of college. And in 1974, I'm fairly certain that was the year — this is by memory, but I'm fairly certain that was the year — but in Massachusetts that's when who owned the medical record changed from being the individual to the institution. Before that, when I went to the window I could say "give me my record" and you would have to produce a stack of paper and when I took it and walked out, I had my records. There wasn't another.
That was changed, ostensibly, to combat insurance fraud — people were taking records, removing parts of it and going to another institution to get more medicine or more whatever. Insurance fraud, okay? But the point was there was a record and you knew where it was — because if I have it, you don't, and if you have it, I don't. But now electronic health records, where is that going to go? There are people who argue that in the world of electronic health records, it's natural for it to revert to the patients. I think that's probably true — but let's think about this.
I have a practicing lawyer friend who argues in a world where malpractice suits are so ordinary, common, and frequent that if might not be the case. If you are a practitioner and it's 100 percent electronic records and you're worried about being sued, will you or will you not want a copy of that record in your files as well as wherever else I might be? Or are you willing to say, "I looked at Dan's records in this cloud at this time and it told me I should give you the transfusion" versus "I've got a copy of the record and this is what I used to make my decision and you know that my copy and this copy are not the same, so someone has modified it?"
So that's going back to what this guy is actually talking about doing: Founding a company that provides time-stamped delivery of medical records fragments so that someone can say, "no, this is what I had and I can prove it — this third party over here can say, no that is what I transmitted to Dan's doctor on this date. We don't know what's in it because it was encrypted, but we can say it was the same bits because we stamped it in a certain way."
And I think he's right about that — the integrity in electronic health records becomes perhaps more important than confidentiality. It may well be that we are at a moment in time when what changes under the pressure to provide observability. I'm using electronic health records, but it could just as easily be cars, or the smart grid or anything else. What changes is confidentiality for better or worse goes away. But that leaves the question of integrity.
I'm sure you've seen this, but the so-called CIA rule — confidentiality, integrity and availability — is a traditional triad about computer security concerns. Availability is not as big of one, but it has to do with "if I go looking for Dan's records will I actually get it." Integrity is "has anybody mucked with it?" And confidentiality is "has anyone whose never had any reason to know able to see it?" I think it's honest to say we may lose a certain amount of confidentiality control. It would be most unfortunate if we lost integrity control at the same time.
So what do we do when there's lots of fragments of my medical records and every practitioner I deal with wants their piece of it, or maybe the whole thing? Integrity actually is the big deal then, I would argue.
Arguably the same points could be made about tracking cars for insurance purposes. . .
And I understand why you would say you want to record everything with a car — I understand that. Where's it has been? One of the things Tim O'Reilly suggested in his work on Algorithmic Regulation was, well you know you could make obeying the speed limit built into the car, but you could also make the speed limit dependent on how crowded the roads are — so you could drive faster in the off hours. In rush hour, the car would drive slower. Yeah, we'd probably do what they do in London and adjust for congestion. But his point is that instead of regulating the prior conditions — "you can't go faster than this, or you can't do that" — regulate it on the run with an algorithm. Of course, that way lies wonderful things and terrible things. It's what do you think is probable? As for myself...
I'm not a Luddite. Luddites smash machines. But I am getting older, and it's easier to say "why do I care?" To continue with the cellphone conversation, it would be especially useful. A member of my family is mentally ill, I've been carrying a paging device but the pager companies are slowly going out of business because of this [points towards cellphone being used to record interview] for obvious reasons. But it's important for some people to be able to reach me for certain situations that occur, as you might guess. Maybe I'll give up and do this. GPS built into cars, or OnStar that you can't turn off. Do you care about that? As I'm sure you know, the most common reaction is "I live a good life, I have nothing to hide." Daniel Solove has a book about this in which he dismembers that argument, showing that just because you have nothing to hide doesn't mean that you want everything recorded.
This comes down to what do you expect as defaults. For me, the default I find easier to expect is "data doesn't exist" rather than "data exists but we handle it properly. Look, I mean nobody who contributed to the 1.2 billion passwords [reportedly in the hands of Russian cybercriminals] expected to do that. That's presumably a rare event — reasonably rare — but what default you are willing to accept or what default feels natural to you is really what it comes down to. For me the default of "the data doesn't exist" seems more natural to me than trusting everyone to not abuse it.
So you don't trust a world where data creation and collection is the default.
Trust, what's the definition of trust? You know I have a sort of personal definition of privacy and a personal definition of security. For me, trust is the availability of effective recourse. I don't guard myself if I have effective recourse, so I trust family members because of course you always have some recourse if it's family — one way or another you do, maybe not today, maybe not with your grandmother, but you do. There's always something. But there's lots of situations now where I would have no effective recourse, so I don't trust it. If I don't trust it, what should be my default? The answer is probably the creation of data is something I should avoid if I can do so. That's an awfully long-winded answer to your initial question, but nuances matter.
Do you think that the new default as surveillance has become more ubiquitous is that everything is public to a certain extent?
Man, what is public these days? If I can read your newspaper from orbit, what is public? If I can tell where you are in your house by imaging through the wall, what is public? On and on and on. We're not there yet, but I figure we're within a few years of being able to figure out if you're in a room by sniffing out your DNA. Is that public? Or putting it differently, as that sphere enlarges, what remains private? Do you have to own a house to have privacy in it or not? If your landlord owns a house...
And mine does.
So does he have the right to examine the records of a smart meter and see if you're running the toaster and the washer and the air conditioning at full blast? The farm that we have, we have several people who live there who work there just because with horses if you have an emergency, like god forbid a fire, you have a very short time — there's no time to call people and ask them to get their pants on to come help. It's now or never. So we have a certain number of people, six, two trainers, three groomers, and a vet in training. I did, in fact, put in a water meter of sorts — a flow rate meter — because one of our wells kept running dry and I wanted to know if there was a leak in the pipes and it was just seeping down into the ground or if someone was taking six hour showers or what. You might say, that's a little invasive. Turned out, one of the tenants had a bad leak in the bathroom and thought nothing of it. "You could have told me" was my reaction.
But the point is that after I ascertained that water was indeed not going back into the ground, I knew it had to be going somewhere. And I don't go visit other people's apartments at random — I could but I don't. But sure, I put a meter on. And certainly your average electrical engineering student could create a device to determine if you're on your cell phone. Maybe that doesn't matter, but they could tell when you're on the phone. So they wait until you're on the phone, run up to the porch and steal your newspaper.
I'm making this up, but if it's observable does that mean it's public? That's sort of your question, and my question too. Just because it's observable without crossing the boundaries of your property, does that mean it's public? I think if we don't do something, that's where it's going. What was it, the 1920s [ Olmstead ] through the 1960s [ Katz ] where wiretaps went from not requiring warrants to, of course they do? It was very plausible — this wire leaves your property, why wouldn't you expect it to be listened to on someone else's property? The decision that overruled that was "no, you have a reasonable expectation of privacy." But that phrase, reasonable expectation is open to interpretation. What's a reasonable expectation. As you know, it doesn't take much to have a parabolic antenna and we can listen to you in an open room.
Yep, someone else could be recording this interview right now.
Absolutely. Or there could be a ghost in your machine.
But it's observable, because it's in public. I'd like to think that we stopped there for a little bit. We can always let go later. We can always say, "nope, your copyright is invalid because it was published three times" and it's now in the public domain. That happened once to the poem "Desiderata"—"go placidly amid the noise and haste" — it was published repeatedly in church newsletters and the courts said it was in the public domain. What is the public domain? That's really the question. Technology is changing what is public by changing what is observable, and that's what I'm getting at. And I don't know the answer, but I do know that if we don't answer it, things will continue.
So one of the points you made earlier was that it is actually very inconvenient for you not to have a smartphone. Clearly, I'm recording this interview on my smartphone — actually I have two smartphones on me, and a laptop, and all other kinds of gadgetry...
But it seems like the lifestyle choice you've made would be very difficult for a lot of people without your technical understanding or resources. . .
Yes, and maybe without my gray hair. I'm not asking how old you are, but young people such as yourself in a way can't do without social media. If you're a high school student for example and you don't play that game you will not be part of any circle of friends — or probably not, maybe if you're going to a forestry high school or something, but you know what I'm saying: Generally speaking, it changes what is possible on the human scale that you almost have no choice and I understand that. Just to be clear, I'm not belittling that at all.
Your question was about lifestyle choice, and I said it in the talk: There's an old engineering rule about fast, cheap, and reliable — choose two. If you're at NASA and you're sending something to the moon you need it to be fast and reliable, but you can throw away cheap. Throwaway medical instruments in an operating room need to have a different thing — doesn't have to work for long, and since you're going to throw it away it would be nice if it's cheap, so you make your trade-offs.
That as a rule of thumb is mostly what engineering is about. You can have most things, but not everything. I think security engineering are about tolerable failure modes — are about what the tolerable levels of failure are. Determine what failure modes are tolerable and what are not and I can design around not having the intolerable ones. But the cost of it will be some others, because you can't have them all. So when I say, not not fast, cheap and reliable but freedom, security, and convenience, choose two — it's in that spirit as an engineer.
I was once trained as an electrical engineer, so that rings true to me for maybe you could say reasons of indoctrination. But I would say really everything in life is a trade-off. There's an economic argument that the cost to everything is the forgoing of an alternative. You buy a hundred dollar this you can't buy a hundred dollar something else. In this case, the forgone alternative is on the risk accumulation side. But it was a choice saying, "what do I need" versus "what do I want?"
I live in a world of old machinery — with hat number two [as a farmer] on. And old machinery has an interesting characteristic compared to new machinery. New machinery doesn't break very often, but when it does you cannot fix it. The old machinery breaks all the damn time, but anybody with a few wrenches, a hammer, and a willingness to get dirty can fix it. One of my guys set an old tractor on fire — burnt out the wiring harness. I have no instructions, but it's so straight forward. It was a freakin' mix, but it's fixable. Maybe your newspaper has covered the right to repair.
Yes, we've talked to the iFixit folks about how consumers' ability to repair items in their own lives has really changed.
Right. I'm in a sense flying in formation out of regular contact with those folks. As I said, I live with old machinery which breaks often, but any idiot can fix. Who fixes their own Prius? I haven't heard of anybody — there's one guy I know who could and might well, but he also spent an entire summer working in a Prius shop because he wanted to know how it works.
Which is not a luxury everyone else has.
It's not a luxury everyone else has. In fact, when I said from the podium when I said that one way for a supplier to avoid liability would be to give consumers the right to recompile, well I was talking to someone a few days ago who said "well, nobody wants to recompile, nobody knows how — who do you think you are?" It was a good point, not denying that at all.
But if the choice is "here are the means to change it or repair it or whatever, you don't have to use them, if you do use them it will work, but if you want to do that, the following rules apply: you must bring it to the dealership, you must bring it on schedule, and if you have a collision we need to know about it." I'm making this all up, but lithium batteries don't take shocks very well so if you do have a collision with a bunch of lithium batteries in the back of your car, you probably ought to look at it.
You know, Jeff [Moss, the founder of Black Hat also known as The Dark Tangent] was talking in his opening remarks about "radical simplicity" — I'm not quite sure what that means in plain English. Is that a movement or a term of art or something?
I don't know, quote, what it means, but let me guess: You can actually draw a line around something and say, "all of the moving parts in the system are inside this box — I don't have to know about a cloud in Singapore, I don't have to know." After all, how did Target get taken over? Their air conditioning contractor — who probably knows nothing about computers and shouldn't have to. If you go to the big banks in New York, I wish I could say which ones, but I probably shouldn't. But most of the ones I know, and that is a subset, are really bearing down on what they call counter party risk: If you have access to my data through some relationship, then an invasion of you is an invasion of me therefore I'm going to hold you to standards that are relevant to me. Even if they aren't relevant to you, if you want to do business with me you're going to have to do this.
And the banks are really enforcing this. If you're a trade clearance firm, what are you doing? The answer is making lists and comparing them and looking for good matches — but no, there are all sort of other requirements because you won't be able to do business with your clients because they need to make sure your air conditioning contractor can't get into you like Target's got a hold of them. That's the complexity and maybe what this radical simplicity says: I should be able to ascertain the the moving parts are.
What is it that Leslie Lamport says? A distributed system is one where the failure of a machine you've never heard of stops you from being able to do your job.
Yes, I've had that problem many times. . .
I went to check out and they told me I couldn't because their computer wasn't working, and I was like "wait a minute, do you know what audience you have here?" I didn't say anything, but you know when you can't give money to the front desk at this conference [it] just seems highly coincidental.
| Andrea Peterson covers technology policy for The Washington Post, with an emphasis on cybersecurity, consumer privacy, transparency, surveillance and open government. |