字幕表 動画を再生する
Everybody count off. Good, thanks. If there's an empty seat
next to you, raise your hand. It looks pretty full. Anyone know
if this place has a fire code or not? Don't start fires. If you
start a fire, you have to share. People who have their hands up,
have a seat next to them so the smart money goes that way. The
smart money is not in Vegas. All right. Hi there. I'm Bruce
Schneier. Are there any questions? It's good. That was
the easiest talk I ever did. If there are any questions, I'm
happy to take them. There are actually mics here. There's one
there, there's one I can't see behind the camera guy there. >>
Do they work, though? >> Yes, they work. That one works. >>
Thanks for coming out again. The question is, it's an old
question and I'm wondering if maybe you have any new insight
into answer on this. With cryptography becoming more in
the collective consciousness, especially with people who are
less technically savvy, there has been an argument for a long
time trying to explain to people that encryption is not security.
It's very common for people to say we'll just encrypt the shit
and we're secure which obviously is total bullshit. Do you have
any insight on how to better explain to those people why
that's fundamentally flawed? >> I think you're right. A lot of
people think of crypto as a panacea where in fact it is just
a tool. And a very powerful tool, actually it's for a bunch
of reasons, but it doesn't automatically make security on
any data that has to be used. One of the things that the
Snowden documents has brought forward is metadata. So this
cell phone is a great surveillance device because the
metadata where this phone is has to be in the clear. Otherwise it
can't ring. Thinking about that I should turn the ringer off. So
there's a lot of things encryption can't do. Encryption
can protect data at rest, but if you aren't going to make a
target corporation, you have a database of numbers that you're
using, if it's encrypted the key has to be there. I talk about
encryption as a tool, not as security, just like your door
lock is a really important tool but doesn't magically make your
house secure. What encryption does, I think this is real
important, is it NSA surveillance, it forces the
listeners to target -- what we know about the NSA is that they
might have a bigger budget than anyone else on the planet, but
they're not made of magic. They are limited by the same laws of
physics and math and economics as everybody else. And if data
is unencrypted, if they can tap a translated Internet cable,
they can get everything. But if stuff is encrypted, they have to
target who they're listening to. If the NSA wants into your
computer, they are in. If they're not, one it's illegal
and they're following the law and two you're not eye enough on
the priorities list. So what encryption does is it forces
them to go through their priorities list. They can hack
into your computer, that's no problem. They can't hack into
everybody's computer. So encryption is just a tool, but
it's actually a really powerful tool, because it denies a lot of
the bulk access and forces the listeners to do targeted access.
And there's a lot of security benefit in that. >> I wanted to
see your opinion on the back door that Obama wants.
(Laughter.) I'm not sure Obama personally has an opinion here.
It's interesting. This is -- this is the same back door that
the FBI has been wanting since the mid '90s. We called it the
crypto war and now we call it the first crypto war. So number
three, I'm done. It is you guys. I only do two crypto wars per
lifetime. (Laughter.) It's interesting. >> You're referring
to PTP? >> In general. The FBI director gave an interesting
talk at the Aspen security forum. I recommend listening to
these talks. This is a very high level, mostly government
discussions about security, cyber security, national
security, really interesting stuff. He was interviewed by I
think Wolf Blitzer who actually asked a great question. Why
don't you like the term lone wolf terrorist. That was kind of
funny. He was talking about the going dark problem and the need
for back doors and this is the scenario he is worried about.
And he's very explicit. It is an ISIS scenario. ISIS is a new
kind of adversary in the government's eyes, because of
the way it uses social media. Unlike al Qaeda, ISIS does it
with Twitter. And this freaks the government out. So the
story, and they square up and down this -- swear up and down
this happens, ISIS is really good at social media at Twitter
and YouTube and various other websites, they get people to
talk to them who are in the U.S., like you guys, except you
know, a little less socially adept and maybe kind of a little
crazier. But they find these marginal people and they talk to
them and the FBI can monitor this and go FBI, rah rah. They
say go use this secure app and this radicalized American does,
they talk more securely and the FBI can't listen. And then you
know, dot dot dot explosion. So this is the scenario that the
FBI is worried about. Very explicitly. And they've used the
story again and again. And they say this is real, this is
happening. Now -- >> It's not going to work, though. >> It's
sort of interesting. If this is true -- let's take it as it is
true. Another phrase they use, they talk about the time between
flash to bang. Flash is when they find the guy, bang is when
the explosion happens. And that time is decreasing. So the FBI
has to be able to monitor. So they are pissed off that things
like I-message and other apps cannot be monitored, even if
they get a warrant. And this really bugs them. I have a
warrant, damn it. Can can't I listen? I can get the metadata,
why can't I listen? So if you think about that as a scenario,
that it's true, it is not a scenario that any kind of
mandatory back door solves. Because the problem isn't that
the main security apps are encrypted. The problem is there
exists one security app that is encrypted. Because the ISIS
handler can say go download signal, go download this random
fire encryption app I just uploaded ten minutes ago. So the
problem is not what he thinks it is. The problem is general
purpose computers. The problem is an international market in
software. So I think the back door is a really bad idea for a
whole bunch of reasons. I've written papers about this. But
what I've come to realize in the past few weeks is it's not going
to solve the problem the FBI claims it has and I think we
need to start talking about that. Because otherwise we're
going to get some really bad policy. (Applause.) >> Good
morning. So this will probably go less in the direction of for
instance crypto. My question is somewhat twofold but I'm going
to focus more on the first one. >> Do one at a time. >> In the
course of day-to-day interactions, both with security
people and with less security-minded folks, I've come
to the conclusion that operational security is very
difficult to instill. From your experience, is there an easier
approach to getting the understanding through? >> I
think ops sec is pretty much impossible. If the director of
the CIA can't get it right, we're all done. We see that in
the hackers that hack Sony. We see people screwing it up again
and again. I'm not sure there is a solution. Because good op sec
is really and truly annoying. It means not using e-mail for
certain things. I've come to the belief that we're not going to
be able to train people in good op sec. That the best security
is going to be pretty good security that's ubiquitous. And
we saw this in some more of the Snowden documents. How many
people read the recent article that came out of Germany. Okay
article, really great document on how it works, the NSA
flexible database for monitoring the Internet. You can read their
documents and they talk about how they can find people using
encryption and roll up the networks. They can't read the
traffic, but they know who is talking to who. Metadata,
encryption doesn't solve everything. And I'm reading
this, and it's clear you can do this with PGP as well. You want
to find out who's using encryption, it's easy if you
monitor enough of the Internet. And what that tells me is that
someone would be better off not using the great encryption
program they wrote or the really powerful one they just
downloaded, but the average one that everyone else is using.
That you actually are better off using an iPhone with i message,
even though I'm pretty sure the FBI can get at it individually.
But you can hide using it because we're all using it. You
don't stand out. So I think there is a lot of power in that.
You had a second part. Make it quick. >> You answered it. It
was for propers making the use of op sec less obvious. >> Make
it invisible. Think of SSL, good security works if people don't
even know it's there. Encryption from the handset to the base
station, it would be great if it was better but it's working
because nobody knows it's there. >> So in thinking maybe not as
thoroughly or deeply as I should about like cyber terrorist
threats and bad actors that want to do corporations or
infrastructure harm, these sorts of things, or just the public,
right, it seems like all the ingredients are there for people
to do really bad things and there are a lot of holes and
security flaws. What keeps there from being enough motivated bad
actors and people -- what keeps them at bay? >> I think
fundamentally people are good. Right? Society works because
most of us are honest. You kind of look at me funny, but none of
you have jumped up and attacked the person sitting next to you.
(Laughter.) You laugh, but if this was a room of chimpanzees
that would have happened. A room full of a lot of strangers
sitting quietly listening to me. This sounds weird but I think a
lot of what keeps the really bad things from happening is most
people don't want to do really bad things. If that wasn't true,
society wouldn't work. So I think you're right, that all the
pieces, a lot of the pieces are there. There is a couple of
things. Terrorism is harder than you think. Yes, technically it
can be easy, but the whole operation is actually harder.
Which is why you don't see a lot of terrorist attacks. What you
do see these lone wolfs that wake up one morning say I'm
going to do something bad, there's no conspiracy to detect,
there are no mistakes you can make over the course of the
planet that flash-to-bang time is so short. So I really do
think that's why. Something interesting I would mention as
long as we're on this topic, that a new tactic we're seeing
more of, the notion of institutional boxing. You can go
into a company, take all of their stuff and publish it. It's
freaking people out. This happened to Sony, the hacking
team. The guy who did that is in this room, thank you very much.
(Applause.) It's what might have happened to Ashley Madison,
which is a little more awkward for some people. (Laughter.) And
if you remember a few years ago it was HB Gary federal. An I
think this is a really interesting tactic because it
empowers individuals against very powerful organizations. And
it is the first, I think the first real counterargument I've
heard to the increasing practice of hiring a sociopathic CEO. If
you are worried about everything your CEO says becoming public in
three to five years, you might not want to hire a jerk. But I
expect to see more of this. People are noticing that
wikileaks is publishing Saudi diplomatic cables. Someone
hacked in and is dumping all this stuff. So that's an
interesting example of a new bad thing that's being enabled by
technology that's happening more and more of. But in general, I
do worry about the bad things happening, but I think it's less
common than we think, because most people don't do them. It's
the systemic stuff that bothers me. The Internet of things,
being able to hack cars and planes and heart monitors and
other stuff. And the interconnection of those. I
think we're going to see more unanticipated vulnerabilities.
Because remember complexity is the worst enemy of security. It
is nonlinear, tightly coupled complexity and that's really
what the net gives us so we've got to be real careful there >>
I had occasion to look at it recently and at that time you
guys had assessed that our ability to analyze hash
functions was a good ten to 20 years behind our ability to
analyze the other primitives. I wonder if you think that gap has
closed. >> I think we're much better at understanding hash
functions right now. We're still implementing bad ones but that's
more legacy. It's a very hard -- mathematically it is hard
because your assumptions are squirrelly. I think I revised
that in the revised edition of that book which is cryptography
engineering. But I do think we understand crypto -- encryption
primitives better than hash primitives. Even though you can
make one from the other. Which is why I don't know if people
remember when there was the hash function con test that just ran
five years ago, I built a hash function on top of a symmetric
algorithm because I felt I understood that better. I'm not
convinced the NSA understood the hash functions well. Something
they're using in military applications much until
recently. They didn't have their rich history like they had with
encryption. >> Thanks. >> The code in rain Dale is not the
code that's in -- I trust you more than I trust the feds. Do
you think that AES is a trustworthy cypher? >> I think
AES is. I trust AES. It is ring dal. Everyone is happy with
them. It is weird because you can actually describe the
algorithm in a linear -- in an equation that fits on one page.
It's kind of small type but it fits on one page which kind of
freaks people out a little bit. But I do trust it. I think it is
secure. I do not think there is a back door or anything snuck
in. I truly don't. NIST did a great job with the AES process.
I really do -- NIST unfortunately got tarred with
the new release generator and they're trying to rebuild their
trust but they've done a fantastic job of crypto
primitives by and large. I like AES. Thanks for using two fish.
I like it too. I use AES without reservation. >> As disturbing as
the current crypto war is, something that scares me are
lava bid or some of the other companies getting national
letters. I'm wondering what we can do to defend against
governments secretly ordering companies to put back doors into
their products. >> The thing that should freak us out the
most, and to me this is the biggest deal revelation of
Snowden and all the stories around it. It's not that we
believe that encryption was perfect and nobody can break it.
But we did believe that the tech rose and fell on its own merits.
And the idea that the government can go into a company and say
you have to break your encryption and then lie to your
customers about it is terrifying. The law can subvert
technology. And we cannot, as a community, as a society, truly
believe that anything is secure, as long as that's true. I just
talked about, you know, i message and we don't know. And I
blogged about this as couple of days ago. It didn't get enough
play. It's kind of the last part of a post, but there's a
persistent rumor going on around right now that Apple is in court
fighting an order to back door i message and face time. And
Nicholas Weaver has written about how they could do that.
How they can modify their protocols to make that happen.
And we don't know. That is fundamentally terrifying. And I
don't know how to fix that. We have to fix that through the
legal -- there's no tech fix. I mean, the kind of things you can
do, I think that if we thought about it, we could rewrite the
Apple protocols such that if they did have to put a back door
in, we would notice. We would say why did you make the change
they would say a bullshit answer, we would know something
was up. So maybe something making your algorithms not back
door proof, but backdoor evident. So maybe think more
about that. But this is a hard one. And I don't have a good
answer. And it is one that I think really should disturb us.
More open source is going to be good here, more sunlight, harder
to subvert. But as long as the government can issue secret
orders in secret courts based on secret laws, we have a problem.
And it is not a tech problem. It's a legal problem. >> Hi. We
seem to be in a situation where the software industry can
release software that runs on billions of devices and is
completely insecure and badly written and there's no
consequence whatsoever to those companies for the problems that
they create. Just recently what comes to mind is heck on
android. Can you discuss generally what you think about
this and from a legal perspective and software
companies being held liable, accountable for the bad software
that they write? >> I've always been a fan of liabilities. I've
written the first thing about it in maybe '02 or something. Maybe
even before. So here's the basic argument. That right now as you
say there are no costs to writing bad software. You read a
software license, it says if this software maims your
children and we told you it would do that, we're not liable.
Even security software says no claims about security are made,
even though they are. So liability changes that. It adds
a cost to not designing software properly. It adds a cost to
insecurity. It adds a cost to nonreliability. And that has
real value. Think about we are already paying these costs,
we're paying it in losses, we're paying it aftermarket security
devices, we're paying it in sort of the entire industry that is
strung up around dealing with the fact that the software
sucks. But with a liability we would pay anyway, the cost would
be passed on to us of course, but at least we'll be getting
more secure software out of it. I see a collective action
promise, the market is not rewarding good security. The
cost of insecurity is too low. The value of insecurity is high.
And liability changes that. It is a lever we can use to
rebalance this cost-benefit ratio. And I think it's a
powerful one. It is not a panacea. Lots of ways
liabilities go wrong. But liabilities do a lot. They
provide value. And I think they would a hundred percent here in
software. We know why the Android vulnerability isn't
being promulgated. They designed their system, Google produced
the patch but it won't go down to the phones. The phone
manufacturers don't care very much. They don't have that tight
connection like you have in the iOS world. So the patch doesn't
go downstream. If suddenly the phone manufacturers were liable,
I assure you that the patch mechanism would work better,
right? And that's a lever we have as society, and we should
use it. I think it's a better one than regulation here,
because it's one that's dynamic, and tend to seek its own level.
But that's why. That's why you use it, and I'm a big fan of it.
Actually, thinking about this, hang on. Everybody, smile.
There's more of you than fits on this screen. That's not going to
work. Hang on. People at the edges, you don't have to smile.
Thanks. Who's next? >> Bruce, it seems like less and less surveys
seem to show that Americans are concerned about the privacy of
their information. Often you hear terms like I'm not hiding
anything, so I'm not worried. And it seems like people my age
and younger don't have much of an understanding of Edward
Snowden or the relevance of what he released. What would you say
to those perspectives? >> I don't know if people know I had
a book published in March called David and Goliath. I spent a
whole chapter on that question on privacy, why it's important.
And it's not true that people don't care about privacy. It's
actually young people don't. All surveys show that they do.
They're very concerned about it. And you know this is true. You
remember being a teenager. You don't care about the government,
because who cares, but you're concerned about the privacy in
your world. And you know, people who are fluent kids teenagers
who are fluent in the net are fluent in how to maintain their
privacy. They might not do a good job but they try a lot. I
argue that it's fundamental to individuality, to who we are.
Without privacy we became conformists. We don't speak out.
And I think it's a really interesting argument in social
change. We're in a year where gay marriage is legal in all 50
states. (Applause.) That issue went from impossible to
inevitable with no intervening middle ground. It's amazing. But
what it means is, and you can take legalization of pot, you
can take a lot of issues you can take in this way. Back then,
something is illegal and let's say immoral. It goes from
illegal and immoral to, you know, some cool kids are doing
it. To illegal and we're not sure. And then suddenly it
becomes legal. But in order to get from here to here, you've
got to be a point here where the thing is illegal and people do
it anyway. Because you've got to do it and say you know what?
That gay sex wasn't that bad. (Laughter.) That was kind of
okay. You know? I tried pot and the world didn't end. And it
might take 40 years and a couple of generations, but then you get
to the point where it is legal. Interracial marriage. Any of
these issues. But if you have surveillance here. If you can
stop people from trying the thing and saying, you know? It's
not that bad, maybe we're wrong, you never get to the point where
the majority of us believe we're wrong. I think surveillance,
broad government surveillance will really have a stifling
influence on social progress. Because it won't let experiments
to happen. Now, the argument you can make to anybody, but I think
it is probably the most important one. But really,
anyone who says I have nothing to hide, you know they're lying,
right? I mean, there aren't cameras in Scott McNeeley's
house because he has nothing to hide. I think you have to point
out that those arguments aren't true and that privacy isn't
about something to hide. It is about maintaining your sense of
self in a public world. I get to determine what I tell you people
and what I don't tell you people. And that is empowering.
And if I lose that, I am fundamentally a prisoner of
society. So attaching privacy to something to hide, to secrets,
is just wrong. It's about human dignity and it's about liberty.
I spent a whole chapter on that and I do it better in the
chapter. So I offer that up. >> Thank you. >> Yes? >> Most
people seem to me more worried about back doors and forced
government back doors, but I'm sort of more worried about
sneakers where what is your opinion on quantum computing and
current encryption and also quantum encryption and its
rebuttal to quantum computing. >> Quantum encryption has
nothing to do with quantum computing. Quantum computing is
going to become real. Probably not in our lifetime. I think we
can factor like 24 now. But it will get better. It has
potential to change crypto, but not destroy it. It will break
all of the common algorithms, so RSA, and those. It will break
those in linear time and be very nasty. But we do have public key
algorithms that do work. His early one that the work factor
of a square instead of exponential. They're less
efficient, but they still work. In theory, the best quantum
photography does is have your key length. It reduces your
brute force search by a factor of a square root. So double your
key length and you're done. NIST is actually hosting conferences
on post quantum cryptography. How can we build resistant to
quantum computing theoretical world. Quantum crypto is really
quantum key exchange. It is a clever way to exchange keys
using quantum properties. Really neat, great science, great
physics, something I as a cryptographer have no need for.
So I think it is kind of pointless from a practical point
of view. Great science, but I would never buy such a device
because I would use one of the math systems and they work just
fine. So that's sort of my quick quantum primer. But it's great
science and I love the research. And eventually, yes. We'll be
able to factor numbers very quickly, which will be cool.
Yes? >> I was wondering if you caught the previous talk by
Eijah demonsaw. >> I definitely want things -- this is the
answer about the opsec. Same thing. The more you make it
invisible, the more you make it transparent, easy to use, no
work. Even sacrificing some security, I think we do better.
I'm really liking -- on my iPhone. It's a great program. It
has a really clean interface. It works. I can actually all the
key exchange happens in the background, it's well designed.
I can confirm there's no man in the middle. I don't have to but
the fact that I can is enough of a deterrent of people trying it.
So I really like simple stuff. Because I want everyone to use
it. There's value in it being ubiquitous. Expert-only
encryption has much less effectiveness. One last comment
to the quantum guy. One of the things we know about the NSA
documents, they have a budget line but it's not very large. >>
First of all, Bruce, you're my security crush and do you mind
if I take a picture with you after the show? >> I don't but
you guys all have weird pie plates on your chest. I'm just
saying. You look like some embarrassing cult. >> It's
Flava-flav. With the explosion of software defined networking,
do you have specific concerns around the security of such
leading edge technology and this virtualization of router
switches, firewall et cetera, do you have thoughts on that? >> I
don't have any specific concerns, just a general of more
complexity, more things to be insecure, and another layer of
organization. Those are my concerns. I mean, there's huge
value in this. I'm a big fan of security outsourcing for
organizations. It's very hard to do right and the more you can
consolidate the expertise, I think the better you'll do. But
there are legal risks. We've been seeing some court cases
that the FBI can survey warrant on Facebook for your stuff
bypassing you. They can do that. And that does cause problems.
But in general I think the value of outsourcing is great. And
there are security risks but I think in benefit I tend to like
that technology. >> So it's a balance, no major concerns over
shared control planes? >> That's it. You've got to know all that.
Those are things to be concerned about. But are they major
concerns? They're like regular-sized concerns. >> All
right, thanks. >> First of all, thank you for everything you do.
>> With the pie plate also, I'm saying. >> My question is, even
if they wanted to, would policy makers be able to stay current
with the pace of technology? >> It's interesting. I think I've
got to the belief that the United States, probably other
countries are becoming fundamentally ungovernable.
That's one of the reasons that technology is moving so fast,
that the people who understand it can run rings around policy
makers. And whether it's writing laws that five years in
retrospect you realize whoa, they understood that and put
that sentence in and we didn't understand that this is hard. I
like seeing laws that are invariant. Write laws that don't
have to keep up. Laws about assault and murder don't really
care about the weapon. I could write a law about privacy for
communications that doesn't care if it's e-mail or voice or voice
over IP. I can do that. I think that's better. I'm not sure it
will happen. There's so much co-option of the legal and
policy process by people who stand to make and lose a lot of
money. Right now the cyber security bill, that's probably
going to get signed, has got all sorts of amendments and riders
and what it actually does isn't what they say it does. And
that's an easy one. You start doing something like health care
or climate change, forget it. So I'm not optimistic about
lawmakers staying current because of technology. I think
we're going to have to go through some bad times before we
realize how to create law in a society where tech moves so
fast. There's an argument to be made that the modern
constitutional democracy is the best form of government mid 18th
century technology could invent. Travel and communications are
hard so we've got to pick one of us to go all the way over there
to make laws in our name. It made since in 1782. There's a
lot of ways that our systems that were designed when nation
states started becoming a thing are sort of breaking now because
things are different. Things are moving too fast. The
communication is different. It's all different. And I think we
have to redesign democracy. This of course is ridiculous and will
never happen. But I think we kind of need to. That wasn't an
optimistic answer, was it? >> A few months ago there was news
about Chris Roberts being detained at the airport after he
posted a tweet -- >> Is he here? Okay. >> And then your blog said
that maybe FBI knew that Chris Roberts works in the field of
avionics and that's why he was detained. And there was news
that they had posted a warning about it even though they claim
it was an IT issue. So what do you think -- is the emphasis on
security -- the issue was similar. >> So we didn't know
that Chris Roberts says that he actually was being watched by
the FBI. That he had talked to them before. The Chris Roberts
says is very complicated and I stopped commenting when I
realized it was way more complicated than I understood.
This is the case of him being on a plane and saying something
about going into the avionics bus via the USB port in his
seat. Which would be crazy if you could, but it wouldn't
surprise me if Airbus forgot about that. It really seems
every time you put physical security people and give them an
IT security problem, they completely forget they should
talk to a security IT person. Anyone follow the hack on the
Brinks safe? It's completely embarrassing. They never even
open an IT security book. Oh, yeah, we can do this. No
problem. I don't know how much proactive -- it does seem like
the FBI is monitoring more groups and more individuals. And
we see them monitoring the occupy movement or black lives
matter. I mean, real social change movements that might not
be as, I don't know, as mainstream as they could be. So
there's a lot of that going on. How much -- in those cases I
don't know. The wall street case I have no idea. Certainly
there's always a lot of bragging that might not be true. >> But
then they posted it the day before. >> Yeah, I don't know. I
don't know the details. And it's hard to speculate. I think there
is more monitoring than we all think. This is the point of
fusion centers, this is the point of information sharing
between the NSA and others and I think a lot of it is going on.
>> So do you trust elliptic curves? >> I've been skeptics
for the bunch of years. Most do trust them. My feeling is that
there's a lot we don't know about elliptic curve
mathematics. The NSA uses elliptic curve crypto. So I can
tell you that. So in some instances it is secure. But I
also know that they have, in some cases, tried to influence
curve selection. Now, for good or for bad, I can't tell you. So
I worry about elliptic curves where the curve selection
process isn't transparent. Now, if you want to use elliptic
curves in your system, Dan Bernstein, a guy we all trust
had a public process and they are available and I would use
those without reservation. The NSA said here's some great
curves, I would say, you know, huh? (Laughter.) So that's my
feeling on that. I think the math is sound, but I think there
are things we don't fully understand. I'm getting the get
off stage soon signal so I'm going to take -- do you have a
yes or no question? >> No. (Laughter.) >> All right. Short
answer. Go. >> So you got me into security, I've been in this
industry ten years. Thank you so much. So my question is, now
that I found myself in a position, hypothetically, where
I'm working with a government agency, where they use the same
shitty software as everyone else, they've got the same
problems as everyone else, and I'm convinced that the actions
of stock piling zero days, weakening crypto all that stuff
is harmful. >> Quick, quick. >> What can I do to convince or
show these people that actually these other arms of the
government are doing things that hurt us. >> This is hard and a
lot of us are trying to do this. Keep saying it. This is not the
tech process. I'm doing a book signing at 4:00. Come by and say
hi. Not all of you at once. I'm going to go outside right now
assuming there is space to breathe. Thank you very much for
coming. Thanks for sitting. Have a good weekend. (Applause.)