Hi everyone, my name is Dennys, I'm a director of InternetLab
and I'm here with Beatriz Kira
and we're going to do more interview sessions
made by InternetLab
with international experts that are in Brazil
and today we're very pleased to have Amie Stepanovicht
Amie works at Acess Now and
we're going to talk about a few questions
that deal with privacy and surveillance,
specially issues related to criptography,
its technical functioning,
issues related to the future
and other problems present in this
discussions, so we'll do the
interview in English but everyone can
follow with the subtitles
So Amie, thank you very much for being with us and for
talking to us about all these issues
that we want to cover in this interview
so I want to start this conversation
talking about encryption, as I'm sure
you've been following there were court
decisions in Brazil that demanded
the suspension of WhatsApp
in the entire country because of the
refusal of the company of granting
access to law enforcement about data
sometimes we know that the data was the
content of communications sometimes
possibly what law enforcement actually
wanted was to get access or to have a
way to wiretap or have access to future
communications and that's something that
made a debate really interesting and
encryption became somehow a hot topic in
Brazil. So I wanted you first to briefly
try to explain how encryption works and why is
it important to users that care about
their communications and their privacy.
Sure! So, encryption is really at
the heart of
security in the digital world, so as
people move more and more of their
information online, encryption provides a
lot of the protections that we're used
to in the analog, in the real world.
Things like keeping documents private
making sure that people who you don't
want to see your communications aren't
opening your letters, for example -- your
emails being the equivalent. And so
encryption is that protection when
you start talking about digital space
it's really important to provide
companies with a lot of incentives to
develop and implement really strong
encryption because the problem is
that there are a lot of vulnerabilities
in digital products and services and
people, when they use those services, just
have natural insecurities built in.
I have yet to see any product that's 100%
secure. But you want to talk to companies
and really make them see the benefit of
using encryption. The fact that it's good
for their users it's inherently
connected to human rights. Users really
can only exercise their right to
freedom of expression or freedom of
the press, privacy, if they have access to
encryption. The problem is that
actions like what we saw in Brazil with
the court shutting down WhatsApp
provides the wrong incentives to
companies. Encryption's already... It can
be expensive to develop, it can be
difficult to implement, especially really
strong encryption. Companies are
trying to put in place whole new models
and rethink security and that's a really
great process and we want them to do
that. But if they think that they are
going to be subject to shutdowns or
fines or imprisonment for doing that
it's going to make them think twice
about it, which harms human rights, harms
digital security, makes a lot of people
more vulnerable to crime, which is
exactly the opposite goal of what the
government wants to accomplish. You know,
they want to help solve crimes, but lack
of encryption makes people
more easily... to have their data
vulnerable, to have their data compromise, and
that could be by people trying to take
advantage of them financially and steal
their credit card information or to
blackmail them to steal their other
personal data to get access to anything.
And so that is what we're trying to
prevent, is that increase in crime. It
also makes them
more protected against street crime
because what the numbers show is that
people are less likely to steal digital
devices
you know, iPhones are a huge target for
criminals because they have a high
resale value, but if those devices are
encrypted they are of less use, so you
don't get those stolen as often. Which is
just another benefit of deploying the
strong encryption. It's really
interesting that you mentioned that,
because very commonly the debate is
framed around the "privacy versus
security" way, like it's framed
in this way. And one thing that I
wanted to ask is that usually in this
narrative, law enforcement has been
claiming that encryption may be an
obstacle to investigations and to
fighting crime. And sometimes there are
even law enforcement experts that
have been saying
that encryption is not a measure that
prevents them to have access to the
content or to the information that they
want. That that is allegedly an
argument of the companies trying to
prevent them to have access to this
information. One thing and or one
strategy that they can be
referring to is the so
called the man-in-the-middle attack, but
there are other ways of circumventing
encryption and obviously you can always
force companies to build in security
holes in encryption. So I wanted you to
explain us a little bit more about how
the man-in-the-middle attack works, if
those claims of encryption being
actually just an argument and not
a technical excuse for the providing company
if they are true and, in case of these
alternatives of forcing companies to
build in security holes into the product,
how that can be hurtful for the
users.
Sure. So there's a lot to unpack there. I think
the first thing that you need to know is
that encryption is not a panacea,
it does not solve all problems. Even
if you use the strongest encryption
available, there are always ways to break
into it. Now that said, encryption is the
best defense against mass surveillance
it makes surveillance a lot more expensive, you
can't collect a lot of information off
the wire if that information is all
encrypted, which means it pushes
government's toward more targeted
individualized surveillance, which we
think is a good thing. Now, in those
targeted scenarios there are a lot of
ways that governments have still to get
access to encrypted data which means it
requires users and companies not only to
implement encryption but also to
implement other practices and good digital
security hygiene, is what we call it, to
continue to protect that data, to add
those extra levels that you need. So for example,
man-in-the-middle attacks are
when people can make you feel like -- the
very simplified explanation -- people can
make you feel like you're communicating
with another person, and you might
actually be communicating with that
person, but somebody is coming in and
is able to see that communication as it
happens or spoofing the party on the
other end and so you're actually
communicating with this person who is
not who you think you're talking to. Now
companies like Signal, designed by
Open Whisper Systems, have tried
to solve that by indicating to users
when their friends keys change and so
they will say, you know, the person
on the other end has a new key and they
encourage you to verify that is
still that person, so to reach out, either
call them and say "has your key changed?"
or send them a message on a different
channel and that is to try to verify, to
cut off these man-in-the-middle attacks.
Now, not everybody does that which means
they could be susceptible and those are
the practices that people need to get
accustomed to if they do want to be
secure and protect themselves. Other
things are not clicking on random links
and email or downloading strange
software because these can also
compromised the endpoints, the devices
that you're communicating on, which is
another vulnerability in encryption. Weak
encryption can be compromised or brute
forced often. Improperly implemented
encryption, which we see often, can have
vulnerabilities that people can push
through. There are a lot of different
ways, which is why we say that companies
should be incentivized, again
to put all of these resources that they
can into making their devices secure.
Because even in that scenario there are
a lot of weaknesses that make people
insecure, and not only insecure to
governments but insecure to bad actors.
And so if you start talking about
requiring companies to take resources
away from security and design a product --
I think I've heard somebody explain it
as encryption that works sometimes but
is designed to fail and you can't decide
when those failures are going to happen
and that's just a bad model overall.
It's interesting that you mentioned Signal, we want
to take the opportunity to discuss
a little bit about open source of code
in encryption devices. We've recently
seen stories in the news about a
vulnerability in the WhatsApp
encryption.
There was a claim made by an expert in
Berkeley that really made some people
question their security when using the
app. WhatsApp has officially responded to
those claims saying that its encryption
is secure but it was actually a feature of
the app. I wanted to use this story to
ask you about the importance of open
sourced code in the encryption debate, is that
something that we need to encourage and
could that benefit hackers and law
enforcement authority in the sense that
they could manipulate the code or find
security holes more easily?
So, open source technology is something that
Access Now promotes because you can see
what the software is built upon. So if
there is a vulnerability, yes you might
have law enforcement able to see that, to
take advantage of it, but you also have
the rest of the world able to see it, so
it is a significantly higher likelihood
that it will be discovered and able to
be patched if it's open source just
because of the number of eyes on it when
you have a piece of closed source
technology it means you, by nature, have
a very limited number of people
reviewing that software. And so that
software companies that are closed
sourced like Apple often have lots of
audits, lots of high-level security
engineers but at the end of the day they
have fewer eyes looking over their code
and that's really one of the big
benefits of open source software. I do
want to touch on a piece that you said
about WhatsApp and about "the
features" versus "the bugs" because I think
this is really interesting.
We encourage companies, again, to
develop a strong encryption as possible
but there are reasons to not have the
strongest encryption in every single
service because encryption is tied to
keys and so if you have a service that
you by nature want to be able to access
from lots of different devices, it
doesn't make sense
to have a single device with the key on
it, it makes it a lot harder to access
that data, If you want to be able to
retrieve data, if you want it backed up.
And this is why a lot of people on the
iPhone, even though the iPhone hard drive
is encrypted, backup their data to the
iCloud which gives that access back to
Apple, it's because they want to know that
if their phone falls into an ocean that
they can get their data back and so
there are legitimate reasons for users
not having the strongest encryption in
some products and in some services or
having practices that weaken
the protection that they have. It's
really important in those cases that you
still have some form of encryption and
that you're still protecting the data,
that you're very honest with users. We
think it's actually much worse to
provide users with a false sense that
they have more security than they
actually have, then to just not provide
them the security to begin with. Because
you're going to give them this idea that
they can do things and that they're
protected and they might take risks that
they would not otherwise take and
that puts them in a worse off position
than they were before.
And it's really important for companies
to hear that message that they need to
be honest and they need to be open with
their users about what they're doing.
Thank you.
Another issue that we wanted to discuss
with you is access to metadata. So under
Brazilian law there's no specific
requirement or there are
no specific rules determining when
metadata can be accessed by law
enforcement and under which circumstances.
So I wanted to ask you in the U.S. how
does that work and what do you think
would be the ideal circumstances under
which metadata could be obtained by law
enforcement and which safeguards should
be implemented so that users are
protected, their privacy is protected
in those circumstances.
So we think that metadata should have the exact same
protections as content because metadata
is often as revealing, if not more
revealing than content because what's
interesting about metadata is it can't
lie, necessarily. I can write an email and it
can be all not true, every single word of
it but the information about where I
sent that email, who I sent it to, what
time I sent it, that is factual
information that can reveal a whole lot
about that communication. So we think
that the protections need to be the same.
The problem is there are many problems.
First of all, in the United States we
have a doctrine called "the third party
doctrine" and it dates back kind of
before of the Internet, before the modern
Internet, and it talks about how when you
give your information voluntarily to a
third party, that you lose your privacy
interest in that information.
And metadata by the nature of what it is,
is always given to a third party. Your phone
company needs to be able to route your
calls, so they need to know who you're
calling and your cell phone needs to
know where you are because they need to
be able to send you service and to
connect you to a tower which provides
additional lower levels of protection
for users. In the US we have tried to
overcome those hurdles by passing
specific laws that provide higher levels
of protection that run around the third
party doctrine. So, for example, there's a
law in Congress right now that has been proposed
to protect location information and
to ensure that you have to get, you have
to show some suspicion that you have to
go to a court and get a warrant to
access location data because of how
sensitive location data is. We think that
is the right approach, we also think
that the third-party doctrine is far
outdated because of how much data we
turn over to third parties, we think that
is no longer tenable in the digital
world, the way that it might have been many
many years ago, decades ago. So that's
important. It's also important to note
that metadata can't be encrypted
necessarily or else we haven't figured
out a way to do it yet. And there are
reasons for that, but at the end of the
day people need to realize that, even if
their communications are encrypted, that
data is out there in the world
and it is not as protected through digital
protections, and so it's more easily
accessible also to law enforcement, just
by virtue of dozens of lack of
encryption.
We've been talking about the legal frameworks to have access to
this data, but I wanted to add to the
discussion the possibility of
governments hacking into users devices.
Recent news stories reported that the
Brazilian authorities have had contacts
with companies that provide these
surveillance solutions for government
such as the Italian Hacking Team. Other
stories suggest that Brazilian
Authorities have pushed telecom companies
to adopt and use malware infiltration
and to obtain information stored in
cell phones. What are the risks for human
rights associated with government
hacking and in what cases, if any of this
could be legitimated?
Also, I could just comment on the recent
Amendment of Rule 41 of the Federal
Rules of Criminal Procedure in the US
I think that should add something
interesting to the discussion.
So, Access Now, last year, we published a report
called "Government hacking and human
rights" -- "Human rights approach to
Government hacking". And we tried to look
at specifically what impact hacking has
on human rights protections for users.
And so we looked at the different types
of government hacking and really the
different motivations that governments
may have, for example to conduct
surveillance, which is a big one, to get
access to user data governments have
hacked into devices in order to -- we
call "messaging control", to dictate a very
certain message and to ensure that
either a message is promoted or kind of
tampered down. And then a third one is to
do some sort of damage. You can hack into
devices, for example, to make them explode
or to set, you know, make them overheat in
a way, and so to do physical world damage.
You often hear in the US these
members of Congress talking about
cyber war, and they're going to hack
into the electric grid and shut down the
electric grid. That is this causing
damage scenario. And what we've
determined is that there's -- the second,
the messaging control and the causing
damage are just absolutely inconsistent
with human rights protections, their current --
what it is currently capable -- what
they are currently capable of doing. That
surveillance motivated hacking might be
consistent with human rights, we said
that we do not condone government
hacking, we actually think that
government hacking is bad for users for
a whole range of reasons. It is very
different from different types of
surveillance. We are very clear that this
is not something that we think should be
blessed, but we are also clear that we
know governments are doing it, we know
it's not going to stop anytime soon, and
that we are in this
very real-world of governments all over
the world trying to hack into devices.
And so what we try to say practically is
if you're going to do that for
surveillance, you need to have a legal
framework in place. You cannot simply
contract with a company like Hacking
Team or use existing surveillance
authority that was designed for less
invasive activity than hacking, to cram
in the use of these very invasive tools.
I mean, we set out 10 safeguards and we
say these are what you need to have in
law. Things like greater transparency and
assurance that you're not going to cause
damage and that you're going to try to
remove the malware from the device after
the hacking operation. We think that this
is necessary. Now that said, Rule 41 in
the Federal Rules of Criminal Procedure
in the United States is the rule that
governs what magistrate judges -- where
they can authorize searches. So it
essentially said, with a few exceptions
that a magistrate judge can only
authorize a search in the jurisdiction
where the device is to be searched. And it
was this practical limit on government
hacking, because a lot of times they were
hacking into devices because they didn't
know where they were located. And what
several courts have said is that you
cannot authorize a warrant for something
where you don't know where it is because
of Rule 41, in this requirement that the
object be present in your jurisdiction.
The recent amendments which were passed
by a Federal Committee, approved by the
Supreme Court and then they went to
Congress. And all Congress had to do was
nothing. And they went into effect.
You know, normally the the rule is
Congress passes the law and the law
changes. Here it was just by inaction.
The rule change would go into effect. And the
rule change said
in certain scenarios -- basically
government hacking scenarios -- that the
judge -- it added this new exception that
said the judge, a magistrate judge can
issue a warrant for government hacking.
Our opinion on that change was that it
was putting the cart before the horse
because we don't have the legal
framework that I said is necessary for
government hacking. So we were removing
these procedural barriers to make it
easier to hack into objects, into devices
without having thought through the
substantive rules that need to be in
place as well. So we think that that is a
very negative thing, we're now in a
scenario where we're not sure how
government is using these authorities
but we know they are, after that rule
change went into effect and we still
don't have the proper substantive rules
in place for it and we think that all
countries should really be considering a
legal framework. And we're seeing it, we
see it in the Netherlands -- it has a law on
government hacking, Italy just proposed a
law on government hacking that they want
to pass. It's actually, noting that we
don't think they should be hacking it is
quite good and quite protective for a
law. Australia has a framework we would
like to see more countries move into
that world of having a legal framework.
So I want to talk a little bit about the
mutual legal assistance treaties now. We
know that very often request for users
data involves companies whose servers
are outside of the country -- in case
of Brazil, are outside of Brazil -- or
which do not even have offices
in Brazil, this adds another layer of
complexity to these issues, particularly
because they depend -- they start to depend
on international cooperation frameworks
to become more operable. The legal --
the MLATs are kind of at the heart of
these frameworks, but law enforcement
authorities have been very vocal about
the weaknesses of this model of working
with the MLATs, they're usually very
slow, bureaucratic, inefficient
but they are still out there.
So I wanted to ask, in your view, what
kind of reforms do you think are
important to make the MLATs work
better as an alternative for law
enforcement to have access to data.
So, I think there are there are two steps to
this and in Access Now, my colleague Drew Mitnick is about to to put forward our
proposal actually, imminently, on how we
think and what should change. And the
first thing is to change the MLATs
themselves. We think the MLAT system by
and large as a human rights protective
system, it works to protect users
specially users in countries where there
are fewer human rights protections.
The problem is that it is slow and
bureaucratic and it takes very hyper
local crime, you know, a crime that
happens in Brazil, with a Brazilian
criminal and a Brazilian victim and
everything is here
and all of a sudden there's a vital
piece of data for the investigation and
it's located on the server in the United
States and you have to go through this
year's long process to get access to
that data. That's really unfortunate, it's
very frustrating. I know it's frustrating
in Brazil and in other countries, as well.
I think we need to be looking at
jurisdiction and jurisdictional issues
and make sure that we are exerting
jurisdiction in the right places, we need
to be providing more funding for MLATs
we need to be providing more training for
people to go through the MLAT process
these are just a few of the
things that you need to fix in the basic
MLAT model, while still protecting
human rights. You can't sacrifice human
rights at the altar of efficiency.
Now that said, one of the things that are
also being proposed is the ability for
countries to enter into agreements, where
they could bypass the MLAT system and
go directly to the companies, in certain
jurisdictions. There is right now a legal
proposal to change the law in the United
States, to allow for these type of
agreements because they couldn't happen
under current law. The problems are
many with the current proposal.
We think that the idea of this might be
very positive, it might alleviate some of
the pressure on the system, so that some
countries that protect human rights can
get more direct access, which means that
other countries that maybe don't qualify
would have a more efficient process
because some of that backlog would be
led up. But A) it does not actually
prevent countries from implementing bad
laws for human rights, things like data
localization -- mandatory data localization
which is bad for human rights. There has
been a proposal to bypass the MLAT
system. It doesn't prevent that from
being in place. Which means it's not
solving some of the underlying problems
it doesn't include MLAT reform, so
you're not, again, solving this underlying
problem by providing for the greater
efficiency of the process. And doesn't
adequately protect human rights, one of
the countries being looked at for an
initial agreement is the United Kingdom.
The UK just passed one of the most
invasive surveillance laws in the world
last year, the Investigatory Powers Bill
that allows for huge amounts of
surveillance. And this proposal, if it
would allow the UK to get access to the
US companies, you can imagine that it
really is not adequately protecting
rights. So we think that we need to
increase the standards -- that if
governments want easier access, if they
want to bypass this, they actually should
show that they have greater protections
for individuals. And the other thing that
I think is at the heart of this that
isn't being discussed is that the
proposal would allow for protections for
American and UK law, by and large, has
some protections for UK citizens but
this is not a proposal limited to just
the two countries in the agreement. So if
the UK wanted to go directly to a US
company and get information about
Brazilians, there are very few
protections in place, which means it
would undermine the human rights of
users in every other country around the
world. Brazil, Germany, Australia, Tunisia.
Every user would have fewer rights
because the UK would then be able to get
greater access to their data.
And I think that that is a huge hole in
the system that needs to be plugged well.
We wanted to end our conversation with
your views for the future, especially
regarding the Brazilian scenario.
The legal disagreements about the legality
of the blockades, of WhatsApp blockades, have
become a constitutional issue in Brazil.
There are two constitutional complaints
challenging the constitutionality of such
measures before our Brazilian Federal
Supreme Court. Within the Legislative
Power, eleven draft bills were so far
presented to the National Congress about
website and applications blockings that
deal with the question in different ways
either by prohibiting blockades in any
circumstances or by regulating it in
specific cases. Well, all of them share
the presumptions that they approve that
only one law will be able to deal with
the complexity that involves the
issue of blocking, even though the
circumstances of each case, their motives
and legal grounds are quite different.
Considering Brazil's pioneering role in
addressing these issues, what are the
impacts that the unfolding of the
discussions might have in other Latin
American countries? How do you see the
future of these tensions and are there any
ways we can move forward?
I think the first thing to consider is that
shutdowns of websites and services
violate human rights. And one of the
things that we need to think about is
that maybe shutdowns aren't the way to
deal with some of the issues that need
to be dealt with, that there are other
paths to do this. Shutdowns tend to be
fairly easy, but they also tend to be
very broad, and affect
a lot of users and a lot of
legitimate speech. And so if you pass a
law on this issue, other than a
prohibition, what you're doing is, even if
it provides some standards, is you're
blessing the practice across
the board. And that can really
be a slippery slope toward undermining
human rights and allowing for a lot of
legitimate content to be shut down.
I do think Brazil has shown a lot of
leadership in the world of digital
rights, globally the Marco Civil was
revolutionary in applying traditional
human rights in the digital world. And so
I think there's a lot of space here for
Brazil to continue that lead and to
prohibit this type of blocking.
Right now with the WhatsApp, one of the
greatest invaders, one of the
people who are -- one of the countries that
is most notorious for blocking is Turkey
which has shut down Twitter on several
occasions and prevented that speech from
taking place. I would argue that
Brazil is competing with its shutdowns
of WhatsApp and trying to elbow in on
the sheer number of times a single
service can be shut down. I think that
that is bad for users across the board
as well, and so maybe it is not
something that we should be blessing
with a law, as much as clarifying that it
is prohibited under current law. There is
an argument that the Constitution does
not allow for full services to be shut
down, only pieces of them, under
the article 12 provision, and I think that is
a legally merited argument and I
think moving in that direction might be
a positive way to go.
Thank you.
So, Amie thank you so much for talking to us about all these important issues, this
was really interesting and I'm glad we
could hear your thoughts in all of this.
Thank you.
No comments:
Post a Comment