Back to index

Ep 72: The GDPR | The Seen and the Unseen


#
Did you know that Parsis in Mumbai, instead of being left at the Tower of Silence after
#
they die, are now cremated?
#
And why?
#
Because a cow fell sick in the early 1990s?
#
Did you know that the smog in Delhi is caused by something that farmers in Punjab do, and
#
that there's no way to stop them?
#
Did you know that there wasn't one gas tragedy in Bhopal, but three?
#
One of them was seen, but two were unseen.
#
Did you know that many well-intentioned government policies hurt the people they're supposed
#
to help?
#
Why was demonetization a bad idea?
#
How should GST have been implemented?
#
Why are all our politicians so corrupt when not all of them are bad people?
#
I'm Amit Verma, and in my weekly podcast, The Seen and the Unseen, I take a shot at
#
answering all these questions and many more.
#
I aim to go beyond the seen and show you the unseen effects of public policy and private
#
action.
#
I speak to experts on economics, political philosophy, cognitive neuroscience, and constitutional
#
law so that their insights can blow not only my mind, but also yours.
#
The Seen and the Unseen releases every Monday, so do check out the archives and follow the
#
show at seenunseen.in.
#
You can also subscribe to The Seen and the Unseen on whatever podcast app you happen
#
to prefer.
#
And now, let's move on to the show.
#
It's quite common these days for me to be at a restaurant or a retail outlet and the
#
cashier asks me for my phone number.
#
I always say no.
#
I value my privacy, and I'm not giving my phone number to anyone.
#
This is all very well in the offline world, or meat space, as we used to call it once
#
upon a time.
#
But what do you do online?
#
Apps on your phone these days have permissions to your photo library and your SMSs and your
#
microphone and whichever folder in your brain contains your innermost thoughts and feelings.
#
Everything we do is constantly being tracked online.
#
We're usually too exhausted to do anything about this, so we just give in thinking, hey,
#
what difference does it make?
#
At the most, they'll give me targeted ads.
#
But actually, your data is worth a lot more than that, and it can be misused in more ways
#
than just the serving of targeted ads.
#
Your data is who you are, and someone else may be in control.
#
Welcome to The Seen and the Unseen.
#
Our weekly podcast on economics, politics, and behavioral science.
#
Please welcome your host, Amit Padma.
#
Welcome to The Seen and the Unseen.
#
My topic for today's show is the GDPR, or the General Data Protection Regulation, a
#
regulation that has just come in place in Europe to protect the data and the privacy
#
of its citizens.
#
It has come about with the very best of intentions and has both intended scene effects and unintended
#
unseen effects.
#
My guest on the show today is Manasa Venkatraman, a legal expert and a policy analyst who has
#
a special interest in privacy and data protection.
#
She works at the Takshashila Institution in Bangalore.
#
But before I begin my conversation with her, a quick commercial break.
#
If this happens to be the only podcast you listen to well, you need to listen to some
#
more.
#
Check out the ones from IVM Podcasts who co-produced the show with me.
#
Go to ivmpodcast.com or download the IVM app, and you'll find a host of great Indian podcasts
#
that cover every subject you could think of.
#
From the magazine I edit, Pragati, I think, pragati.com, there is the Pragati podcast
#
hosted by Hamsini Hariharan and Pawan Srinath.
#
There is a brilliant Hindi podcast, Puliya Baazi, hosted by Pranay Kutaswamy and Saurabh
#
Chandra.
#
And apart from these policy podcasts, IVM has shows that cover music, films, finance,
#
sports, sci-fi, tech and the LGBT community, all under one roof, or rather, all in one
#
app.
#
So download the IVM Podcasts app today.
#
I am a lawyer, and right after law school, I started working in corporate law in Bombay.
#
Then I decided to sort of redeem my soul, do some good deeds for a change, and I think
#
I always knew I wanted to be part of the lawmaking process than the law interpretation and navigating
#
an existing space, which is why I moved to Takshashila.
#
And over my time there, I've developed an interest in technology law and technology
#
policy.
#
And here we are.
#
And here we are.
#
And were you always interested in law or it just happens to be something that you did
#
and you fell in love with it later?
#
You know, I always wanted to be a journalist.
#
And it's funny because I went to Xavier's, submitted my BA application, I was like, government
#
law college is right here.
#
It's just after one signal, let me just go put in my application there.
#
And when I got through to GLC, I was like, let's do it, let's see where it takes us.
#
And then eventually I ended up sort of liking, dissecting law and stuff like that.
#
And what are the areas of the law which you found most fascinating, like who are the writers
#
on law who you would enjoy reading the most and how did you sort of discover yourself
#
as someone who thinks about the law and in general legal philosophy?
#
So the great thing about the college I studied at is that it has a rich history of alumni.
#
So Ambedkar was once the principal of the college, various freedom fighters were alumni
#
of the college.
#
But all of that aside, I think that space allowed me to participate in a lot of competitions.
#
Law students have this fun competition, which is called a mock trial.
#
And I did a lot of that.
#
And I just liked going up there and sort of being like, no, clause B of section eight
#
does not allow you to do this, my Lord.
#
And a lot of millards later, I sort of figured that I liked it.
#
There is one particular, two particular judges that I really loved reading through law school.
#
One was Lord Denning, who was a sort of judge in England a really long time ago, sometime
#
in the 1700s, 1800s.
#
And one was MC Chagla.
#
And apart from law, what else interests you?
#
What do you do?
#
I like to sing.
#
But you know, when I sound like a, I don't want to say I sound like a creepy night watchman
#
on a podcast.
#
You already said it, you're not editing this out.
#
I like to sing, I mean, I do a little bit of yoga on the side, but that's about it.
#
Mostly I just watch Netflix.
#
And if there's one book or movie that has really influenced you and made you a different
#
person, is there something you'd be able to name?
#
A movie that's made me a different person.
#
Or a book or, you know, it doesn't have to be just one.
#
Well, there are a couple of books that sort of cemented my, the fact that I wanted to
#
do public policy.
#
There's this particular book called Banishing Bureaucracy.
#
And I read that when I was studying public policy at Takshashila.
#
And that was just fascinating how much you can do within the government and outside the
#
government to sort of ramp up how the system works.
#
And there's this other book called Public Policy Making in India, it was a textbook.
#
But I think just the way in which, just the kind of things that you can do to be part
#
of the process of making your state a better state was kind of, it shaped the way I looked
#
at the...
#
So in a sense, you actually want to make the world a better place.
#
That's why you're in this.
#
That's the grand idea.
#
Yes.
#
Yeah.
#
That's the grand idea for the benefit of my listeners here, of course, the institution
#
which publishes Pragati, the magazine I edit at thinkpragati.com.
#
But what else is it?
#
Tell us.
#
Certainly.
#
Takshashila is a think tank and it's a school of public policy.
#
So these two things are, you know, one can think of them as very parallel tangents, but
#
we kind of managed to marry them both.
#
On the one side, we do research on international relations, geopolitics, public finance and
#
technology.
#
And on the other side, we teach the basics of public policy to students.
#
We also have started teaching geo strategy to students.
#
So we tell them how the learnings that you have from it and other such things.
#
Takshashila to me is, it's my place of comfort where I can learn, where I can make mistakes
#
and where I have the space that I need to grow.
#
And I think that's really what you look for in your workplace when you're starting out
#
in a career.
#
So it's a fantastic bunch of people.
#
It's also increment time there.
#
I can attest to a fantastic bunch of people.
#
So let's move on to the subject of the day.
#
But before we talk about GDPR specifically, let's talk a little bit more about privacy.
#
Like why is privacy a more important issue today?
#
And would you say it's a more important issue today than it was say 30 years ago?
#
Definitely.
#
You know, privacy is an important issue today can be attested by the fact that the statement
#
we're in a global village today is now a cliche, right?
#
And it is a cliche because you hear it every other day.
#
Because it's happening every minute, we're becoming more and more interconnected.
#
And that's the thing about technology is that while technological developments multiply
#
from year to year, the law can only catch up or what the law can do is just create a
#
safe playground for technology to develop.
#
And in India, we didn't have that safe playground for privacy, which is why last year the Supreme
#
Court had to come in and say that no, guys, there is a fundamental right to privacy.
#
Privacy is important because of how simple it is for us to be connected to one another
#
from one end of the globe to another end.
#
And we all know that in the process, our data is trading several hands.
#
We don't know who has access to our photos.
#
We don't know how many apps know our phone number.
#
And we don't know how many apps know our home address and what we look like from the window
#
of our bedroom.
#
Privacy is more important now because there are more technologically advanced ways of
#
knowing about a person than, say, 50 years ago.
#
There was probably the telegram 50 years ago, the telephone, the radio, but today you have
#
you can look at a person's face when they're sitting in America and you're sitting in India.
#
While technological developments have opened up a lot of doors in this sense, we don't
#
know if our private lives have also walked out of these doors.
#
And that is why it's important to discuss the subject of privacy and how we can still
#
maintain that while allowing technology into our lives.
#
And now an argument I've heard from some people is that, listen, okay, our data is out there,
#
but the only thing that really happens is we get the spookily targeted ads.
#
So if I do a search for, say, shoes, I'll get ads for Reebok and Nike surf to me the
#
next time I go to Facebook, and that's okay.
#
That's the use of their private data that some people see, and that's the only interface,
#
so to say, with their data coming back at them that people are aware of.
#
What are the dangers of your data being out there unfettered, unprotected?
#
There are two examples that I want to talk about.
#
One is from the USA and one is from China, the other end of the world.
#
We all know what happened between Facebook and Cambridge Analytica and the whole democracy
#
being taken for granted, being targeted the way they were targeted for ads.
#
Assuming someone has heard about this for the first time, can you sum it up?
#
So what happened in the US is two years ago when they held federal elections for the president,
#
it came to light later on that this analytics firm called Cambridge Analytica was mining
#
data of US voters from their Facebook profiles, and it managed to target voters in a way that
#
no other campaign could before.
#
And while this seems fairly innocent when you hear it, it's problematic because none
#
of the voters knew that they were being approached for their votes.
#
This wasn't made clear.
#
So in a way, it was manipulative.
#
It was deceptive that the voter who signed on to Facebook did not know his data was going
#
to be used like that.
#
Exactly.
#
So that's one example of how your data can actually be used to influence democratic processes.
#
The second example is all the way across the globe in China, they're doing something called
#
the social credit system.
#
So apparently China doesn't have a rich history of banks lending money to people and banks
#
generally give out loans when they know that someone has the ability to pay back.
#
But now China wants to move towards a more institutional credit giving system, which
#
is allow banks to do it.
#
But banks don't know if they give money to the next person if that guy is going to pay
#
back or no.
#
So what China is doing is something, it's collecting data about every person within
#
China through their behavior on apps like WeChat, Alibaba, this and that.
#
And it's putting all of this data together, creating a very personalized profile of this
#
person.
#
And on the basis of what he looks like, on the basis of what his data tells the government
#
about him, loans will be given out.
#
Now, the problem with this is that it can be easily used by the state to discriminate.
#
And I think they saw a couple of examples of this in the US when black people were not
#
given loans.
#
Basically, black people were discriminated against because of the areas that they lived
#
in and because they generally did not have the kind of information that the computer
#
was able to think was believable or reliable.
#
Right and what you protest here, is it the intended use of what appears to be a really
#
efficient credit rating agency where it gets much more information than credit rating agencies
#
otherwise would?
#
Or is what you're protesting the fact that it's a state which has this data at its fingertips
#
and states are of course always subverted, there is always regulatory capture and it's
#
not good for democracy, which is moot in the example of China.
#
But in general, it's not good for democracy if the state has all this information.
#
Absolutely, I'm protesting both.
#
On the first point, yes, we don't know the intended usage today may be to give out loans
#
to people who can pay back.
#
But you don't know if five years down the line, that will still be the purpose.
#
And by then, they would have been empowered with all the data they need to do what they
#
can with it.
#
And the second point is, you're absolutely right, all the data, let's take other data,
#
for example, that rests with the ruling government today.
#
And that's one of the dangers of giving all your information to the ruling government,
#
that they can use that and manipulate that to stay in power.
#
Democratic processes are at a very dangerous place if this is going to happen.
#
And the thing to remember is, and I'm actually old enough to remember this, but in the 1984
#
riots, the rioters, if we call them that, were actually given electoral roles.
#
So they could identify which houses seek families lived in.
#
So it was targeted to that extent.
#
And that's just a very small subset of the data that is actually out there.
#
If you can target better than that, and if you can do more insidious things and necessarily
#
go to their houses and burn them, then the scope for the state harassing a citizen just
#
goes up exponentially.
#
There's no limit to it.
#
Absolutely, just like tech developing exponentially, the power of the state to do whatever it wants
#
also increases that much.
#
Right.
#
So the question here is that in general, does the citizen need protection from the state
#
as well?
#
And if so, how does that come about, given that any laws and any regulations that would
#
happen would have to come from the state itself?
#
I think that's where the GDPR comes in.
#
Any law that seeks to protect a citizen has to obviously have their rights at its center.
#
So for example, if I have 60 apps on my phone, that's 60 terms and conditions that I haven't
#
read, 60 privacy policies that can do whatever they want because I haven't read them.
#
If the state wants to protect my rights and to protect my privacy, the law should say
#
that irrespective of what is contained in these terms and conditions, the person collecting
#
them is still accountable for it.
#
And there's X, Y, Z things you cannot do.
#
Exactly.
#
Yes.
#
And that's where the state comes in.
#
Of course, like you're saying, the state finally holds the pen to the paper and in that sense,
#
the power is a little lopsided, right?
#
Because we want them to do one thing, but they have the ultimate word on the matter,
#
which is why what we're doing in India is really great.
#
There are public consultations that are being held.
#
There is an independent committee that is proposing what the law should look like.
#
And then there's a Supreme Court that has this matter in its mind and that is thinking
#
correctly about this topic.
#
So I think we're at a comfortable place in India, a place that we've never been before,
#
but this moment is also temporary and we need to cash in on it as fast as we can.
#
Right.
#
And we've done a brainstorm on Prakriti, which you led, where a bunch of the people who are
#
stakeholders speaking on behalf of civil society actually wrote long pieces about their vision
#
of what data protection should look like from Rahul Mathan to Nikhil Pawar and Malvika.
#
Before we come back to India, which I want to do at the end of the show in any case,
#
let's talk a little bit about GDPR now, like what was the evolution of the GDPR at around
#
what time did it begin to take shape and what were the different kinds of demands and pressures
#
on it?
#
Europe was thinking about privacy and data protection long before we were, long before
#
even the rest of the world was.
#
And by Europe, you mean EU or do you in general mean different countries within Europe, the
#
EU?
#
Yes, the EU as one collective body.
#
So in 1996, EU had a directive on data protection.
#
And before that, in the 80s, they had another law talking about privacy.
#
So EU has actually been thinking about this subject for a long time now, 20-25 years.
#
So very proactively and long before the internet reached anywhere near its current extent.
#
And if anything, they've kind of walked alongside the internet, walked alongside technology
#
and tried to make robust laws in that sense, which is how the GDPR was born two years ago.
#
And while it came into effect last Friday, 25th May, it was actually passed two years
#
ago and they gave like a two year window for all companies everywhere in the world to comply
#
with their law.
#
The GDPR is a great format from the perspective of the kind of rights that an individual has.
#
Right from, you know, asking a company what kind of information they have about her to
#
the right to be forgotten.
#
Everything is there in that law.
#
So in that sense, it's a very ambitious project and, and it's great that they've seen it
#
to fruition.
#
But the GDPR also has its drawbacks because it puts a lot of pressure on businesses.
#
So it's like a seesaw where the user was once at the lower end.
#
The user is, you know, has gone up now considerably, but the businesses have taken all the burden.
#
Be that as it may, to my mind, the GDPR is a great precedent for us to work off of because
#
of how comprehensive it is.
#
So like I was saying, it talks not only about the rights of every individual.
#
It also says that, you know, you will, if you don't comply with the GDPR, your punishment
#
will be up to 4% of your annual turnover for a big company that might, you know, that might
#
just be like a pinch in their pockets.
#
But for startups working in Europe, it's death.
#
Even for a big company, 4% of turnover is a hell of a lot because it's not 4% of profits,
#
it's 4% of turnover.
#
Yes.
#
Yes.
#
So before we proceed, did the GDPR come about more because of pressure from civil society
#
activists as we see in India, for example, or was it a proactive bureaucracy, which in
#
any case was seized with the need to do something about the space, or was it even a lucky consequence
#
of every bureaucracy's tendency to try and regulate everything?
#
That's a very interesting way to put it.
#
While I'm not sure how proactive the civil society was in pushing this law forward, there
#
definitely seems to be a sort of bureaucratic action that was taken over time, which led
#
us to the GDPR.
#
To your third point, yes, I think a lot of it was also serendipity, in a sense, because
#
when Brexit happened, it turns out that Cambridge Analytica did some, you know, a little bit
#
of games over there as well.
#
And over the years, there have been enough data leaks, enough data breaches, enough phishing
#
incidents for governments to sit up and take notice.
#
Not so much for us in India, maybe, but, you know, several incidents do come to mind in
#
the Western world, which might have prompted this to happen.
#
Right, and governments do take threats to their power seriously, which, you know, all
#
of this is.
#
Yes.
#
So what do you feel about the GDPR personally?
#
I mean, I can see that you are glad that it takes the rights of the individual seriously
#
and puts them, you know, firmly at the center of what the law is all about.
#
But on the whole, what are your feelings about the law?
#
My only sort of worry with the GDPR is that it might not be an enduring law.
#
And that's the danger that every law that deals with technology faces.
#
Like I said earlier, because tech sort of leaps into the future, whereas the law sort
#
of crawls behind it.
#
It's very easy for a law to become redundant.
#
The Information Technology Act in India is now, you know, it's there in the background
#
somewhere.
#
What's my worry with the GDPR?
#
So for example, it says that the user has all of these rights and that's great.
#
But it still says that every company has to communicate the terms and conditions to the
#
user and they have to communicate it in simple terms.
#
They can even use pictures.
#
That's a great idea that I will come to later.
#
But the fact is that I as a user, I'm going to get tired of reading that same terms and
#
conditions, reading the same language.
#
As an example, over the last week, we've all received a lot of spam whenever we logged
#
into the internet saying we've changed our privacy policy.
#
How many of us actually read them?
#
So this is the first failing of the GDPR.
#
We are recording this on June 1, by the way, it will release a while later, week and a
#
half later.
#
Okay, great.
#
I hope there'll be more of the spam.
#
The weird thing is, I know that there's been all the spam, but I haven't noticed a single
#
mail which has come to me asking me to reread anything, which is really nice, they don't
#
care about me.
#
So that's one thing and that's the problem because the crux of the issue is not lack
#
of consent.
#
Yes, there is, when I am accepting terms and conditions, I don't do it knowingly because
#
I don't read them.
#
So the problem is consent is just a symptom of the larger problem of fatigue.
#
Even if you use pictures, even if you use a video as your terms and conditions, at some
#
point I will just stop caring because I will be downloading so many apps, I'll be going
#
on so many websites that I'll be sort of bombarded with these things all around and the fatigue
#
will catch up.
#
The term you used for this was consent fatigue.
#
Yes, consent fatigue and GDPR does not solve for consent fatigue.
#
That's the first problem with the law.
#
How could it solve for consent fatigue?
#
By saying that, by shifting the onus from the user to the person collecting the data.
#
So every data collector would then have a list of A, B, C, D, D, these are the things
#
you need to do and so it's like a standard, like if you exist, you are automatically,
#
you have to fulfill those requirements.
#
Something like an auditor going to a company once a year to check their books of accounts.
#
If you do the same kind of thing with data, if you have a bunch of people who are not
#
qualified to be data auditors, then you shift the burden from every user to actually the
#
company because the company has to show year on year that it's complying with the law.
#
It's complying with the A, B, C, D things that it can and cannot do.
#
So that's one way to solve for consent fatigue, there are many, but that's the first that
#
comes to mind.
#
To come back to your earlier question actually, the GDPR is a mixed bag because it's the first,
#
you know, in history, when you trace back to how things started, you find that the first
#
act that started an entire momentum had many flaws, but it kind of gave the push that the
#
rest of the world needed.
#
I have a feeling that GDPR will be something like that.
#
That eventually will get to a very good mean that solves all these problems, but we'll
#
guard our way to that situation.
#
So I want to delve more deeply into the unseen effects of the GDPR as it now stands.
#
But before we do that, let's take a quick commercial break.
#
So it's been another great week on IVM and we're hoping that you enjoy all of the podcasts
#
that we're being able to get out to you.
#
As always, if you're not following us, please do follow us on IVM podcasts on all the social
#
media platforms.
#
This week on Keeping It Queer, Naveen spoke to Ankit Das Gupta, the social media content
#
manager at Mirror Now.
#
On Who's Your Mommy, Veda discusses mom-bots and the toll a pregnancy can take on women.
#
On Varta Lab, Akash and Naveen exchange stories with boys from the Bombay Hemp Company.
#
On Paragati, Pawan and Hamsini are joined by Dr. Shambhavi Nayak to discuss the Nipah
#
virus and discuss the nitty gritties of this new disease.
#
On Simplified, Naren and Chuck break down the difference between schizophrenia and split
#
personality on a shorty.
#
It's been a really, really great week and I hope that you're going to listen to all
#
of these shows or at least some of them.
#
In the meantime, let me get you on to this one.
#
Welcome back from the commercial break.
#
While you were away, we have collected all your data and hey, good looking.
#
So Manasa, coming back to the GDPR, let's talk about, I mean, the show is called The
#
Seed and the Unseen.
#
So let's talk about some of the unseen effects or rather the unintended consequences of the
#
GDPR as it stands now.
#
Well, the first unintended consequence is what we spoke about before the film.
#
Constant fatigue.
#
People won't read it anyway.
#
Yes, yes.
#
And, you know, I'm actually interested to know if there are any behavioral economics
#
papers or sort of trivia that prove consent fatigue to be real.
#
And if you do have any examples, do comment below.
#
Do comment below.
#
We don't have comments enabled.
#
I'm sure your colleague, Nidhi, who's an expert in behavioral economics would be able to conduct
#
experiments with those kinds of things.
#
Yes, yes.
#
Nidhi, if you're listening.
#
Well, the second unintended consequence of the GDPR is that I don't know if many startups
#
are going to start up in the EU region anymore because of the high costs of doing it.
#
And while, you know, this feels like a very careless argument to make, why should a business
#
thrive if it does so at the cost of my privacy?
#
And I'm not saying that's the way for it to do, for it to thrive.
#
But the GDPR tilts everything against a startup.
#
It puts unbearable pressure on the startup to comply and eventually just kills that kind
#
of ecosystem over them.
#
The kind of next unintended consequence of the GDPR is how long it will actually last.
#
I want so we're already seeing examples of US media stopping to serve people in the EU
#
region because of the high costs of complying with the GDPR.
#
I want to know how much further this will deteriorate.
#
And my sense is that in a couple of years, we'll see all of these unintended consequences
#
come to life.
#
And also there's also the broader philosophical question that ultimately every transaction
#
in the marketplace is between two consenting parties.
#
And here, of course, because it's impossible for most users to really consent or even understand
#
what is being proposed to have the GDPR to protect their rights and so on.
#
But if the GDPR goes too far and over regulates, then it actually creates a negative sum game
#
because it's harder for the business to exist and it's harder for the consumer to get the
#
benefit that he would otherwise have gotten out of that business existing.
#
And just in the last week since GDPR has been implemented and we're recording this on June
#
1, a lot of international websites because they cannot comply or don't want to comply
#
have been blocking EU users entirely.
#
For example, you know, there are newspapers like the Chicago Tribune and the Los Angeles
#
Times, there is Unroll.me, Instapaper, which have all blocked EU users, so suddenly that
#
entire group of users doesn't have access to them or they get stripped down versions
#
of the service, which is especially hurtful in the case of NPR, National Public Radio,
#
which EU is now getting a stripped down version of not the full version because they simply
#
can't comply.
#
And a bunch of companies such as Cloud, KLOU2 and various online video games have stopped
#
operating in the EU, have stopped, they were operating, now they've stopped operating
#
there because of this.
#
All this is an unseen cost.
#
All the value that these companies and services were bringing to users are an unseen cost.
#
But how would you mitigate for that?
#
Like when, you know, you're regulating what is essentially virgin territory, no one's
#
really done that.
#
No one knows the consequences.
#
Where have they gone wrong?
#
So this is something that I read in Justice Sri Krishna's white paper, which is our own
#
Indian way of arriving at a data protection law, there are three kinds of regulation.
#
One is self-regulation, which is when you let the market sort of develop its own practices.
#
The second kind is a co-regulatory model, which is you handhold companies, you handhold
#
private players as much as you can.
#
And the law steps in and says, you know, broadly, you can't do these three, four things, but
#
everything else you guys figure out how you want to develop best practices.
#
And the third kind of regulation is full on state.
#
It's called command and control regulation.
#
And the GDPR resembles command and control because there is almost no room for companies
#
and for industry bodies to come together and to develop their own best practices to protect
#
privacy.
#
And that I think is the ultimate, if I had to give like a TLDR of the GDPR and why it
#
won't last, that'll probably be it because it doesn't engage with all of its stakeholders.
#
It doesn't give enough room for them to wiggle and sort of figure out what is most comfortable.
#
It just says, you guys have to do this.
#
These are the rights that every user has.
#
These are your punishments if you go wrong.
#
And I don't care what happens, comply, however big or small you are, comply right now.
#
And you're right in that, that might not be feasible for companies of all sizes.
#
That might not even be feasible for companies like, I don't know if Pokemon Go would still
#
be able to survive in the EU.
#
So I feel like a co-regulatory approach, engaging with the stakeholders before creating a law.
#
Which is the second approach that Justice Sri Krishna proposed?
#
Yes, yes.
#
You know, I hope that it's the approach that we take in India, because there is a lot of
#
opinions floating around about what we can do for privacy.
#
And that's the best approach when you're talking technology law, right?
#
Because the government might not know everything there is to know about technology.
#
Businesses might know 30%.
#
Me as a consumer might know 30% and everyone else might know 5%.
#
So you find a way of bringing everyone to the table.
#
So people who favor the command and control approach could say, hear that, look, you know,
#
people like your Facebooks and your Googles or whatever, and the Ubers, you know, are
#
the people you should be worried about.
#
Why bring them to the table?
#
I mean, their interest is obviously in having as lax regulation as possible, because after
#
all data is where they make their money from.
#
They want to harvest as much data as they can.
#
But these companies are also reliant on people believing in their products.
#
And take Google, for example, there is no sort of motivation for Google to up its privacy
#
policy game in India.
#
But it does a fantastic job of it.
#
Whenever they update their privacy policy, if it's a big enough update, then you as a
#
user of Gmail or whatever, are taken through it in great detail.
#
And it's sort of broken down to you beautifully.
#
That's because Google has a vested interest in simplifying it, right?
#
So that actually then seems to favor the first approach that just Sri Krishna mentioned,
#
because if the market itself is a best regulator, then that's going to do the best job.
#
Like, even with what happened with when, you know, when the Cambridge Analytica scandal
#
broke, a lot of people reacted with that whole hashtag uninstall Facebook and, you know,
#
so on, which was itself the market sort of striking back.
#
And do you feel that that's not effective on its own and the government does need to
#
step in and there needs to be the second approach?
#
It does just largely for the reason that there is way too much happening right now.
#
And if each player, if each company is given its own leeway to do to develop it over time,
#
that's great.
#
That's a very democratic way of doing it.
#
But the fact is that we kind of need some level of uniformity at some point, which is
#
why what the law needs to do is it needs to just have a few principles on which it will
#
survive.
#
So for example, let's say accountability is a principle.
#
Then on the basis of that principle, Google can do whatever it wants to make its product
#
better.
#
Uber can do whatever it wants, Facebook and the government can do whatever it wants.
#
There are these minimum benchmarks it has to meet, which are agreed upon by all stakeholders
#
and come through that consultative.
#
Yeah.
#
So what the law should should look to do is just sort of have everyone agree on these
#
frameworks and on these principles and tell them that you can do three, four things.
#
You can collect data for the limited purpose of your app and you can't say collect data
#
of their grandmother and their great grandmother who have passed away before we got freedom.
#
Before we achieved independence, we have never got freedom, we don't, we're all slaves.
#
What the law should do is just set out these basic principles and the sort of scope of
#
what a company can and cannot do within which the economy can thrive as it does right within
#
which they can get like consent for micro things here and there, you know, additional
#
forms of so let's quickly go through the unseen effects of the GDPR once again and tell me
#
if I missed anything.
#
The first one, of course, is consent fatigue where consent is required.
#
But if somebody has 60 apps on their phone, they're having to give consent 60 times and
#
eventually they just get tired and that consent becomes meaningless per se.
#
The second is that the cost imposed on businesses goes up drastically and that might stop business
#
businesses from existing in the first place where they might have been there and the unseen
#
effect of that is all the value which would have been brought to the lives of users and
#
citizens through those businesses, which now no longer exists.
#
So what would have been a positive sum game effectively is completely reversed, both parties
#
lose and no one really gains.
#
And the third is because technology is always advancing by leaps and bounds ahead of the
#
law, the law simply can't keep up and therefore it will prove to be inadequate.
#
Is that an accurate summary?
#
Have we left out any unseen effects?
#
That's broadly the accurate summary of the GDPR on its own.
#
But when we look at it from the context of what it means for India, everyone in India
#
today knows about the GDPR or everyone with a smartphone and enough apps on it knows about
#
GDPR.
#
And so, you know, it's in our conscious mind.
#
What is important for us is to distinguish ourselves from this law.
#
It's a great law because it has all these fantastic qualities to it.
#
But it's also a very controlling law in that sense.
#
And we are at a...
#
Give me an example of that if you can.
#
I mean, what do you mean by a controlling law?
#
Like I said, it doesn't allow companies and industries to come up with their own regulatory
#
guidelines or best practices to sort of accommodate to the larger principles of the GDPR.
#
It's also controlling law in the amount of penalty that looms over every business's head.
#
So in that sense, the GDPR is kind of stringent.
#
And in India, we're at a great point because we don't have to clean up any mess.
#
This is the first time we're going to be writing a data protection law.
#
We don't have a history of it that went wrong and we have to clean that up.
#
So we should be careful to not be too influenced by the GDPR.
#
We should adopt a co-regulatory approach so that we avoid the unintended consequence of
#
over-controlling and becoming the new license Raj.
#
We should also try and solve for consent fatigue and sort of stay away from the approach that
#
the GDPR used.
#
Yes, user consent is absolutely important.
#
And I need to know what they're taking from me when I'm getting the benefit of an app.
#
There is no arguing that.
#
But the idea is to reduce the information gap between the data collector and the data
#
user.
#
And there are more than one ways of doing this.
#
The terms and conditions are just one instance that the GDPR looks at.
#
So we need to sort of develop our own flavors of what a data protection law should have
#
to kind of circumvent the unseen effects.
#
And how receptive have governments been to this argument about a, needing a data protection
#
law and the right to privacy being important, and b, to the importance and optimality, if
#
that's a word, of this consultative process where all the stakeholders together can arrive
#
at a framework for the law?
#
So some government wings like the TRAI have shown a lot of interest in this process.
#
And there have been a couple of bills in the parliament over the last two years on a data
#
protection law.
#
While I don't know enough to comment about, you know, whether as a collective body where
#
the government stands, it's very evident that, you know, individually, people representing
#
the government are thinking about these things because they are engaging with their voter
#
bases on social media, and they are also benefiting from these services just as much as we are.
#
So I feel like we're all on about the same page, we're probably in the same chapter.
#
And that's a really good place to be.
#
I mean, it sounds like a miracle for a country that tends to overregulate that you actually
#
have, you know, civil society voices, which are so strong and actually playing a part
#
in drafting these laws.
#
Yeah, and in a very constructive way.
#
I feel like all the ideas that I've heard over the last year of working in this subject
#
have almost all sort of contributed to some larger relevant point.
#
And that just goes to show that we all have a lot to learn from each other.
#
And this is the beauty of the lawmaking process, that you just take everything that benefits
#
you.
#
So Manasat was really enlightening talking to you, and no doubt the government will hear
#
this episode before the listeners of The Seed and The Unseen actually get to do so, in which
#
case I hope they actually get to do so.
#
But I'd like to end with, you know, a couple of last questions I always ask all my guests
#
in the context of whatever subject they're talking about.
#
What makes you despair?
#
And what makes you hopeful about the future of data protection and privacy in India?
#
What makes me hopeful is that everyone is thinking about it.
#
It's relevant for everybody I've met.
#
And there is so much to...
#
There is probably a little bit of selection bias.
#
My paanwala is not thinking about the GDPR, but no, I'm kidding, I get your point.
#
No, I actually disagree.
#
A data protection law is relevant for him as well.
#
It affects everyone, but not everyone is really thinking about it in that sense.
#
But the good part is that even the PLU who are thinking about it are much more vocal
#
than their numbers indicate.
#
So we are making a lot of noise about it.
#
Yes.
#
And, you know, when we're teaching public policy to students, we talk about this beautiful
#
thing called the window of opportunity.
#
When politics and solutions and problems align, they open the window of opportunity.
#
And that's the time that you have to make a law or a policy that will bring about change.
#
I think we're at the point where the window is creaking and we're just opening it up.
#
And it's good that we're all at it.
#
What despairs me is our history is not great with making laws for an unknown beast.
#
So I hope we don't fall into the same trap of overregulating.
#
And I hope we sort of give enough room for all stakeholders to wiggle and find their
#
most comfortable way of complying with the law.
#
So that's something that I'm kind of wary about, but I'm largely hopeful, I think.
#
Excellent.
#
On that note of hope that we shall all be allowed to continue to wiggle within the window
#
of opportunity.
#
I can hardly picture it in my head.
#
Thanks a lot for coming on the show.
#
Thank you so much.
#
If you enjoyed listening to this episode, hop on over to sceneunseen.in for the archives
#
of The Scene and the Unseen.
#
I have recorded an episode in the past with Manasa on prostitution.
#
Do check that out.
#
And you can follow Manasa on Twitter at nasac, N-A-S-A-C.
#
You can follow me at amitvarma, A-M-I-T-V-A-R-M-A.
#
Thank you for listening.
#
There she stands, a podcast addict, outside the bank, having travelled several miles to
#
get in with other poor souls like her.
#
The journey, though daunting for this youngling, will have some comfort because she has downloaded
#
her favourite podcast.
#
You can see more of her species on ivmpodcasts.com, your one stop destination where you can check
#
out the coolest Indian podcasts.
#
Happy listening.