Privacy Within the Digital Ecosystem - An Interview with Pam Dixon of World Privacy Forum


Interview with Pam Dixon of World Privacy Forum:

Cyber Security Dispatch: Season 2, Episode 05

Show Notes:
On today’s episode, we are joined by Pam Dixon, Executive Director or World Privacy Forum. As privacy seems to be one of the most current issues, even outside of the cyber security industry, we are very excited to have Pam with us to discuss her work in the field. We chat about what the privacy landscape looks like at present, which of course leads into the topic of GDPR, which our guest unpacks for us a bit. Pam views privacy as an issue that falls under the banner of more general human rights and she explains that absolute privacy for its own sake should not be the goal. We then go on to talk about the role of companies and organizations in determining the development of the policies that will shall see, both created and then implemented. Our guest suggests that we should all be approaching the issue of privacy in a affirmative and constructive manner in order to build the future we really desire.

Key Points From This Episode:

  • The current privacy landscape and an introduction to GDPR.
  • Unpacking GDPR and what it will mean.
  • The future of terms, conditions and consent forms.
  • Locating the issue of privacy within a larger context of human rights.
  • The privacy issue and the distance it has to go to catch up with other social concerns.
  • The role of industry in the progress of the privacy issue.
  • Imagining an affirmative, multifaceted approach towards privacy.
  • Privacy’s relationship to identity and data.
  • The evolution of the rules of the privacy game.
  • The important decision we all have to make with regards to privacy.
  • And much more!

Links Mentioned in Today’s Episode:
World Privacy Forum —
Pam Dixon —
Pam Dixon on Twitter —
General Data Protection Regulation —
Facebook —
Cambridge Analytica —
Aadhaar —
The Failure To Do No Harm —
Estonian ID —
Axiom —
DuckDuckGo —
Instagram —
Snapchat —
The Scoring of America Report —
Apple Pay —

Welcome to another edition of Cyber Security Dispatch, this is your host Andy Anderson. In this episode, Privacy Within the Digital Ecosystem, we talk with Pam Dixon, Executive Director of the World Privacy Forum, a public interest research group devoted to privacy. She shares her thoughts on the upcoming implementation of the General Data Protection Regulation. GDPR, and what the future of privacy looks like.

[0:00:30.5] Pam Dixon: My name is Pam Dixon. I’m the Executive Director of the World Privacy Forum. We’re a nonprofit, public interest research group, we focus on privacy and related issues.
[0:00:40.5] Andy Anderson: Great. Privacy has been so in the news in the last couple of weeks, particularly with sort of the Facebook, Cambridge Analytica, sort of issue coming to light, what sort of top of mind when you think of that issue there?
[0:00:53.8] PD: So many things, we are living in an incredibly important time. It’s a watershed moment. All of us who are here right now and adults and really thinking about these issues and or not thinking just participating - all of us who are in the digital ecosystem right now are experiencing the growing pains of that digital ecosystem.
Right now, especially in the US, we’re in an environment where a lot of the rules around privacy don’t actually give us privacy. They kind of give us some aspects of certain things but it’s not necessarily the ability to control our digital exhaust. When I talk to people, they really want that among other things.
We can get hurt by a lot of data that’s out there about us that we really don’t have control over. For example, whenever we use our credit or debit card to make purchases and then the purchase history is sold then someone, some data broker somewhere decides what kind of consumer we are and whether or not we deserve a certain type of offer maybe to a good college or not or maybe a health plan decides, maybe we should charge them more because they’re buying a certain type of food or what not. This really happens. Our transactional life and our data crumbs that we leave has a genuine marketplace impact on our lives.
It’s not disputable anymore - I think we all know that this is happening. What does that mean in today’s world and for all of us who are living in it? Well, what that means right now is we just don’t have the tools and the rights that we need, right now, to effectuate privacy or autonomy on our own behalf - not quite yet.
I do believe though, in Europe that Europeans have a lot more rights and I’m wondering if the European law that’s going into effect in May of 2018, the GDPR, General Data Protection Regulation. I’m wondering if that will change the landscape to some degree, it’s going to be interesting.
[0:02:51.5] AA: Yeah, walk me through, I mean, you know, I’ve talked about GDPR with a lot of our guests because I think it’s probably the biggest sort of iceberg that’s approaching in many companies when they think about compliance, when they think about security, when they think about data overall - they are sort of like three legs of a stool.
We have really focused on the right to be forgotten and then the other piece which is the sort of data breach notification. It sounds like there are other pieces that are sort of more your focus and sort of thinking about in GDPR. So walk us through those.
[0:03:23.3] PD: GDPR, those are important aspects but I actually am very torn about the right to be forgotten. I worry about it, I have to tell you, I think it was really – it was helpful for privacy but it’s really bad for the internet. I do have questions about it because it worries me, I don’t like history getting scrubbed, I’m all for like keeping the history but it’s a very controversial thing. All right, I won’t go there.
GDPR, there are elements of GDPR, that’re incredibly important and so one: let’s talk about the sensitive information categories. In GDPR, certain information is considered to be sensitive and it gets extra processing attention and you have to get different and more robust consent to process that information- Biometric information is part of that. That is so important. Now, of course there are exemptions but if someone is going to collect your biometric, guess what? You are going to know about it, and I think that’s a good thing. We are moving into what I call a strong identity world where our requirements to prove our identity will be heightened, so biometrics are very appealing to for example, governments and biometric IDs and whatnot. I spent a year, I lived in India for a year and I studied the India’s Aadhaar biometric ID system. I wrote a really intensive scholarly journal on it, it was published in Nature’s Springer last year now. But, the title of the article is ‘A Failure To Do No Harm - India’s Aadhaar System’, that should tell you something.
I think, in fact, I’m sure I was the only western NGO (Non-Governmental Organization) that was in India in 2010 when Aadhaar started and I went in 2010, 2011, 2012 and then 2014 to study Aadhaar, it was absolutely amazing, it taught me so much about biometrics and identity and rights. You know, biometric deserves to be a sensitive information category, and I think Europe did the right thing - I think that’s going to change and help Europeans.
Another really important thing that the Europeans did was they made it impermissible and actually, outright illegal to do automated profiling on a person without consent and knowledge; so data brokers will be out of business in Europe. For example, Axiom, you know, get a list of people who’d subscribe to a diabetic magazine and then infer that they had diabetes and then sell that list to an unknown third party to you - that won’t be happening in Europe.
[0:05:48.5] AA: I mean, you know, as I think about that, there’s the companies that we don’t think about, maybe Axiom or the other data brokers but like, essentially, Google or Facebook or Twitter or any of the major social media, they’re incredible data profiling companies, would they also not be able to do that?
I mean, I assume that they’ll just add it to their terms and conditions that as a user, you’re agreeing to that, right?
[0:06:11.1] PD: Actually, I think it’s more robust than that. I don’t view those companies as data brokers but I do think that absolutely, everyone is going to have to get consent for things. I mean, for example, for our website as an NGO, we’re going to have to put up a new consent for cookies.
We’re racing to get that done. I mean, look, you’re not going to be able to do things in an unconsented way and the consent has specific requirements, it’s not going to be buried in a terms and service at all; it’s going to be in your face consent and I actually don’t think that’s terrible.
Watch, everything is going to be consented.
[0:06:46.6] AA: Yeah, I’m curious. When you think of, it’s just such an asymmetrical balance of power between an individual or even groups of individuals and one of these incredibly large corporations which have data on an entire individuals, right?
How do you think about sort of balancing that scale and about privacy, right? Because they’re with algorithms with sort of massive amounts of data, they essentially can wash over your ability to sort of hide yourself, to remain private.
[0:07:15.8] PD: Right, the way I conceptualize privacy. Privacy is a subset of the broader human rights - of autonomy and human freedoms and freedom of thought and freedom of expression and fairness - I view it as a subset. When I talk about privacy, I’m really referring to the broader issues, especially human autonomy, human dignity.
These kinds of things, we should be able to determine our path in life now, isn’t that American? Look, if there’s any American dream, that’s it. Privacy is really, instead of the right to be kind of hidden away, it might be, that is a subset of the definition of privacy but it’s today, in the digital ecosystem, it’s really about, “Okay, who is holding my data? What rights do I have to see it? To correct it, to delete it, to manage it, to take it with me? What rights do I have to prevent its sale to third parties? Or what rights do I have to prevent its use in meaningful marketplace decisions about me that could impact my opportunities or finances in the future?”
That to me is what it’s all about. You're absolutely right - right now, there is tremendous asymmetry. Because we have been in this growing digital ecosystem, there has been such a data grab and there has been absolutely, very ineffective resistance against that.
Until the GDPR, I think the GDPR was the first big hammer that said, “Okay, we’re going to find a little bit of balance here.” The pendulum has to swing a little further the other way. I think that is a first step and I think as time goes on, we’ll find that there’s going to have to be a middle ground. It can’t be – we get everything we want but it can’t be that companies get everything they want either.
I mean, if you could imagine the HIPAA fight. Health information privacy was only passed in the 90s. Can you imagine if your doctor had the right to sell your medical record to anyone who wanted to buy it? I mean, we’re lucky we got that through when we did.
We’re really facing the same thing now. We need new protections and I think that the protections have to be smart and have to be born out of a new way of thinking. I don’t think we can just simply apply all these old laws and say, “Look, let’s just think the way we’ve always thought, and try to create new rules out of old thoughts.”
I think we need to have fresh ideas. I think industry needs to come to the center. I need we need to have much more temperate discussions.
[0:09:52.4] AA: Yeah, you know, unfortunately, I think the sort of congressional sort of legal environment that we’re in right now, like the likelihood of Congress passing anything meaningful that’s not controlled by special interest is sort of - it’s sad that we have to look to Europe to do it, right?
[0:10:08.9] PD: It is.
[0:10:10.0] AA: I don’t sort of see their coming out of Congress and I am not like a sort of crazy liberal person, I’m more like a pragmatist. It’s just like - I don’t complexity of these issues are such that I don’t think like your everyday voter is able to sort of influence - it’s so much easier for sort of special interest to kind of make sure that the changes that happen are ones that benefit the companies in the groups that make billions of dollars. Plus, you have issues around national security and what not. You have multiple interests sort of wind up on the side of reducing privacy and maybe your organization, the ACLU and a few others kind of on that other side.
[0:10:49.0] PD: I think you’re right. I have to tell you, I am pleased that Congress probably won’t do anything because I would be concerned about what it would look like, right now. I don’t think that there has been appropriate enough compromise and discussion. I mean that in a good way, not this, we’re making sausage kind of way. Let me explain to you what I mean.
In the environmental movement, there has been real progress made in a lot of ways. Now, I do think environmentalists are a bit on their heels right now but just apart from that. I do think that there has been a very broad understanding that: look, we have environmental concerns, we need to address environmental concerns. I think that there is a broad consensus about that.
I do think you find industries such as carpet manufacturers and what not coming to a very good consensus place where they’re still having their businesses but they’re moving to a zero carbon footprint, et cetera.
We don’t have that quite yet in privacy. With the exception of a few companies that are working really hard to be trustworthy and –
[0:11:49.8] AA: Who would you put in that category –
[0:11:51.6] PD: I’ll give you a great example. I’m going to give you example of DuckDuckGo. DuckDuckGo contacted World Privacy Forum, about let’s see, a year and a half ago and so now. They told me the conversation said, “Look, we want to be a trusted company and we want to do everything we can to be a trusted company, because we want the digital ecosystem to be vibrant - we want to assist with that.”

[The internet privacy movement has] come to that point - we are at that same point that the environmental movement got to where it is like you can’t just war with people you disagree with. You’ve got to find a way to work together and it can’t be disingenuous.
— Pam Dixon

They supported us and funded us to write a new guide for HIPAA and a parents guide to privacy which we’re working on. I thought that that approach, we want to become a trusted company. I thought that was really interesting. I think that’s the right approach and we have to start having these conversations and they have to be meaningful.
I’m very concerned that one of the things that we really have to do is we have to start looking at okay, just kind of envision a thought experiment where we don’t have regulation. How do you protect privacy in an environment where there’s no regulation? The answer is, you have to have companies who care; so will good industry please raise their hands now and let’s work together, right? I think that’s going to have to happen.
We have to learn to talk to each other. We’ve come to that point - we are at that same point that the environmental movement got to where it is like you can’t just war with people you disagree with. You’ve got to find a way to work together and it can’t be disingenuous.
[0:13:13.5] AA: Yeah and I mean I think clearly Facebook is getting particularly tarred right now because they’ve been probably one of the worst actors there, right? And I think –
[0:13:23.6] PD: They deserve a lot of criticism right now.
[0:13:24.9] AA: Yeah, I would agree, right? And I don’t know, I mean I think although I am hopeful because I am starting to see, already you can see in Facebook’s numbers and you saw the stock market this week, they got hammered and I think if Facebook didn’t have Instagram, right? That’s the other thing, I think if you really want to leave Facebook, you’ve got to leave Instagram as well which I think is at least for my generation a lot harder.
But I have not seen Snapchat really step up or the other competing products to say, “Hey we have actually thought about privacy in a much more fundamental way. We have taken these ideas to our core”, because I think they are afraid. I think they are afraid of the revenue, hit on revenue.
[0:14:03.1] PD: But you know I think you are right but I would also submit to you this idea. There has been a focus on privacy that is not quite the right idea of privacy. It is almost like the conversations over way off here, in the woods but really, we need a different conversation. The conversation should really be about instead of saying what we hate and what we don’t like, why can’t we have an affirmative vision for what we do want?
So here is my affirmative vision for what I want. I want companies to have a general understanding of data governance and knowledge governance and information governance that establishes rules of fairness - rules of generally accepted use of data - that really establishes meaningful boundaries for when and how you shouldn’t use data. This to me is where real privacy happens - it’s in the trenches. I’ve been all over the world doing this privacy stuff, and I’ve learned that there are good companies out there - there are really good people out there trying to do the right thing. It is always the people in the trenches who come up with the ideas of how to actually protect privacy. So really clever ideas about, “Okay, so here is how we can do this and here is how we could do that.” And one of the projects that I am working on now is to figure out what are these people doing? When you boil down all that they’re doing, what is it that they’re doing? And I think we can pass all the laws in the world but if people don’t listen to the laws and if the laws don’t actually solve the problem, then we are no better off. So one of the examples that just drives me batty, I mean we wrote a report in 2014 called “The Scoring of America.” and we’ve spent seven years researching that report and intensively the last year and a half. And one of the reasons it took us so long to research the report is we are doing the reports on predictive analytics and how people are categorized and what that does to them in their real lives. And we found a lot of health scores and what-not but finally, we were finding enough research and it took time for the market to mature and in our documentation, we found that a major health plan in the United States hired a major analytics company to do a 1200 factor analysis of purchase patterns and other data that they procured from a variety of sources, we’ll just put it that way.
They did the analysis, they found the top 25 most predictive factors. One was smoking, no surprise, right? Of course there was obesity, there were all the ones that you would expect but then there was the new analysis. Another very predictive factor in the top 25, how much you spend on online clothing purchases, super predictive of poor health or it wasn’t good for health, but it predicted your health.
Then another one was how much camping gear you’ve bought. So tell everyone, “Go buy some camping gear but make sure it’s on your credit card. If there is a loyalty card use it”, you could gamify this system a little bit but that was also predictive of good health and I thought to myself, this is just exactly what we are trying to prevent. When we go shopping, personally I don’t know a lot of people that pay cash anymore. I really don’t, for all sorts of reasons it is very helpful to keep track of - Apple Pay is very helpful and other – anyhow, very few people are going to whip out $500 for a $500 purchase, yeah. So since that is not happening our purchases can be used to help decide how good or bad we are, or profitable or not profitable, or healthy or not healthy and you know what? I am not so sure about that. So what rules would touch that in all the different permutations, in all of the different settings - I don’t know that there are.
So we have to find solutions that are going to work in very nuanced ways and - you know I like to say that the command and control regulatory era is over and I also like to say that there is not one single, giant, perfect solution that is going to fix privacy.
I think we need a lot of multifactorial solutions that attack the problem from all sorts of different angles and we need to implement all of them and really work at that instead of just going, “Okay we hate this.” Let’s figure out what works. Let’s do that.
[0:18:09.4] AA: Yeah and I think it’s funny. I mean privacy as we have talked about is sort of about keeping things hidden or concealed, right? Like your personal, individual information and other factors as well but I think in some ways transparency, right? In fact we want less transparency unnecessarily, what an individual is doing but we want more transparency on what organizations are doing and understanding - because that’s I think where fundamentally it’s problematic.
It’s you’re not understanding what they are doing with that data or you don’t have full understanding of how broadly they are using it or where it’s going right?

You know people don’t quite see that risk but now with biometrics and identity - here is what I will say, I think identity is absolutely the linchpin to privacy - and by privacy I mean privacy as a subset of broader issues like human autonomy, freedom of thought and whatnot - so identity is a complete linchpin and it’s not going to be possible in the future world for us to simply hide our identity. I know that there’s all sorts of discussion of hyper ledgers, blockchain identity but you know at the end of the day, this world that we live in with its risks and whatnot is going to require a lot of identity. So that means we have to have agency and we have to have agency with our identity and our data. What that does look like in the digital ecosystem. That’s what we have to figure out.
— Pam Dixon

[0:18:43.7] PD: Yeah, you are exactly right. You articulated that beautifully.
You know for me, the linchpin of understanding identity came with the identity theft crisis in the 1990’s when the first hearings are being held about identity theft. I thought, “You know what? We’ve got a two-edged sword here.” We’re it was saying that identity can be used for fraud which means if we are going to have to give more identity and I kept warning all the privacy advocates to be careful. Be careful here, make sure you put in protections for asking for identity.
And it was a little early. You know people don’t quite see that risk but now with biometrics and identity - here is what I will say, I think identity is absolutely the linchpin to privacy - and by privacy I mean privacy as a subset of broader issues like human autonomy, freedom of thought and whatnot - so identity is a complete linchpin and it’s not going to be possible in the future world for us to simply hide our identity. I know that there’s all sorts of discussion of hyper ledgers, blockchain identity but you know at the end of the day, this world that we live in with its risks and whatnot is going to require a lot of identity. So that means we have to have agency and we have to have agency with our identity and our data. What that does look like in the digital ecosystem. That’s what we have to figure out.
[0:20:02.7] AA: Yeah, you reminded me at I talked with one of the leaders of the digital infrastructure in Estonia right? And I don’t know to me or do you know Tamir?
[0:20:13.0] PD: I don’t know him personally.
[0:20:13.8] AA: But it was interesting, they are, I was sort of calling it like a postcard from the future because they basically since the late 90s, early 2000s have brought themselves to a fully digital economy and they have one of these incredibly interesting things where you have essentially reversed big brother, right? So you can see all of the uses of your – that the government at least is - where they have data and how they are using it and I wonder, do you see something sort of like that coming like out of GDPR or other kind of –
Maybe it is not a legislative kind of solution but basically individuals start saying, “I want to understand the way that you are using it or I am no longer going to involve you and no longer use you as a company or involving you”?
[0:20:57.9] PD: I think that day is coming and companies need to be prepared for it. I do see that coming. Estonia is an incredible use case. I actually wrote about Estonia and the India’s Aadhaar biometric system and compared them. Estonia has a remarkable system, you can I could both sign up for Estonian ID; It is a global digital ID - it is absolutely fascinating. You know they are part of Europe and so they fall under European privacy rules. They really did it right.
It is so important to study so I did actually a comparison. So that was just hard to write. That was a good research though. So I believe that this issue of who is using my data and, oh, by the way, who would you sell it to and how they use it? If we knew that and if companies have disclosed that, we would have a different world and I think it would be a better world.
[0:21:44.4] AA: Yeah, I mean as we think about right to be forgotten, I think it is really interesting and GDPR, one of the nightmare scenarios for companies is that they’ll get essentially a “right to be forgotten storm”. Where a ton of individuals will basically start asking to be forgotten from their databases right?
Because it will essentially blow up their – you’ll start to just go to huge amount of work for these companies but also fundamentally mess with the value of their databases as well.
[0:22:12.8] PD: So this is where the value of being a trusted company comes in because if you can show users that you are not selling their data to third parties, I think that would make a big difference.
[0:22:22.0] AA: Yeah I am with you although, but maybe it is the pragmatist in me like there is the carrot but I think you’ve also – that companies sadly often aren’t going to move unless they’re forced to. How do you create a sense of a stick for them too?
[0:22:36.5] PD: I think that Europe has the stick. The United States doesn’t have a stick right now. I am very interested in what happens with Facebook - I don’t know what’s going to happen with Facebook, in the US. I think Europe would be very complex and lengthy. I know that the FTC (Federal Trade Commission) has opened a Facebook investigation.
But what does this all mean for us in the long term? I think it is going to be very important to write that story and to see how it turns out. But we need to influence the ending of that story and the outcome.
[0:23:03.3] AA: Do you – I mean from the people that I talk to and this is CISOs (Chief Information Security Officer)  kind of CIOs (Chief Information Officer)  etcetera like they effectively see GDPR as the rules of the game, right? Because it is virtually possible to understand whether an individual is a European citizen or not and the potential penalties are so large that you essentially have to play by GDPR rules.
Now there’s some other – I would like to get your thoughts, because there are some intersections where essentially GDPR and US law are other legislative frameworks are in direct conflict, right?
So you either follow GDPR or you violate like US law and so essentially we are waiting for those to be legislated or case law to essentially resolve this issues. How do you sort of see - or do you predict that GDPR will run the table as the rules of the game or what’s the game that you sort of-?
[0:23:52.3] PD: I don’t know. You know it is hard to tell. I was just at a Berkeley Law privacy forum conference and a couple of people got up and say, “You know GDPR has been overhyped.” I thought that was an interesting take on it and I do think that there is going to be compliance issues. We have to see what happens in May and what actions that the data protection authorities take but here’s the complicating thing about GDPR.
GDPR allows for multiple, multiple state level regulations. So we may see very, very strong state level regulations coming out in certain places.
Let me put it this way, GDPR is the starting gun and I don’t know where it will lead. I don’t but I think we are approaching a new era and the Facebook data debacle could not have come at a worse time for Facebook and for data use in general. It’s provoked so much discussion.
[0:24:46.5] AA: Well this was great. Thank you so much.
[0:24:48.6] PD: Thank you.
[0:24:49.3] AA: Anything, such a great wide ranging discussion. Anything, last sort of parting thoughts before you go?
[0:24:55.0] PD: Yeah, the people I talk to who call World Privacy Forum, a lot of them feel like they need to give up because they just can’t bother with privacy anymore but they feel discouraged about it. We have to decide what we are going to affirmatively get done. What do you want to have protected and how do you get to fight for it? That’s what we have to do.
[0:25:14.3] AA: This was awesome, thank you so much.
[0:25:15.6] PD: Thank you.