
Loading summary
Doug Miller
Foreign.
Alan Chappell
Welcome to the Monopoly Report. The Monopoly Report is dedicated to chronicling and analyzing the impact of antitrust and other regulations on the global advertising economy. If you are new to the Monopoly Report, you can subscribe to our weekly newsletter at Monopoly Market and you can check out all the Monopoly Report podcasts @monopoly report pod.com I'm Alan Chappell. This week my guests are Jules Polinetzky, who is CEO of the Future of Privacy Forum since 2008. It's a Washington D.C. based think tank that seeks to advance responsible data practices in the digital age. And that bio really undersells, I think, Jules influence within the privacy community. He is one of the rare, I would go as far as to say non political people who is just out to find practical solutions to some very thorny challenges. Also joining me today is Doug Miller. Doug has been in the privacy space as long as anybody. He was AOL's first full time privacy pro back in 1998 and he spent years running AOL and Yahoo's privacy programs. In addition to being a senior fellow at the Future of Privacy, Doug now has his own privacy career coaching practice. So we're going to be talking about data Clean Rooms today and I couldn't be more excited because these guys are really sharp on these types of things. Welcome to the pod, Jules and Doug. It's great to have you both on.
Doug Miller
Thanks for having me.
Jules Polinetzky
Great to be here.
Alan Chappell
So not many people know this, but Jules and I used to work together at DoubleClick way back in the day and I didn't have a lot of direct interactions with Jules. Jules was then the chief privacy officer and I was doing sales. But there's a story that we had an indirect interaction where I was trying to close a big deal. And I remember my sales manager taking me aside and saying, hey, if those privacy people get in the way, you closing your deal, just let me know and I'll figure it out. Now that was maybe nine months after the DoubleClick Abacus thing had sort of blown up. But it kind of struck me as I was thinking about this discussion because it's sort of like that sort of encapsulates one of the key challenges that I think privacy folks in the ad space are still dealing with to this day. You know, guys, we're seeing a lot of pressure on the ad space to shift to more privacy compliance solutions. One of those is, you know, the data clean rooms, you guys at the Future of Privacy Forum and put together an overview. And so I want to just throw this out There give us a working definition. What is a data clean room and what specific problem or problems do they, do they solve?
Jules Polinetzky
Well, I'll let Doug jump in, but let me just step back with some framing. You know, one of the challenges that EdTech has had is we love like putting labels on things and then everyone marches into that thing and then the rest of the world is like, what exactly is going on here? When what may be going on simply is we're measuring what we're doing, we're targeting. But every nuance and every business model kind of becomes a product and a label. And you know, when I started out with you back at DoubleClick, I don't think anyone was using the word clean room unless they, you know, meant maybe making an intel chip or something. Right? But people were sharing data or avoiding sharing data, but combining, you know, data to use it across customers or clients while being worried that I don't want to give my client my entire database. Or maybe I said I don't share personal information and so I do want to match up and bump two data sets and learn something, but I don't have the rights to go buy it and sell it and swap it the way I might in some of my business models. And we didn't call it a clean room, right? We, we said I need to make sure that I don't share personal information with this third party, maybe because of confidentiality and I just don't want them having all my data or security or maybe it might violate a law or a privacy policy. So there were all sorts of techniques, you know, that you did, maybe you did it in house, maybe there was a vendor in the middle, maybe one of the data companies, you know, who was perhaps managing your data for you or maybe even monetizing and marketing your data for you facilitated this. Hey, I'm holding your data, I'm holding this. Other folks data. Maybe you want to do some business together and I'll charge you a little bit of premium there and you'll both be happy because you'll learn something more about each of customers, right? So I feel like we suddenly threw this big title over it now because there's been bigger demand and more regulatory pressure and so forth, but as a result, maybe lost what actually is the big question, right? How can I learn from data when there might be restrictions on simply combining those two data sets? Maybe I hold the data, but it's sensitive and I have some restrictions. Maybe I can't move it around the world. Maybe it's my partners and Clients and we really should be focusing and maybe the nomenclature should focus on what are you doing, right? And researchers outside of the edtech world, widespread throughout government, throughout research, throughout every sector, are doing this today. And they're not calling it, you know, some bell and whistle, right. They're working to, in a privacy preserving manner, join or analyze or learn things from data sets. So that's what a clean room is. It's a big fancy marketing term for a business model built around a set of solutions and techniques. But perhaps having lost, therefore, the actual transparency and understanding of what are you doing, why are you doing it, and what are the legal consequences of what you're doing. Doug, would you add anything to that?
Doug Miller
Well, I second that emotion. I think debating how to define clean rooms is probably not a very good use of our time. I would imagine people are interested in that only in the vain hope that they'll be able to define themselves out of scope. But I think you're right that let's look at the actual behaviors and do the thing that privacy professionals do and think of all the compliance measures that you would put in place to do anything. Now, last September, FPF came out with a nifty primer on clean rooms. It was authored by Aaron Massey. It was called Data Cleanrooms, A Taxonomy and Technical Primer. And you can find it@fpf.org in the AdTech page. And there we defined it as a collaboration environment where two or more companies or their partners can perform data analysis on collective data sets and choose what they reveal to one another, which is a fancy way of saying largely what Jules just said. And so we pointed out too that it's used for medical research and academic research and for government uses and for ads. And the ad uses are usually to answer the Wanamaker question of can you figure out how effective your ad campaigns have been? And so I think that's what's been going on for a long time. And I'll just point out, we always talk about, for privacy teams, the challenge of explaining your value proposition to the powers that be, where you work and how important it can be to align privacy objectives with what the company actually cares about. Clean rooms are right in that sweet spot, it seems to me, where you're going to use clean rooms to do things like figure out how your ad offerings are actually working and also protect your proprietary data. But it aligns perfectly with exactly what the Federal Trade Commission is saying in their blog, that, you know, there are certain constraints that you got to put on here. And so think carefully about what they are. It's a nice issue for, I think, attracting attention for privacy professionals.
Alan Chappell
This is all great, but really concisely, what is the problem that they're seeking to solve with these things? Because I, my sense is that depending on who you ask within the ads community, you're probably going to get a very different answer. I mean, is this a targeting solution? Is this something to help with attribution and measurement? Is it all three? Is it a secret room where, you know, a place where the rules of privacy don't really exist anymore? Like what is it?
Doug Miller
Well, measurement and attribution, of course, because you're going to have, you'll have an advertiser with their data, you find somebody who has purchase data and you can figure out how the ads are going. So that's great. You can identify audiences for an ad campaign. So there's targeting uses as well. You can simply learn more about your customers by doing market research. So there's all kinds of uses that can help your business objectives. I think a real basic, fundamental step in thinking about this for companies, sit down and actually think what your marketing objectives are, your business objectives, so that you're planning your privacy strategy accordingly.
Jules Polinetzky
But again, I argue that it's the reverse and it's flipping that question on its head that maybe has created some of the confusion. Any sophisticated business has a lot of use cases where different data sets need to be analyzed together. And in many cases, no big deal. That's a data set here, that's a data set there. I hold it. Nobody tells me I can't link it together. Well, guess what data has become in the last 20, 30 years a highly regulated sector. What did you tell the user when you collected data? Set A right, even quote, unquote, first party data, zero party data. I don't know what kind of label you want to put on it. Right. I collected data for one purpose here and I maybe now collect the data for another purpose there. Can I necessarily mash them up together and go do some analysis? Maybe, maybe not, right? Maybe I need some sort of technical step in between because maybe I'm in Europe and I specified only a certain purpose and now I have a different purpose. Well, can I now go ahead and do that secondary purpose or use that data that was restricted for whatever set of reasons? Maybe this is a health data set under one of the new state health laws, even though it's not medical data, but maybe it's data that can infer health conditions this and this and that. But you want to do Some sort of analysis on it. Let's take Maryland, the new state law that has just kicked in where there's these very strict data minimization. Like you can't do anything other than what the user reasonably expected that they thought you were going to do when you gave it to them and nothing more. Right. Like you can't even ask them for more permission. Well, maybe you do want to do something a bit more, but Maryland law, assuming they're a Maryland consumer, has this very severe restriction. So perhaps if you're de identifying the data in your process, in your joinder, in your analysis across data sets, well now maybe you're not in that law. So I'd say there are hundreds and hundreds of interesting use cases and some again, may be low risk. No problem. Just do the math. And some of them, maybe this is actually a third party's data. Holy cow, we can't do it. Is there a technical solution that lets me maybe not get the maximum benefit of just mashing these things up, but doing an analysis in some way that is going to ensure that I haven't sold the data, given that so many of the state laws, you know, have restrictions around sale of data or maybe it needs to be consented or you know, all of the other issues.
Alan Chappell
Well, and that raises a whole other set of interesting questions, Jules, like is the data going into a clean room? Is that a sale of data under say California law? Is it processing data under EU data protection law? And does it depend on the use case?
Jules Polinetzky
Yeah. So our goal, if we are going to be able to do this and we're looking at sort of building out a little bit of a best practices, is like don't say the word clean room. Here's what legal challenge you have. Here are the techniques that are suitable for that particular legal challenge. Right. And it may be that the big challenge maybe in the US is dealing with sale. Maybe it's not. Right. A lot of companies have already put up their little links and they say, do not sell my data. Here's the odd part of the California law which the sponsor of it gets, starts head, starts steaming. When I, when I lay this out, I say years ago almost Every major Fortune 500 company had a privacy policy that said we do not sell your personal information. Now, yeah, they had some ad tech and maybe there was, you know, targeted ads and this and that, but, but your actual personal information, most of them thought, hey, people aren't going to like that. And, and we're going to say that, and that's the norm because the California law was drafted really kind of the nail attack with such awkward nuance that it's, you know, almost irresponsible now to say I don't sell your personal information because you're going to end up getting caught because you've got ad tech partners. Almost every major company now says at the bottom of almost every website we sell or share your personal information, go click here, blah, blah, blah. If you want to tell us not to. Now, that doesn't mean it's now a free for all, but boy, privacy leads. You know, like me, Doug and I worked together for many, many years at AOL and so forth. We were able to block all kinds of stuff or say, no, you got to really do a lot of technical stuff here if you're going to link that data up. Because we have said we do not sell or share your personal information. Now they do. And so, wow. California and the states that followed it perversely have flipped the default and industry now promises that they sell your data. Now, from the business side, this is still a complication because there may be opt outs that you now have to process, there may be clients that didn't say that. So you still have to jump through a lot of hoops. But I think that's the path forward.
Alan Chappell
So there's a whole bunch in there.
Jules Polinetzky
Jules.
Alan Chappell
But there's one thing that maybe I feel like I buried the lead a little bit. I think you referenced the fop, perhaps getting more involved in helping companies to ascertain whether data going in or out of a clean room would be a sale or a share, even processing of data. And am I understanding correctly that you guys are, are starting to head in, in that direction?
Jules Polinetzky
Yeah. So here's what we're looking at and we've had a lot of conversations with regulators around the world to sort of understand what they think about when they think about the identification. Right. Because this is probably in large part a good deal of, of the question here. And you know, de identification ends up being one of these just really challenging issues that ends up being looked at as black or white. You know, many years ago we did work trying to illustrate the spectrum of the identification. Right. And, and you know, you can have maximum de identification. I can add up, you know, everything and just say number 99 is all the, you know, is the number and I don't learn a lot from it or I can try to preserve. Right. More utility and maybe it's a little bit of risk. And how is that risk properly managed? Europeans haven't really Given us closure on that yet. We'll see, you know, EDPV guidance as soon as a few final court cases are over about whether or not it is sort of an absolute thing or a relative thing. And I think in the US we don't fully know the answer, but assuming we've got a largely US audience, in our conversations with folks at the FTC and ags, they think about it a little bit differently than folks in industry. They talk to de identification experts who are often sort of security minded people who think of things like the attacker, right? And it was very interesting in some of our conversations with the FTC when we talked about things like clean rooms and we said, well, you know, would you assume that this is the identified because the clean room works like this and this and that. And they said, well, could the attacker say wait, wait, what do you mean attacker? There's no attackers here. So well, the other party that's holding the data, I'm like yeah, but that's my partner, there's a contract. They're like well that's not necessarily a guarantee. I mean I said never. Sometimes I said well it really depends. And so I would say the critics, the ftc, the ags, when they look at whether or not data is, you know, properly de identified so that your analysis and you know, your, your joinder, your, your learning across data sets is, is not a sale or is not going to create risk of harm or whatever the concern is. Let's take a look at some of the cases, right? So let's take the obvious ones, right? People put some subset of their data out for clients to go test and use, right? You saw those FTC taking action against some of the location data companies or oh, I have dwelling and overnight, you know, where the device was. But, but we don't use that in some way that actually, you know, identifies that the user is in that house. But well, it's there in the data set and anybody who wanted to, and you've got no guarantees or protections that any of the customers who took that data, you know, weren't going to do that. So. Right, so there's a heavy assumption that when there are businesses that perhaps sometimes do monetize data and just because the deal is, here's what we're looking to do, the FTC, the AGs are going to look at that party as a bit of an attacker, right? Can you trust their employees? Can you trust that they'll do what you said? And whether contract is enough might be a bit iffy, right? We certainly Know that they've taken action against players who've sat at an exchange agreeing to the terms. Oh, we won't do this with the data. Right. Unless we buy it and. But maybe ignored it and then took and kept the data. Right. So they assume that there's a little bit of bad faith in the world and that simply having a contract may not be enough. So then the question is, what do you have more? Do you have maybe some sort of audit system? Okay, better. Do you actually have some technical guarantee? Well, we're using this particular homomorphic encryption, this sort of interesting system in the clean room that actually prevents our partner from doing the dastardly deed that I don't really think they're doing, but that the FTC wants to know that you can actually give a reasonable guarantee. And that might be a big guarantee if the data is sensitive or maybe there's lots of parties involved. Here's another interesting example. There was a major case against one of the major TV companies that was, you know, learning what was on the screen. What's the term for screen recognition technology? And we're. And they were providing information to advertisers who obviously wanted some ROI information, but they obviously didn't want to dump their entire data set with those advertisers. And the advertisers didn't want to dump their entire data set. This was the Visio case with Visio. And so what Visio did is it shared IP address. And what ad showed up at what, in what context with the advertiser so that the advertiser could run an analysis against the advertiser's database and then aggregate the data and not keep any of the individual information. And they had a contract saying, hey, we're going to give you this data and quote, unquote, Vizio had something the advertiser advertisers had. Well, maybe something like a clean room. Right. But it was clearly provided to them or a vendor working on their behalf. And the FTC said no personal information IP+ what you viewed was shared with these advertisers who then de identified it and only used it in some aggregated way. No, no. Bam. Right. Nailed them on that. So we've got sort of an emerging line of cases where it's fair to say that the FTC is going to assume, you know, a very competitive market with people who monetize data. And again, our fault, those, I kind of say ours. As someone who worked in, you know, the edtech world for a while, we've got business models where it's never clear sometimes what we're doing, right? Sometimes I'm a vendor, but then I also have a data business, and I have a business where I'm not clear whether I'm your processor or I'm working for the advertiser. And maybe I'm wearing both hats at the same time, right? So we've created this kaleidoscope and regulators are like, I don't know what the hell's going on. You, you have to have a very clear explanation. Oh, and by the way, what we also heard from the FTC is we're looking at exactly what you've said in your privacy policy. And then we're looking at exactly whether or not what's happening in this process leaves the other party with more information about an individual user than they had before. If so, that's a sale. You've sold something about this particular user. Now if at the end of the day, you've got some aggregation process and, you know, the result is your campaign, you know, performed in this manner, that's, you know, one thing. But learning something you did not know about an individual user, they clearly believe that that is a transfer, a sale. And that might not be in sync. And you might think that's no big deal, right? Like, well, I just learned that you're in the other data set, right? What's the problem? That's actually a huge deal, right? That means the identification was not effective. And obviously in some cases, you might be sharing very, you know, sensitive information with some party that's not. Not allowed to have it.
Doug Miller
Right.
Alan Chappell
And you know, the Visio case is a really interesting one because I have friends in the CTV ad space who are, who are really just jumping up and down and saying the absolute future of that format is going to be the use of clean rooms to, in order to, to do some flavor of targeting. And so if there isn't an effective path forward for them, I think some of those guys are in some trouble.
Jules Polinetzky
You know, this kaleidoscope of identifiers and models and transfers and, you know, not very good systems and transparency for kind of cooperation across those. All of the vendors, Right. With some better clarity, right? Hey, you're giving me a data set. Is it consented? Actually, right. All these states, you know, maybe I'm advertising pharma, right? Huge TV advertiser. Oh, don't worry, it's consented. Wait, where? What?
Doug Miller
How?
Jules Polinetzky
Oh, I bought it from someone who said it was consented. Right. I can't tell you who because then you'll just go directly to them. Right. So we've got, you know, challenges with sort of transparency of the value chain and what rights you have. Do I need to even de identify this data? So, but, but I think it's doable. It's just that we are kind of this jumble of an industry that kind of, you know, makes life hard for ourselves because of the complexity and the lack of sort of very clear, specific standards.
Alan Chappell
Completely agree. But, but one thing that you're, you're, you, you've talked about a bunch here, which I find really intriguing is the. You use the term DE identification a bunch. And like, in my view, we had once upon a time a fairly compelling definition of DE identification, which again, in my view, has been blown up by a lot of the state privacy laws where, like, I don't know, in any given data set, you know, it could be both DE identified and still be personal information. It's sort of this Schrodinger's, you know, data point problem. And I'm just curious because you, you've mentioned that, you know, the term a bunch. Do you see it that way or do you think there's a path forward to, to use clean rooms as really a viable way to DE identify?
Jules Polinetzky
So again, it depends on what you're trying to do. I do think there are people who want the identification to do things that it will not do. Right. There was a world, you know, again, when you started out and I started out where we said, well, it's cookies and it's all fine. And we had a lot of controls way back in, even in those early bubble click days, we had contracts and we had audits and we had real efforts to ensure that the partners, but we knew the partners, right? There weren't exchanges and third parties and multiple parties. Like, we had some ability to say, I see you, you see me. I'm looking at your privacy policy. I'm looking at what you're doing. You're representing things, you're putting language in your policy that, you know, promises this and this and that. And you could at least argue that you kind of had a little bit of a leash around how the data is used. Right? But today there's so many parties, so many people who have linked data to these identifiers that making kind of the explosion of parties and data and multiplicity of ways data moves, quote, unquote, all anonymous, so that we can all just do everything we were doing. Not possible. Are there chunks of this between certain parties or groups of parties where you can leash the data? Well, Enough that you can actually make some sorts of promises? I think yes, because the alternative is, the only alternative is one of two directions, right? Everyone agrees everything's personal. We come up with these new identifiers and we try to promise, you know, that we're going to respect all the access rights. And again, my argument to regulators is when you make it so hard and so impossible, you've just forced everyone to just live in a world where, hey, we're selling you data and now we'll just cooperate with what we need to do, which is, you know, not the best outcome. The other alternative, and this is going to open up one of your favorite issues, right, is parties who have control working one on one, right? Browser side, device side, with the trade offs that come from that, whether it's competition, whether it's, is it really privacy? You know, another project that we worked on, similar to some of the work sort of you've done, is we've said what are all the things that someone says is adding privacy, right, sandbox or this or this and that and then what are all the different consequences? Because right now everybody is arguing sort of across wises to each other as opposed to you do these things, it affects things. And maybe you don't care because that's not your problem or your business model or you just don't care about who pays for it, but you got to understand that there is an ecosystem and you squeeze somewhere. So I do think that there is a clear role for more transparent use of clean rooms, for a range of processes, but not necessarily what some parties want, which is let's just do everything we're doing with a lot of data going in a lot of places and somehow promise that it's de identified. That's that just is too many attackers, too many, you know, ways that you aren't going to be able to guarantee what you need to guarantee.
Alan Chappell
Yep, and this is a great point and I guess my question, and I'm going to throw this out to Doug, is like, how do we transition from the current status, which is still pretty close to a free for all, to something that is a little bit more uniform or at least somewhere there's just some general sense of, you know, the direction that we're all going. Like, you know, if you're, you want to work with one of those clean room vendors, you know, what are some practical things that you should be thinking about as you're exploring those options in that light?
Doug Miller
Alan, I was happy to see the FTC's blog about this. I always Looking for the silver lining, that's me. And what I thought was it's great that they have highlighted this issue and they have essentially said that it's not just business interest that you're protecting here, but you got to put consumer first. I think that's an important guideline. And then I think it points to, you know, sort of what Jules is talking about there, that, that you can find a way of dealing with de identification or whatever if I, I think if we think about it holistically and systematically. So for me, when I think about how do we deal with clean rooms, it's in three chunks. It's the technical chunk of how all these privacy enhancing technologies work. And that's going to be the engineers and all that. It's going to be the, the legal chunk of how you sort out is it something a sale or a share or how do you, what do you put in the contracts or whatever. But then there's what I'm going to call the organizational chunk, because I can't think of a better word for it yet. But that's the privacy team. And those are the people who are facilitating the kinds of conversations that I think the FTC wants to see happening, where the privacy team talks to the sales and marketing team and you get clear about what your business objectives are and you make sure that you're doing something that actually is serving those objectives and no more. And then those two groups are talking to the technologists about how we actually implement this and solve this problem so that everybody's actually talking. Because I guarantee you there are engineers out there who wonder what is all the hullabaloo about this? Because in their minds they're keeping the data safe, but they're, they're not familiar with the legal terms and the legal standards that are at play here. And then I think, you know, you start with security, you get clear about the business goals, you start thinking about the technical controls that you can put in place because contractual promises are simply not going to be enough. And make sure that your disclosures are exactly right. Now then I start to think about, well, what do FPF members or your audience, Alan, actually need here? Do we need to be figuring out guidelines, best practices, configuration standards? You know, I'd love to hear feedback and I'm asking people every time I'm in a conversation about this, what would be useful for companies as we're figuring out all this stuff? Final point. Another discussion draft that the FPF did, launched last spring, also available on the ad tech page, was this NIFTY document about sort of novel approaches as companies are adapting to changes in the marketplace. But it has this cool appendix that we call the Risk Utility framework. And it is, I think, a really good roadmap for thinking about privacy issues like clean Room. So you figure out what do you want to do and what problem are you solving and, and then you get into dealing with the lawfulness and the fairness and the transparency and the purpose limitation and data minimization and making the opt outs work and thinking about sensitive data. It's a nice list and roadmap of the things you need to be thinking about and we'll continue to refine that moving forward.
Jules Polinetzky
But I think the clean room providers could do that would be helpful. And I get, I get why they don't want to because they're not your lawyer, but they're providing kind of a technical answer to a legal problem, right? So what I've asked them when I've talked to some of the clean room people, right? And I said, okay, what's the legal effect of this thing here? Oh, well, that's for the company to decide. I'm like, what? But, but wait a second, I thought you're offering this very specifically to achieve these sorts of purposes. So I think to the extent that we're able to pull together a number of the most common use cases and then lay out some of the factors that you need to be asking, right? Because what we hear from, we've got a lot of ad tech members, but our typical member is the chief privacy officer of a major retailer, a major bank, right? Yes. We also have the big tech companies. They're in the weeds on this stuff, right? But the typical person is fairly sophisticated, but they don't live and breathe ad tech. And so they're getting a proposal from their business unit who's working with their edtech people, who are working with the vendor, who's working with the other people's vendor. And so they're having a hard time. They say, well, what are the questions we should be asking? So what I think we're able to do is we're able to say, look, here's the questions you need to ask. If what you're simply doing is looking to learn, you know, how your customer set matches up with whatever data set is, is being considered here because you want to learn overlap, roi, whatever the, the learning from common use cases is, right? Here's the questions you need to ask, right? And the representations you need, right? Are you going to have any information related to this user you know, in that data set afterwards, blah, blah, blah, blah, blah. Can we set forward the due diligence questions? And if you are able to get satisfactory answers to those, well then, you know, presumably you can feel comfortable. And I think the vendors are answering those now quietly, under pressure. If you're smart enough to ask the right question and they pull in the right legal people and some of our former staff are those legal people now working at some of these companies. But, you know, we got to twist arms to get those answers, but they need to be out there a little bit more. You know, when, when we were living in the world of global data transfers, all the big companies started publishing their transfer impact assessments because company after company was like saying, and right now with all the laws against data going to China, right, all of a sudden you got to ask your vendor about their vendor, about their vendor. So I think companies are getting a little more comfortable saying, hey, I better put that set of transparency about what I do out there, or at least make it available on demand to my partners, because the friction is slowing down the deals. So we did a lot of this.
Alan Chappell
Back when I was working with Blue kai in what, 08 or 09. And back then the idea of a data exchange was brand new. And so we spent a heck of a lot of time out there educating individual advertisers and publishers. We weren't providing legal advice. But in a world where people are really just trying to understand, like, how should I be thinking about this? I think that the clean room vendors could probably do collectively a better job of that.
Jules Polinetzky
And Omer was very sophisticated and saw some of this early on at Blue Kai and actually was building in some sort of noise and sort of, you know, I'll say, de identification, you know, techniques into some of the reporting, but it was early and the market, you know, everything has some effect on utility and the market demand wasn't there. The regulators could be more helpful here. Right? I mean, if the state legislators understood better, they would draw the lines a little more effectively and it would make a big difference. And maybe the new FTC where I think maybe, you know, not that they're going to love Ed Tech, edtech has, you know, unfortunately been been put in a doghouse that, that is hard to, you know, get out of, but, but presumably they'll be a little bit friendlier, they won't call it surveillance advertising or surveillance capitalism, you know, just like defining an entire economic sector as like evil by default, which I think has been the case with the current FTC leadership. Right?
Alan Chappell
Yeah. I think that's been a bit of a problem. I feel like they've almost taken an advocate hat and I don't think it's a really good thing if the regulator, that's a defined role. It should not be too business friendly and they probably should not be too advocate friendly.
Jules Polinetzky
So my hope, though is that they could, instead of that kind of blog post where they did, which was like, clean rooms, boo, right? How about clean rooms? Here's what we're looking for, right? So we'll be urging them, right, because they, they voted, you know, against a lot of the companies on all of these cases. It's not like they are now cool with, you know, sloppy or risky practices. You know, the chair, you know, he voted for these things. His rhetoric is different. So what we'll be hoping that we can encourage them to do is to provide the kind of guidance that would, you know, eliminate some of the friction that allows, you know, business to proceed. So that doesn't mean don't enforce and don't do blog posts that, you know, touch on hot button issues. But maybe if that blog post was being written in the Ferguson era, maybe you would have, you know, here's the standards that the FTC is looking for based on our enforcements when we are assessing whether or not data is being shared properly.
Doug Miller
Right.
Jules Polinetzky
And then that would be, I'd argue, very, you know, very effective. People would look at that and say, oh, well, that's, that's what I'm going to ask for. That's what I'm putting my contract and.
Doug Miller
Looking for the silver lining again, or maybe the pewter lining. But now companies, you know, after that, FTC blog posts, companies can't hide behind the idea that this isn't an important issue and maybe their activities won't get noticed or anything like that, that in fact they have to worry that maybe their competitors are actually getting better about compliance and they might have to start keeping up. I think the conversation is starting to change. Conversations like this are really important. People need to start talking about this stuff more. I don't think across companies, there are enough people who know about what clean rooms can do for the company, let alone the privacy considerations. I think every privacy professional who has anything to do with clean rooms should make a New Year's resolution in January to go talk to your marketing people about how they want to use clean rooms and your technical people about how they're working with them and start getting smarter about it.
Jules Polinetzky
We have the CPOs of, I'd say a majority of the Fortune 500, if they ask the right questions. And I learned this sort of aol, where occasionally I wanted to try to get something, and, you know, our business people were like, oh, we'll have to pay for that if we ask the vendor to do this and this. I'm like, could you just ask. Ask them how they deal with this and how they deal with that. And then the salespeople run back home, hey, we're being asked about this, we're being asked about that, right? And like, all of a sudden, oh, maybe this is a feature that, you know, people want. But I'm like, did you raise it? You're complaining. We sit around and we have all these roundtables, very often with chief privacy officers, and they're complaining, wow. Oh, I can't. I'm like, did you actually ask for these specific things in your due diligence in your contract? If you're not asking the smart questions, you're not going to get it. So if we're able to help, identify, Here are the 10 things you should ask, request, get an answer to, and we can get that being asked. I think it'll drive change.
Alan Chappell
That's a great point. And I think we're going to leave it there. Doug, Jules, thank you so much for being on the pod. Really appreciate it and great to connect.
Jules Polinetzky
Good to be with you.
Doug Miller
Thanks for having me.
Alan Chappell
That was a great conversation. We've got a bunch of other fantastic guests coming up on the Monopoly Report podcast over the next few weeks. We've got Omar Tawakul, who we referenced in today's pod, founder of Rembrandt, and former founder and CEO of Blue Kai. So we've got a lot happening in 2025, and thank you so much for all of your support over this last part of 2024. So please subscribe to the show@monopolyreportpod.com or on Spotify, Apple, YouTube, or wherever you listen to your podcast. Happy New Year.
Release Date: January 8, 2025
Podcast: The Monopoly Report
Host: Alan Chappell
Guests:
In Episode 12 of The Monopoly Report, host Alan Chappell delves into the intricate world of data clean rooms with two esteemed guests: Jules Polonetsky, CEO of the Future of Privacy Forum, and Doug Miller, a seasoned privacy professional with extensive experience at AOL and Yahoo. The discussion aims to demystify data clean rooms, exploring their definitions, use cases, regulatory challenges, and the path toward standardized practices in the advertising technology (AdTech) landscape.
Alan initiates the conversation by seeking a clear working definition of data clean rooms. Jules responds by contextualizing the term within the broader landscape of data sharing and privacy:
Jules Polonetsky [02:45]: "It's a big fancy marketing term for a business model built around a set of solutions and techniques."
Doug complements this by referring to the FPF's primer on clean rooms:
Doug Miller [05:54]: "It's a collaboration environment where two or more companies or their partners can perform data analysis on collective data sets and choose what they reveal to one another..."
The duo emphasizes that data clean rooms are not novel concepts but rather an evolution of existing practices aimed at enabling data collaboration while safeguarding privacy.
The conversation transitions to the practical applications of data clean rooms. Doug outlines several key use cases:
Doug Miller [08:29]: "You can figure out how effective your ad campaigns have been... identify audiences for an ad campaign."
Jules adds depth by highlighting the regulatory complexities associated with data usage:
Jules Polonetsky [09:08]: "Data has become in the last 20, 30 years a highly regulated sector. What did you tell the user when you collected data?"
These insights underscore that data clean rooms serve multiple functions—from enhancing ad targeting and measurement to enabling sophisticated market research—while navigating stringent privacy regulations.
A significant portion of the discussion centers on the legal implications of data clean rooms. Alan raises pertinent questions about data sales and processing under various jurisdictions:
Alan Chappell [11:41]: "Is the data going into a clean room? Is that a sale of data under say California law? Is it processing data under EU data protection law?"
Jules elaborates on the challenges posed by evolving state laws, particularly California's stringent regulations:
Jules Polonetsky [14:11]: "Almost every major Fortune 500 company had a privacy policy that said we do not sell your personal information... now, almost every company now says at the bottom of their website we sell or share your personal information..."
Doug discusses specific cases, such as the FTC's action against Vizio, highlighting the regulatory scrutiny faced by companies utilizing data clean rooms:
Doug Miller [21:48]: "In the Visio case, the FTC said no personal information IP+ what you viewed was shared with advertisers... Bam. Nailed them on that."
These anecdotes illustrate the precarious balance between leveraging data for business objectives and adhering to evolving privacy laws.
Alan probes into the transition from the current fragmented practices to more uniform standards. Doug outlines a three-pronged approach:
Doug Miller [27:21]: "It's the technical chunk... the legal chunk... and the organizational chunk."
Jules emphasizes the need for transparency and due diligence:
Jules Polonetsky [30:31]: "Here's the questions you need to ask... What do you have to de-identify this data? Can you set forward the due diligence questions?"
The guests advocate for a collaborative effort between privacy teams, legal experts, and technologists to establish best practices and standardized protocols for data clean rooms.
Looking ahead, Jules and Doug offer recommendations for organizations navigating the complexities of data clean rooms:
Doug Miller [33:16]: "Every privacy professional who has anything to do with clean rooms should make a New Year's resolution... start getting smarter about it."
Jules Polonetsky [33:42]: "If you're able to help identify, here are the 10 things you should ask... it’ll drive change."
They stress the importance of proactive engagement, continuous education, and the development of comprehensive guidelines to ensure that data clean rooms are utilized responsibly and in compliance with legal standards.
Episode 12 of The Monopoly Report provides an in-depth exploration of data clean rooms, shedding light on their functionalities, benefits, and the intricate web of legal considerations that accompany their use in the AdTech sector. Jules Polonetsky and Doug Miller underscore the necessity for clear definitions, standardized practices, and robust collaboration between various stakeholders to navigate the evolving landscape of data privacy.
As the episode wraps up, listeners are encouraged to consider the multifaceted nature of data clean rooms and the critical role of privacy professionals in shaping their implementation. The discussion serves as a call to action for organizations to prioritize privacy in their data strategies, ensuring that business objectives are met without compromising consumer trust.
Notable Quotes:
This comprehensive summary encapsulates the essence of Episode 12 of The Monopoly Report, offering listeners a detailed roadmap to understanding and implementing data clean rooms within the realms of privacy and AdTech.