Bay: Is this just a Facebook problem, or is it more?
Malik: It’s across the board. It’s a social media problem, a digital media problem. Anyone who says it’s just a Facebook problem should see the scripts running on their own website. All the newspapers are such hypocrites. The New York Times has 21 tracking scripts running on their website. If they were really that that much holier, they wouldn’t be around long.
Bay: Whatever happened to Ghostery? Is that still around? I highly recommend this. It’s a service you sign up for and it shows who’s tracking you on whatever site. It’s a little horrifying, to see that on your feed.
Malik: Just to be clear, traditional media has not been good at this. Facebook and Google are very good at advertising. The old media online, they’re just not as good at doing this – at targeting, at selling ads effectively, at getting more money per ad. The big media companies are still stuck in the past.
June 5th: The AI Audit in NYC
Join us next week in NYC to engage with top executive leaders, delving into strategies for auditing AI models to ensure fairness, optimal performance, and ethical compliance across diverse organizations. Secure your attendance for this exclusive invite-only event.
Steinberg: I’ll take the other side. I do think it’s just a Facebook problem. Not that I really disagree with anything Om is saying, but the executives at Facebook, the people at Facebook, are so much more arrogant than any company I deal with. I have found that when you sense the tenor of the people at the company, you have a good indication of what’s going on at the company.
It’s so clear that it’s all spin. Even after the testimony, even after the Cambridge Analytica thing, on all their internal message boards, as reported by the New York Times, they were basically saying, “This is so unfair. Why is everyone coming after us?” All these things. That lack of perceptiveness makes them far more dangerous.
Bay: Chris, you were a founder of Facebook. You haven’t worked there in a long time, but you also referenced this, when you talked about the collective exhale. Do you agree with John?
Hughes: I think the problem is much bigger than Facebook, but I also agree with John. There is a culture at Facebook, at Google, at the biggest companies that’s a result of the concentration of power they have. If what Om and John are saying is true, that Procter and Gamble and other huge consumer goods companies don’t have anywhere else to go, that should make anyone who believes in the free market and the virtue of competition very uncomfortable. That’s essentially saying that so much power has been concentrated at Facebook and Google that the biggest, most talented advertisers and marketers have next to no leverage. That should be scary to anyone who cares about that competition.
I do, though, think that we need to be very specific about what problems we’re talking about. I put those problems in three buckets. The first is data privacy and protection. That’s Cambridge Analytica and the larger question around who owns my data. Is it my property? Is it my labor? How do we think about that? The second category is around democracy and the news. How do we make sure we don’t build networks that only reward the most extreme voices? Let alone make sure that foreign powers aren’t able to hack election. And then there’s the third, which is what Tristan was talking about, the conversation around attention and the way that these companies, through our apps and our devices, very much decide where to direct our attention, and increasingly own more and more of it.
These problems are each so big, it’s hard to talk about all of them. They’ve just been boiling under the surface for more than 15 years now, and they’re all spilling out at once. That’s why I think this is such a critical moment to dig into this is.
Bay: Let’s talk a bit about what makes up the social contract in the digital age, what that looks like in mobile and social media. Tristan, could you share with us some of your thinking about how we need to re-orient or redefine that contract?

Above: Mark Zuckerberg testifies in front of the Senate Judiciary and Commerce Committees.
Harris: Partially, as Chris said—in Silicon Valley, all of these products emerged out of a libertarian philosophy. We create a product. If people like it they use it. If they don’t like it they won’t. If you don’t like the 2-billion-person social network on Facebook, just switch to another one. But of course, I just came back from the University of Chicago and their conference on anti-trust. The network effects that these companies have created have made them virtually impenetrable. The traditional sources of—who owns the dollars and can you redirect them, through shareholder activism or customer activism—even the customers, the advertisers, don’t have a lot of options.
The question, then, is what kind of relationship do we really have with these platforms? Right now it’s just, sign on the dotted line, get consent, and they can change their policies at any time. Theoretically they should notify you, but they don’t. They can do what they want. They can choose new business models. You, the user, are responsible, because you hit the OK button.
What I’d love to introduce to you—there’s a different kind of relationship that describes these products, and it’s a fiduciary one. If you think about the asymmetric power an attorney has over a client, they know way more about the law. They can manipulate and exploit their client. They have lots of privileged information about the client. If they want to, they can screw them over. The asymmetry of power is enormous. Same with psychiatrists or priests in a confessional.
If you stack that up, how much asymmetric power an attorney has over the client, how much asymmetric power a psychiatrist has over the intimate details and private thoughts of their patient—now, next to that, add to that how much power around the intimate details of your mind and your communication, and even what you don’t know about how your brain works—how much of that does Facebook have? On the grounds of that asymmetric power alone, we should reclassify Facebook as having a fiduciary responsibility or a fiduciary relationship.
That instantly changes other things. It instantly makes clear why Facebook could never be an advertising-based business model. Imagine a psychotherapist who knew every detail of your life, and also listened to every one of your conversations, and everyone’s conversations with each other, and his entire business model was to sell access to that information to someone else. Or a priest in a confessional whose entire business model, the only way he makes money, is selling access to everything he’s learned in that confessional to someone else. When you frame it that way, it’s clear that this is a dangerous business model.
When I was a kid I was a magician. Throughout my background I’ve always had this sensitivity to the fact that people’s minds can be manipulated. Instead of seeing choices as an authoritative thing, I see human beings as living inside 24/7 magic tricks built out of the cognitive biases in their minds. And then in college at Stanford I was part of something called the Spatial Technology Lab, which teaches engineering students how to manipulate people’s psychology and get them to engage with products. Some of my friends in those classes were the founders of Instagram.
They taught that if you want people to use products, you turn them into slot machines. You give them the juicy rewards. You give them out some times and hold some back some other times. You make it addictive. The human animal is very easily manipulated. From that perspective, you have 2 billion people in one environment with a business model that manipulates all their deepest vulnerabilities. Including, and I think this is one thing we don’t talk about enough—on top of all this you add AI.
If you check out an article on the Intercept, there’s something called FBLearner Flow, where Facebook can predict what you’re going to be vulnerable to in the future. They can predict when you’ll have low self-esteem. They can predict when you’re about to change your opinions on certain topics. That psychotherapist isn’t just listening to you and your conversations, but to 2 billion people’s conversations. We’ve never had an AI that could learn from 2 billion minds, including what color of buttons light them up.
When you think about it that way, this is a dangerous situation, to have all that power completely unaccountable to the public interest, to democracy, to truth. They can claim that they care about users, but they only care about them insofar as they need them to be jacked into this environment they’ve created.
Bay: Now that you’ve terrified us, let me turn to Chris. I’ve noticed you smiling, and I couldn’t tell whether that was appreciation or disagreement, so feel free to share. Also, you come at this from a different contractual perspective, in the contract between our data and companies that have access to it.
Hughes: Well, first, I agree with everything Tristan just said. His analysis—I give him credit for seeing this world and the direction it’s been heading in well before just about anyone else. I was nodding in general agreement.
The key thing that stands out to me, that you can’t overstate, is how much the design choices matter and encourage people into behaviors that may feel good in the short term, but are often an illusion for the long term. When it comes to data and ownership, this is what I’m thinking about the most these days, because this is where it’s not just a Facebook or a Google problem.
Your phones know where you’ve been geographically at every moment of the day. If you have one of those Nest thermostats that helps you be energy-efficient, it knows the temperature in your home. Your Alexa listens to everything you say, not to mention all your email. The amount of data we create is enormous, and in many cases that’s very good. Big data analytics—the analysis of Tesla driving patterns means future cars might be safer. But the issue is, all these people, all of us, create all this data, and we just hit “Agree.” We give up all legal rights. We get no compensation for that.
What’s happening now is historic profit. The margins of Facebook and Google are through the roof. The CEOs just say, “Well, you’re using our service for free.” In reality, all of that data is not only valuable now, but with the coming of artificial intelligence, it’ll be even more valuable in the future. That’s why more people should be talking about some kind of data dividend, some kind of sovereign wealth fund that’s capitalized from companies that make enormous historic profits off consumer data. Maybe it’s a five percent royalty on the revenues that go into that fund. It cuts a check to each American to make sure everyone shares in the upside of this a bit.
There’s precedent for this up in Alaska with oil. Every Alaskan gets a check paid for by something that was their common resource. Data is the common wealth of the future century that we’re all creating. We should tie our economic outcomes to it, so that it’s not just a few people who are getting extremely lucky, like myself and many others in the tech world.

Above: John Steinberg of Cheddar at the Milken Global Conference.
Malik: The way I think about the problem we have right now, it’s very much like the tobacco problem. Tobacco has been around forever, but it wasn’t as addictive until Philip Morris and others modified it and everything changed. Behavior modification in media has always been around, but Facebook took it to the next level.
In the tobacco industry, just putting that label on boxes was a big step forward. We need to have similar approaches to data and privacy. Instead of terms and services, companies big and small should be forced to write terms of trust. What will they not do with our data? Rather than saying what they will do with our data. That’s not particularly definable in technology terms. But when you look at what Facebook is paid to do, it should be to protect our data from leaking through some third-party app. That’s their job as a platform. There need to be terms of trust that lay that out.
The other thing we need to do is figure out a non-industry group that regulates data. I don’t think people from the technology industry should be allowed to do this. We need the people and their representatives coming up with rules and regulations around how data needs to be protected.
Bay: Give me three things you want to see in those terms of trust.
Malik: Number one is that the user’s data is not leaked to a third party, that it’s protected. How about that for a starter? My data isn’t going to get stolen. Number two, it won’t be sold to third parties. Facebook is giving me value and they expect to make money in return, but they can’t make money selling me to analyzers, to the likes of Cambridge Analytica.
Having been in technology for most of two decades, there was a time when Microsoft looked unbeatable. Sun Microsystems looked unbeatable. They all come and they all go. We’re in the fifth generation now. I’m assuming there is something else out there.