Bay: It’s interesting that big bad old media suddenly looks valorous in all of this. But in all seriousness, John, are there codes of ethics, codes of conduct from the old media business that are applicable here?
Steinberg: I’m glad you asked that question, because what I wanted to comment on is where I believe the social contract exists between media and the distributors of media. Imagine how bad the relationship has gotten between Facebook and the media, that somebody like Andy Lack, who’s hardly a bomb-thrower, called Facebook “Fakebook” in public. The relationship has gotten to the point that it’s worthless, if he’s willing to say something like that. My old CEO Jonah Peretti is now making comments about Facebook not living up to its bargain and compensating publishers. Rupert Murdoch, I guess he says whatever he wants to say, but he’s doing full-page letters now saying Facebook has to pay publishers money.
Facebook is a lot like Donald Trump in that the first time Trump does something crazy, everyone is shocked. Then, over time, it becomes normalized. Human beings can adapt to basically anything. If Comcast or Charter continually said to media companies, “Today you’re here, but tomorrow your channel disappears, sorry about that,” constantly moved people around the dial, some days you get access to audience data and other days you don’t—people would be up in arms. It would be intolerable. But what Facebook did is, they changed up so often and were so routinely distrustful that people just became accustomed to it. There was nothing you could say anymore.
The only bright side to it is, you see someone like Tim Cook on MSNBC saying, “We never would have gotten ourselves into a situation like this.” Historically, Apple has always treated media companies far better. There’s been a set of rules. You signed a contract. You got on the system. The App Store didn’t change radically overnight and make your app suddenly disappear.
June 5th: The AI Audit in NYC
Join us next week in NYC to engage with top executive leaders, delving into strategies for auditing AI models to ensure fairness, optimal performance, and ethical compliance across diverse organizations. Secure your attendance for this exclusive invite-only event.
The traditional cable companies have typically behaved with much more negotiation and trust and agreement. When I look at our relationship now with Hulu and YouTube and Sling, we did a contract. We negotiated. We’re on the system. There are rules around what we put on the system. There are rules around how they compensate us and how the ad split works. There’s not only a social contract, but an actual contract in place.
Facebook has no friends left now. No one is rooting for their success. Every media company wants them to fail. That’s five, 10 years of bad behavior.
Malik: I have no sympathy for the media at all. This is a mess of their own making. They’ve shot themselves in the foot. I was the first person who went to my editors and said, “We should start a website.” They said no, and then they fired me. Big media, let them rot in hell. That’s all. [laughter]
But the problem we’re talking about here is Facebook. How do we think about the next chapter of Facebook? That’s more important. What’s happened in the past isn’t going to change. How are we going to control the beast?
Bay: Let’s talk about that. Where do we go from here? Is regulation in the U.S. inevitable?

Above: Willow Bay of USC and Om Malik of True Ventures.
Steinberg: I don’t think anything is inevitable, politically, in this climate.
Harris: There are different things happening. The Honest Ads bill may pass. But obviously everyone knows that the U.S. regulatory climate is not very functional at the moment. The GDPR is about to go into effect in Europe, if you weren’t aware of that. That’s coming up in May. That’s going to set data rules. But now the U.S. is comparatively unprotected in terms of privacy. That’s one thing to consider. I know Klobuchar and Kennedy and Blumenthal are putting up a bill that’s essentially universal data protection for U.S. citizens, hopefully mapping to the protection in Europe.
We need to examine much deeper questions, though, around what it means to have something so powerful. How do you make it accountable to something other than its own profit? Only because Facebook is affecting 2 billion people’s minds—we haven’t even gotten to the issues around, it’s a machine designed to throw thoughts into people’s minds based on whatever thoughts got clicked and liked the most. In languages the engineers don’t even speak – in Sri Lanka, in Burma – you have genocides being amplified by the fact that these are countries that came online only in the last two or three years. You have automated systems pushing ideas in people’s minds that literally cause them to kill each other. The U.N. has called out Facebook, in the case of Burma, as one of the principal amplifiers of that conflict.
I say this in a sense of, when the New York Times asks nine experts, “What would you do fix Facebook”—Tim Wu, who wrote a great book called The Attention Merchants about this, said, “I would turn it into a global public benefit corporation.” It might sound incredibly naïve to say something like that, but I don’t think anything with so much power should be anything but that. There’s a question of whether you take the existing system there, or you constrain the existing system and make way for new completion. Both of those are important to consider.
Malik: I think it’s a naïve way of thinking about the future, the idea of making it a non-profit, a global benefit corporation. Where we need to be thinking is not about the past, about what they’ve been able to build, but putting more regulation around facial recognition, visual data they’re collecting, video data they’re collecting. AI can create much more effective fake personas. That can have much more damaging impact on society. That’s the baseline we need to start with. What’s happened in the past is probably difficult to monitor right now. We should have very hard rules established around collecting visual data.
Bay: Hard rules established by who? Do we use the same levers we’ve traditionally used – government regulation, industry self-policing, consumers voting with their wallets, advertisers voting with their wallets? All of the above.
Malik: Yes, plus whatever Chris suggests.
Hughes: Plus the data dividend?
Bay: Yeah, a data protection agency, or a data dividend.
Hughes: Well, I think both. I think we can have a data protection agency modeled on the Consumer Financial Protection Bureau, or if you don’t like that because you’re on the right, you can choose other regulatory agencies. There is, even in a period when there’s a lot of distrust of government—unfortunately, we have to have public policy. That’s the role of public policy, to stand up for citizens who need that protection.
We live in a time where, whenever you start talking about public policy, people just tune out. “We’ll never get anything done.” I share that cynicism, but we have to beat back against it. This is a perfect example of a place where we can make headway.
I wanted to comment on the three things you just outlined. Consumers voting with their wallets is not feasible in this moment, in the same way it might be in others. When you think about competition among these platforms, Facebook owns Messenger, Instagram, and WhatsApp. By some counts that’s 80 percent of the social traffic on the web, all going through Facebook services. This idea of “delete Facebook” as a movement—there aren’t viable alternatives out there.
I can’t leave Gmail, and I don’t think it’s right to ask me to leave, with all the services that are locked in there. By the way, Google knows a lot more about me than Facebook does. We need to have a more nuanced view of what consumer power is here. Some people are calling for data agents, which would enable people to choose an agent, almost like a union, to lobby on people’s behalf. It may seem like a naïve idea, but there are some very smart people thinking ahead about how this might work in companies that are starting up. I don’t know what direction this is going to take, but we have to change the power dynamics in this landscape.
Steinberg: Why is it that all the smart, decent people used to work at Facebook, and are now on the outside? I mean this with all respect. You, Sean Parker, Roger McNamee, the 20 other people on the outside—doesn’t Mark say, “Hey, all the good people with good ideas and decency left. Sheryl’s still in hiding. What’s going on?”
Hughes: This is where we might disagree a little bit. I don’t think the leadership there is motivated by any malice. I have not seen the side of them that’s as brusque or—there are certainly lots of pockets in the company of people who have very different views than I do, so I’m not setting up to defend them. I do think, though, that it’s too easy to say that they’re all living in a different world. They’re reading the same news and having the same conversations we’re having. But the public pressure is only now just beginning. A year ago Mark Zuckerberg was running for president. There was certainly not this tone of close criticism of the company.
Again, not just at Facebook, but across the board at these companies, they’re very aware they’re under the microscope now. I think that’s a good thing for them and for democracy in the long term. They’ve been unaccountable for so long.
Malik: A bigger problem in Silicon Valley that no one wants to ouch is that we all have blood on our hands. Every single person in the Valley has become beholden to an idea of unfettered endless growth at any cost. It’s not that people are bad. The whole incentive structure is based on growing fast and making a lot of money.
Think about Uber, which is just an idea in 2008. A few years later it’s a $50 billion company. That kind of growth cannot happen without taking some shortcuts. YouTube became this massive platform by infringing on IP. It’s not just Facebook. If there is a cultural moment right now, it’s time for Silicon Valley to take a step back and say, “We’re chasing unfettered growth. But behind every data point there is a person.” As long as we can internalize that as an industry, we’ll make better decisions.
This has to be across the board. Not just at Facebook or at Google. Investors, journalists, entrepreneurs, everyone has to be asking these questions. Are we doing the right thing? How are we protecting people’s data and privacy? How are we protecting the future? Facebook is the most visible platform, but there are others. People don’t talk about that.

Above: (Left to right) Tristan Harris of Center for Humane Technology, Chris Hughes of Economic Security Project, and Willow Bay of the USC Annenberg School for Communication and Journalism.
Bay: If there is a call to action here, if we’re going to take this as a serious moment of cultural reckoning, what is the call to action? For the business community, for the tech community, for consumers, for policymakers?
Harris: I will say that public pressure is working. A year ago, would you have believed that Mark Zuckerberg would be testifying before Congress and talking about regulating social media? We were nowhere near that conversation. Public pressure may seem naïve, but it’s having an impact. The Delete Uber campaign, as an example, that’s not going to change their revenue or meaningfully drop their user numbers, but it does change the culture, especially for the employees.
I know there are many employees—Jan Koum, the founder of WhatsApp, just left Facebook. Many more people are in that same position. Imagine we don’t have any anti-trust law, or any advertisers who are willing to pull out their money. The one thing this company is built on is people. If those people don’t feel good about the practices or the business model or the responses of the company, they’re going to leave. That’s what happened at Uber, what forced Travis to leave and what created that cultural change. There’s a model for this that’s working, and we’ll see more of it.
Malik: In Uber’s case, the middle management wanted change. At Facebook the middle management wants no change. Jan Koum left, but he was not a regular employee. He was on the board. He started a company that was sold to Facebook. He’s an exceptional case. Just one person publicly quit over Facebook’s policies. There’s virtually no one there that thinks what they’re doing is wrong. No one is leaving a cushy million-dollar-a-year job and a Club Med lifestyle over this.
Steinberg: It goes back to what Chris was saying. Chris has been involved in public service, and you can have apathy about public service, or you can get involved. Watching that hearing, they’re so old. They are so very old. There’s nothing wrong with being old, but we can’t have every member of Congress at 75 years of age and older. The world changes fast. You need diversity throughout all organizations and cultures, and part of diversity is age diversity.
Whichever senator is asking Mark Zuckerberg, “If you don’t charge subscription fees, how do you make money?” And he has to answer, without offending him, “Senator, we sell ads.” That senator was either too confused or couldn’t be bothered to read an article. We need regulation. Maybe we need to make bad regulations and fix it to make better regulations, but we need to try. We need to have a political climate in this country where good people can run for office.
Bay: You’re saying, good people who are tech-savvy, who understand these issues that are front and center.