
Above: Amazon Echo Loop
GamesBeat: I think we’ve agreed to say “tech giants.”
White: Basically, if nothing else, it’s going to improve your relationship to understand your contract with them, in the sense of saying, for social media, they’re not going to be able to use my personal data willy-nilly, but I am possibly willing to sell it to you. Something of that nature. Really having a greater understanding of the relationship. It’s not about totally shutting them out. That’s not one of the goals of the decentralized web.
Radburn: As far as centralization and decentralization, we’ve always been on a constant kick of iteration, iteration, iteration. We were centralized in mainframes, and then we were decentralized to the edge. The time frames are getting shorter and shorter.
But where we are now is, it gets hard to tell whether we’re centralized or decentralized. When we actually look at the cloud, the cloud says, hey, everything’s going out there. Now we’re moving to 5G, and your data has to be co-located with a 5G antenna, because otherwise you have that latency in accessing your data remotely. Now I have to have copies of all my data from that central source out at the edge so that I can access it immediately and get the response times I need. Is that centralizing my data, or decentralized? I haven’t quite gotten my head around that one yet.
June 5th: The AI Audit in NYC
Join us next week in NYC to engage with top executive leaders, delving into strategies for auditing AI models to ensure fairness, optimal performance, and ethical compliance across diverse organizations. Secure your attendance for this exclusive invite-only event.
Rehbock: The bigger issue is really ownership of the data, rather than centralized versus decentralized. One of the big things that we do at Shadow is that every user has 100 percent control. Shadow has no access to your copy of Windows or to any of the games you have. It’s your little virtual machine. When you’re logged in, that gets mounted to a GPU and a CPU, but when it’s shut down it’s encrypted. Nobody, not even customer support, has access to it.
It’s a very weird thing, because for Shadow users, when they need customer support, you have to use Windows Remote Desktop to get there to help them. But that’s important. Even though the computing is centralized, the ownership of the Windows instance and the data is sacred. That’s what it comes down to. At AWS, you have your AWS instance, and it’s your machine. Jeff Bezos doesn’t have access to any of it.
MacLean: And we have encryption tools to make sure that our customers always retain control over their data.
Rehbock: Exactly. I think that’s where the mindset is, which puts it in a pretty good place.
MacLean: And I think it has to be. If you’re a cloud infrastructure provider, understanding that the customer’s data is always the customer’s data is absolutely important to your business. It becomes a more interesting case — “interesting” as in we need to talk about it a lot more as a society — when we look at what Bebo mentioned earlier about how people are using our personal data. Not every company is ethical about how they’re using your personal data. They’re certainly not transparent about how they’re using your personal data. Even with the advent of things like GDPR — it still means that your personal data is being used in ways that you’re not aware of and that you may object to, or that may not come with what you see as fair and reasonable compensation.

Above: Blade promises quality Shadow cloud gaming on any device.
Question: It’s a futurist panel, and you’ve been painting a rosy picture of the future so far. When you give this kind of power to the masses, with that comes great responsibility, and there are a lot of irresponsible people in the world. What do you foresee as the most negative things that can come out of this kind of power, and what do you think we should do to try to overcome that?
MacLean: One of the big things I worry about is how we’re empowering very bad behavior in multiplayer games. You can be a terrible gamer in a multiplayer game online and face no consequences. As a woman in games, I’ve seen what that means outside of games. No one should ever experience death threats or rape threats because they make video games, and yet we see that happening to game developers. That’s not acceptable.
As an industry, we need to take a much harder stance on what we think is acceptable, because fundamentally, what you allow in your community, what you allow in your game, sends an implicit message about what’s allowed in your contact with other human beings.
Radburn: From my perspective it’s a case of checks and balances. You mentioned earlier about introducing technology as fast as we possibly can. I think due diligence needs to be done along the way, because we don’t know what we don’t know. The onus is on us to find out what we don’t know, to ensure that there’s no longer-term effect.
A case in point would be with VR. We don’t market to anybody under the age of 13. When account managers come to me saying, “Hey, we have this great educational thing for K-12,” as long as K is involved, we say no. We’re hitting that 13-year-old and up. We all need, as an industry, to be responsible for that. There’s the potential for psychological harm. We already know that VR can affect the mind, because we use it as a sedative, to treat PTSD, to work with autism, whatever. It could also have negative effects as well.
We as an industry need to police that. We need to make sure that everybody upholds that same standard, and make sure that we don’t give ourselves problems in the future because of our reckless abandon to make a buck.
Shih: I think a lot of problems could be prevented with better positioning and better policy. We talk about the possibility of negative uses of AI or machine learning, and if you look around, there’s really no policy or standards. It’s the role of government to have these debates and create better policy, because asking a lot of companies to just self-police is a little optimistic.
White: In terms of bad behavior, to give another example, there are cases where we’re not talking about technology. I’m not sure that people want to have computer scientists develop a code of ethics for them, something of that nature. Now, I have trust in the community to self-regulate. In terms of governing behavior, the way you suggest, unfortunately I see that as being one more step toward censorship.
I think people can make their choices, and I can give an example of that quickly. A number of you are probably familiar with the Internet Archive, the goal of which is to archive all the content on the web. But they have consciously chosen not to archive pornography, not to archive hate sites or anything of that nature. They have taken it on their own initiative to do that. I think, once again, it’s got to be a decision that cannot actually have, through its consequences, the potential to support arguments for censorship or anything of that nature. I believe in government regulations, but it’s a slippery slope, and it doesn’t have a technical solution.
One other thing that I think is quite possible, and a strength of the decentralized web, is provenance. Being able to establish the validity of something, being able to establish the goal of something — if this is coming from a particular site, you understand the probability of it being true. The expansion of those sorts of ideas is something we can do a lot more with.
Rehbock: If I were Disney, and I made a theme park, and I built a ride that encouraged people to sling racial epithets at each other, or misogynistic comments, all across the ride, society would not allow that. Yet we have that, unfortunately, in the digital playgrounds that many developers have set up. That’s a bit scary. Would you consider it censorship? I don’t know. But we’re on a slippery slope there as well.
MacLean: We not only allow it in those digital playgrounds, but we allow it in their surrounding community spaces. We allow it on major social media platforms. I don’t know if anyone else in this room has experienced harassment on Twitter. Did you report it? What happened? It went into the ether. Even when there are policies in place, if the platform doesn’t have the will to enforce it — which is often a business decision — those policies aren’t relevant.
GamesBeat: I subscribe to this notion that the road to hell is paved with good intentions. Who would have thought that the internet would have led to Facebook and Cambridge Analytica? I like these notes of caution, like Kurt Vonnegut’s line about how we are who we pretend to be, so we must be careful who we pretend to be. Technology has a way of being very unpredictable as far as its consequences.
White: I have no doubt that Facebook had great intentions to start off. Somehow they got lost along the way.
Rehbock: Oh, I don’t think they had that great of intentions. Mark Zuckerberg wanted to find a date for the weekend.
White: In any case, the chickens have come home to roost. Bringing people together is not bad. Facebook is not bad. If you’re 75 or 80 years old, that might be an important part of your social interaction. The concept is not bad. But it’s in how it’s allowed itself to be used.

Above: Unity’s game engine can be used to create more than just games.
GamesBeat: Facebook’s concept of using real names seemed like a great idea to hide all the problems of anonymity, but then that allows us to have our privacy invaded.
Radburn: Every major technology along the line has been used in some form or another that wasn’t what was originally intended.
Question: You hit the nail on the head in more ways that you realize, I think, when you talked about accessibility. Each panelist represents an element of accessibility. The cloud, the client, the tools, the platform. What do you think about what needs to happen for this next computing era to happen?
White: Certainly one thing that happened in the early days of the web, it was basically the wild west, without leadership and without standards, and certainly that has led to a lot of its problems. Also, combine that with the fact that it was pretty naive. People had great aspirations how it could be used. Nobody had ever dreamt about things like cyberbullying on the web, or even security issues on the web.
It’s a case in point. Even though it seems like a good idea, and there’s the whole issue around the fear of missing out, we have to tread quite carefully in terms of development. Otherwise, the web is a great example of how something can get totally out of hand.
MacLean: There are people who would have thought about cyberbullying on the web. Any woman who’s ever been whistled at as she walks down the street probably could have told you what would happen in chat rooms. Any person who’s had a racial epithet yelled at them could have told you what would happen in a chat room. One of the fundamental flaws in our industry and our community is that we’re not diverse. We don’t have people who’ve had that diversity of experiences at the table and making a point, particularly in the development of new technologies like AI. We know what’s going to happen, because we’ve lived it. That’s something we all need to do better, because otherwise we’ll repeat the same mistakes.
GamesBeat: There’s a good point to be made there about how you can’t see what’s around you when you’re inside a bubble. We have this all the time in Silicon Valley. We have all these great bubbles. We have the esports bubble right now, and it seems like it’s going to be the next great thing and we’ll all make billions of dollars from it. But when that bubble pops, you realize in hindsight what was really happening.
White: I will say that I think the web has a bright future. It’s going through a mid-life crisis now. But I think a lot of people are committed to it. I’m hopeful that it’s going to continue to be a defining force.
I have a 14-year-old kid, and she’s what’s referred to as a digital native. They haven’t known a world without this. Certainly in the case of whatever future that we have, we have to be cognizant of the fact that we can’t necessarily just talk about the next generation. We’re talking about the generation that’s already in it, and how we can change their experience.
Radburn: I think we’re realizing the ramifications of being free and easy with our data. We’re thinking about how to get that genie back into the bottle around things like decentralization and all the rest, so that you can control who has it, who uses it, who monetizes it.
MacLean: I talked about using technology to build emotional connections. I think we do need to do that. I have more than a thousand friends on Facebook, probably too many, but many of them are people who live on the other side of the planet, or who I haven’t seen physically in 20 years, and I feel like we’re still friends. I know about my friend Leslie’s dog Moose and how he’s this great old retired police dog. I know about my friend Emily being named to the New Zealand Game Developers Association.
It gets back to, are we using technology to build those emotional connections and make the world a better place? And often, maybe not often enough, but often, the answer is yes. I’m grateful to live in a world where that’s possible.