Skip to main content

The highlights and memorable moments of GDC 2017

Palestinian game maker Rasheed Abueideh drew a standing ovation.
Image Credit: Dean Takahashi

Each year, developers and aspiring game creators flock to the Game Developers Conference in San Francisco like they are going to the well. It’s inspiring, and it widens their horizons. It has the same effect on me, as a game journalist. Each year, there are some unforgettable moments. And here are some that made an impression on me during the past week.

#1ReasonToBe

This session was, sadly, the only panel that I attended in full during the show. (GDC has become more about press events and game demo appointments and interviews for me, which is a little more corporate and less serendipitous for me). But I chose wisely in attending this session about diversity. Organized by Rami Ismail of Vlambeer, this session was about geographic diversity.

Rasheed Abueideh started it out by saying he is from a place that doesn’t show up on Google Maps. What is it like if you come from a place where you do not want to put your name in a game’s credits? Abueideh said he is from Palestine, which he spelled P@L!$T!NE.

“It has been suffering from occupation for 69 years,” he said. As he said that, I thought he was going to go into a political rant about the Israeli-Palestinian conflict. But he didn’t do that. He showed a sense of humor and humanity in his presentation, and that made it very moving.

An image from Palestine.

Above: An image from Palestine.

Image Credit: Rasheed Abueideh

“The Palestinian people love playing games,” he said. Then he showed a picture of giant, real slingshot, saying, “This is how we play Angry Birds.” He showed himself taking a selfie at a checkpoint where people were queued like cattle. He said, “This is Papers, Please,” referring to the award-winning game about a border control agent. But he added, “They don’t say please.”

He said he wanted to make games early on but it was under very difficult conditions. He said the country doesn’t have access to 3G networks because “we don’t have permission.” There is no access to funding. He started a game division in his company, but it shut down for lack of money. No one comes to visit from the outside world to court developers.

“We are isolated,” he said. “No one comes to us.”

When the Gaza war broke out in 2014, Abueideh said he imagined what would happen if his kids were killed in that war. He said there were many such deaths in real life. His team banded together and made a game, Liyla & the Shadows of War.

“How do we make games if we are living as survivors?” he said.

Liyla & The Shadows of War.

Above: Liyla & The Shadows of War.

Image Credit: Rasheed Abueideh

They sent it to the app store, and it was rejected because it was based on a “political statement.” Apple is skittish about publishing anything that is controversial. There were some false reports that the game was made by the Islamic State.

“By the way, I am not ISIS,” Abueideh said.

He said this community, this industry, was amazing, and he got support from around the world. The game was eventually approved, and it got 700,000 downloads on both platforms. The title wasn’t particularly polished, but it made a point about children in war.

When he told his father what he was doing, his father replied, “My son, you will end up in jail.” He said his wife was terrified as well. She said she advised him not to put his name on the game after he had worked on it for two years.

“I am honored to be here, but I am not happy,” he said.

Abueideh said his No. 1 reason to be in the game industry was to make games that could stop wars and stop more children from being killed. The game did go out with his name in the credits. He got a standing ovation.

Politics at the game awards

Nina Freeman at the IGF Awards.

Above: Nina Freeman at the IGF Awards.

Image Credit: Dean Takahashi

In some ways, I was surprised that Abueideh was allowed to travel to the U.S. for the talk. President Donald Trump’s travel ban isn’t as sweeping as you might expect. You could say that politics has no place in a place like the GDC, which celebrates games. And that was the attitude that I found prevailed at the DICE Summit and DICE Awards.

But at the Game Developers Choice Awards, Trump’s travel ban was fair game. Nina Freeman, a game developer and host of the Independent Games Festival Awards, drew her loudest applause of the night on Wednesday as she expressed sympathy for those “affected by the Muslim travel ban.”

A number of game developers said they could no longer travel to the U.S. for the Game Developers Conference in San Francisco because of the travel ban. Thousands of people applauded in the audience. Behind me, one table stood up and shouted, “Fuck Trump,” over and over.

Her outspoken comment was a contrast to the DICE Awards, where none of the recipients mentioned Trump or the travel ban.

“I urge you to donate to the ACLU,” Freeman said.

Were all of the developers against Trump? No. I had a two-hour conversation with a CEO who felt that the left was going to far in comparing Trump to Hitler, who was a terrorist of the first order before he took office.

GDC Ambassador Award

After Nina Freeman spoke, Mark DeLoura, the former White House adviser on digital media, received the Ambassador Award for his work making government and the wider population more aware of games and the good impact that can come from being a gamer or game developer.

Mark DeLoura, former White House adviser on digital media.

Above: Mark DeLoura, former White House adviser on digital media.

Image Credit: Dean Takahashi

DeLoura was an advocate for the understanding the game industry in the federal government, and he made it clear that he believes games can be a force for social good during his acceptance speech at the awards show.

DeLoura said, “People at the White House knew me as the guy with the big fat Pikachu at my desk.” Among his accomplishments: He organized the first game jam at the White House.

In his acceptance speech, DeLoura praised the diversity of skills and creative inspiration that drives game developers to create original works of art. He said praised how games help give voices to everyone, including “the voices of women, gay and transgender individuals, underrepresented minorities, veterans, the elderly, the homeless, and, of course, immigrants and refugees.”

He admired games like 1979 Revolution, That Dragon Cancer, We are Chicago, Never Alone, and Paolo Pedercini’s great web games such as Unmanned.

Yet so many of us are building our next shooter, our next 4X, our next match-three….And those are great games, we love those games, billions of people love those games. Bigly. Nobody loves them more than I do. It’s true. But if we think that games can be more, do more, have a broader impact – and we’ve seen great examples of that – isn’t it incumbent upon all of us, in this room, to drive that?

If we don’t stand up for the rights of others, we’ll never hear their voices. If we don’t make games easier to build, we’ll never see the amazing innovation of people using them in new ways.
If we don’t start by using our own voice… how can we possibly show others the way? By putting our own hearts out there. Our own passion out there. Taking risks.”

These are difficult times for many. We have unique skills that we can use to help. How are you using your unique abilities — your superpowers, making games — to share with the world your perspective? How are you taking risks to use games in new ways? How are you helping raise the voices of others? If you do that, you will change things. You will change the world. Through games. So please. We need your expertise. We need your voice. Start today.

Qualcomm’s VR headset prototype

Dean Takahashi tries out the Qualcomm VR headset at GDC 2017.

Above: Dean Takahashi tries out the Qualcomm VR headset at GDC 2017.

Image Credit: Dean Takahashi

Yes, GDC wasn’t all about politics. It had cool technology as well. I saw Qualcomm’s prototype virtual reality headset with stand-alone wireless functionality. This headset, which Qualcomm vice president Tim Leland showed me and is intended for developers, had no wires connecting it to a PC.

Rather, it had Wi-Fi built into a Snapdragon 820 processor. (The real version coming this year will use the more recent Snapdragon 835). That made it smaller and less bulky than existing headsets like the HTC Vive. And to reiterate, you don’t hook the headset to a PC. All of the processing power and wireless connectivity is built into what will be an ARM-based Snapdragon processor in the headset itself.

It was battery-powered, but I didn’t get any details on the type or how long the battery lasts. It also had a somewhat-large bar in front of the eyes with a couple of sensors, as part of its inside-out sensing design. (Inside-out means the sensors are on the headset, not on devices set up around the room).

Tim Leland of Qualcomm's shows off wireless VR headset.

Above: Tim Leland of Qualcomm’s shows off wireless VR headset.

Image Credit: Dean Takahashi

Those sensors were able to detect my hands and the movements of my 10 fingers. It used controller-freehand tracking from Leap Motion. That meant I didn’t have to hold a touch controller in my hands. The headset ran Leap Motion’s demo software. With it, I was able to pick up and create blocks. I could toss them around, and look at my fingers in virtual reality. The resolution was good, though the tracking wasn’t absolutely precise. It was still a little hard to pick up objects, but mainly because I had no force feedback, or sense of touch, that told me when I had touched a block.

The Leap Motion software has been integrated into Qualcomm’s reference design kit that other companies can take and put their brand names on, if they wish. Leland said hand-tracking is opitonal for other companies to include, but the sensor for it isn’t that expensive. It adds very little weight, and so including it is kind of a no brainer for hardware companies.

The Qualcomm headset was smaller than Intel’s own prototype, dubbed Project Alloy. That device has the capability of a high-end laptop, with the ability to run VR apps at 90 frames per second. The Snapdragon 835 is also powerful, but I don’t know what kind of performance it will have. Qualcomm’s device was also much lighter, and not so heavy on the head.

The developer headset had a four-megapixel (2560×1440) WQHD AMOLED display (two megapixels per eye) and six degrees of freedom (6DoF) motion tracking. It had two monochromatic, one mega pixel (1280×800) global shutter cameras. It had four gigabytes of LPDDR$ DRAM memory and 64 gigabytes of Flash UFS. It used wireless with Wi-Fi, Bluetooth and USB3.1 type C (power). For audio, it had Qualcomm Aqstic audio codec (WCD9335), and for I/O it also had a trackpad on the right side of the headset.

The best thing about this prototype is that it told me that the second generation of VR hardware isn’t so far away. That’s a good thing, because the first generation hasn’t gotten enough traction to sustain a large number of VR startups and game companies.

Epic’s real-time reskinning of a race car

The Mill's Blackbird is a car that can be reskinned with game animations.

Above: The Mill’s Blackbird is a car that can be reskinned with game animations.

Image Credit: Epic Games

Epic Games, visual creation company The Mill, and car maker Chevrolet showed a new tool for filmmakers that blends artificial reality and the real world. It was one more step on the way to a mixed reality where we can’t tell the difference between what is animated and what is real.

The demo showed how Chevy and visual production firm The Mill used the Unreal game engine to blend an animated car in real-time into a video that will be part of a television advertising campaign.

The film makes use of The Mill’s special adjustable car, dubbed the Blackbird. The race car has QR codes and markers that allow it to be used as a template for digital animations. It also has a 360-degree camera. Filmmakers shot video of a real car and the Blackbird racing on a mountain road. Then, in real-time, the animators covered the car with digital animations that transformed it into another sleek race car. As they shot the Blackbird, the filmmakers could see through a viewfinder what the finished animated car would look like in the scene.

Epic Games CEO Tim Sweeney (who will be a speaker at our GamesBeat Summit 2017 event in May) said in an interview with GamesBeat that the creators used an advanced implementation of Epic’s Unreal Engine with The Mill’s proprietary virtual production toolkit, Mill Cyclops, to create a film that merges real-time visual effects and live-action storytelling.

“This is a way to bring down the costs of animation,” Sweeney said. “Brands can’t always have a car on the set. Sometimes that’s due to security reasons. We had to find a way to visualize a car that wasn’t there. So we created a virtual car using the Blackbird. We used the Blackbird in shoots and then skinned it with a computer-generated car.”

In the ad campaign, “The Human Race” is about a real-life car driver’s battle with an artificial intelligence being that drives a car without human control. The film features the 2017 Chevrolet Camaro ZL1 in a heated race with the Chevrolet FNR autonomous concept car.

“This sounded crazy, but we figured out how to do this in real-time,” said Kim Libreri, CTO, Epic Games, in an interview. “It’s real-time visual effects. We generate cars, composite them, and put them in a video. This looks like it went through months of post-production processing. In reality, it is rendered in real-time.”

Sweeney said, “The question here is can your game engine produce pixels that match the real world.”

The only physical vehicle filmed for “The Human Race” was the Mill Blackbird, a fully adjustable rig that enables filmmakers to insert any car model into any filed environment. Until now, computer-animated cars were added by visual effects artists in post-production, requiring days of rendering to produce high-quality imagery. During this shoot, however, live video feeds as well as positional data from the tracking system were fed directly into Unreal Engine. The Camaro was then rendered and composited seamlessly into the footage in real-time AR, allowing the directors to instantly see the final look and composition of each shot.

These cars have been reskinned in real-time.

Above: These cars have been reskinned in real-time.

Image Credit: Epic Games

The same real-time technology was used to create, alter and produce the short film, blurring the lines between production and post. The ability to create ‘final pixels’ in real time will ultimately change the way filmmakers create content and make critical decisions, Sweeney said.

Alistair Thompson, executive vice president at The Mill, said at Epic’s event that his company created Blackbird because it couldn’t get access to a car that it needed to shoot a commercial. Chevy’s commercial creators, however, need to be able to see the animated car as they shoot a scene through a viewfinder. On stage, Epic, Chevy, and The Mill showed how they could present an animated car onscreen simultaneously as the Blackbird was being filmed in front of the crowd.

Sweeney said, “This technology is changing the economics of entire industries.”

Sweeney also showed how you could use virtual reality to design a game, from inside the game. The Unreal Engine VR editor is a full editing tool that lets you build an entire level of a game from inside virtual reality. It’s a bit like building a full game scene by snapping together Lego bricks and then twisting them into the right shape with your hands, the way a real artist does.