Skip to main content

Intel self-driving cars: Does anyone want an autonomous Ferrari?

Kathy Winter, vice president and general manager of autonomous driving at Intel.
Image Credit: Dean Takahashi

Watch all the Transform 2020 sessions on-demand here.


For the past 18 months, Kathy Winter has been vice president and general manager of autonomous driving at Intel. In car time, that seems like a long while, especially as self-driving cars were one of the biggest themes at CES 2018, the big tech trade show in Las Vegas last week.

I attended Brian Krzanich’s keynote speech at the opening of CES, and then I spoke with Winter the next day. Krzanich said that Intel is building a fleet of 100 autonomous cars to begin testing self-driving car software in partnership with BMW, Volkswagen, and Nissan. Mobileye, which Intel bought for $16 billion, rolled out a self-driving car on stage with 12 cameras. Intel even teamed up with Ferrari — not for self-driving cars, but to capture immersive video of races.

I talked with Winter about competition in the self-driving car business and how quickly autonomous cars are going to enter our lives.

Here’s an edited transcript of our interview.


June 5th: The AI Audit in NYC

Join us next week in NYC to engage with top executive leaders, delving into strategies for auditing AI models to ensure fairness, optimal performance, and ethical compliance across diverse organizations. Secure your attendance for this exclusive invite-only event.


Above: Mobileye-powered car on stage at Intel keynote at CES 2018.

Image Credit: Intel

VentureBeat: It was a strong keynote last night.

Kathy Winter: You’ve seen a few over the years.

VB: It didn’t seem like any one subject got quite enough time, though.

Winter: Yeah, there are so many things going on. I heard that from a lot of people. “It seemed long, but it could have been longer.” Each of those topics, you could spend a lot of time on. A lot of the messaging we put in the booth, as opposed to on the stage last night, beyond just the big announcements. And then we had a press conference this morning that reinforced a lot of things that were talked about. There’s a lot going on, between the Mobileye booth and the Intel booth.

Above: Panasonic concept cockpit for a self-driving car at CES 2018.

Image Credit: Dean Takahashi

VB: I saw Panasonic’s booth. They had three or four different concepts for driver cockpits.

Winter: On the infotainment side?

VB: Yeah, concept cars, what they’re going to look like.

Winter: People are focusing very much on sensors today, getting the data out, getting consumers to trust it, getting the vehicle. You probably saw the displays we had, where you can get in. Have you been in an automated vehicle?

Above: Intel self-driving car video

Image Credit: Intel

VB: I’ve been in one from Phantom Auto, and the Delphi car.

Winter: Right, so you’ve done it. But most people don’t have that access. We actually have a big thing set up in the booth so people can get in there and see what it’s like, see what the sensors see, experience what you did in the real vehicle. It’s the whole HMI and getting consumers to trust that it’s safer. Even though we know technically that it’s safer, getting people to feel comfortable is something else.

VB: The Phantom Auto thing was interesting. They think that the AI is going to get 99 percent there, and then it won’t be good enough to cover that final one percent. So their thinking is, why not have a human driver overseeing and watching remotely to see if the car needs to be taken over at some point? There’s a backup remote driver who has video game equipment, essentially — a steering wheel and pedals — to take over the car in real time.

Winter: I’ve heard some of the fleet operators talk about that, more than consumer vehicles. From a fleet perspective, if you’re running a fleet — we think some of those will be the biggest use cases for automated vehicles early on, and it’s attractive to them. But when I think about the consumer vehicles and this evolution from what we have today along the way to fully automated, a lot of what Amnon’s been talking about with this new RSS model — have you seen that? The Responsibility Sensitive Safety model. Basically it’s a safety model so we can validate that you don’t have to have the driver re-engage, but the car won’t cause an accident. Can you make it foolproof and have it drop back to that safe state every time?

VB: The theory was that the driver wouldn’t be able to do that, so the remote person is tasked with paying attention.

Winter: If the logic in the car is good enough, though, and we get to the point where the driving policy and decision-making is strong enough, the car should be able to make those decisions without intervention from a human. But in the meantime, especially with some of those early fleets, it’s an interesting thought that some of these companies are throwing out to watch their fleets.

Most consumers don’t like the idea of somebody coming in to control their vehicle for them, right? If the vehicle’s part of a fleet, fine, but there’s a question of whether the typical consumer will stand for that.

Above: Shai Magzimof, CEO and cofounder of Phantom Auto, shows the company’s remote-driving station.

Image Credit: Dean Takahashi

VB: One thing they did was restrict the human takeover to 25-mile-per-hour situations. Driving in San Francisco would be hard for a self-driving car, but it’s slow enough that maybe the remote human could take it over. It was a very specialized situation.

Winter: It’s interesting. If you look at level one, level two, level three, level four, you’re really talking about those shades where, at the earlier levels, there’s more and more automation coming in. A lot of what Amnon talked about today, bringing in REM and HD mapping to improve even those level two vehicles out there today — things like Highway Pilot, Lean Departure — keeping things like that, you could improve things like that even though you still have a driver in the seat. They can relax a bit but still be there to take over.

And then when you get to level three, you have to leave some time for re-engagement. By definition, there we’ve said that we’re going to trust the car in certain scenarios — on the highway, maybe in an urban area that’s easier to navigate, but not necessarily in every situation. I think it’s consistent with that whole thought process.

VB: What do you think of the whole arms race between Intel and Nvidia on the chip side?

Winter: In general, I would say that now that we’ve combined with Mobileye and brought in their IQ technology, combining it with what we have from our Atom-based Denverton platform, this solution is a really strong competitive solution. We’ve been able to bring in the strengths of Mobileye in computer vision and acceleration and use our Atom processors, or eventually even Xeon processors, that are top-notch in the CPU space. The combination is really powerful.

We’ve been able to bring together the strengths of both of these companies and still have some flexibility. We can move workloads around. What we’re finding with the OEMs is that some like to do things architected one way, and some like another. It’s given us this great flexibility to have some choice about where to put those things and how to best optimize across those platforms that we didn’t really have before.

VB: Does it make sense that, say, one’s going from the CPU side and one’s going from the GPU side? Is the best thing ultimately going to be someone who does ground-up AI processing?

Winter: We’re getting ready to put our pilot proof out in a few hours in the newsroom. It has some performance comparisons. We’ve been very heavily trying to look at the combination of — if you look here, this is Nvidia, Xavier. When you talk about the AI and deep learning tasks, that’s one measurement, and then there’s the total compute capacity. The other thing that’s important is power consumption. If you get a lot of deep learning operations, that comes at the cost of a ton of wattage. In a lot of cases, we’re trying to put this in an electric vehicle. They don’t have watts to spare. They have a budget. Heat is another thing. If this needs water cooling and the solution has to be this big to keep it cold, where do you put it?

Above: Intel has 12 Mobileye cameras on its self-driving cars.

Image Credit: Intel

VB: Someone has mentioned to me, an analyst, that this might be one Nvidia chip against three Intel chips. The cost might be something to consider, as well.

Winter: We’re not really talking price yet. One thing we’ve spent a lot of time benchmarking recently is to understand the ratio of how important the deep learning is — I would tell you, from an Intel perspective and a Mobileye perspective, not everything you do in the vehicle requires that kind of deep learning and AI compute. Some things run really well on a CPU. The combination of the two — you can take advantage of the higher CPU and combine it.

We can do different combinations. Our solution is really two Atoms in a Denverton platform. That’s the solid solution between the two of them, between the Atom processor and combining these two together. It gives you the flexibility to optimize. That’s the architecture we’re going for versus, in the case of Xavier, you just have to keep adding Xaviers that all do the same thing. You don’t get that opportunity to balance and take advantage of some tasks in the vehicle that really benefit from high power and low wattage. If you look at the wattage on these two, this is hot. That’s going to be a challenge.

So that’s how we’re approaching it. Low wattage and optimizing across what needs to be true AI and what can be taking advantage of CPU capabilities, which we know really well. And the deep learning piece, which of course is very strong.

VB: Nvidia has been saying GPU, GPU, GPU, and then they’ve learned that this might be more AI-specific. Do you think you’re learning the same thing?

Winter: I think we’re saying “combination.” Both CPU and the combination with the accelerator, having that flexibility to move workloads across. Did you look at whether it’s fusion? Things like fusion and driving policy fit really well on the accelerator. But when it comes to doing all the transactions that are communicating to the actuation systems, for example, sending out all of those commands, that can be really fast.

Above: Inside a self-driving car with Mobileye technology.

Image Credit: Intel

VB: The bridge silicon you guys are doing, like with AMD — the semi-custom AMD chip with the Intel CPU that’s going into some of the laptops — is that similar work?

Winter: These are very specifically designed for automotive. I would say not specifically that, because we’re putting these Atom chips through all that additional incremental functional safety that has to happen with an automotive-grade processor. We’re very selective. We’re not just picking up random different chips from the Intel family, because we want to be very targeted. We want ones that can meet those automotive standards.

VB: Where do you end up being optimistic about what can be technically achieved?

Winter: We’re super optimistic in that we’re building this fleet of 100 vehicles — I’m sure you heard about that — to test out a couple of things. We want to get a total solution package on the road. We have partners we’re working with, but with our own fleet we can start doing our own validation on the driving policy. Also, we can start looking at the industry-wide safety standards and validation to ensure that all vehicles are safe — not just ours, but across the industry.

I’m really optimistic about getting our own fleet, in addition to our partners’ fleets, so we can accelerate that learning and start focusing on a commercially viable solution — something that builds on things today like the level two, like adding all that camera data in to help crowdsource your maps. Things like that, things that have been traditionally talked about around final autonomous driving, that you can start laying down to help make some of the more middle-class cars or platforms — more economic, as opposed to the high end — getting them to take advantage of some of these lower-cost safety technologies without having to wait all the way until level four or level five. The opportunity is huge, once you get these cars connected and get all that camera data coming in.

Above: Intel is working with Ferrari to capture race car performance.

Image Credit: Intel

VB: I talked to another company called Vayyar that has a 3D imaging radio sensor technology. It does some of the same things cameras do. Are there different kinds of sensors that might be useful for you?

Winter: I’m not familiar with that. I think the one thing I like about this camera-first approach — when I look at Mobileye, for example, they’re on their fifth generation. They have this down. You’re not taking an entirely new technology and putting it at the heart of something that’s already a very tricky problem. We’re still architecting radar and lidar, for example, as safety and checkpoints, but the nice thing about camera-first — one, it’s economical. You can have lots of them. And we’re on the fifth generation now. That’s an important piece as far as trust.

In the same way, you’re going to see lidar going from — for example, it’s mechanical today. You have all these people in pursuit of solid state, higher quality, higher longevity, at a reasonable cost. Every time I see one of these new sensors, I have that same thought. It needs to go through the same process radar did. Radar started out big and super expensive. Now you can go over to the Aptiv booth and see these tiny little things. You can put them all around the car now at a reasonable cost.

Lidar is going to come down that same curve, and in general, as new sensors come out, we’re going to — again, part of having our own fleet is we’re going to keep mixing the sensors and trying new things out. I can’t speak to that one in particular, but in general we’ll keep looking at new things. That’s part of why we want to build our own fleet, so we can change things out as new technologies come in, because there will be good ones. There have to be, the way things are going.

VB: It sounds like Ferrari wasn’t part of this AI stuff? I guess nobody really wants a self-driving Ferrari.

Winter: I will say that — it’s funny. I blog about automated driving constantly. It goes back and forth. No matter what I write, there are always the folks in the crowd writing in and saying, “I love my car. I love my sports car. I will never give it up.” We have to keep promising them, “Don’t worry. You won’t have to.” They must have better cars than I do.

The other thing that’s very polarizing is getting people to trust it. We still have a ton of people who say, “I don’t trust it.” I’m blogging about, “Hey, my son’s driving now. My parents probably shouldn’t be driving. I completely trust it.” But I always get people voicing their opinions on the blogs who need to get over the trust factor.