Future energy requirements to power the potent computers on board a worldwide fleet of autonomous vehicles may equal the current global data center emissions in terms of greenhouse gas emissions.
That was one of the main conclusions of a recent MIT study that looked at the possible energy use and associated carbon emissions if driverless vehicles were extensively used.
According to the International Energy Agency, data centers, which house the physical computing infrastructure needed to run applications, currently produce about 0.3 percent of the world’s greenhouse gas emissions, or about as much carbon as the country of Argentina does every year.
The MIT researchers created a statistical model to investigate the issue after realizing that the possible footprint of driverless vehicles had received less attention. According to their calculations, 1 billion autonomous vehicles, each powered by a computer using 840 watts, would use enough energy to produce almost the same amount of emissions as data centers do today.
The researchers also discovered that each autonomous car needs use less than 1.2 kilowatts of power for computation in over 90% of predicted scenarios to prevent emissions from blasting past present data center emissions, which would require more efficient technology.
Hardware efficiency would need to double faster than every 1.1 years to keep emissions under those levels in a scenario where 95 percent of the world’s fleet of vehicles is autonomous in 2050, computational workloads double every three years, and the world continues to decarbonize at the current rate.
“If we just keep the business-as-usual trends in decarbonization and the current rate of hardware efficiency improvements, it doesn’t seem like it is going to be enough to constrain the emissions from computing onboard autonomous vehicles. This has the potential to become an enormous problem. But if we get ahead of it, we could design more efficient autonomous vehicles that have a smaller carbon footprint from the start,” says first author Soumya Sudhakar, a graduate student in aeronautics and astronautics.
Sudhakar wrote the paper with her co-advisors Vivienne Sze, associate professor in the Department of Electrical Engineering and Computer Science (EECS) and a member of the Research Laboratory of Electronics (RLE); and Sertac Karaman, associate professor of aeronautics and astronautics and director of the Laboratory for Information and Decision Systems (LIDS). The research appears in the January-February issue of IEEE Micro.
Autonomous vehicles would be used for moving goods, as well as people, so there could be a massive amount of computing power distributed along global supply chains. And their model only considers computing it doesn’t take into account the energy consumed by vehicle sensors or the emissions generated during manufacturing.
Professor Sertac Karaman
Modeling emissions
The researchers created a framework to investigate the operating emissions from computers in a large fleet of completely autonomous electric vehicles that operate all over the world.
The model depends on the size of the worldwide fleet of cars, the computing power of each car’s computer, the number of miles each car is driven, and the carbon intensity of the electricity that powers each computer.
“On its own, that looks like a deceptively simple equation. But each of those variables contains a lot of uncertainty because we are considering an emerging application that is not here yet,” Sudhakar says.
For instance, according to some research, people may spend more time driving in autonomous vehicles since they can multitask while doing so, and younger and older drivers may do so more frequently. However, according to other research, driving time may reduce as a result of algorithms that can find the quickest routes to destinations.
The researchers had to model future improved computing hardware and software in addition to taking these uncertainties into account.
They did this by simulating the workload of a well-known multitask deep neural network technique for autonomous vehicles, which can handle multiple jobs at once. The amount of power this deep neural network would use to handle several high-resolution inputs from numerous cameras at fast frame rates was investigated.
When they used the probabilistic model to explore different scenarios, Sudhakar was surprised by how quickly the algorithms’ workload added up.
For instance, if an autonomous car makes 21.6 million inferences every day while driving for an hour and has 10 deep neural networks analyzing data from 10 cameras. One billion vehicles would make 21.6 quadrillion inferences. To put that into perspective, all of Facebook’s data centers worldwide make a few trillion inferences each day (1 quadrillion is 1,000 trillion).
“After seeing the results, this makes a lot of sense, but it is not something that is on a lot of people’s radar. These vehicles could actually be using a ton of computer power. They have a 360-degree view of the world, so while we have two eyes, they may have 20 eyes, looking all over the place and trying to understand all the things that are happening at the same time,” Karaman says.
“Autonomous vehicles would be used for moving goods, as well as people, so there could be a massive amount of computing power distributed along global supply chains,” he says. “And their model only considers computing it doesn’t take into account the energy consumed by vehicle sensors or the emissions generated during manufacturing.”
Keeping emissions in check
To keep emissions from spiraling out of control, the researchers found that each autonomous vehicle needs to consume less than 1.2 kilowatts of energy for computing. For that to be possible, computing hardware must become more efficient at a significantly faster pace, doubling in efficiency about every 1.1 years.
Increasing the usage of more specialized hardware, which is made to perform particular driving algorithms, may be one approach to increase that efficiency.
“Because researchers know the navigation and perception tasks required for autonomous driving, it could be easier to design specialized hardware for those tasks,” Sudhakar says. But vehicles tend to have 10 or 20-year lifespans, so one challenge in developing specialized hardware would be to “future-proof” it so it can run new algorithms.
Researchers may improve the algorithms in the future so that they consume less computational power. This is difficult, though, because increasing efficiency at the expense of precision could compromise vehicle safety.
The researchers aim to keep investigating hardware efficiency and algorithm advancements now that they have established this framework. Additionally, they claim that describing embodied carbon from autonomous vehicles, carbon emissions produced during automobile manufacturing, and emissions from a vehicle’s sensors can improve their model.
While there are still many scenarios to explore, the researchers hope that this work sheds light on a potential problem people may not have considered.
“We are hoping that people will think of emissions and carbon efficiency as important metrics to consider in their designs. The energy consumption of an autonomous vehicle is really critical, not just for extending the battery life, but also for sustainability,” says Sze.
The National Science Foundation and the MIT-Accenture Fellowship funded this research, in part.