SDS 722: AI Emits Far Less Carbon Than Humans (Doing the Same Task)

Podcast Guest: Jon Krohn

October 13, 2023

This week, Jon Krohn unpacks a research paper comparing the carbon footprints of AI-driven writing and illustrating to human tasks. Surprisingly, AI emerges as a lower carbon emitter, prompting discussions on its broader ecological implications. 

Ever wondered how AI-driven writing and illustrating measure up against human carbon footprints?
When it comes to writing, BLOOM’s open-source LLM produces around 1.6g CO2 per query, whereas GPT-3 slightly edges up at 2.2g CO2 per query. For context, using a laptop in the US to write a page of text emits about 27g CO2, which is a substantial 12 times the emission of GPT-3. Switch to a desktop, and the emissions surge to approximately 72g CO2, a staggering 32 times the output of GPT-3. As for illustration, Midjourney clocks in at 1.9g CO2 per query, and DALL-E 2 mirrors GPT-3 at 2.2g CO2 per query. But here’s the surprising part: A human illustrator, working for 3.2 hours on a laptop in the US, would release about 100g CO2. On a desktop, this number catapults to 280g CO2, overshadowing the AI’s output by a vast margin.
Jon, however, steers the conversation to a broader perspective, urging listeners to consider what humans might do with their spare time if they weren’t occupied with writing or illustrating. If they engage in high-emission activities like driving, then the carbon footprint dynamics might shift. Regardless, Jon’s initial reservations about the high energy consumption of top-tier LLMs were fundamentally changed by this research. Consequently, he’s more inclined to utilize AI in his daily tasks, with the aspiration of wrapping up his day early to take in the great outdoors. 

ITEMS MENTIONED IN THIS PODCAST:

DID YOU ENJOY THE PODCAST?

  • If AI continues to be more energy-efficient, how might this change the landscape of professions in the next decade?
  • Download The Transcript

Podcast Transcript

(00:05):
This is episode number 722 on A.I.’s relatively low carbon emissions. 

(00:19):
Welcome back to the Super Data Science Podcast. I’m your host, Jon Krohn. Today’s episode is a quick one on how “The Carbon Emissions of Writing and Illustrating Are Lower for AI than for Humans”. This was a surprising thing for me though, once I started to dig into it, it started to be pretty obvious. So everything in today’s episode comes from an archive pre-print paper by researchers from UC Irvine, MIT, and other universities around the world. Yeah, so this article is called “The Carbon Emissions of Writing and Illustrating Are Lower for AI than for Humans”. And in this article, they compared how much energy is used by people as well as by large language models, popular large language models, for writing and for illustrating as the title suggests.
(01:10):
And so for writing, for example, they used BLOOM, which is a big open-source, large language model. And for comparison, they also used GPT-3, which was at the time of their publication in March, the state-of-the-art proprietary large language model from, of course, OpenAI and yeah, available within the ChatGPT user interface. So yeah, so they looked at BLOOM, they looked at ChatGPT, open-source and proprietary large language models respectively. And they estimated that including training, the BLOOM model uses just 1.6 grams of carbon dioxide per query, per request that you make to BLOOM. GPT-3 uses, they estimate 2.2 grams of carbon dioxide per query. Now what does that mean? I don’t myself have any sense of whether that’s a lot or a little. And so in comparison, they estimated that it takes a human about four fifths, 0.8 of an hour to write a page of text. And so if you’re going to write a page of text on a laptop in the US based on the mix of energy sources in the US, that corresponds to about 27 grams of carbon dioxide per query. That’s 12 times what you were getting with GPT-3. 
(02:46):
And if you were to use a desktop instead of a laptop, it’s even worse with a desktop on average for that same amount of time, four fifths of an hour, you would be spending 32 times as much, so about 72 grams of carbon dioxide relative to using GPT-3. A weird thing, if you are watching the YouTube version of this podcast, I am actually including the figures from this paper as I describe what I’m saying. And there’s something really funny about this paper, which they also give you a comparison of just how much energy is used by a human in India and in the US for that same amount of time writing one page. And I guess it kind of serves as a comparison point, but it seems like an odd point. They make a point in the article that human time should also be considered during the writing time, but that doesn’t make any sense to me because you can’t get rid of the human, the human’s going to be emitting CO2, whether they are writing a page or not. 
(03:53):
So yeah, I don’t know. It’s kind of odd to me that they included that at all. But yeah, humans use way more energy than laptops or desktops. And then of course also these large language models while they’re writing that page of text. But I think that isn’t so interesting. Again, you can’t displace humans or you’re giving a artificial super intelligence of the future a really bad idea when it’s like, oh, I’ve got to reduce carbon emissions. Let’s just get rid of all the humans. Yikes. Okay, so I’ve given you all the data on writing now that they dug up that they computed for this article. Yeah. So again, if you’re using a laptop on average, you’re going to be using 12 times as much energy as GPT-3 or a desktop, you’d be doing 32 times as much energy as GPT-3 to write that page of text. 
(04:46):
Now, they also did this for illustration where the results are even starker. So in this case, they compared Midjourney and DALL-E 2 to a human using a laptop or desktop. And so for Midjourney, they estimated 1.9 grams of carbon dioxide per illustration created. And for DALL-E 2, they estimated 2.2 grams, which is the same as what they estimated for GPT-3 actually. And then they estimated that for a human, it would take them 3.2 hours on average to create an illustration manually on a computer. And so that corresponds to a hundred grams of carbon dioxide on a laptop or 280 grams of carbon dioxide on a desktop in the US. And that corresponds to a 45 x multiple relative to DALL-E 2. So DALL-E 2 again uses more energy in the Midjourney. So I’m using that as the more challenging comparison. And the laptop uses 45 times more energy than DALL-E 2 would, and about 127 times more energy than DALL-E 2 if you try to do it on a desktop. 
(06:08):
So overall, you’re getting big improvements of energy usage using a large language model as opposed to using a laptop or a desktop ranging from 12 x if you’re using a laptop to write a page of text, up to 127 x if you are illustrating on a desktop. So yeah, again, these are all estimates and there are lots of complexities here, such as what humans do with their time, instead of writing or illustrating, if you were to use DALL-E 2 to create an image really quickly and then spend all of that spare time driving around in your car, well then the net impact would obviously be even worse. But overall, I found this article really interesting and eyeopening as someone who would love to see the world at net negative carbon emissions as soon as possible through innovations like nuclear fusion and carbon capture.
(07:08):
I have been getting antsy about how much energy state-of-the-art large language models use. But this short, simple article, which of course I’ve included for you in the show notes, turned my perspective on its head. So I’ll continue to use AI to augment my work wherever I can and hopefully get my workday done earlier so I can get away from my machine and enjoy some time outdoors.
(07:30):
All right, well that’s it for today. Until next time, my friend, keep on rocking it out there and I’m looking forward to enjoying another round of the SuperData Science Podcast with you very soon. 
Show All

Share on

Related Podcasts