Artificial intelligence’s energy consumption has been a hot topic as the technology has exploded in popularity. It’s no secret that generative AI chatbots are very power hungry, and raising awareness of just how much energy they consume is important in making sure these tools are used responsibly. One way to do that is by showing people just how much each of their interactions costs.
That’s where this new tool comes in — Hugging Face engineer Julien Delavande and a team of creators built an interface that can estimate exactly how much energy each interaction with a chatbot uses. The tool works with Chat UI, which is an open source front end for using GenAI models like Google’s Gemma 3 and Meta’s Llama 3.3 70B. Whenever you enter a prompt through the tool, it provides an estimate of how much energy the bot used to answer your query.
For example, the tool estimates that asking Llama 3.3 70B to write an email uses roughly 0.1841 Watt-hours, which it says is the equivalent of running a microwave for 0.12 seconds. While these are only estimates and may not be totally precise, putting a price on each interaction is a good way to help users be more conscious of their usage. “Even small energy savings can scale up across millions of queries — model choice or output length can lead to major environmental impact,” the team behind the tool said.