Q: I don’t really understand how I can have AI running on my phone or tablet without any impact on battery life but keep reading how companies like OpenAI, Perplexity, and Google need zillion watt power plants and half the Earth’s water to run their AI systems. What’s the explanation?
Dave Taylor / TechnologyA: There’s a lot of confusion about this dichotomy of power needs in the world of generative artificial intelligence! It’s more nuanced too, because sometimes what you think is running locally on your device is still sneaking online for processing oomph.
One way to tell is to see what AI features work when you’re in airplane mode. You might be surprised what fails!
It’s really a difference of power and capabilities. Online AI systems like ChatGPT, Gemini and Grok have massive data and processing available, allowing millions of simultaneous queries and faster responses. These data centers are running thousands of servers to achieve these results, requiring massive cooling systems and lots of electricity.
Transcribing voice, creating simple images, even language translation can be managed locally with the most modern of AI-ready devices. These are distinguished by having a Neural Processing Unit or NPU. The latest Google Pixel 10 units feature this, as does the Apple iPhone 17 lineup.
Local generative AI language models can range from under a gigabyte to 5GB or more of data “memory.” Online cloud-based LLM systems, however, can be 200GB, 500GB, even 1TB or larger, representing significantly broader and deeper knowledge.
That’s why task-constrained AI is such a winner: If you’re just going to be asking for help with high school algebra, it doesn’t need to know every factoid about world history, obscure spoken languages, or the cast and crew of thousands of movies.
I also want to clarify that heavy AI usage on your device will use more battery power just as gaming on your phone or tablet will run through battery faster than if you’re just scrolling your social media feed.
Quantifying power requirements
It’s worth noting that individual queries to the large online LLM systems like ChatGPT typically require around 0.001kWh of electricity. The equivalent of a 10W bulb running for six minutes. Image generation requires 0.01-0.1 kWh per image, roughly the same as charging your smartphone twice.
Water usage is all about cooling those busy processors and are a bit harder to quantify. The best figures I could find suggest approximately 500mL per text query and 2-3 liters per image request. Of course, once water has cooled down, it can be cycled again to cool the data center, so it’s not all consumed, per se.
The more you can use local AI tools instead of centralized systems, the lower your impact. As with so much in the tech world, it’s a trade-off of power, capabilities and costs.
Dave Taylor has been involved with the online world since the beginning of the Internet. He runs the popular AskDaveTaylor.com tech Q&A site and invites you to subscribe to his weekly email newsletter at AskDaveTaylor.com/subscribe/ You can also find his entertaining gadget reviews on YouTube at YouTube.com/AskDaveTaylor.
Hence then, the article about dave taylor why is ai so power greedy was published today ( ) and is available on GreeleyTribune ( Saudi Arabia ) The editorial team at PressBee has edited and verified it, and it may have been modified, fully republished, or quoted. You can read and follow the updates of this news or article from its original source.
Read More Details
Finally We wish PressBee provided you with enough information of ( Dave Taylor: Why is AI so power greedy? )
Also on site :
- AFCON final stadium cost £200m and sparked outrage after four tournament games
- Senegal vs Morocco live: Afcon final latest updates as Bono denies Teranga Lions early goal
- David Gardner: Your net worth statement — simple, powerful and kind
