Home

On the edge

Maybe computation should happen on your devices as much as possible to lower power infrastructure demand
25th January 2026, Less than one minute to read

Moving ‘per-user’ computation to the cloud feels like a net-negative for energy consumption. Is the reduction in on-device resource usage greater than the increase in data centre usage?

When coupled with the transfer of wealth not to the service, but the infrastructure, I wonder if we’ve got this all the wrong way around. Not only does local compute give power to the user — what happens on your local device, stays on your local device — but it also distributes the power consumption more broadly.

Are SaaS vendors moving calculations to the client today, and only storing the results in cloud storage, off-loading work to the clients? Does this end up trading computation for network egress costs?

Maybe I just don’t understand the aggregate power usage patterns?