No-power sensors surface due to computational energy efficiency trends

Koomeys_law_graph,_made_by_Koomey (cc) (from
Koomeys_law_graph,_made_by_Koomey (cc) (from

Read an article The computing trend that will change everything in MIT’s TechReview today  about the trend in energy consumption per unit of computation.

Along with Moore’s law dictating that  transister density doubles every 18 to 24 months, there is Koomey’s law that states that computational power efficiency or computations per watt, will double every 1.57 yrs.

Koomey’s law has made today’s smart phones and tablets possible.  If your current laptop were computing at the power efficiency of 1991 computers their batteries would last ~2.5 seconds.

No-power sensors?!

But this computing efficiency trend is giving rise to no-power sensors/devices, or computational sensors without batteries.  These new sensors gather electrical energy from “ambient radio waves” in the air, and by doing so harvest enough electricity to power computations and as such, don’t need batteries.

Such devices can gather ~50μwatts of power from a TV transmitter just 2.5 miles away.  Most calculators only use ~5μwatts and digital thermoters around 1μwatt, so 50 is enough to do some reasonable amounts of sensing work.

But the exciting part is that as Koomley’s law continues, the amount of work that 50μwatts or even 5μwatts supports doubles again every 1.6 years.  For example, the computational power of today’s laptops will only consume infinitesimal amounts of power in ~two decades time.  Thus, no-power-sensors of 2034 will be very smart indeed.

“Any sufficiently advanced technology is indistinguishable from magic”, Arthur C. Clarke

Data transmission efficiency not keeping up

Nonetheless, the fact that computational efficiency is doubling every 1.6 years doesn’t mean the data transmission efficiency is doing the same.  Which means that for the foreseeable future, data transmission may remain a crucial bottleneck for no-power sensors.

However, computational increases can somewhat compensate for data transmission limitations by more efficient encoding, compression, etc. But there are limits as to what can be accomplished within any data transmission technology.


Thus, for the foreseeable future, although sensors will be able to do lots more computations, what they transmit to the outside world may remain limited.  Giving rise to smart, no-power sensors providing very miniscule data packages.

One term coined to describe such limited external data transmission from no-power computationally intense sensors is nanodata.   Because of their ability to exist outside the power grid, it is very likely that the future sensor cloud or internet-of-things will be primarily comprised of such nanodata devices.

I was at SNW last week and there was some discussion of “little data” or data in corporate databases, in contrast with big data.  But nanodata is something I had never heard of before today.

So now we have big data, little data, and nanodata.  Seems like are missing a few steps here…