Internet of Tires

Read an article a couple of weeks back (An internet of tires?… IEEE Spectrum) and can’t seem to get it out of my head. Pirelli, a European tire manufacturer was demonstrating a smart tire or as they call it, their new Cyber Tyre.

The Cyber Tyre includes accelerometer(s) in its rubber, that can be used to sense the pavement/road surface conditions. Cyber Tyre can communicate surface conditions to the car and using the car’s 5G, to other cars (of same make) to tell them of problems with surface adhesion (hydroplaning, ice, other traction issues).

Presumably the accelerometers in the Cyber Tyre measure acceleration changes of individual tires as they rotate. Any rapid acceleration change, could potentially be used to determine whether the car has lost traction due and why.

They tested the new tires out at a (1/3rd mile) test track on top of a Fiat factory, using Audi A8 automobiles and 5G. Unclear why this had to wait for 5G but it’s possible that using 5G, the Cyber Tyre and the car could possibly log and transmit such information back to the manufacturer of the car or tire.

Accelerometers have become dirt cheap over the last decade as smart phones have taken off. So, it was only a matter of time before they found use in new and interesting applications and the Cyber Tyre is just the latest.

Internet of Vehicles

Presumably the car, with Cyber Tyres on it, communicates road hazard information to other cars using 5G and vehicle to vehicle (V2V) communication protocols or perhaps to municipal or state authorities. This way highway signage could display hazardous conditions ahead.

Audi has a website devoted to Car to X communications which has embedded certain Audi vehicles (A4, A5 & Q7), with cellular communications, cameras and other sensors used to identify (recognize) signage, hazards, and other information and communicate this data to other Audi vehicles. This way owning an Audi, would plug you into this information flow.

Pirelli’s Cyber Car Concept

Prior to the Cyber Tyre, Pirelli introduced a Cyber Car concept that is supposedly rolling out this year. This version has tyres with real time pressure, temperature, (static) vertical load and a Tyre ID. Pirelli has been working with car manufacturers to roll out Cyber Car functionality.

The Tyre ID seems to be a file that can include anything that the tyre or automobile manufacturer wants. It sort of reminds me of a blockchain data blocks that could be used to validate tyre manufacturing provenance.

The vertical load sensor seems more important to car and tire manufacturers than consumers. But for electrical car owners, knowing car weight could help determine current battery load and thereby more precisely know how much charge is left in a battery.

Pirelli uses a proprietary algorithm to determine tread wear. This makes use of the other tyre sensors to predict wear and perhaps uses an AI DL algorithm to do this.

~~~

ABS has been around for decades now and tire pressure sensors for over 10 years or so. My latest car has enough sensors to pretty much drive itself on the highway but not quite park itself as of yet. So it was only a matter of time before something like smart tires would show up.

But given their integration with car electronics systems, it would seem that this would only make sense for new cars that included a full set of Cyber Tyres. That is until all tire AND car manufacturers agreed to come up with a standard protocol to communicate such information. When that happens, consumers could chose any tire manufacturer and obtain have similar if not the same functionality from them.

I suppose someone had to be first to identify just what could be done with the electronics available today. Pirelli just happens to be it for now in the tire industry.

I just don’t want to have to upgrade tires every 24 months. And, if I have to wait a long time for my car to boot up and establish communications with my tires, I may just take a (dumb) bike.

Photo Credit(s):

Cloudlets at the edge

Read an article (Never heard of Edge Computing….) this week on ATT’s presentations at their Spark Conference.  Apparently, ATT is saying that the problem with AR, VR/immersive gaming, self-driving cars, drones, etc. has been two fold, lack of bandwidth and processing latency.

The long latency issue comes from having current processing  for these devices being done mostly at cloud data centers, 100s of miles away from the device doing the work.

The upcoming 5G rollout should hopefully solve the bandwidth problem (for now at least) but the processing latency issue can only be dealt with by moving compute closer to where it’s needed.

A couple of weeks back I was at VMworld and one of the big announcements there was vSphere supporting 64 bit ARM processors. Pat and others talked up the coming edge processing tsunami, that will overtake IT as we know it today and bring significant benefits to everything from traffic management, to infrastructure maintenance, to better security for all, etc. Windows Server has been ported to ARM for Azure apps  for a while now but I don’t know if it’s been slated for external release

The new edge

Up until this point, I had always considered edge devices as sensors and other equipment embedded in buildings, land, sea, air, machinery, etc., that provided useful, realworld information/status about their environments and when  somethings gone wrong, that has to be fixed. I hadn’t really saw AR and, VR immersive gaming as an edge issue.  However, drones and self-driving cars are edge devices.

AR seems to rely on smart phone levels of computation and VR today is usually tethered to a desktop PC or Mac. But to take AR and VR to the next level, processing requirements need to go up.

Self-driving cars have their own army of compute processing and sensors to deal with realtime road recognition and accident avoidance. Drones have smart phone levels of compute onboard and a nearby laptop for additional processing and control support. Not sure that edge processing requirements for these devices is increasing but I’m no expert.

But, they all need more low-latency computation to become more effective, they all require lot’s of bandwidth and some of them at least, can only perform well, if both of these requirements are solved.

CloudLets

ATT has been experimenting with neighborhood data centers, test zones or cloudlets to supply this new,  low-latency processing.

These are apparently local (edge) mini-datacenters that host edge electronics gear for to \ow latency latency processing. ATT has one current test zone (or cloudlet) set up in Silicon Valley and has plans to roll out more across the US.

Up until this point, I thought edge processing would be solved by moving AI and other compute resources out to the devices themselves (see my AI processing at the edge post). Moore’s law would allow today’s compute capabilities to be embedded in low-power edge devices in a decade or so.

But why wait. If you can setup a mini-(ARM based)-data center in a  neighborhood cell-phone/telephone/cable/electrical cabinet, running vSphere or Windows virtualization, with high speed networking data connections to edge devices and the cloud, you can get by with less compute processing at the edge devices, enjoy low-latency responsiveness and use less cloud resources to boot.

~~~~

Doesn’t this mean we need mini-racklets, to stack our mini cloudlets compute resources, something like 9.5″ wide and 0.5U shelving.

Just when I thought (edge) decentralization would take over compute again, cloudlets come to take it back again.

Photo Credit(s): L10000901-Edit|Guide van Nispen

Augmented Reality RFid Cup|JeanBaptisteParis

The Great Escape|Edward Webb