You are viewing a single comment's thread from:

RE: Inference Compute: More Important Than Neural Networks

in LeoFinancelast year

All Tesla's come with SIM cards so the question is connectivity. Some opt not to connect hence go through WiFi. If they connect, it is $9.99 per month which gives the ability to get over the air updates and infotainment package while moving.

As for the inference, it is dependent upon the hardware if would imagine. each generation of hardware can handle more. The size of hardware 5 will likely allow it to be moving, even on FSD, and handle inference. I am not sure of the potential of it on the earlier versions. It might have to be done when the vehicle is not being utilized.

Sort:  

I totally understand that the hardware might be sufficient for inference... but the way that I understand it is that the compute power is potentially restricted by the amount of data that can be uploaded and downloaded to those computers. If data can't be transferred quickly then the compute power of each device becomes far less important (as I understand it).

You can imagine that Amazon, xAI, would have incredible data connections - whereas Tesla might be relying on SIM cards and suboptimal WiFi connections.