I totally understand that the hardware might be sufficient for inference... but the way that I understand it is that the compute power is potentially restricted by the amount of data that can be uploaded and downloaded to those computers. If data can't be transferred quickly then the compute power of each device becomes far less important (as I understand it).
You can imagine that Amazon, xAI, would have incredible data connections - whereas Tesla might be relying on SIM cards and suboptimal WiFi connections.