The Plummeting Cost of AI Inference And What It Means

in LeoFinance5 months ago

▶️ Watch on 3Speak


We are seeing the cost of inference computing dropping very rapidly. We have projections where ChatGPT 4.0 quality could be attained on a smartphone by the end of 2026.

In this video I discuss how this will localize AI and how it could change things. This opens up the potential for many agents to run locally while avoiding cloud processing.


▶️ 3Speak

Sort:  

That's great news. If they are able to localize AI faster, then the training it can get will be significantly more. Being able to 'converse' with more humans can give it varied data, and the developers will have more idea on what they need to work on.