Very interesting. I've been looking for an easy to use a large LLM that could use the 64Gb of unified memory on my M1 Max chip.
I don't like interacting with AI in the cloud that collects my data and is not private.
I was surprised to learn that Apple has released a M3 Max with 128Gb unified memory. That would really be powerful and could run huge models.
I'll let you know how it goes.