Google also has AI models optimized to run on phones, they're just in a lot better of a position to actually build purpose-built LLMs for phones.
It's not clear to me why certain classes of things still end up farmed out to the cloud (such as this, or is it?). Maybe their LLM hasn't been built in a very pluggable fashion.
It's not clear to me why certain classes of things still end up farmed out to the cloud (such as this, or is it?). Maybe their LLM hasn't been built in a very pluggable fashion.