Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

There is a reason I kept emphasizing the ChatGPT product. The (paid) ChatGPT product is not just a text based LLM. It can interpret images, has a built in Python runtime to offload queries that LLMs aren’t good at like math, web search, image generation, and a couple of other integrations.

The local LLM on iPhones are literally 1% as powerful as the server based models like 4o.

That’s not even considering battery considerations



> The local LLM on iPhones are literally 1% as powerful as the server based models like 4o.

Currently, yes. That's why this is a compelling advance - it makes local LLMs much more feasible, especially if this is just the first of many breakthroughs.

A lot of the hype around OpenAI has been due to the fact that buying enough capacity to run these things wasn't all that feasible for competitors. Now, it is, potentially even at the local level.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: