Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don't know what you mean by this at all tbh.

I don't think the researchers at the top think LLM is AGI.

DeepMind and co are already working on world models.

The biggest bottleneck right now is compute compute and compute. If an experiement takes MONTH to train, you want a lot more compute. You need compute to optimize what you already have like LLMs and then again a lot of compute to try out new things.

All of the compute/Datacenters and GPUs are not LLM GPUs. They are ML capable GPUs.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: