Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
broknbottle
on Nov 30, 2024
|
parent
|
context
|
favorite
| on:
Llama.cpp guide – Running LLMs locally on any hard...
the irony, preventing and killing something that is actually useful, while we let crowdcrap hum along consuming tons of memory and bottlenecking IO so it can do snakeoil things...
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: