Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I can imagine Apple shipping Mac Pros with add-ons that allows running local inference with minimal setups. "Look, just spend $50k on this machine and you get a usable LLM server that can be shared for a team." But they don't seem particularly interested in that market.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: