Legit feels like Nvidia just buying out competition to maintain their position and power in the industry. I sincerely hope they fall flat on their face.
> Legit feels like Nvidia just buying out competition to maintain their position and power
Well, I mean, isn't that exactly what they should be doing? (I'm not talking about whether or not it benefits society; this is more along the lines of how they're incentivized.)
Put yourself in their shoes. If you had all that cash, and you're hearing people talk of an "AI Bubble" on a daily basis, and you want to try and ensure that you ride the wave without ever crashing... the only rational thing to do is use the money to try and cover all your bases. This means buying competitors and it also means diversifying a little bit.
Dunno thought AGI would make everything obsolete and it's just around the corner? It looks rather like it dawns on everyone that transformers won't bring salvation. It's a show of weakness.
What a business should do, sure. Businesses should - and do - do a lot of really shitty things because it benefits them but harms a lot of other things. I don't feel that it's a good justification to argue this way though.
In this case, removing a competitor, absorbing their IP, and maintain their ability to dictate the direction of an entire industry. They're hurting the industry itself by removing competition, since competition is good for consumers and also good for progression forward.
Businesses with a monopoly of some sort often stop innovating in the space and end up slowing the entire thing down. Often, they do their best to block anything and anyone that tries to do better, and effectively keep progress back in doing so, simply to maintain their position.
They're selfish self-preserving entities often driven by the same kinds of people, disregarding the harm they do in the name of profits and shareholder "value". Sure, until someone disrupts that (or they get bought out and dissolved).
The bottleneck in training and inference isn’t matmul, and once a chip isn’t a kindergarten toy you don’t go from FPGA to tape out by clicking a button. For local memory he’s going to have to learn to either stack DRAM (not “3000 lines of verilog” and requires a supply chain which openai just destroyed) or diffuse block RAM / SRAM like Groq which is astronomically expensive bit for bit and torpedoes yields, compounding the issue. Then comes interconnect.
There's this curious experience of people bringing up geohot / tinygrad and you can tell they've been sold into a personality cult.
I don't mean that pejoratively, I apologize for the bluntness. It's just I've been dealing with his nonsense since iPhone OS 1.0 x jailbreaking, and I hate seeing people taken advantage of.
(nvidia x macs x thunderbolt has been a thing for years and years and years, well before geohot) (tweet is non-sequitor beyond bogstandard geohot tells: odd obsession with LoC, and we're 2 years away from Changing The Game, just like we were 2 years ago)
My deepest apologies, I can't parse this and I earnestly tried: 5 minutes of my own thinking, then 3 llms, then a 10 minute timer of my own thinking over the whole thing.
My guess is you're trying to communicate "tinygrad doesn't need gpu drivers" which maybe is transmutated into "tinygrad replaces CUDA" and you think "CUDA means other GPUs can't be used for LLMs, thus nvidia has a strangehold"
I know George has pushed this idea for years now, but, you have to look no further than AMD/Google making massive deals to understand how it works on the ground.
I hope he doesn't victimize you further with his rants. It's cruel of him to use people to assuage this own ego and make them look silly in public.
Look dude, this guy failed his Twitter internship and is not about to take on Jensen Huang. This isn't some young guy anymore and this isn't 200x where is he about to have another iPhone / Sony moment.