Intel has spent the last 25 years training consumers to look for the "intel inside" sticker. It's the only thing that's kept intel afloat while they butcher their engineering tallent.
Pytorch is very firmly GLUED to CUDA. It will probably NEVER support anything else beyond token inference on mobile devices. The only reason Pytorch supports AMD at all is because of AMD's "I can't beleive it's not CUDA" HIP translation layer.
OpenCL is a real cross-platform API and with 3.0 it's finally "good" and coincidentally, intel is....half-heartedly interested in it, except they're shooting themselves in the foot by trying to also cover useless CPUs for inference/training and spreading themselves too thin (OneAPI). Because all intel can think about are CPUs. Everything must drive sales of CPUs.
At this rate just about the only think that might save us from CUDA is rusticl. If a real, full-fat, high-quality openCL 3.0 driver sudently popped into existence on every GPU platform under the sun, maybe pytorch et al could finally be convinced to give a shit about an API other than CUDA.
They are only moving because AMD is finally providing that competition, but it's not fair to say that "if only AMD had gotten their act together sooner, intel would be a better comapny."
Ultimately, intel chose to let intel rot while AMD was out of the picture. (ignoring that intel's backroom deals with Dell and co. are a big part of what pushed AMD out)
AMD and ARM-based solutions too. I can't picture wanting to run Windows on ARM for a few product generations but for server use I'd have no problem w/ Linux on ARM.
Intel tried to get away from x86 over and over again with radically overengineered architectures that didn't work (as well as mainstream architectures) and failed:
Honestly, they'd probably be better off if they ditched all the sed/awk/macro BS and just went back to bash scripts (or perl/TCL, if you don't like weird syntax issues) that spat out C code. Rust saw the writing on the wall and implemented proc macros.
Continue forever? no. Be performed once at no cost? maybe.
I'm kinda sceptical that a computation to which the 2nd law is indifferent would occur spontaneously without immediately reversing. The 2nd law is what determines the direction that things typically progress.
consumer, at the store: "why doesn't this thing have <magic maguffin>!? bring me something that has <magic maguffin>!!!"