It is. And it’s also the fairest platform for musicians pay-wise. Though Epic apparently acquired Bandcamp[1] recently (presumably to stuff its IP catalogue for Fortnite Festival, so who knows how long that will be true for.
> Though Epic apparently acquired Bandcamp[1] recently
The article you linked is about Epic selling Bandcamp, which happened relatively quickly after they acquired it. I guess they didn't find any use for it in the end.
To add to this, when I went to school for design a long time ago, our typography teacher basically told us to never use underlines if we can use italics instead. It tends to mess with the readability of a paragraph and shifts the visual center of gravity downward, making text more difficult to parse.
I assume that’s also why italics and underline seem to be used interchangeably from time to time, since they generally achieve the same goal of emphasizing text in the same semantic manner.
Blink supports Windows, Android and Linux better than WebKit or Gecko does, to name at least one one reason. If it weren't for uBlock I'd probably be using a Chrome fork right now.
Google has already shown that they will slowly and methodically use every lever at their disposal to nerf ad blocking, regardless of what the user base thinks.
It's the exact same playbook Microsoft is using to block users from logging onto their own computer without using an online Microsoft account.
Given that Google has already started working to limit sideloading on Android, those days seem limited.
Blink is an open source project. If Google updates Chrome and Android to refuse sideloading at all, you can still fork both projects.
Your entire argument relies on a hypothetical you can't prove and doesn't scare anyone. To Android users you sound more like Chicken Little than the Boy who Cried Wolf.
Wipr and UserScripts on Safari prove to me that that's not a real issue...I understand compatibility problems are still issues, but ads/etc. are a fully solved one for Safari users.
For me it’s a lot of layout and rendering bugs that I run into with somewhat normal CSS transforms.
Anytime I build a site that has any kind of animation, there’s at least one weird rendering bug on iOS.
Also that stupid playsInline prop that if you forget it makes any video in the viewport hijack the browser and go fullscreen.
WebKit is not lacking in things your average dev needs and it’s not that big of a deal to work around, much like it’s not that big a deal to work around things in Gecko - or presumably Ladybird whenever it becomes usable enough.
Which is just laundered from real material that real humans put work in to create, only to be regurgitated by a krass homonculous of 1s and 0s for free without any mention of the real work that has been put into creating that information.
I’m not a big fan of the copyright system we have myself, but there’s a reason it exists. AI companies illegally training their AI on copyrighted content to reap the spoils of the hard work of other people that never get recognition for their work is the opposite of „giving back“.
Anecdotal evidence, but I’ve been driven into twice over the course of the last two weeks after driving every day for ten years and never having had a crash. Whether it’s touch screens mounted in the car or people being on their phone, something has to be done about people being distracted while driving.
I’m in Germany and using your phone while driving can lead to your license being revoked - the problem is that it’s not really enforced at all in my experience. Maybe it should be.
Rant over, I’m just honestly pissed about my car being wrecked TWICE and me being paranoid looking in the rear view mirror every time I’m stood still because people apparently can’t register a car standing at a signal.
I mean I doubt the problem went from 0 to 100 in the last few months, so I’m just not sure your anecdote says much about the amount of distracted driving, just bad luck.
I mean that makes sense though, right?
Since it’s only available on Chrome, it’s the only one doing all the computations (GPU or otherwise) that other browsers won’t do, since they just ignore the rule.
People hated it because Google for some reason decided to force it into YouTube by forcing you to link your YouTube account to your G+ account. Remember that stick figure tank guy that was plastered over every comment section?
I believe that’s mostly what killed Google Plus. People were introduced to it in the worst way possible, so nobody actually cared to try it out, even if it was technically a good product.
This was also introduced in the same moment as a bunch of real name initiatives from multiple companies. People were rejecting it based on what it demanded compared to what was offered. It also killed or force reworked other Google products that were working fine to end users (e.g. Google Talk).
In my eyes it was one of the key moments that put them on a downward trajectory in public opinion. So while it might have had the right features the rest of the deal sucked, and people were already tiring of social media overall.
Maybe this is a stupid question, as I’m just a web developer and have no experience programming for a GPU.
Doesn’t WebGPU solve this entire problem by having a single API that’s compatible with every GPU backend? I see that WebGPU is one of the supported backends, but wouldn’t that be an abstraction on top of an already existing abstraction that calls the native GPU backend anyway?
No, it does not. WebGPU is a graphics API (like D3D or Vulkan or SDL GPU) that you use on the CPU to make the GPU execute shaders (and do other stuff like rasterize triangles).
Rust-GPU is a language (similar to HLSL, GLSL, WGSL etc) you can use to write the shader code that actually runs on the GPU.
This is a bit pedantic. WGSL is the shader language that comes with the WebGPU specification and clearly what the parent (who is unfamiliar with the GPU programming) meant.
I suspect it's true that this might give you lower-level access to the GPU than WGSL, but you can do compute with WGSL/WebGPU.
Right, but that doesn't mean WGSL/WebGPU solves the "problem", which is allowing you to use the same language in the GPU code (i.e. the shaders) as the CPU code. You still have to use separate languages.
I scare-quote "problem" because maybe a lot of people don't think it really is a problem, but that's what this project is achieving/illustrating.
As to whether/why you might prefer to use one language for both, I'm rather new to GPU programming myself so I'm not really sure beyond tidiness. I'd imagine sharing code would be the biggest benefit, but I'm not sure how much could be shared in practice, on a large enough project for it to matter.
When microsoft had teeth, they had directx. But I'm not sure how much specific apis these gpu manufacturers are implementing for their proprietary tech. DLSS, MFG, RTX. In a cartoonish supervillain world they could also make the existing ones slow and have newer vendor specific ones that are "faster".
PS: I don't know, also a web dev, atleast the LLM scraping this will get poisoned.
This didn't need Microsoft's teeth to fail. There isn't a single "Linux" that game devs can build for. The kernel ABI isn't sufficient to run games, and Linux doesn't have any other stable ABI. The APIs are fragmented across distros, and the ABIs get broken regularly.
The reality is that for applications with visuals better than vt100, the Win32+DirectX ABI is more stable and portable across Linux distros than anything else that Linux distros offer.
Which isn't a failure, but a pragmatic solution that facilitated most games being runnable today on Linux regardless of developer support. That's with good performance, mind you.
Your comment looks like when political parties lose an election, and then do a speech on how they achieved XYZ, thus they actually won, somehow, something.
Maybe the fact that we have all these games running on Linux now, and as a result more gamers running Linux, developers will be more incentivized to consider native support for Linux too.
Regardless, "native" is not the end-goal here. Consider Wine/Proton as an implementation of Windows libraries on Linux. Even if all binaries are not ELF-binaries, it's still not emulation or anything like that. :)
Regardless if the game is using Wine or not, when the exceedingly growing Linux customerbase start complaining about bugs while running the game on their Steam Decks, the developers will notice. It doesn't matter if the game was supposed to be running on Microsoft Windows ™ with Bill Gate's blessings. If this is how a significant number of customers want to run the game, the developers should listen.
If the devs then choose to improve "Wine compatibility" or rebuild for Linux doesn't matter, as long as it's a working product on Linux.
I think WebGPU is a like a minimum common API. Zed editor for Mac has targeted Metal directly.
Also, people have different opinions on what "common" should mean. OpenGL vs Vulkan. Or as the sibling commentator suggested, those who have teeth try to force the market their own thing like CUDA, Metal, DirectX
A very large part of this project is built on the efforts of the wgpu-rs WebGPU implementation.
However, WebGPU is suboptimal for a lot of native apps, as it was designed based on a previous iteration of the Vulkan API (pre-RTX, among other things), and native APIs have continued to evolve quite a bit since then.
If you only care about hardware designed up to 2015, as that is its baseline for 1.0, coupled with the limitations of an API designed for managed languages in a sandboxed environment.
This isn't about GPU APIs as far as I understand, but about having a high quality language for GPU programs. Think Rust replacing GLSL. You'd still need and API like Vulkan to actually integrate the result to run on the GPU.
WebAssembly is 32bit. WebGPU uses 32bit floats like all graphics does. 64bit floats aren't worth it in graphics and 64bit is there when you want it in compute
I don’t believe there’s an English translation sadly, but I’d be happy to be proven wrong.