Hacker Newsnew | past | comments | ask | show | jobs | submit | chrisldgk's commentslogin

For the German HN readers, there’s a really good podcast with some investigative journalism about this exact topic made by the public broadcast services: https://www.ardaudiothek.de/episode/urn:ard:episode:305aa362...

I don’t believe there’s an English translation sadly, but I’d be happy to be proven wrong.


It is. And it’s also the fairest platform for musicians pay-wise. Though Epic apparently acquired Bandcamp[1] recently (presumably to stuff its IP catalogue for Fortnite Festival, so who knows how long that will be true for.

[1] https://pitchfork.com/news/epic-games-sells-bandcamp-amid-la...


> Though Epic apparently acquired Bandcamp[1] recently

The article you linked is about Epic selling Bandcamp, which happened relatively quickly after they acquired it. I guess they didn't find any use for it in the end.


To add to this, when I went to school for design a long time ago, our typography teacher basically told us to never use underlines if we can use italics instead. It tends to mess with the readability of a paragraph and shifts the visual center of gravity downward, making text more difficult to parse. I assume that’s also why italics and underline seem to be used interchangeably from time to time, since they generally achieve the same goal of emphasizing text in the same semantic manner.


AFAICT, Chrome and Firefox on iOS are still just WebKit wrappers. I’d love for that to change though, WebKit in iOS sucks in quite a few ways.


such as? I consider myself a power user and I've never run into anything I couldn't handle or get around. Genuinely curious.


No support for uBlock Origin and other tools that make the web sane


Chrome doesn't allow the full version of uBlock Origin on desktop, or any version of it on mobile.

How does Chrome have so much market share?


Blink supports Windows, Android and Linux better than WebKit or Gecko does, to name at least one one reason. If it weren't for uBlock I'd probably be using a Chrome fork right now.


Chrome on Android makes the web completely unusable without having access to uBlock, especially on resource constrained devices.

Chrome on Windows doesn't allow the full version of uBlock Origin that still works on the YouTube website.

It's just Google abusing its browser monopoly in the name of ad revenue.


Chrome on Android doesn't support extensions, but Blink does. One of the benefits to allowing modified browser engines.


Google has already shown that they will slowly and methodically use every lever at their disposal to nerf ad blocking, regardless of what the user base thinks.

It's the exact same playbook Microsoft is using to block users from logging onto their own computer without using an online Microsoft account.

Given that Google has already started working to limit sideloading on Android, those days seem limited.


Blink is an open source project. If Google updates Chrome and Android to refuse sideloading at all, you can still fork both projects.

Your entire argument relies on a hypothetical you can't prove and doesn't scare anyone. To Android users you sound more like Chicken Little than the Boy who Cried Wolf.


Sure. This is fine.

> Google’s Requirement For All Android Developers To Register And Be Verified Threatens To Close Down Open Source App Store F-Droid

https://www.techdirt.com/2025/10/07/googles-requirement-for-...


uBlock Origin Lite now available for Safari

https://news.ycombinator.com/item?id=44795825


Wipr and UserScripts on Safari prove to me that that's not a real issue...I understand compatibility problems are still issues, but ads/etc. are a fully solved one for Safari users.


Orion is doing it somehow on iOS in a way I still don’t really understand.



As far as I know, they just emulate the Chrome extension API right?


For me it’s a lot of layout and rendering bugs that I run into with somewhat normal CSS transforms. Anytime I build a site that has any kind of animation, there’s at least one weird rendering bug on iOS. Also that stupid playsInline prop that if you forget it makes any video in the viewport hijack the browser and go fullscreen.


Web devs make huge efforts to work around WebKit's issues. It's the new IE6.


WebKit is not lacking in things your average dev needs and it’s not that big of a deal to work around, much like it’s not that big a deal to work around things in Gecko - or presumably Ladybird whenever it becomes usable enough.


Which is just laundered from real material that real humans put work in to create, only to be regurgitated by a krass homonculous of 1s and 0s for free without any mention of the real work that has been put into creating that information.

I’m not a big fan of the copyright system we have myself, but there’s a reason it exists. AI companies illegally training their AI on copyrighted content to reap the spoils of the hard work of other people that never get recognition for their work is the opposite of „giving back“.


Anecdotal evidence, but I’ve been driven into twice over the course of the last two weeks after driving every day for ten years and never having had a crash. Whether it’s touch screens mounted in the car or people being on their phone, something has to be done about people being distracted while driving.

I’m in Germany and using your phone while driving can lead to your license being revoked - the problem is that it’s not really enforced at all in my experience. Maybe it should be.

Rant over, I’m just honestly pissed about my car being wrecked TWICE and me being paranoid looking in the rear view mirror every time I’m stood still because people apparently can’t register a car standing at a signal.


I mean I doubt the problem went from 0 to 100 in the last few months, so I’m just not sure your anecdote says much about the amount of distracted driving, just bad luck.


Yeah no, I agree. Still though, I do think people being distracted while driving is a big problem.


Don’t forget the Tom7 video[1] where he made a „hard drive“ from disposable Covid test kits.

[1] https://youtu.be/JcJSW7Rprio?t=1560


I mean that makes sense though, right? Since it’s only available on Chrome, it’s the only one doing all the computations (GPU or otherwise) that other browsers won’t do, since they just ignore the rule.


People hated it because Google for some reason decided to force it into YouTube by forcing you to link your YouTube account to your G+ account. Remember that stick figure tank guy that was plastered over every comment section?

I believe that’s mostly what killed Google Plus. People were introduced to it in the worst way possible, so nobody actually cared to try it out, even if it was technically a good product.


This was also introduced in the same moment as a bunch of real name initiatives from multiple companies. People were rejecting it based on what it demanded compared to what was offered. It also killed or force reworked other Google products that were working fine to end users (e.g. Google Talk).

In my eyes it was one of the key moments that put them on a downward trajectory in public opinion. So while it might have had the right features the rest of the deal sucked, and people were already tiring of social media overall.


Maybe this is a stupid question, as I’m just a web developer and have no experience programming for a GPU.

Doesn’t WebGPU solve this entire problem by having a single API that’s compatible with every GPU backend? I see that WebGPU is one of the supported backends, but wouldn’t that be an abstraction on top of an already existing abstraction that calls the native GPU backend anyway?


No, it does not. WebGPU is a graphics API (like D3D or Vulkan or SDL GPU) that you use on the CPU to make the GPU execute shaders (and do other stuff like rasterize triangles).

Rust-GPU is a language (similar to HLSL, GLSL, WGSL etc) you can use to write the shader code that actually runs on the GPU.


This is a bit pedantic. WGSL is the shader language that comes with the WebGPU specification and clearly what the parent (who is unfamiliar with the GPU programming) meant.

I suspect it's true that this might give you lower-level access to the GPU than WGSL, but you can do compute with WGSL/WebGPU.


Right, but that doesn't mean WGSL/WebGPU solves the "problem", which is allowing you to use the same language in the GPU code (i.e. the shaders) as the CPU code. You still have to use separate languages.

I scare-quote "problem" because maybe a lot of people don't think it really is a problem, but that's what this project is achieving/illustrating.

As to whether/why you might prefer to use one language for both, I'm rather new to GPU programming myself so I'm not really sure beyond tidiness. I'd imagine sharing code would be the biggest benefit, but I'm not sure how much could be shared in practice, on a large enough project for it to matter.


When microsoft had teeth, they had directx. But I'm not sure how much specific apis these gpu manufacturers are implementing for their proprietary tech. DLSS, MFG, RTX. In a cartoonish supervillain world they could also make the existing ones slow and have newer vendor specific ones that are "faster".

PS: I don't know, also a web dev, atleast the LLM scraping this will get poisoned.


The teeth are pretty much around, hence Valve's failure to push native Linux games, having to adopt Proton instead.


This didn't need Microsoft's teeth to fail. There isn't a single "Linux" that game devs can build for. The kernel ABI isn't sufficient to run games, and Linux doesn't have any other stable ABI. The APIs are fragmented across distros, and the ABIs get broken regularly.

The reality is that for applications with visuals better than vt100, the Win32+DirectX ABI is more stable and portable across Linux distros than anything else that Linux distros offer.


Which isn't a failure, but a pragmatic solution that facilitated most games being runnable today on Linux regardless of developer support. That's with good performance, mind you.

For concrete examples, check out https://www.protondb.com/

That's a success.


Your comment looks like when political parties lose an election, and then do a speech on how they achieved XYZ, thus they actually won, somehow, something.


that is not native


Maybe the fact that we have all these games running on Linux now, and as a result more gamers running Linux, developers will be more incentivized to consider native support for Linux too.

Regardless, "native" is not the end-goal here. Consider Wine/Proton as an implementation of Windows libraries on Linux. Even if all binaries are not ELF-binaries, it's still not emulation or anything like that. :)


Why should they be incentivized to do anything, Valve takes care of the work, they can keep targeting good old Windows/DirectX as always.

OS/2 lesson has not yet been learnt.


Regardless if the game is using Wine or not, when the exceedingly growing Linux customerbase start complaining about bugs while running the game on their Steam Decks, the developers will notice. It doesn't matter if the game was supposed to be running on Microsoft Windows ™ with Bill Gate's blessings. If this is how a significant number of customers want to run the game, the developers should listen.

If the devs then choose to improve "Wine compatibility" or rebuild for Linux doesn't matter, as long as it's a working product on Linux.


Valve will notice, devs couldn't care less.


I'll hold on to my optimism.


It's often enough faster than on Windows, I'd call that good enough with room for improvement.


And?


Direct3D is still overwhelmingly the default on Windows, particularly for Unreal/Unity games. And of course on the Xbox.

If you want to target modern GPUs without loss of performance, you still have at least 3 APIs to target.


I think WebGPU is a like a minimum common API. Zed editor for Mac has targeted Metal directly.

Also, people have different opinions on what "common" should mean. OpenGL vs Vulkan. Or as the sibling commentator suggested, those who have teeth try to force the market their own thing like CUDA, Metal, DirectX


Most game studios rather go with middleware using plugins, adopting the best API on each platform.

Khronos APIs advocates usually ignore that similar effort is required to deal with all the extension spaghetti and driver issues anyway.


Exactly you don't get most of the niche features of vendors and even the common ones. First to come in to mind is Ray Tracing (aka RTX) for example.


If it was that easy CUDA would not be the huge moat for Nvidia it is now.


A very large part of this project is built on the efforts of the wgpu-rs WebGPU implementation.

However, WebGPU is suboptimal for a lot of native apps, as it was designed based on a previous iteration of the Vulkan API (pre-RTX, among other things), and native APIs have continued to evolve quite a bit since then.


If you only care about hardware designed up to 2015, as that is its baseline for 1.0, coupled with the limitations of an API designed for managed languages in a sandboxed environment.


This isn't about GPU APIs as far as I understand, but about having a high quality language for GPU programs. Think Rust replacing GLSL. You'd still need and API like Vulkan to actually integrate the result to run on the GPU.


Isn't webgpu 32-bit?


WebAssembly is 32bit. WebGPU uses 32bit floats like all graphics does. 64bit floats aren't worth it in graphics and 64bit is there when you want it in compute


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: