Hacker Newsnew | past | comments | ask | show | jobs | submit | randomint64's commentslogin

Indeed, Rust's supply chains story is an absolute horror, and there are countless articles explaining what should be done instead (e.g. https://kerkour.com/rust-stdx)

TL;DR: ditch crates.io and copy Go with decentralized packages based directly on and an extended standard library.

Centralized package managers only add a layer of obfuscation that attackers can use to their advantage.

On the other hand, C / C++ style dependency management is even worse than Rust's... Both in terms of development velocity and dependencies that never get updated.


> countless articles explaining what should be done instead (e.g. https://kerkour.com/rust-stdx)

Don't make me tap the sign: https://news.ycombinator.com/item?id=41727085#41727410

> Centralized package managers only add a layer of obfuscation that attackers can use to their advantage.

They add a layer of convenience. C/C++ are missing that convenience because they aren't as composable and have a long tail of pre-package manager projects.

Java didn't start with packages, but today we have packages. Same with JS, etc.


Or from another angle, dpkg/apt is the package manager for C/C++ ...


Yeah, but it's not immune to supply chain attacks. Counting on maintainers of dpkg is not that different from counting on maintainers of random crate package.


WebAuthn may be one of the most important security technology of the decade. It's a revolution in key management, which may be the hardest part of applied cryptography.

Passkeys enable phishing-resistant and 1-click authentication.

The PRF extension discussed here enables end-to-end encryption of data (with envelope encryption). Think about secure chat backups, double factor disk encryption (password + security key) and much more.

Soon we will be able to sign apps bundle (APKs, IPAs) with hardware security keys.

Great times are ahead for those who care about securing their users' data.


No, it won't.

We've had this technology for decades, TLS client auth with X.509 certificates has been in browsers for a very long time. There just never was any interest in it, and never any investment into making the UI/UX usable beyond the most trivial use cases.

Passkeys are trumped-up certificates with a maybe-optional (depending on attestation status) hardware keystore. And lots of vendor lockin for Google, Apple and Microsoft. The only reason that there is a push right now is that big-vendor interest in lockin.


A solution that’s perfect except for onboarding (people usually need to pay to get a client X.509 cert!), UX, and authenticating to the completely wrong entity (the TLS terminating load balancer instead of the application or authentication server holding user public key credentials).

Surprising how that didn’t become a slam dunk replacing passwords!


Nope. It used to be that browsers even had a Javascript API to create a keypair and submit the public part to the website in question to register or sign for access privileges. Exactly what Passkeys do nowadays. You never had to pay for your client cert if you didn't need it for mail signatures or something.


UI/UX is extremely important to the impact of technology, doubly so for security technologies which often are held back by the difficulty of using them correctly.


That is correct. What I mean is that if there had been any widespread interest, then browser-makers would probably have fixed their UI/UX long ago. But since there never was any interest, nothing was fixed.


There is tremendous interest. TLS client certificates are just a categorically wrong solution to user authentication.


It is basically the same solution as Passkeys. CA involvement in TLS client auth is totally optional.


Rust is a serious contender is this space, and closing the gap quickly.


Including introducing new features on 6 weeks basics, just wait until Rust also gets 40 years of history.


It's true, improvements to Rust ship on a six week cycle, the next will be Rust 1.69. Nice. I was inspired to improve a compiler diagnostic earlier this year†, I benefit from that improvement already in the stable compiler today. Whereas if you "miss the train" with standard C++ you've got three years to wait each time, and of course the Powers That Be can ensure that oops, you just missed the train again...

Of course Rust's improvements are actually compatible, not only by fiat, but because Rust's automation extensively tests each of these six weekly releases against the vast field of Free Software out there written in Rust. Now maybe this is secretly happening for C++ and they're just very bad at it. Or, as seems more likely, it's not done, the results are the same either way, new C++ versions require extensively manual testing to upgrade your software before you can take advantage without too much fear.

† Rust knows that characters like 'A' aren't necessarily one byte, and it deliberately doesn't coerce them to fit in a byte, you'd need to convert them, so let ch: u8 = 'A'; won't compile. But ASCII characters can fit in a byte, so there is syntax to write that b'A'. My change means that the compiler will explicitly suggest you modify that earlier mistake to let ch: u8 = b'A'; which works, however it knows not to recommend nonsense like let ch: u8 = b'£'; the pound currency symbol isn't in ASCII so you keep the same diagnostic just explaining what's wrong with no suggestion.


Again, wait until Rust gets 40 years of history deployments, distributed from the tiny 8 CPU, to HPC workloads and FPGAs, or stuff running on Mars.

I doubt very much that Rust editions and backwards compatibility history will be able to survive 40 years with such diverse use cases, without introducing accidental complexity and corner cases along the way.

This assuming that we can still use Rust and not Crab , as if Rust also doesn't have its own show of politics.


Rust is not designed for 8-bit CPUs like Tiny8. The smallest usize is allowed to be is 16 bits.

In practice on these very tiny devices high level languages are total overkill. Grace's original "compiler" concept makes sense, but today's assemblers are more than sufficiently capable. You can literally memorise what all the individual memory locations (actually Tiny8 just admits they're registers, it's not as if it would make sense to also have registers when you only have 256 bytes of RAM) are used for which means even the idea of variable names is of doubtful value.

I don't know if it's practical to write a conforming C++ "freestanding" compiler for Tiny8, but I can't imagine it'd be any more useful than Rust would be if you did.

The reason there isn't stuff on Mars running Rust is mostly that it takes a long time both to get stuff approved for that kind of application and to send things to Mars. Still I'm sure in 40 years there will have been Rust on Mars because why not and I doubt it'll have significant impact on Rust syntax.

There already are inelegant decisions which cannot (for compatibility) be revoked, but they're much less numerous and egregious at this point in Rust's life than similar problems were in standard C++. If you want one to point at, for some reason, I suggest comparing ASCII predicates like char::is_ascii_lowercase(&self) -> bool with the non-ASCII ones like char::is_lowercase(self) -> bool

Because char is Copy, the latter design would be more elegant, and allows e.g. "C++".contains(char::is_uppercase) which is true, whereas the ASCII variant means we need the more awkward looking "C++".contains(|c: char| c.is_ascii_uppercase()) going via a lambda but alas the way we got here didn't allow that to happen.


Wait, maybe you meant one of the other "8-bit" CPUs which actually have 16-bit address bus? That's kinda cheating but yes now we might actually want a programming language, we've got all this RAM to play with, we can make a stack, we can invent data structures, sure, Rust is fine with that setup. Or well, it's crippled, but not in any surprising ways you care about.

But it doesn't seem like there are interesting lessons here? Running the compiler on this sort of hardware was torment (I know, I'm old, I wrote my first software in the 1980s for a Commodore Vic 20, my program source code didn't fit in RAM so my parents had to buy a RAM expansion) but we just wouldn't do that today, we can cross compile from say, a Raspberry Pi, or even a real computer.


You know this is a red herring. Frequency of the release cycle is orthogonal to the amount of changes or even how long the changes are in development.


Just wait until Rust gets 40 years old.

Pity I won't be no longer around to check on it, given average human life expectancy.


I can't wait for C++64.

But "just wait until Rust will repeat C++'s mistakes" is just pure speculation. Language evolution doesn't have to make the language worse. Java, JS, C#, or Ada are pretty old now, and have been doing fine. Rust is well prepared for a 40-year lifespan with its edition system.


You quite clearly are unware of the evolution pain points to move past Java 8, .NET Framework 4.8, and how the Java community embraces Java 20, or the C# one sees C# 12, and the rate they are adding new features.

As for JS, everyone knows the mess of the Web ecosystem and frontend development.

Ada is doing just fine, as most vendors are still adopting Ada 2012. Ada Core and PTC are the only ones with the latest version, from 7 remaining vendors.


But the ecosystem lagging years behind the latest version is a separate problem, and one that ironically the 6-week release cycle of Rust helps with: there are no major upgrades to fear, and small frequent releases make the ecosystem move with the compiler instead of having time to ossify and choose to stay on an old version (the same way nobody chooses to stay on an old Chrome, but people used to stick to good'ol versions of IE and Netscape).


Lagging behind is only one issue, I explicilty mentioned the drama of newer updates that make many unconfortable given the rate that they now are coming with changes for the sake of it, just go read the comments on the C# 12 features announcements for a taste of it.


The language is moving too fast!

Too many features are added and seeing proposal like [0] don't make me confident about its future. Is it a general purpose programming language, or is it a research project?

[0]https://blog.rust-lang.org/inside-rust/2023/02/23/keyword-ge...


As an avid reader, I'm disappointed by Amazon's decision to discontinue individual newspaper and magazine subscriptions in favor of Kindle Unlimited. This change reduces my ability to support specific publications I value, potentially leading to less variety in the market.

It diminishes my sense of ownership, as content becomes more ephemeral within a subscription service.

I really can't wait for competitors to shake down this predatory monopoly.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: