> now you're getting convicted instead of having an opportunity to correct the problem
You have completely ignored the fact that this is literally the opposite of what has (likely) happened in this case, and is provably almost the norm in countless other cases like this. There was an opportunity to make things right by the leadership. They did not choose to do so. At every step along the way.
Even assuming that these were honest mistakes (I don't believe so), the actions taken by leadership in most of these situations tend to wring the complainant out of the system. If there's a "free market" way to resolve these problems, this certainly ain't it, which calls for regulations. If those regulations aren't strong enough to coerce the intended result (if you find out one of your employees acted in this manner, immediately contact law enforcement and let them figure it out, otherwise you're in trouble as well), then you are actively encouraging an extrajudicial mechanism for resolving these matters. Specifically, you're encouraging civil remedies vs. criminal ones.
Failure to report isn't the problem here. It's that the failure to report has no consequences on people with adequate power, who get to create their own kangaroo court of dispute and certainly resolve cases to their own benefit.
> You have completely ignored the fact that this is literally the opposite of what has (likely) happened in this case
What happens in a particular case is not what motivates behavior in the aggregate. People have incomplete information and use heuristics. If you set up a system that causes people to loath interacting with the government, the prevailing heuristics come to be about what you would expect.
> There was an opportunity to make things right by the leadership. They did not choose to do so. At every step along the way.
And that's what the system we have encourages. The prevailing rules and political climate empirically lead to this result.
> If those regulations aren't strong enough to coerce the intended result (if you find out one of your employees acted in this manner, immediately contact law enforcement and let them figure it out, otherwise you're in trouble as well), then you are actively encouraging an extrajudicial mechanism for resolving these matters.
Consider the alternatives you've laid out here.
Their first option is to go to law enforcement right away, but that immediately leads to a scandal, bad PR, legal expenses, etc. Unless you can significantly mitigate these deterrents, people will inherently have a disinclination to do this, whether it's "required" or not.
Their second option is to try to arbitrate the situation internally. This has an intrinsic advantage because you get an attempt to handle things by people who know the parties and their circumstances, and if it turns out to be a false accusation or some other shenanigans you don't get a public scandal. And if that doesn't work the first option is still available afterwards. So people are going to want to start here.
Now suppose you say that they're not allowed to start there. People are still going to want to and a lot of times they're still going to do it anyway. But once they have, they're now under much more pressure to make sure it goes away even if it turns out not to be a false accusation, because the initially-innocent people who were just trying to avoid a scandal are now regarded as co-conspirators who could be charged if they don't engage in an effective coverup. That is a helluva perverse incentive to create.
> Specifically, you're encouraging civil remedies vs. criminal ones.
Which is a trade off but not always the worst one.
> It's that the failure to report has no consequences on people with adequate power, who get to create their own kangaroo court of dispute and certainly resolve cases to their own benefit.
"People with power will use their power to their own advantage" is nearly a tautology. The question is, how do you create a system that produces reasonable outcomes in that context?
A system that causes intuitive human responses to put people into a situation that exercise of power is necessary to extricate them from it is going to both encourage that result and disadvantage people without influence, which is bad. Ideally you want a system that doesn't harm any innocent people because it's efficient and reasonable, so people don't expect to be unjustly damaged by interacting with it.
The IEA has done a pretty terrible job of predicting the mix of energy sources over the recent past, largely being overly biased towards fossil fuels. I honestly wouldn't put a whole lot of stock in their predictions even if their claim is that energy-related CO2 would peak in 2025.
For those of us that chuckle at it at this time, this is likely to be far far far bigger than Y2K. The reason is because software that runs critical aspects of human life will have proliferated to a greater degree when compared to >2000, and <2038.
Before 2000, we had software running our systems, yes. But it was not as distributed, and not as ubiquitous, and not as deeply ingrained into human culture as it is today. This proliferation will obviously continue past today, and while hardware and low-level OS/software mitigations (as well as a herculean effort to clean up the mess) will make up the gap, it's not hard to see that this is likely to be much more impactful upon failure because of the "embeddedness" of these systems.
A box that has just been doing its thing for 40-50-60 years and all of a sudden fails, is likely to be more impactful than one that was 20-30 years old even.
Y2K38 is a "standard Unix functions" problem. I don't think the surprises this time are going to be mission critical software doing date math, I think the surprises are going to be "non-mission critical apps" that "don't do date math" where it is not that "no one wants to risk touching it" but more "no one has thought to touch it in years because it isn't mission critical and it's just some random thing in the stack".
Case in point, this article's pointing a finger at fontconfig. Who considers fontconfig mission critical software? fontconfig has been open source forever, is it just "too boring" that no one has bothered to do a Y2K38 audit on it? It probably doesn't even really care about dates for the most part, so maybe no one even realizes it needs a date math audit? Multiply that across the very long tail of Unix apps and libraries since 1970. That's the weirder risk of Y2K38 than Y2K: the huge amount of "non-mission critical"/"non-date math" code that potentially exists in every Unix-derived tool. (With all non-Windows OSes in common usage today themselves being Unix-derived, that's a lot of surface area.)
Y2K was looking for everything that did date math with varchar(2) or 2-digit BCD. Those were needles in haystacks certainly, but the needles were sharp enough to know when you found one. Y2K38 is looking for subtle differences in (mostly) C macros and C library function calls and making sure that time_t structs are appropriately sized for modern platforms. That almost sounds to me more like looking for particularly colored straws in a haystack.
Why would that be true in any alternate versions of these contests? I understand the enormous incentive mechanisms involved, and yet I cannot see this being obviously true.
Edit to add: if the authorship of the submitters is as above reproach as we are led to assume, why can that not be the case for the NIST decision panel itself?
While I largely agree with you, and respect your opinion on these matters, the truth is that Dual EC was indeed a standard of the NIST, and therefore as a practical matter did get deployed by the public in whatever ill-informed manner for not having your elite cadre advice (yes, joking).
I appreciate the point about trust in the authorship of those presenting these algorithms, and I personally do accept it, but there's a lack of trust broadly (in the very community that these standards are intended for) in the process that your comments don't account for in this instance.
Great news. While this whole saga has built a sense of a trainwreck waiting to happen, I think this is the best result that everyone involved could have walked away with.
I have no doubt about Elon's intentions about what he plans to fix about Twitter the product, but I certainly don't expect things to go smoothly with wrangling Twitter the organization to actually achieve those results.
Tesla and SpaceX were built from the ground up with his personal principles in mind, and I think it will be quite the roller coaster for him to bring in the same sense of passion, discipline, and efficiency to Twitter. Twitter's inability to address many of its own longstanding issues is surely caused by a lack of will to do so. Turning around this ship will be no easy feat.
About 15 years ago, while working on Windows at Microsoft, a test machine sitting at my desk hit a kernel panic (BSOD). As was standard working on the test team, the machine was already setup for kernel debugging, and so I set out to debug it a bit in order to file a decent bug report.
Hours later, I couldn't make sense of it (I wasn't super experienced at this point). A few of the nearby devs couldn't either, and a small troop of us curious enough about the puzzler eventually escalated to the resident wizard, Raymond Chen[1]. Within 15 minutes of checking our work and poking at the machine, he traced the root cause down to a bit flip.
honestly i'm glad you brought up the reference to the awful censorship for this hilariously terrible movie. it made a mark somehow, and i still remembered (and waited for a reference to) it.
You have completely ignored the fact that this is literally the opposite of what has (likely) happened in this case, and is provably almost the norm in countless other cases like this. There was an opportunity to make things right by the leadership. They did not choose to do so. At every step along the way.
Even assuming that these were honest mistakes (I don't believe so), the actions taken by leadership in most of these situations tend to wring the complainant out of the system. If there's a "free market" way to resolve these problems, this certainly ain't it, which calls for regulations. If those regulations aren't strong enough to coerce the intended result (if you find out one of your employees acted in this manner, immediately contact law enforcement and let them figure it out, otherwise you're in trouble as well), then you are actively encouraging an extrajudicial mechanism for resolving these matters. Specifically, you're encouraging civil remedies vs. criminal ones.
Failure to report isn't the problem here. It's that the failure to report has no consequences on people with adequate power, who get to create their own kangaroo court of dispute and certainly resolve cases to their own benefit.