Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Your argument falls apart when you remember that the U.S. federal, state, and local governments are critically dependent on open source software – not just directly in things like Linux servers or Chrome/Edge/Firefox but also open source components used in appliances or compiled into commercial software. It is quite reasonable to argue that even a narrow approach of improving only components they run would be justifiable on those grounds and it’d be a tiny part of, say, NIST’s budget to fund developers directly or to pay some group like the Linux or Apache foundations to support an open source maintainership team.


Organizations of all types -- government and otherwise -- are dependent on a wide variety of externally-sourced solutions for mission-critical operations. They can and do develop their own processes for testing and vetting potential solutions against their own criteria for performance, reliability, maintainability, and security.

Government orgs can and do contribute the results of the work they do in this regard upstream to FOSS projects. This has never not been the case, and when government-employed developers release the work they do to meet their own security requirements to the broader community, everyone benefits.

But this is drastically different from the scenario that the preceding poster was proposing, in which government officials would assume effective responsibility for the entire project, not just act as participants in the FOSS community.

That proposal would invert the situation, and change it from government devs adhering to the norms and conventions of the community to the community adhering to the rules and priorities defined by the government, which is where the negatives I outlined above would come into play.


I fail to see how you addressed the previous comment here. You ignored 80% of the points made.

Donations is fine, but it needs to be “no strings attached”, otherwise I agree with the GP that the risk of weaponising FOSS may become even greater.

I do agree that the US government is critically dependent on FOSS by now though. But “why throw money at it if it ain’t broken?” is the prevalent mentality, especially when everyone can have their own definition of “broken”…


The first sentence was a category error. It is a national security issue as long as systems which national security depend on are running the code in question. I totally agree on the FOSS side, but consider that the proposal could be as simple as having, say, NIST give the Linux foundation money annually to pay a supporting team of developers rather than the government taking over maintenance on anything.


> It is a national security issue as long as systems which national security depend on are running the code in question.

This is itself a category error. By this standard anything and everything upstream of certain government agencies implicates a "national security", and you might as well say that the supply chain for staplers is a national security issue because government officials staple documents together, or that the manufacture of socks is a national security issue because government agents wear socks.

Naturally, it's up to organizations themselves to make sure that their particular usage of any specific resource meets their exceptional security needs, not to expand the definition of "national security" to encompass the entire upstream supply chain independently of their use cases.

> NIST give the Linux foundation money annually to pay a supporting team of developers rather than the government taking over maintenance on anything.

This in itself would create a nexus of influence that could ultimately function as a vector of social engineering attacks, including from factions within our own government. It's just not cut-and-dried enough to presume that government money is some sort of magic solution and wouldn't itself actually make things worse.


> you might as well say that the supply chain for staplers is a national security issue because government officials staple documents together, or that the manufacture of socks is a national security issue because government agents wear socks.

You’re leaving out the key part: this only works if there’s a way for a flaw in those staples or socks to impact national security functions. Once you correctly make the analogy you can easily see why that’s true for a server operating system but not either of your examples.


What if a threat actor embedded secret listening devices into staplers? What if wool socks specifically designed to maximize ESD discharge were used to disable sensitive equipment?

Organizations that are concerned with outlandish risks like these implement their own policies and procedures to safeguard against them. They might x-ray office supplies before allowing their use in secure facilities; they might maintain a short list of approved fabrics in an ESD-sensitive environment. The point is that organizations that have exceptional security requirements apply their own policies and procedures to mitigate risk, and don't expect parties upstream of them do so for them.


Socks and staplers are not part of the security infrastructure. Computers are, and that means it’s in the interests of everyone to keep them secure.


"Computers" is too broad to be meaningful. Compression tools are no more a part of the security infrastructure per se than socks and staplers are.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: