There seems to be strong lobbying for insects as human food, in particular from companies that would be happy feed us with their own shit as long as it's cheap and they could get away with it
The green-left seems to enjoy that idea. Exactly why is hard to tell - especially on HN, but let's say I don't think it's rational.
The why is not that hard to understand - insects provide a lot of proteins compared to how much food they consume over their lifetime.
But yes, the obvious place to start is to use it for feeding chickens and not humans. Why chickens? Because insects are part of their natural diet when they are free. There is just a bunch of infrastructure problems that need to be solved for that to work as insects have pretty different problems to solve compared to other parts of the food production chain.
None of which requires startups, science or factories.
If you put cows on a field for a day, wait three days for insects to infest their shit, then put chickens on the field, the chickens scratch through the cow shit and eat the bugs. The cow shit gets nicely spread out and fertilises the soil more quickly.
The problem with this system is that it doesn't allow rich people to screw mega bucks out of the government for doing no work at all.
No, but there is a non-green left. And the greens do get most of their policy influence by associating with the rest of the left, since there are very few green parties that govern directly, or at least alone. So it's fair to say that such initiatives are successful because a subset of the broader left, the green-left, likes the idea.
> As a practical implementation of "six degrees of Kevin Bacon", you could get an organic trust chain to random people.
GPG is terrible at that.
0. Alice's GPG trusts Alice's key tautologically.
1. Alice's GPG can trust Bob's key because it can see Alice's signature.
2. Alice's GPG can trust Carol's key because Alice has Bob's key, and Carol's key is signed by Bob.
After that, things break. GPG has no tools for finding longer paths like Alice -> Bob -> ??? -> signature on some .tar.gz.
I'm in the "strong set", I can find a path to damn near anything, but only with a lot of effort.
The good way used to be using the path finder, some random website maintained by some random guy that disappeared years ago. The bad way is downloading a .tar.gz, checking the signature, fetching the key, then fetching every key that signed in, in the hopes somebody you know signed one of those, and so on.
And GPG is terrible at dealing with that, it hates having tens of thousands of keys in your keyring from such experiments.
GPG never grew into the modern era. It was made for persons who mostly know each other directly. Addressing the problem of finding a way to verify the keys of random free software developers isn't something it ever did well.
What's funny about this is that the whole idea of the "web of trust" was (and, as you demonstrate, is) literally PGP punting on this problem. That's how they talked about it at the time, in the 90s, when the concept was introduced! But now the precise mechanics of that punt have become a critically important PGP feature.
I don't think it punted as much as it never had that as an intended usage case.
I vaguely recall the PGP manuals talking about scenarios like a woman secretly communicating with her lover, or Bob introducing Carol to Alice, and people reading fingerprints over the phone. I don't think long trust chains and the use case of finding a trust path to some random software maintainer on the other side of the planet were part of the intended design.
I think to the extent the Web of Trust was supposed to work, it was assumed you'd have some familiarity with everyone along the chain, and work through it step by step. Alice would known Bob, who'd introduce his friend Carol, who'd introduce her friend Dave.
I think that the important conclusion to make of this is that publicly available code is not created or even curated by humans anymore, and it will be fed back into data sets for training.
It's not clear what the consequences are. Maybe not much, but there's not that much actual emergent intelligence in LLMs, so without culling by running the code there's seems to be a risk that the end result is a world full of even more nonsense than today.
This already happened a couple of years ago for research on word frequency in published texts. I think the consensus is that there's no point in collecting anymore since all available material is tainted by machine generated content and doesn't reflect human communication.
I think we'll be fine. AIs definitely generate a lot of garbage, but then they have us monkeys sifting through it, looking for gems, and occasionally they do drop some.
My point is, AI generated code still has a human directing it the majority of the time (I would hope!). It's not all bad.
But yea, if you're 12 and just type "yolo 3d game now" into Claude Code, I'd say I'd be worried about that but then immediately realized no... that'd be awesome.
Old English and Old Norse are mutually intelligible (especially after you realize the precise correspondences like un- = o-). Gunnlaugs Saga explicitly says the English and Norse are of one tongue and features a Norse poet singing to an English king. As another example, Ohthere of Hålogaland (Norway) visited King Alfred's 9th century English court and simply spoke to them in his own language:
> Whoever preserved this story was also curious about Ohthere’s descriptions of where the Angles had lived ‘before they came into this land’ (England). Members of Alfred's court remembered that their ancestors came from mainland Europe, and they wanted to learn more about the lands which they identified as their own places of origin.
The scribe explicitly wrote things like "he said krán which we call crein" showing they were speaking in their own languages. It's even clearer if you consider our standard Old English is West Saxon from 850 and our standard Old Norse is from 1250 in Iceland (more different than the Danish variety of most Scandinavians in England). At the same time point,they would have more similarities (8th century Danish had wír before w turned to v).
I think there was a limit of something like 60 days. At least my bank apparently refused all transactions that were settled too late.
I had a job which involved a lot of taxi trips, and when I cross checked 30% of the trips where never charged my account. I suppose they just filled up the glove box with old slips until they couldn't shut it. Hotels never failed.
I once worked with translating an application to polish, and found out we had to have separate placeholders for "name" for persons (nazwisko) and for things (nazwa).
Which is a simple example why you need context.
All UI frameworks should have a "translate" mode, where all labels and static text can be right-clicked and modified...
I think the current estimate is that there are almost a half a billion more Christians than Muslims (in 2025.)
One reason is that the number of Christians in Sub-Saharan Africa is growing.
But extrapolating the trends, yes Islam will probably become the largest religion in the coming decades.
Or at least maybe - looking at birth rates, it seems as second generation muslim immigrants to Western countries have even lower birth rates than the native population. That might happen also in regions say like Pakistan and Indonesia and other fast growing regions, depending on economical or other changes.
There seems to be strong lobbying for insects as human food, in particular from companies that would be happy feed us with their own shit as long as it's cheap and they could get away with it
The green-left seems to enjoy that idea. Exactly why is hard to tell - especially on HN, but let's say I don't think it's rational.
So I guess, successful lobbying?