Hacker Newsnew | past | comments | ask | show | jobs | submit | cedilla's commentslogin

Is this a new thing or do you think that most professors were always unable to do their job? Why do you think you are an exception?

I don't believe that your argument is more than an ad-hoc value judgment lacking justification. And it's obvious that if you think so little of your colleagues, that they would also struggle to implement AI tests.


I assume that TVs have bad sound because better speakers just don't fit into their form factor.


Well, when fascists are in power, paper won't help anyone. But at this point, as a European I enjoy enumerated human and civil rights from multiple constitutions and several international treaties, which are directly enforceable by courts at the state, national, and European level.

The human and civil rights guaranteed by the US constitution are a complete joke in comparison, and most of them are not guaranteed directly constitution, but by Supreme Court interpretation of vague 18th century law that can change at any time.


You seem to have missed the Bill of Rights. Which is odd, because whenever we tell you during online arguments that our rights are guaranteed, you all say that absolute rights are dumb and it's actually more sophisticated and European to not have them.

Not that courts, legislators, and administrations haven't tried and succeeded in abridging them somewhat in any number of different ways for shorter or longer periods, but the text remains, and can always be referred to in the end. They have to abuse the language in order to abridge the Bill of Rights, and eventually that passes the point of absurdity.

No such challenge in Europe. Every "right" is the right to do something unless it is not allowed.


Deutsche Bahn ist anything but wasteful, it's underfunded to the tune of tens of billions per year. Cleaning out trees more up to ten meters away from railways was seen as to expensive, now 6m tall trees fall in them all the time during storms. Having two railways next to each other was seen as unnecessary, now we have no backups when one fails.

Swiss railway is seen as the ideal DB should strive for, but fact is that Switzerland invests more than double per capita into its rail infrastructure. German stinginess now compounded over decades, and that's not the fault of management.


Underfunded and wasteful are not opposites. When there is not enough money to do things properly there is often a lot of duct tapping going on which waste available resources without fixing anything. There is scarcely anything more expensive in government than saving money.


I'll never understand the amount of vitriol Wikipedia volunteers must receive. Why is the deletion (or even deletion proposal) regarded as such a heinous act that people feel the need to attack and bully others?

I find this kind of behaviour and rethoric wholly unacceptable.


> Why is the deletion (or even deletion proposal) regarded as such a heinous act that people feel the need to attack and bully others?

FWIW I don't see this as an attack (with, perhaps, the exception of a couple of comments in the linked thread) and posted the link to the reddit thread as I see it more as an interesting observation around the myriad issues facing "legacy" languages and communities. To wit:

* Google appears to be canon for finding secondary sources, according to the various arguments in the deletion proposals, yet we're all aware of how abysmal Google's search has been for a while now.

* What's the future of this policy given the fractured nature of the web these days, walled gardens, and now LLMs?

* An article's history appears to be irrelevant in the deletion discussion: the CPAN page (now kept) had 24 years of history on Wikipedia, with dozens of sources, yet was nominated for deletion.

* Link rot is pervasive, we all knew this, but just how much of Wikipedia is being held up by the waybackmachine?

* Doesn't this become a negative feedback cycle? Few sources exist, therefore we remove sources, therefore fewer sources exist.


> Google appears to be canon for finding secondary sources, according to the various arguments in the deletion proposals, yet we're all aware of how abysmal Google's search has been for a while now.

Nobody is forcing you to use Google. If you can provide an acceptable source without the help of Google, go ahead. But the burden of proof is on the one who claims sources exist.

> An article's history appears to be irrelevant in the deletion discussion: the CPAN page (now kept) had 24 years of history on Wikipedia, with dozens of sources, yet was nominated for deletion.

Such is life when anyone can nominate anything at any moment... and when many articles that should have never been submitted in the first place slip through cracks of haphazard volunteer quality control. (Stack Overflow also suffers from the latter.)

The sources is the only part that matters. And they sufficed to keep the CPAN article on site, so the system works.

> Doesn't this become a negative feedback cycle? Few sources exist, therefore we remove sources, therefore fewer sources exist.

It was wrong to submit the article without sourcing in the first place. Circular sourcing is not allowed.


> The sources is the only part that matters. And they sufficed to keep the CPAN article on site, so the system works.

The system works if the sources remain available, and in an environment predisposed to link rot that can be a problem. Imagine the hypothetical situation of archive.org disappearing overnight? Should we then delete all pages with it as their sole source if they're not updated within a week?

And the system works if intentions are pure - it seems here the user that suggested the deletion of several Perl related pages is a fan of film festivals[1] and clearly wasn't happy that the "White Camel Award" is a Perl award, since the late 90s, and not a film festival award (since the early 00s). At least according to Google. So they went on a bit of a rampage against Perl articles on Wikipedia.

You could argue "editor doing their job", but I would argue "conflict of interest".

[1]: https://en.wikipedia.org/w/index.php?title=Sahara_Internatio... # amongst many in their edit history


These are all bad-faith takes. What are you doing?

24 years ago, some people wrote on Wikipedia instead of elsewhere. So the wiki page itself became a primary source.

"The page shouldn't have been submitted..." This was a Wiki! If you're unfamiliar with the origin of the term, it was a site mechanism designed to lean in to quick capture and interweaving of documents. Volunteers wrote; the organization of the text arose through thousands of hands shaping it. Most of them were software developers at the time. At a minimum, the software-oriented pages should get special treatment for that alone.

You're acting as though this is producing the next edition of Encyclopedia Britannica, held to a pale imitation of its standards circa the 1980s. The thing is, Britannica employed people to go do research for its articles.

Wikipedia is not Britannica, and this retroactive "shame on them" is unbelievable nonsense.


Verifiability is a core policy on Wikipedia, and with time, citing your sources has become more and more important. Wikipedia isn't was it once was in 2001. Articles can't survive on being verified by their own primary sources, for the same reason we don't want Wikipedia to become a dumping ground for advertisers who then cite their own site in an attempt to gain legitimacy. Secondary sources provide a solid ground truth that the subject in question has gained recognition and thus notability. If those secondary sources don't exist, we can't assume notability based on nothing.

Wikipedia isn't Britannica, because by this point it's probably a lot better than Britannica. They were comparable already in 2005,[1] and I have little reason to believe that Wikipedia is doing much worse on that front nowadays, even though they have vastly more content than Britannica.

[1] https://www.cnet.com/tech/tech-industry/study-wikipedia-as-a...


Some of the deleted pages never had the « sources missing » tag set for a significative time. It has been straight to deletion point.

Some pages that survived the deletion (e.g. TPRF) had the « missing sources » tag set since 15 years… What, I have to admit, can justify some action. But it was not the case for the PerlMonks and Perl Mongers pages: those just got deleted on an extremely short notice, making it impossible for the community to attempt any improvement.


7 days is policy for a deletion proposal,[1] which I can agree is not really enough time, although it's usually extended if talks are still ongoing.

There aren't really any rules about putting up notices and such before proposing deletion, and if you can't find anything other than primary sources, it doesn't seem unreasonable to propose deletion than propose a fix which can't be implemented. Thankfully, someone did find reliable sources for some of the articles.

[1] https://en.wikipedia.org/wiki/Wikipedia:Deletion_policy#Prop...


> If you can provide an acceptable source....

https://arstechnica.com/gadgets/2021/08/the-perl-foundation-...

https://www.theregister.com/2021/04/13/perl_dev_quits/

20 seconds.

If I ran Wikipedia I would ban everyone involved in this spectacle.


> And they sufficed to keep the CPAN article on site, so the system works.

This is such an absurd take. “It this one example the system worked so clearly it’s fine.”


People get extremely frustrated and upset about arbitrary rules, especially when they are imposed inconsistently.

From the talk page it seems like exactly three people were involved in deciding if this was worth deleting and they indicated they could not find evidence of notability. Meanwhile I found a Register article about PerlMonks in minutes and there are pointers here to Google Scholar references as well.

When the bar for deletion is “a couple of people who didn’t try very hard didn’t find notability” is it any wonder that there’s pushback? This feels entirely arbitrary.


Consider the other perspective: how should Perl programmers feel when Google's index becomes the main criterion for what is considered important or not? This creates a circular dependency that can erase genuine technical contributions from the historical record.


Google index is tailored for each individual. Persons with interest in breeding cats won’t be served Perl results.

If Google index becomes a criterion of notability, we are in a deep deep shit.


Because it puts the history of the article behind a lock

I wonder if there are any privileged Wikipedia accounts who have defected and are doing a sci-hub thing.


> Why is the deletion (or even deletion proposal) regarded as such a heinous act

"Those who control the past, control the future"


I completely agree. I struggle to think of any legitimate business that would allow only gift cards. Maybe some privacy oriented VPN providers?

In any case, I think this is almost a willful misunderstanding. Not only does it attack the straw man of "no one ever gets legitimately paid in gift cards", but literally the first counterexample, Paysafecard, isn't a gift card!


I work with someone who does payment for adult sites etc and even though they do offer Paysafecard not a ton of revenue is generated through them, because fees for the creators are quite high and I guess it's just inconvenient.

Most people who want to spend their money just do it using credit card, bank transfer, whatever.


And note that in the VPN situation the customer is the one initiating the transaction. I want X, the only acceptable payment is a gift card--the person buying the gift card knows that it's specifically being done to make it very hard to track the transaction. That's a very different thing than someone demanding a bill be paid via a gift card.


Blizzard runs several popular games where you need to buy their currency before you can buy anything. I don’t know if it’s the case anymore, but Microsoft used to require Xbox Gold to purchase games. Usually this requires locking more up than the purchaser intended to spend.


AFAIK in most games or storefronts with a real-money exchange pipeline, the resulting units are simply not gift-able. Being unable to exchange value with other users makes it qualitatively different.

In other words, you spend regular money for company-points, but thereafter you can only spend the company points on things that cannot be transferred. While there is certainly a cynical aspect to locking up customer funds, it makes it a lot easier to handle things like fluctuating currency exchange rates, and simplifies refunds within the points-store.


OTOH even as someone who played a popular online Blizzard game for years (and realistically spent a decent amount of money on it), maybe it's not the worst thing in the world if this sort of think becomes considered more "scam" than "legitimate business model". There's almost never a direct 1:1 ratio between the real money you convert into the currency and the price of the thing you want to get (which isn't that surprising, as it would probably be pointless to ask someone to put in $20 to convert to widgets only to immediately ask for all of their widgets for the item they want rather than ask for the $20 directly), which means you either buy more than you intended or hold onto it in the hopes that you can put it towards something else you want later. What percentage of people who have ever bought one of these currencies do you think don't currently still hold some due to having leftover from their most recent purchase? What percentage of people have bought some and later stopped playing the game managed to spend all of them before they stopped? All of those people have basically been taken advantage of IMO (probably knowingly, but that's hardly an excuse when there's a power imbalance). Even if the relative injustice is small compared to other things in the world, it would still probably be better for business models like that not to exist.

(edit: in retrospect "OTOH" was a poor choice of words since this isn't really a different point than the parent comment is making)


More forget than fire.

The longer certificates were valid the more often we'd have breakage due to admins forgetting renewal, or how do install the new certificates. It was a daily occurrence, often with hours or days of downtime.

Today, it's so rare I don't even remember when I last encountered an expired certificate. And I'm pretty sure it's not because of better observability...


I don't think anyone made the claim that requiring identification while providing German phone numbers would do anything about abuse from Russian botnets or abuse from non-German phone numbers.


For some reason, many people think that gifting money is gauche, but gift cards are somehow okay.


What incompatible versions of pythons do you mean? I'm entirely unaware of any forks, and the youngest version I have to supply at the moment is 3.9, which is over 5 years old and available in all supported platforms.


Try to run any random python program of moderate dep use on your python 3.9 system interpreter without using containers. Most likely you'll have to use a venv or the like and setup a special version of python just for that application. It's the standard now because system Python can't do it. In practice, pragmatically, there is no Python. Only pythons. And that's not even getting in to the major breakages in point version upgrades or the whole python 2 to 3 language switch.


> Most likely you'll have to use a venv or the like and setup a special version of python just for that application.

Using venvs is trivial (and orders of magnitude more lightweight than a container). And virtually every popular package has a policy of supporting at least all currently supported Python versions with each new release.

You need to set up a venv because of how the language is designed, and how it has always worked since the beginning. Python doesn't accommodate multiple versions of a package in the same runtime environment, full stop. The syntax doesn't provide for version numbers on imports. Imports are cached by symbolic name and everyone is explicitly expected to rely on this for program correctness (i.e., your library can have global state and the client will get a singleton module object). People just didn't notice/care because the entire "ecosystem" concept didn't exist yet.

I have at least one local from-source build of every Python from 3.3-3.14 inclusive (plus 2.7); it's easy to do. But I have them explicitly for testing, not because using someone else's project forces me to. The ecosystem is just not like that unless perhaps you are specifically using some sort of PyTorch/CUDA/Tensorflow related stack.

> It's the standard now because system Python can't do it.

Your system Python absolutely can have packages installed into it. The restrictions are because your Linux distro wants to be able to manage the system environment. The system package manager shouldn't have to grok files that it didn't put there, and system tools shouldn't have to risk picking up a dependency you put there. Please read https://peps.python.org/pep-0668/, especially the motivation and rationale sections.

> major breakages in point version upgrades

I can think of exactly one (`async` becoming a keyword, breaking Tensorflow that was using it as a parameter name). And they responded to that by introducing the concept of soft keywords. Beyond that, it's just not a thing for your code to become syntactically invalid or to change in semantics because of a 3.x point version change. It's just the standard library that has changes or removals. You can trivially fix this by vendoring the old code.


> And that's not even getting in to the major breakages in point version upgrades or the whole python 2 to 3 language switch.

Python doesn't use semver and never claimed to do so, but it's probably worth treating "x.y" releases as major versions in their own right (so like 2.7 -> 3.0 is a major version and so 3.10 -> 3.11). If you do that, the versioning makes a bit more sense


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: