Hacker Newsnew | past | comments | ask | show | jobs | submit | superice's commentslogin

It’s especially frustrating that dependency hell seems to be embedded in the Python culture. The amount of “oh no this lib will only work with Python 3.10+ and a slew of other libs at random versions we won’t bother to tell you” while some other lib that it depends on will only work on “3.8.56z but not if you look at it funny and only if you say pretty please” is maddening. Semver is apparently not standard practice either.

I am probably biased against Python, so take this opinion with a grain of salt, but it feels to me like a whole ecosystem of amateur software devs (but professional ML-engineers, data scientists etc) cobbling together something that barely works.

I’m old enough at this point that I remember the whole old guard of software engineers falling over themselves to hate on JS and Node, call the ecosystem immature, and be quick to point out how that is not “real” software. But in the past 10-15 years it appears JS and Node got their shit together, while Python is still completely and utterly stuck in managing dependencies and environments like it is 2012. And if you ask professional Pythonistas about this, you always get an answer like “oh it’s actually really easy, you must not have taken the time to really look at it, because Python is easy, it’s just pseudocode look how amazing it all is”

I really wish ML hadn’t standarized on Python. As a user of ML tools ans frameworks but not a fulltime ML engineer it just constant pain.


>It’s especially frustrating that dependency hell seems to be embedded in the Python culture. The amount of “oh no this lib will only work with Python 3.10+ and a slew of other libs at random versions we won’t bother to tell you” while some other lib that it depends on will only work on “3.8.56z but not if you look at it funny and only if you say pretty please” is maddening. Semver is apparently not standard practice either.

There is no problems with this in modern python if you just use the right tooling.


In the best case scenario, professionally writing software is treated like a craft. You produce something useful while applying your skills, with the tools at your disposal. You can write software as an art form, just like woodworking can be both a craft and an art form. A woodworker in an assembly line doing the same thing over and over again to me is not a craftsperson, it is the attitude towards the job that makes it art, craft, or assembly line work.

Too many software projects treat programmers as factory workers, where their primary value is measured in amount of storypoints or Jira tickets finished. Don't get me wrong, you can be a craftsperson and use an issue tracker ofcourse, but if quantity is the only thing management cares about instead of quality, the craft gets lost in the process. Quantity is easy to measure, quality is not.

At the same time treating software like an art is probably not very useful. That code is (typically) not written to be looked at, but to make the computer do something useful.

It's a shame artisinal software sounds so weird, because that precisely describes the level of caring I'd like to see applied to the software I use.


> At the same time treating software like an art is probably not very useful. That code is (typically) not written to be looked at, but to make the computer do something useful.

FWIW you can argue the same for woodworking, a chair is typically not made to be looked at but for people to sit on. I tired to think what inherently makes writing software treated less than a craft than woodworking, but couldn’t think of any.


depends on the kind of software, the visuals are part of the ux and ergonomics. ratio, spacing, how to convey state with just the right amount of visual cues. a chair is not a tool your interact with to create. now there are some software that is like this, you want to start it and forget about it as long as it does its thing while you think about your goal.


Sometimes with software the reat art has nothing to do with anything the user sees.


that is true, a lot of gems are forever hidden under the hood


I suppose that's true, in much the same way that chemotherapy treats the symptom of cells spontaneously deciding to replicate in your body. That does not mean we judge people battling a cancer diagnosis and tell them to pursue non-medicated approaches because it's "just treating symptoms".

If you encounter a bit of bitterness from the ADHD community online, let me provide some perspective: I have been called lazy my entire life, I have wondered why everybody could just do stuff and apply themselves. Why couldn't I just clean my house, do my homework, keep on top of chores, or even find the energy to play games after a day of work? I only found out as an adult I have a disability which makes all of that an uphill battle for me, INCLUDING finding the motivation for the fun stuff. There is an easy fix for this, some meds that take care of SOME of the problem. They don't fix it in much the same way that a wheelchair doesn't fix the legs of a crippled person, but it sure is like playing life on easy mode if you're used to dragging yourself around by your arms. And now I'm stuck explaining this to people who have done the barest minimum of research and who say 'oh it only is treating symptoms'. They have the audacity of calling me lazy (again!) for not training my arms more to overcome my disability that way. And my response is simple: You can take my metaphorical wheelchair over my dead body, and if you were in my position you would feel exactly the same way.


i understand your perspective viscerally and as such i understand the push back ... the argument is that there is nothing wrong with someone labelled 'adhd' , rather that the modern western system both a) does not handle adhd behaviour properly and b) exhibits conditions where non-adhd individuals exhibit adhd behaviour ... when taking into account that speed will motivate anybody (both adhd and non adhd) , and that demotivation is a natural response to a hopeless scenario , i do not see adhd as a disability in and of itself ... recommend to look up the effect of hope on drowning rats ...


Even if you were 100% correct and the world is broken, fully causing ADHD as a disorder: Please fix the world FIRST and only once proven ADHD is caused by what once was the shape of western society and no longer applies, THEN you get to take the metaphorical wheelchair away.

The alternative is that you prevent millions of people in managing their disability while asking them to bet on your view of the world AND on our collective ability to change it. In the best case scenario where we manage that shift, that's what, 10 years of my life gone while society adjusts? Will you write my kids a nice letter explaining them their dad is going to be a deadbeat the next 10 years while we fix society, because somebody on the internet thinks daddy shouldn't be on stimulant medication?

You're just not presenting an attractive deal to anyone, whilst very politely telling disabled people making the best of their shit situation that their crutches should not exist. Hell, maybe they shouldn't need to exist, but how is that my fault? And while I can't tell if you stand on the side of 'using meds to manage ADHD is a failure of self discipline and morality', but if you do: I promise you most people with ADHD have more self discipline in their little toe than others do in their entire body. But self discipline doesn't make a cripple walk, as much as it doesn't make my brain make the chemicals I need to put my body into action. I've spent enough time of my life flogging myself into action, believing I was a fundamentally lazy human, I'll take the meds.


> Even if you were 100% correct and the world is broken, fully causing ADHD as a disorder: Please fix the world FIRST and only once proven ADHD is caused by what once was the shape of western society and no longer applies, THEN you get to take the metaphorical wheelchair away. [...] You're just not presenting an attractive deal to anyone.

Thanks for writing this succinct counterpoint!

The argument of the commenter you're responding to reminds me of the "ADHD is a superpower!” vibe, which I perceive as toxic positivity, but couldn't rebuke quite as clearly [1].

"There's nothing wrong with you, it's literally the entire human society that's broken" has the same implication ("don't take meds, you're nOt bRoKeN”).

Of course it's the environment that causes our symptoms. Just like cold weather makes one feel cold.

It'd be rather silly to argue that winter clothes should be abandoned, that they exist only to make money for clothing manufacturers.

Some of us live in cold places. We need winter clothes. And we don't make the weather.

[1] https://romankogan.net/adhd/#Superpower


I don't like the superpower vibe either. I do not entirely look negative upon my ADHD, but then again, to what degree it is my personality and to what degree it is the disorder, I don't know. If others want to view their ADHD as a superpower, I'm all for it, but pushing this narrative too hard doesn't feel constructive.

For me personally, I tend to view myself as a human with some characteristics which in most contexts are not helpful, but make me uniquely suited for other situations. I try to avoid the word 'broken', but for a long time I thought about myself as 'broken in a sometimes useful way'. My burden in life is to manage and control the characteristics where they are disadvantaging me, and position myself as much as I can in positions where these characteristics are useful assets. To hook into your winter metaphore: If you always feel cold, working in a tropic resort where others would feel continually hot, sweaty and uncomfortable might just be perfect for you.

Where it crosses the line for me is neurotypical people saying 'oh I don't need winter clothes, and anyone who does is must be to use a shortcut and should just jog themselves warm instead'. Or even ADHD people saying: 'My feeling cold all the time is a superpower, you should just go work in your local sauna and you'll have no problem at all!'. Great, happy that you guys don't have this issue or figured out a workaround, but I do, and I need my damn winter clothes. I wish I didn't, I'll look for warmer situations, but I need them right now.


yeah my point is that the western society should be changed and i dont know why you think that i think that speed should be taken away entirely but i dont think that


Apologies for drawing that conclusion, but usually the argument presented is something like "amphetamines / stimulants bad especially in healthy adults -> usage on the rise due to increases ADHD diagnoses -> diagnoses potentially fraudulent? -> pharmaceutical companies are incentivized to sell stimulants -> ban stimulants entirely before it gets out of hand and rejoice our win on late stage capitalism and druggies".

I might've misunderstood the point you were trying to make, but saying 'society causes the issue' usually is followed by 'therefore treating with meds is silly and we shouldn't do that'. The latter part is what I take issue with, not the former, and if you weren't advocating for that then we have no issue and are in agreement.

I'd love to find a constructive way to change society for the better so ADHD is not as much of an issue, but personally, I don't see that happening. And I do see anti-intellectualism and puritanism on the rise, and with it, a movement that wants to take my crutches away to deal with society as it is. Maybe you weren't one of them, but I saw you making similar arguments, which is what prompted me to respond.

Calling the group of all stimulant medications "speed" is not a great sign either if I have to judge where somebody stands in this debate btw. Dextroamphetamine, one of the ingredients in Adderall, is similar to other amphetamines as found in speed, but for instance methylphenidate aka Ritalin only partially has the same working mechanisms, blocking the reuptake of dopamine and norepiphrine but not helping the extra release of more of these neurotransmitters. And that comes with a different profile in terms of addiction risks and whatnot. It is not helpful to call all of these medications 'speed' as if all ADHD people are buying potentially contaminated stuff on street corners from shady dealers, manufacturing this breaking bad style in their shed. Managing ADHD is done with clean, medical-grade, typically less-potent versions, with tightly controlled doses, closely managed by a licensed medical professional. Framing ADHD patients taking their prescription medication as speed users is not helpful. Even if you don't have any ill intent, reinforcing the belief we are all essentially the same as junkies just provides ammo for people who do want to take this away.


would love to continue this discussion but i feel like it will involve lots of this error correction so i shall leave it here , all the best


To provide a European/Dutch perspective: I’m pretty sure that as a small employer myself, I am very much disallowed from using those mechanisms to actually inspect what employees are doing. Automated threat/virus scanning may be a legal gray zone, but monitoring-by-default is very much illegal, and there have been plenty of court cases about this. It is treated similarly to logging and reading all email, Slack messages, constantly screenrecording, or putting security cameras aimed at employees all day long. There may be exceptions for if specific fraud or abuse is suspected, but burden of proof is on the employer and just monitoring everyone is not justifiable even when working with sensitive data or goods.

So to echo a sister comment: while sadly it is common in some jurisdictions, it is definitely not normal.


I'm literally working for a local govt agency (via contracting company). I'm not sure that anything is being actively monitored so much as it's' blocking a number of sites (anything AI), upload sites, etc. As well as blocking POST actions to non-whitelisted sites/apps.

I've also seen similar configurations in Banking environments having done work for three major banking establishments over the years. The exception was when I was on a platform security team that managed access controls. Similarly at a couple of large airlines.


Minor pedantic correction: 2.5gbit, 5gbit and 10gbit RJ45 is getting more affordable and more common, and for short runs should run over CAT 6 and CAT 6a fine, and plenty of reports it does ok on short runs even on CAT 5e. With devices like the USW Flex Mini 2.5 at ~50-60 EUR / USD, you can affordably outfit your home for higher than gigabit speeds without rewiring everything with new CAT cable or fiber.

Over here in NL we now get more and more access to >1gbps speeds, the office of my small business for instance has a 4gbps connection, and the ISP offers up to 8gbps on a standard consumer / small business package. We're in the process of upgrading our gear to take advantage of that. With WiFi 7 we've seen some real world throughput speeds of 1800-2000mbps going through a Ubiquiti U7 Pro straight to the ISP supplied router.

I wasn't really keeping up with networking gear, so I was pleasantly surprised when I looked into this stuff recently and figured out the gear has just magically gotten better and running 2.5gbit everywhere is surprisingly easy.


Something nonobvious to consider, 10G copper/RJ45 SFP modules run hot, to the point where our Mikrotik switch's manual mentioned that we could use them, but they strongly recommended only populating every other port, if we did. Heat wasn't a problem at all with the fiber ones.


> 2.5gbit, 5gbit and 10gbit RJ45 is getting more affordable and more common

Still, compared to the SFP+ gear it's ridiculously overpriced. NICs are <$20 on ebay and an 8x10G port managed switch is $120 on aliexpress.

> Over here in NL we now get more and more access to >1gbps speeds

Same in France, yet the main "geek" ISP (free) has an 8Gbps symmetric ISP router with a 10G SFP+ cage for full bandwidth to the LAN. RJ45 ports are 2.5G.

And it's hard to fault them, as customers that are likely to even hardwire stuff to the router and moreso at 10Gbps are usually enthusiasts that do prefer SFP+ due to the abundance of hardware on the used market. Oh, and their team designing the router are a bunch of nerds that most likely all have a 10Gbps network.


There’s an ISP in Switzerland offering 25Gbps, they provide a Mikrotik. They’re called init7.


Yup, that's pretty nice. I sold a couple of XXV710s to a friend that moved over there.


I've wondered this as well, I'd love to hear from the mods as to how many false positives vs true positives this generates. Us, the lowly users, only spot it when it mangles a title, but does it actually provide some tangible benefit?

I don't want to judge this 'feature' too harshly without that data, but couldn't 80% of the value of this be achieved by putting the text 'please don't editorialize titles of submissions except to de-clickbaitify them' in the submission form?


Or just use the <title> of the page verbatim? If it's clickbaity, it'll get downvoted, as it should.


Shipping has always been a strong point for Ikea. Just look at how efficient most of their stuff is packaged, usually into very neat rectangular packages. They can fit a ton of stuff into a small volume, which improves shipping cost a lot. Last mile will always be relatively expensive, but shipping a truck load or a 40ft shipping container is relatively cheap. In the end it all comes down to cost per unit of volume, compared to the raw product cost per volume. A dense product will not nearly be affected by high shipping prices as much, and due to Ikea being insanely good at packaging, I expect they are outcompeting the rest of the market at this game by a decent margin.


Starting from 'what looks good' is putting the cart before the horse. Making a UI usable and well laid out first is key. Practical UI and Refactoring UI are great resources, as long as you read them through a lens of 'what works well?' instead of 'what looks pretty?'. The author is absolutely right in that alignment and consistency are important, but that should really be your starting point.

Building a good user interface is fundamentally an engineering challenge. I see roughly two camps in building UIs, one designing a pretty picture and then tweaking the CSS until it looks like the picture, the other treating the CSS as rules of how the UI should behave. A simple example would be using display: flex; gap: 32px; on a parent of two elements instead of margin-right: 32px; on the left-most element. While the end result is identical, specifying the gap on the parent is better, because it puts the responsibility for spacing in the correct place. This also goes for the way you define CSS classes and rules, if two values are linked, like the height of your buttons and the height of your input fields, then try and capture that in a single rule, or extract it out to a variable.

A lot of building good UIs becomes much easier once you adopt the engineering approach. Consistency is almost built-in at that point, and that automatically makes your UIs better and easier to understand. It keeps your CSS more maintainable too.

While I'm sure there are ways to achieve this with Tailwind, generally I tend to see developers do the exact opposite when they use tools like that: just define everything with atomic classes inline, and forget all about the relations of styling rules to eachother. Tailwind has some great concepts, like defining a limited set of values to choose from, but be careful to keep the engineering, rules based way of building UIs.

There are so many times we've gone a direction in our products only to figure out that while we could make the page look pretty, it never would work well. It always ends up being some version of 'if we go direction X, then features Y and Z will have to be shoe-horned in and it'll look ugly'. When you get that feeling, take a step back, come up with some different approaches, and go with a better one.

The "make it pretty"-step should really be the last thing you do. If you design your UI with heavily visually simplified components and in black and white, it should still work and feel right. Make it work right, and the pretty will come.


> The "make it pretty"-step should really be the last thing you do. If you design your UI with heavily visually simplified components and in black and white, it should still work and feel right. Make it work right, and the pretty will come.

Totally agree. I’ve fallen into the trap of chasing “cool” or fancy visuals first, but the most beautiful UI is the one that actually solves the user’s problem. A slick-looking interface that doesn’t work well isn’t really beautiful at all.


I don't understand when people see Tailwind as anything other than a syntax for CSS.

> forget all about the relations of styling rules to eachother

This is a hot take, but the more cascading your styles are, the harder they are to read and debug. I've never been upset to find classes that just correspond 1:1 with DOM elements. And that's what Tailwind is doing.


I'm not necessarily arguing in favor of deep cascading, but I don't see how something like this:

.form-element, .button { height: 32px; }

Is something that can be easily achieved with Tailwind without either using @apply all over the place, effectively now doing regular old CSS but with Tailwind syntax, or by using JS/TS variables extensively making the styling pretty hard to read. Either way, I'm not saying Tailwind makes it impossible, but it sure doesn't make it easier. And while Tailwind has a bunch of benefits, especially where it becomes a design system rather than a syntax for CSS, you can achieve ~90% of that by just defining a bunch of color and size variables at the root of your CSS and using that.

I don't mind Tailwind too much for 1:1 mapping with DOM elements, but I also don't really see why inline styles for that case would be bad.


> Is something that can be easily achieved with Tailwind without either using @apply all over the place, effectively now doing regular old CSS but with Tailwind syntax, or by using JS/TS variables extensively making the styling pretty hard to read.

Define an html component, say MyButton:

    <button class="h-8 other-inline-styles-here">{MyButtonText}</button>
And use that component everywhere and the styles carry through:

   <MyButton MyButtonText="Click here!" />
People worry about regurgitating the same CSS with Tailwind, while continuing to regurgitate the same HTML structures all over the place. Encouraging inline styling encourages you to stop repeating your HTML, which is a better approach, even for a design system.


The equivalent tailwind is to just put h-8 (or whatever it is) on whichever form elements and buttons you want to be that height. Tailwind recommends you never use `@apply` - it’s basically an escape hatch for weird, niche interop requirements if you’re not all-in on Tailwind.

What is it about your example that the tailwind approach (h-8) doesn’t achieve?


>And that's what Tailwind is doing.

What DOM element does "shrink-0 w-6 h-6 mr-2 -ml-1" correspond to?


the one that you put that class name on


Tailwind's default sizes add constraints, which makes Tailwind a bit of a design system. Also, Tailwind's default colors are just very pleasant to work with. Even on projects where I don't use Tailwind, I look up the color table in the docs.


Same, I really have nothing bad to say about the Tailwind design system.

I suspect there is a lot of halo effect going on, where if people were asked to judge the pure code aspect of Tailwind vs the design system separately, they'd be much more negative on the code aspect of it, but because the two are conflated and one is very good, Tailwind as a whole is judged to be much better than I personally think it deserved to be. It doesn't help either that forming an opinion on an architecture of code is difficult, and forming an opinion on a designed web page in front of you is easy.

My personal poison is just defining the Tailwind color system and a very basic system of sizes in CSS variables and then using that all over the place. It leaves an escape hatch in place of just using 5px when you really need to, but most of the time we use var(--col-grey-200) and var(--u-4) or var(--u-8)


"Starting from 'what looks good' is putting the cart before the horse. Making a UI usable and well laid out first is key." This is first principles yet is so often ignored.


Yes, and I want FireWire. Oh, and I'd really prefer 16 bit real mode CPUs. While we're at it, why not go for support for serial connection mice?

This reads like such an arbitrary wish without a reasoning WHY you would want this. I'm sure OP has a reason for preferring it, but what makes the 80x25 superior in their opinion?


I think the author is making the argument for consistency.

I actually always disliked the modeset that the author remembers fondly, but it is always sad to lose part of our history for arbitrary reasons and especially so if it breaks a ungoverned consistency.

To use your example: Real mode still exists and you can use it, and firewire is effectively the father of Thunderbolt (and granddaddy to Thunderbolt 3-4); so its removal really does feel unnecessary without additional context.

Serial mice is masochism, but people do dislike that PS/2 is gone, for good reasons.


One reason it could be nice is what I experienced a decade or so ago, the damn machine kept changing video modes during boot and the LCD couldn't keep up so an important screen was missed when diagnosing a boot issue.

Had to get a CRT to see what the hell was going on.


This even applies to remove-viewing software that wants to "follow" the resolution changes, flipping your remote window size around a lot. Super annoying.


I don't really get the 80x25 thing, but using dumb terminals to write code is great. Zero distractions.

More than half the code I've been paid to write in the past 2-3 years has been written in vim running on a vtty with no X and no mouse. It's my favorite way to work, although occasionally it's impractical.


You can still plug in a FireWire PCIe card and have it work - I still use one for an old 35mm film scanner.

I think serial mice should still work as well - https://wiki.alpinelinux.org/wiki/Serial_mouse


Apple has finally depreciated Firewire with macOS 26, sadly.

Serial mice still also just work in Windows, too! If you attach a serial mouse to a USB-serial adapter, then attach the adapter, Windows should pick up and load sermouse.sys. On the flipside, if you’ve got a weird serial device attached screaming garbage out the wire, Windows might pick THAT up and load the mouse driver, too… “hey, why is my cursor freaking out?”


The author listed several reasons why they want it.

Also, it should still be possible to connect a serial mouse to a modern system thanks to adapters. I still have serial to PS/2 and PS/2 to USB adapters floating around in a tackle box.


> PS/2 to USB adapters floating around in a tackle box.

Heh. [Most] PS/2 to USB adapters aren't.

They don't actually adapt the PS/2 protocol to USB, they just adapt the pins. The USB _hardware_ on the host does the emulation. However, the new generations of USB chips stopped bothering with the PS/2 emulation so these adapters are now useless.


Actually it's the device that supports both PS/2 and USB mode. The host doesn't.


Sure. It is also a possibility, but a lot of PS/2 devices predate the USB.


This is also my understanding too


Wonder if this is part of why my unicomp keyboards don't work for going into the bios on my current desktop, it doesn't work until inside the OS.


Damn, in that case I'm certain that the ones I have gathering dust, all of which came from various packaged mice/keyboards, won't be up to the task. I've got enough old hardware they might still come in handy one day, but I'm pretty sure they're now just relics.


You can buy real PS/2 to USB adapters. E.g. this one from StarTech: https://www.amazon.com/dp/B00028OP2Y

I bought a used slideable rack-mounted LCD last year for my home server rack, and its keyboard with touchpad use PS/2. That's how I found that out.


> The author listed several reasons why they want it.

To be fair, they listed reasons to need a 80x25 terminal, but not reasons to need a 80x25 console. I'm a bit unclear as to why they could not use a regular 80x25 term in their graphical session.


They specifically want 8×16 characters in 9×16 character cells on a 720×400 display that has an overall 4:3 aspect ratio. There’s no way to achieve that on anything other than a real hardware CRT. No amount of fiddling with fonts in either X11 or Wayland or any other display manager will change the size of the pixels on your LCD.


There may be something I don't get then. What can you _not_ do in a graphical terminal that you can in a pure HW console?

Why would you not setup your graphical terminal to be full-screen on whatever column/row count, what's the difference ? Surely the rasterizable screen size is the same whatever mode your screen is in?


See my longer reply <https://news.ycombinator.com/item?id=45275565>, but the short version is that 720×400 has a 5:9 aspect ratio, but the CRT display was 4:3. The CRT compensated by scaling the vertical height of each line by 135%. An LCD simply cannot do that. The best anyone can do is to scale the 400 lines up by the same non-integer scaling factor to 540 lines. This gives you an ugly image where some lines are 1px tall but others are 2px. It does get less ugly if you scale to 1440×1080 (on an HD display), or to 2880×2160 (on a 4K display), but the artifacts are still obvious and undesirable.


How would a CRT scale the vertical height? Maybe you get a bit more space between lines, but the beam is only as tall as it is, right?

I would think the scaling would come horizontally ... you can put as many pixels horizontally as your RAMDAC can manage, but screen width per pixel depends on the pixel time vs the line time.


Yes, you’re correct. I left out all of those details to try to simplify the explanation as much as possible. The reality is that the pixels we were drawing got narrower as our display hardware got better. In this VGA text mode the width of the pixels is 20⁄27ths of the line height. But you obviously don’t want to scale the 720 pixels of width down to display in just 533⅓ LCD pixels, so instead you have to scale the height up. It was just easier to start by saying that the height of the CRT’s pixels was 27⁄20ths of their width instead. :)


Yes, you do get more space between the lines, an effect is known as "scan lines". It is specially notorious when forcing a 240p mode like 8-bit and 16-bit systems used to do, which gives old games a distinctive look that some retrogamers crave.

https://en.wikipedia.org/wiki/Scan_line


That still really isn't a reason. I think I can extrapolate where they are going with this, which will be something like 'I have a ton of 720x400 hardware CRTs sitting around that I need to use/support/deal with'. But that is never explicitly stated, you can completely read it as 'oh it's neat that 80x25 matches up with the number of image lines on old CRT displays and here is the math to show it'.


Perhaps you don’t know that on a CRT the number of lines that can be displayed is variable. Any CRT can display 400 lines or 399 lines or 1000 lines or however many lines you need or want. On an LCD there is always a fixed number of pixels, no more and no less. You can leave some of those pixels blank if you don’t need them but that’s about it.

720 pixels by 400 pixels is a 5:9 aspect ratio, but the monitor is a 4:3 aspect ratio display. On a CRT the result was an image made up of pixels that were taller than they were wide. 35% taller, to be specific.

To reproduce this on an LCD you need to scale the image up to 720×540 pixels which results in every line being drawn as either one or two lines of LCD pixels. Some lines are literally double the height of others. This is super ugly! Of course you could scale it up to 1440×1080, but now you’re just scaling the lines up by a factor of 2.7 instead of 1.35. Some lines are 2 pixels tall and others are 3, which still makes some lines 50% taller than the rest. On a 4K monitor you could scale it up by a factor of 5.4 to 2880×2160 making some lines 5 pixels tall and others 6. This is certainly better but you’ll still be able to tell the difference and it’s still ugly.

When you scale an image taken from the real world, such as one from a television program or a movie, then nobody will notice the artifacts. But when you scale pixel graphics, and especially text, then the artifacts spoil the whole thing.

There are two other routes you could take.

You could scale the text display instead. You could have an 80×33 text display using the 9×16 character cell. This gives you 720×528 pixels, which is close enough to the right ratio that you can just scale it up by a nice integer ratio and just ignore the few wasted pixels at the top and bottom of the screen. But now you’ve squashed the aspect ratio of the characters!

Ok, so you could stretch the character cell to 9×22 pixels, redrawing all of the characters by hand to approximate the original shapes. You’ll only have room for 80×24 characters in 720×528 pixels, but that’s much less disappointing than mucking about with the original font. People _grew up_ with that font. They _like_ it.

Of course neither of these options can take advantage of real VGA hardware. One of the advantages of VGA was that the CPU only had to manage a 2 kilobyte text buffer while the VGA hardware composited the character data from the font and created the video signal that went to the display. It could do this in real time, meaning latency was actually lower than a whole video frame. If you emulate this on a CPU it’ll be much, much slower than that. If you farm it out to a GPU instead then it’ll be far more expensive. A modern GPU needs tens of billions of transistors to run the shader that emulates what probably took a few thousand transistors on original VGA hardware.

A completely modern take on a console would lean into the new ratios and might have a 120×30 text display and a 16×36 character cell, creating a 1920×1080 pixel display that doesn’t need any scaling on most LCD panels. Instead of trying to support the original VGA character set (CP437 as it is sometimes called) and disappointing its fans, it would support Unicode, text shaping, combining characters, BiDi text, emoji, etc, etc. And the compositing would be done in hardware, but not in a shader on a $500 GPU. Or even a $100 GPU.


Except that still locks you to 1920x1080 which a LOT of displays, laptops etc already break... 16:10 aspect, 3440x1440p, etc. are already outside that ratio you outline. The thing with 80x25 VGA text modes was it was both fast and consistent in a lot of ways. You just cannot have a predictable full screen text interface today in the same way as you could.

That said, there were differences back when... you had the original CGA and EGA as precursors to VGA. I used an EGA 386 system for several years and have similar fondness for that, especially in that it was the native for RIPterm/RIPscrip in the early-mid 90's. Which was still different from the much more common VGA modes.

Hell, my biggest niggle to this day is that so many terminal apps don't match the DOS colors you got with CGA/EGA/VGA default. I often will configure that, so I can at least output classic ANSi art to my terminal and have it look closer to correct. I keep working through ideas for a "modern" text mode BBS that worked well in modern terminals. On the flip side, I'm thinking of adapting a door server model that will map CP-437 output to UTF-8 and map the colors to rgb color codes for modern terminals to show the right colors.


Sorry, I thought that part went without saying. A modern console leans into the actual resolution of your _current display_ and gives you nice integer ratios by default while letting you configure whatever character cell size you prefer if you want something different.


> Perhaps you don’t know that on a CRT the number of lines that can be displayed is variable. Any CRT can display 400 lines or 399 lines or 1000 lines or however many lines you need or want.

This isn't true. Besides there being only a finite number of distinct phosphors / grid holes, the electronics and logic driving the display can also only handle a limited range of frequencies.

As LCD pixel densities increase the situation becomes effectively the same.


Yea, I was deliberately simplifying so that I didn’t have to spend pages and pages on the subject. Suffice it to say that the display on your desk could handle quite a large range of display resolutions with equal ease. The portable TV you took to the beach or whatever might not be so flexible.


Chill out and stop judging people with such unnecessary histrionics.


Amount of driven kilometers are also up massively compared to 1985, meaning that per kilometer we did fine on the safety record.

Stop trying to make cars safer, and reduce the amount of driving you need to do. There is a way to have more liberties and have it improve safety instead of fewer. The liberty to commute, do groceries, and go to the gym by bike is huge life improvement, whilst taking nothing away in terms of car liberties.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: