Hacker Newsnew | past | comments | ask | show | jobs | submit | BenjiWiebe's commentslogin

I was born in the 90s, but I'm nostalgic for the 80s/90s at least w.r.t computers.

I haven't noticed slowness in the right click menu on <= windows 10. On Windows 11 it's slower simply because I have to open a second menu to get to what I want.

You've not had WinRAR, 7Zip, TortoiseSVN, et. al. as COM extensions, I see ;-)

7-zip yes, the others no.

How many IDEs run it all in one process? As far as I know they still call lots of processes.

Visual Studios creates worker threads used for compiling C# that are reused for builds. This can be seen in task manager. Ran into a bug were the VS compiler would only work properly for one build and require VS to be restarted.

Visual Studios is like Windows, each version just seems to be getting worse with more bugs.


Visual Studio calls just one (msbuild) for almost all of the compilation process, perhaps half a dozen processes at most. Certainly not hundreds to thousands, as typically seen under Linux.

I hate time change but I love daylight savings time.

I'll put up with the switch if that's the only way to get DST.


Verizon even does something like that with physical SIMs. My father got a new phone and we moved the SIM card to it. Some things worked, some didn't. Called customer service, they said you can't just move the SIM like that!

Ended up switching to AT&T.


Not everyone uses the roads the same amount. It would disincentivize having a job with a 40 mile commute.

Also trains/subways are obviously another non-road transport option.


Re landscape photography: If it actually looked like that in person 1 percent of the time, I'd argue it's still accurate to reality.

There are a whole lot of landscape photographs out there I can vouch for their realism 1% of the time because I do a lot of landscape photography myself and tend to get out at dawn and dusk a lot. There are lots of shots I got where the sky looked a certain way for a grand total of 2 minutes before sunrise, and I can see similar lighting in other peoples' shots as real.

A lot of armchair critics on the internet who only go out to their local park at high noon will say they look fake but they're not.

There are other elements I can spot realism where the armchair critic will call it a "bad photoshop". For example, a moon close to the horizon usually looks jagged and squashed due to atmospheric effects. That's the sign of a real moon. If it looks perfectly round and white at the horizon, I would call it a fake.


Wouldn't be a genuine version of what my eyes would've seen, had I been the one looking instead of the camera.

I can't see infrared.


Perhaps interestingly, many/most digital cameras are sensitive to IR and can record, for example, the LEDs of an infrared TV remote.

But they don't see it as IR. Instead, this infrared information just kind of irrevocably leaks into the RGB channels that we do perceive. With the unmodified camera on my Samsung phone, IR shows up kind of purple-ish. Which is... well... it's fake. Making invisible IR into visible purple is an artificially-produced artifact of the process that results in me being able to see things that are normally ~impossible for me to observe with my eyeballs.

When you generate your own "genuine" images using your digital camera(s), do you use an external IR filter? Or are you satisfied with knowing that the results are fake?


Silicon sensors (which is what you'll get in all visible-light cameras as far as I know) are all very sensitive to near-IR. Their peak sensitivity is around 900nm. The difference between cameras that can see or not see IR is the quality of their anti-IR filter.

Your Samsung phone probably has the green filter of its bayer matrix that blocks IR better than the blue and red ones.

Here's a random spectral sensitivity for a silicon sensor:

https://encrypted-tbn0.gstatic.com/images?q=tbn:ANd9GcRkffHX...


But the camera is trying to emulate how it would look if your eyes were seeing it. In order for it to be 'genuine' you would need not only the camera to genuine, but also the OS, the video driver, the viewing app, the display and the image format/compression. They all do things to the image that are not genuine.

I know both C and Ruby, and Ruby is far more than syntactic sugar over C.

Like not even close.


Fixing problems when they appear is ethical.

Refusing to fix a problem that hasn't appeared yet, but has been/can be foreseen - that's different. I personally wouldn't call it unethical, but I'd consider it a negative.


The problem is that popularity is governed by power laws.

Literally anybody could forsee that, _if_ something scales to millions of users, there will be issues. Some of the people who forsee that could even fix it. But they might spend their time optimizing for something that will never hit 1000 users.

Also, the problems discussed here are not that things don't work, it's that they get slow and consume too many resources.

So there is certainly an optimal time to fix such problems, which is, yes, OK, _before_ things get _too_ slow and consume _too_ many resources, but is most assuredly _after_ you have a couple of thousand users.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: