I've never had rice with burgers nor do I have an "Asian eating expectation/culture", but I absolutely do avoid McDonald's and the like because I feel hungry and lethargic shortly after eating there.
However, after a nice home-made burger I won't feel hungry again until the next meal and am full of energy. This isn't a tiny burger, either, I'll usually slap an egg on a 150g patty with some cheese for good measure. Since this is an "I'm too lazy to actually cook" meal, this tends to go with some kind of potatoes. I think the only difference between the two is the quality of the ingredients (added sugar in ketchup = bad, tomatoes are plenty sweet).
I think the difference absolutely comes down to what I eat. I don't put sugar syrup or whatever makes the McDonald's sauces so sweet in my burger, just basic boiled tomato sauce (so that it's thicker and doesn't make a mess). And I think that not only typical fast-food places are guilty of this. I've had similar outcomes after eating in "regular" brasseries around Paris what, on the face of it, wouldn't be considered "fast food".
I don't particularly like that, but even so, it doesn't preclude having a "standard" or "no enhancement" option, even if it's not the default.
On my TCL TV I can turn off "smart" image and a bunch of other crap, and there's a "standard" image mode. But I'm not convinced that's actually "as close to reference as the panel can get". One reason is that there is noticeable input lag when connected to a pc, whereas if I switch it to "pc", the lag is basically gone, but the image looks different. So I have no idea which is the "standard" one.
Ironically, when I first turned it on, all the "smart" things were off.
> public would expect two photos taken at same time with same model camera should look identical
But this is wrong. My not-too-exotic 9-year-old camera has a bunch of settings which affect the resulting image quite a bit. Without going into "picture styles", or "recipes", or whatever they're called these days, I can alter saturation, contrast, and white balance (I can even tell it to add a fixed alteration to the auto WB and tell it to "keep warm colors"). And all these settings will alter how the in-camera produced JPEG will look, no external editing required at all.
So if two people are sitting in the same spot with the same camera, who's to say they both set them up identically? And if they didn't, which produces the "non-processed" one?
I think the point is that the public doesn't really understand how these things work. Even without going to the lengths described by another commenter (local adjust so that there appears to be a ray of light in that particular spot, remove things, etc), just playing with the curves will make people think "it's processed". And what I described above is precisely what the camera itself does. So why is there a difference if I do it manually after the fact or if I tell the camera to do it for me?
Well, isn't this compositor related? I've never had any window placement issues running Sway (i3 for Wayland). I never used night light on that machine, so I can't comment on that particular point, but the thing seems to work just as well as i3.
The only problem I have is with JetBrains IDEs, which seem to have shaky support. They're usable (meaning you can code), but the experience is so wonky that I basically consider they don't support Wayland.
The reason I switched from i3/x11 is that we've got some 27" 5k screens at work that are basically useless at 100%, and Sway handles different scaling settings flawlessly (except for IntelliJ, which seems lost).
Intellij kinda supports Wayland, but it gets confused when using scaling, at least when not all screens have the same factor. It's not blurry or anything, but it's slower, and the menus sometimes appear in random places.
When I only use a scaling factor of 1 on all the screens, it's usable enough, although it still feels sluggish.
You are not wrong. But there is a new extension protocol on the way (dunno if done, very probably not rolled out if it is) that let programs solve this problem.
> buying top of the line laptops and trying to use them longer is never going to be cheaper or better than buying medium grade laptops and upgrading more often.
I think this is much less general than you make it out to be and has an extremely strong dependency on how you use the thing and of your preference. It makes me think of the boot theory.
Personally, for the type of work I do, I rarely need the latest ludicrously fast CPU. But I use it a lot and love to do so comfortably. To me, that means a great screen, a quiet fan, and a nice keyboard and touchpad.
Buying a mediocre computer and changing it more often means you'll always have a mediocre experience. A case in point: at work we have HP Elitebooks. The brand-new 2025 models I see people receive have worse screens and trackpads than my 2013 MBP. Sure, that box was quite a bit pricier even in nominal terms, but it had the same amount of RAM (16 GB) and SSD (512 GB) as these new computers. I'll also grant that the new ones have a faster CPU but the SSDs are somehow absurdly slow. I haven't seen a single one of these machines last more than 10 years fully functional. My mom still uses that MBP.
But the experience is sub-par. In the period 2013-2015, we never got to experience a nice laptop. For the office work these people do, that 12-year-old Mac would be an all-around better experience.
The HP screens at the time were truly horrendous. They're leagues better now but still poor and clearly worse than the 2013 mac. They are relatively contrasty, but the colors are all weird.
The trackpads have also improved a lot, but there still is some kind of odd lag when you use them [0]. They're horrible enough that many people still prefer carrying a mouse when using them away from their desks, and the mice we're provided aren't some Rolls-Royce ultra-premium affair, just a crappy, laggy Bluetooth Dell.
They also degrade from daily use: the screen hinge loosens so it moves if you look at it wrong, barrel power connectors from older models somehow become unreliable, and USB ports start to get loose (although when new they tend to be extremely tight). USB-C ports tend to become mushy.
Newer models tend to be quieter, but up until a few models ago, the fan would go wild for no reason (I work with many "non tech" people, so they basically use Outlook and browse a few random websites).
Now, if you only ever use your laptop tethered to a big screen and whatnot, and it's basically a very compact and easy-to-cart-around desktop, then sure, I can understand not caring one bit about all this: you never go out in the rain, so you never get wet feet!
---
[0] This is possibly a Windows driver issue, since on my lower-end Elitebook (840 vs 1040) from 2020 running Linux, this doesn't happen.
> I think this is much less general than you make it out to be and has an extremely strong dependency on how you use the thing and of your preference. It makes me think of the boot theory.
The boot theory is different: It is about buying something not crafted with proper materials which will quickly fall apart and cost more in the long run. However, unlike technology, there is no cowhide 2.0 coming out 6 months after, with all leather made the cowhide 1.0 way instantly dropping in value and fading into irrelevancy.
Low-end laptops tend to be build around older SKU's which are no longer of interest but function no worse than they did when launched, and mid-tier laptops tend to be made with current SKU's in the more reasonable binning categories.
At the same time, the replacement rate also means that whatever high-end laptop will soon have its rear handed to it by a mid-tier machine of the next chip generation. While it's outdated within a year or two you of course won't replace it that soon, but that upgrade might stretch your upgrade schedule 50% to justify it - from, say, 4 years to 6. More years of being generations behind.
Keeping a Framework around for longer also only makes sense when considering upgradability.
> A case in point: at work we have HP Elitebooks. The brand-new 2025 models I see people receive have worse screens and trackpads than my 2013 MBP
HP Elitebooks are expensive, high-end machines. The issues you see is because HP, Dell and Lenovo all cater to IT department buying strategies, and therefore all offer a bargain screen option as the machines are bought in bulk and mostly used docked. You'll find that they also have an mid-tier screen option (usually a color-accurate 1080p or 1440p panel), as well as a high-end option (say, 4k OLED touchscreen with wacom digitizer).
The trackpad is a different story entirely. If you're used to a Mac trackpad, things are a bit grim on the PC side. There's some nice ones coming out though.
> They also degrade from daily use: the screen hinge loosens so it moves if you look at it wrong
Everything degrades from daily use. My hinge loosened on my last Mac, the screen damaged itself because the 15" panel did not have the necessary rigidity to main the intended 0.5mm gap to the keyboard keys (confirmed as the second panel did the same), the shitty magsafe port overheated as using pogopins for high power transfer is a terrible idea, all I/O on one side died, and the battery had inflated at least once...
My Dell XPS 13 costing 1/3rd never had any problems, and when replaced it just felt a bit slow and had a somewhat aged battery.
Price is not an indicator of quality or expected reliability, nor is the brand itself a reliable indicator. Use common sense, take a look at the product and avoid the bottom of the bargain bin.
---
Do get a good mouse and keyboard though, it has a much more direct impact on your user experience and a mouse that costs twice as much isn't as much of an issue as a laptop costing twice as much. The keyboard and mouse also lasts longer if kept well.
I'm typing this on my company azure ad integrated windows 11. The system info says it's windows 11 enterprise 25h2.
My start menu still has multiple random xbox crap in there, game bar (what even is that?!), "game mode", "solitaire and casual games". It shows random ads in the weather app. It invites me to do more with a microsoft account, even though the computer is fully azure ad joined and my windows session is an azure ad account with some expensive office365 licence attached.
Before reinstalling the other day for unrelated reasons, I had actually tried to add that account. Turns out it doesn't work with a "work or school" account, it requires the personal one, but it doesn't say it clearly, only that "something went wrong".
I honestly don't see any difference when compared to my personal windows install I use for the occasional game and Lightroom / Photoshop.
> Apple was a great option before they moved to ARM. Now they can't run much of anything (and it sucks).
I used to love my 2013 MBP. ARM Macs run pretty much everything I need, and some things better than Windows (such as Lightroom and PS which don't run at all on Linux).
But what kills it for me is the absolutely bonkers window management, and the fisher-price interface filling up half the screen with empty space around huge widgets.
I really hope they dial back Liquid Glass after the guy who ruined it all left. I love the glass effect itself, but the rest of the design could use a lot of work.
When you tap one of those fields it bounces you to a contact card. If it is an existing contact (for example, yourself), you just get the full contact card. If that contact card has multiple addresses (my contact card lists ten), you get no indication of which one it was sent to.
At some point in time the actual email address used was flagged with a little “recent” badge - by itself a confusingly-worded tag - but even that doesn’t show up consistently.
It’s stupid because there’s really no reason to play hide and seek with the email address - that’s an identifier that people should generally be familiar with (since you have to use it reasonably often), and lots of people have multiple addresses that they can receive mail at.
However, after a nice home-made burger I won't feel hungry again until the next meal and am full of energy. This isn't a tiny burger, either, I'll usually slap an egg on a 150g patty with some cheese for good measure. Since this is an "I'm too lazy to actually cook" meal, this tends to go with some kind of potatoes. I think the only difference between the two is the quality of the ingredients (added sugar in ketchup = bad, tomatoes are plenty sweet).
I think the difference absolutely comes down to what I eat. I don't put sugar syrup or whatever makes the McDonald's sauces so sweet in my burger, just basic boiled tomato sauce (so that it's thicker and doesn't make a mess). And I think that not only typical fast-food places are guilty of this. I've had similar outcomes after eating in "regular" brasseries around Paris what, on the face of it, wouldn't be considered "fast food".
reply