Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think this is a very good example of how windows is different in its goals and designs from Linux. I have a feeling this is the reason Linux has had a hard time catching on, on the desktop. Its easy to complain about a slow filesystem, but Microsoft lives in a different world, where other priorities exist. For someone building a server running a database serving loads of people Linux is a no brainer. You can pick the parts you want, shrink wrap it and fire it up without any fat. On a desktop machine, you want to be able to update drivers in the background without rebooting, you want virus scanners, and you want to have driver ready the moment the user plug's in a new device. Both Windows and Linux is for the most part very well engineered, but with very different priorities.


I’m very confused by your post. You start off talking about desktop machines but NT was actually engineered for servers and then later ported to the desktop. You then describe a bunch of features Linux does better than Windows (eg updating drivers without a reboot).

I think a more reasonable argument is to make is just that Windows is engineered differently to Linux. There’s definitely advantages and disadvantages to each approach but ultimately it’s a question of personal preference.


NT is engineered for a different category of servers, though - it's a workgroup server first (originally its chief competitor was NetWare), and a Web/Internet server second. That drives a different set of priorities.

For example, as someone elsewhere in the comments pointed out, NT does file access in a way that works very well when accessing network shares. That's a pretty core use case for Windows on business workstations, where it's common for people to store all the most important files they work with on a network share, for easier collaboration with other team members.


NT was architected to handle high-end workstations from day one — there’s a reason why running the GUI was mandatory even when the resource costs were fairly substantial.

Check out e.g. https://en.wikipedia.org/wiki/Windows_NT_3.1 for the history of that era. The big selling point was that your business could code against one API everywhere, rather than having DOS PCs and expensive Unix, VAX, etc. hardware which was completely different and only a few people on staff were comfortable with.


OS/2 was a high end desktop OS, but NT diverged a little and took some heavy design principles from VMS (hence it’s name, WNT) and was thusly pivoted towards back office systems rather than desktop usage.

At that time Microsoft’s server offering was a UNIX platform, Xenix, but it was becoming clear that there needed to be a platform to serve workstations that wasn’t a full blown mainframe. So Microsoft handed Xenix to SCO to focus on their collaboration with IBM so the intent there was always to build something more than just high end workstation. And Given it was intended to be administrated by people who were Windows users rather than UNIX grey beards (like myself) it clearly made sense to make the GUI a first class citizen; but that doesn’t mean it was sold to be a desktop OS.


My point was that it is misleading to say it was billed as a server OS when all of their messaging was that it was both — maybe not as far down as low-end desktops but they were very clear that any serious work could be done on NT, going after the higher end PC and lower end workstation business.


In that era workstations weren’t the same things as desktops. They were an entirely different class of computers and often workstations just ran server OSs with a nicer UI (Next, SGI, etc). So you’re point about workstations doesn’t invalidate what I was saying about NT not originally targeting desktops.


Drivers in Linux live in the kernel. Whenever the kernel is updated a reboot is required (in most distros). Hence your assertion that Linux updates drivers without a reboot better than windows does is questionable.


You only need to restart if there has been a kernel update (on any platform, not just “some distros”). For regular driver updates between the same kernel ABI you can use modprobe to unload and reload the drivers. This works because while drivers share the same kernel memory space (as you illuded to), they aren’t (generally) part of the same kernel binary. They normally get bundled in the same compressed archive but are separate files with a .ko extension.

This isn’t a system that is unique to Linux either. Many UNIX platforms adopt similar mechanisms and Windows obviously also has its drivers as separate executables too.

It just so happens that rebooting is an easier instruction to give users than “unload and reload the device driver”; which might also potentially be dangerous for some devices to “hot-unload” while in use. So a reboot tends to be common practice on all platforms. But at least on Linux, it’s not mandatory like it is on Windows (for reasons other than the ability to reload drivers on a live system)


It's not mandatory on Windows either. I've updated various drivers for a wide range of devices over the years without needing a reboot. From what you describe, it seems the situation on Windows is similar to that on Linux.


Windows is a little different: to unload a driver in Windows it needs to support an unload method, which not all drivers do. And without that you cannot even write the updates to disk (due to the file locking vs inode differences which have already been discussed in this thread) let alone load them into the kernel.

That said, if a kernel module is in use on Linux then it’s sometimes no easy task finding the right procedure to follow to do an unload and reload of it.

Ultimately this is all academic though. These things mattered more on monolithic systems with little redundancy but these days it’s pretty trivial to spin up new nodes for most types of services so you wouldn’t get downtime (generally speaking. There are notable exceptions like DB servers on smaller setups where live replication isn’t cost beneficial)


> unload method, which not all drivers do

There comes the question why Microsoft did not implement a standard convention (not API) to unload a driver. They could have said a driver does this and that on unload, and we perform an as-if-shutdown unload if you fail to follow our convention.


> On a desktop machine, you want to be able to update drivers in the background without rebooting, you want virus scanners, and you want to have driver ready the moment the user plug's in a new device.

With the exception of the virus scanner these actually sound like arguments in favour of Linux, in my experience.

(Although there are also excellent virus scanners available for Linux anyway)


I’m pretty confused by this post. What would you identify their priorities as?

Regardless, as a person not a fan of windows (not worth learning a unix alternative), I would argue it’s the polish that makes the experience worth it, not some better engineered experience. For instance: working with variable DPI seems to be trivial, whereas it still seems years off in Linux. Same with printers and notifications and internet and almost everything. These aren’t feats of engineering per se, but they do indicate forethought I deeply appreciate when I do use windows.


I would hesitate to ascribe too much purpose to everything you see. Microsoft is a huge company with conflicting political factions and a deep ethos of maintaining backwards compatibility so there are plenty of things which people didn’t anticipate ending up where they are but which are risky to change.

One big factor is the lost era under Ballmer. Stack ranking meant that the top n% of workers got bonuses and the bottom n% were fired, and management reportedly heavily favored new features over maintenance. Since the future was WinFS and touching something core like NTFS would be a compatibility risk, you really wouldn’t have an incentive to make a change without a lot of customer demand.


As a c# Dev, I am constantly annoyed that Windows updates and sometimes installs require reboots or stopping all user activity, while I've never had to reboot or block during an upgrade on ubuntu


To be fair, a lot of Linux updates require a reboot or at least a logout to properly take effect, too. Windows is just very aggressive about forcing you to upgrade and reboot, which does undeniably have security benefits when you consider that Windows has a lot of non-technical users and a huge attack surface. At least they have relaxed it a bit, the frequent forced reboots caused me some serious problems on a Windows machine I had controlling a CNC machine.


Windows also requires rebooting for the actual upgrading process. A Linux update might need a reboot to take affect but the reboot is still a normal reboot, it won't take longer because it's trying to install something.

Both Windows and macOS suffer from this. Big updates to both systems can render the computer unusable for 30 minutes while they are installing.


This is true. Fedora now only has a Reboot and Update button in the GNOME Software GUI because some software like Firefox and some GNOME components crash if you update them while they're running (although this seems to happen more often with Wayland than Xorg for some reason). At least Linux and the BSDs give you a choice whether to do offline or online updates.


Interesting... I did a upgrade from Ubuntu 18.04 to 18.10 at same time that I was playing Stellaris. Zero issues.


I can report I've had firefox go weird on Linux if something like a font is updated while it's running.


Most of these things are coincidental byproducts of how Windows (NT) is designed, not carefully envisioned trade offs that are what make Windows Ready for the Desktop (tm).

For some counterexamples of how those designs make things harder and more irritating, look at file locking and how essentially every Windows update forces a reboot, that is pretty damn user unfriendly.


Even without file locking, how would live updates work when processes communicate with each other and potentially share files/libraries? I feel like file locking isn't really the core problem here.


Everything that is running keeps using the old libraries. The directory entries for the shared libraries or executables are removed but as long as a task holds a live file descriptor the actual shared library or executable is not deleted from the disk. New processes will have the dynamic linker read the new binaries for the updated libraries. Unless the ABI or API somehow changes during the update (and they don't, big updates usually bump the library version) things work pretty fine.


Do they work fine though?

1. On the one hand I see folks accessing files over and over by paths/names, and on the other hand they demand features that would break unless they switched their fundamental approach to handles/descriptors. Which is it? You can't claim descriptors would fix a problem and simultaneously insist on path-based approaches being perfectly fine. Most programs use paths to access everything (and this goes beyond shared libraries) and assume files won't have changed in between. You can blame it on the program not using fds if that makes you feel better, but the question is how do you magically fix this for the end user?

2. Do you actually see this working smoothly on a Linux desktop environment in practice, or do you just mean this is possible in a theoretical sense? Do you not e.g. get errors/crashes after an apt-get upgrade that presumably upgraded a package your desktop environment depended on (say GTK or whatever)? That happens to me frequently (and I'm practically guaranteed to see a problem if I open a new window in some program in the middle of an update), and it scares me what might be getting corrupted on the way -- makes me wish it would reboot instead of crashing and stop giving me errors.


1. In general updating the same files at the same time is not a that much of a common problem in any practical sense. The user generally won't be editing the document in two different editors at the same time. Programs use flock(2) or something similar if they have to update a shared file, or they have a directory structure that allows different instances of the program to update simultaneously by using little small files instead of having a mutually exclusive access to a single file.

I think the most common real-life problem is editing a shell script while it's still running: this happens often during development if the shell script takes a bit longer to run. You edit the file and hit save until the previous run has finished. The on-disk data changes which is reflected in the shell process that mmap()ped the script file, and eventually the contents that changed or shifted will break the shell's parser.

2. I have 106 days of uptime on my laptop. It has gone through several apt upgrades and I don't think I've shut down my X11 session for 106 days either. Firefox sometimes restarts itself after an update because it apparently knows it needs to do that but other than that I generally never restart X or reboot the machine because of updates. This has basically been the case for years, even decades. The scheme probably has to break eventually but I generally bump into other stuff such as important kernel updates before that. Fair enough for me, never really ran into any issues because of it.


1. User opens a document. User moves a higher-level directory not realizing it was an ancestor of that file. Then user goes back to the program and it can no longer find the file because it was using file paths. What should happen? Should the OS play any role?

2. You manage to keep X11 open, but that's hardly the point I was making. Do you also keep everything open and use your GUI programs as normal when going through an upgrade, or do you change your behavior in some way to avoid it messing up what you're running? And/or are you selective about which updates you apply to minimize their effects on what you're running?

Furthermore, are you familiar at all with the kinds of errors I referred to? Or have you never seen them in your life/don't know what I'm even talking about? If you don't think I'm just making things up when I say updates frequently cause me to get get crash and error messages ("Report to Canonical?" etc.), then in your mind, why does this happen right when I update? Is it just some random cosmic bit flip or disk corruption on my new computer that pops up exactly when I update? Is it not possible for it to be the update perhaps changing files when programs didn't expect them?


1) No, I think the assumption has to be that the user should know what s/he's doing. However, how would using handles even help there? If you close the file you will need to access it by a path even on Windows, and the very path has changed. Or instead, if you keep the file open and do not try to reopen it, even Linux lets you keep the file descriptor and have the program access the file as before even if it's moved around in the directory tree.

2) Yes, I generally keep stuff running as usual. I don't screen any updates, I just run them whenever I remember to. I think I've seen similar things to what you described. They're a rare exception though.

Obviously, doing something like a major update to a new Ubuntu version would make me close all programs and reboot the machine after the update. But any normal updates I just let through without thinking twice.

There will be problems eventually but the version mismatches become rather evident at that point. A configuration file format has changed or some scripts have moved, or a library has moved. I've seen Gnome Panel get messed a couple of times as Gnome gets notified of configuration file changes and the old Panel tried to load stuff meant for a new version of the Panel. I keep Emacs running all the time and I've seen it fail to find its lazy-loading lisp files some time after an update, being unable to enter a major or minor mode. I've seen Nautilus go wonky and unresponsive some hours/days after an update but killing the process fixed it. Chrome doesn't seem to mind but it gets a bit slow after a few weeks of use so it tends to get an occasional reboot even without updates. I've seen crash dialogs which don't come back after I restart the program, but again those are a handful across several years and were mostly about some long-running panel items like calendars or notification tickers.

However, all these are rare enough that I don't really feel any particular pain. It's quite indistinguishable from these complex programs rarely but sometimes still crashing on their own, all even without updates.

It generally takes a really long-winding session to run with enough cumulated big updates that majorly change things underneath that you can't just keep running the old binaries as they are. When something eventually misbehaves or crashes after the tenth or so update, I'll just restart that particular program. Most of the time the desktop itself keeps running like before. I don't recall a data loss because of live updates ever and I've used Linux since 1994 or so.

The live updates are much more convenient than restarting the whole system after each and every update just to make sure. I only restart one program when that program stops working, and like I said above even that is quite rare indeed.

As for you, you probably run programs that do suffer from this more.

I have the Gnome desktop with all its stuffses running in the background, I keep Emacs running continuously, a couple of browsers but their uptimes are generally around 1-2 weeks anway, a tmux session, then a lot of other programs which I don't keep open all the time.

But as most of the desktop still likely churns along as usually it's pretty easy to quit + relaunch a single program.


1. Reopening a file pretty much always marks a point where it's safe for the contents to change.

2. I don't think I've ever had a problem caused by continued use between update and reboot.


In answer to 2, not the GP but I've not experienced problems doing that. Maybe I'm just lucky, though.


you can always restart processes, on Windows it is fundamentally impossible to overwrite a running DLL or EXE file. So for example if some services are needed to apply updates, they can never be updated without a reboot.


Yes, I'm aware how Windows file locking works -- in fact you can sometimes rename running executables -- it depends.

Your solution to a rebooting the system being user-unfriendly is... restarting processes? How would that be so much more user-friendly? That's almost the same from a user standpoint, you might as well actually lock down the system and reboot to make sure the user doesn't try to mess with the system during the update.

And on top of all that, if you're actually willing to kill processes, then they won't be locking files anymore in the first place, so now you can update the files normally...

So yeah, I really don't understand how file locking is the actual problem here, despite Linux folks always trying to blame lack of live updates on that. I know I for one easily get errors after updating libraries on e.g. Ubuntu making programs or the desktop constantly crash until I reboot... if anything, that's far less user-friendly.


Not all applications need to restart, most updates will effect things that are not the running application (Office suite/webbrowser/game/whatever) ? Meanwhile your entire system has to restart.


"Most updates" won't affect running applications? What DLLs do you imagine "most updates" affect that are not in use by MS Office, Chrome, games, etc.? Pretty much everything I can imagine would be used all over the system, not merely coincidentally by desktop applications, but especially by desktop applications... if anything, it'd usually be the other way around, where some background services wouldn't need to be killed (because they sometimes only depend on a handful of DLLs), but many applications would (which can have insane dependency graphs). But both applications and background services also use IPC to interact with other processes (sometimes internally through Windows DLLs, not necessarily explicitly coded by them) which could well mean that they would need to be restarted if those processes need to be updated...


> What DLLs do you imagine "most updates" affect that are not in use by MS Office, Chrome, games, etc.?

Yeah, you can't update libc this way.

But outside of a short list of DLLs that are used by everything, files are mostly specific to a single program, and 90% of my programs are trivial to update by virtue of the fact that they aren't running.

And most of the background services on both linux and windows can be restarted transparently.


> But outside of a short list of DLLs that are used by everything, files are mostly specific to a single program, and 90% of my programs are trivial to update by virtue of the fact that they aren't running.

Are we talking about the same thing? We're talking about Windows updates, not Chrome updates or something. Windows doesn't force you to reboot when programs update themselves. It forces you to reboot when it updates itself. Which generally involves updating system DLLs that are used all over the place.


I don't think most updates touch those DLLs. Most have a modified date of my last reinstall. Some updates do, but a whole lot more could install without a restart if microsoft cared at all (like if it cost them ten cents).


>So for example if some services are needed to apply updates, they can never be updated without a reboot

I wouldn't say never. Hotpatching was introduced in windows server 2003[1]. However, it's seldom available for windows update patches, and even if it's available, you have to opt-in (using a command line flag) to actually use it.

[1] https://jpassing.com/2011/05/01/windows-hotpatching/


IIRC this is because under memory pressure, files can be paged out to their existing disk location, rather than taking up extra space in swap.


> On a desktop machine, you want to be able to update drivers in the background without rebooting, Exactly what does Linux and Windows DON'T does .


Windows can update many drivers without rebooting - even graphics drivers (try that with Linux and X!).


Yeah, it's amazing there is just a brief flash and everything is up and running again.

When I had slightly more unstable drivers, Windows could recover from that as well. The driver would crash, screen goes black, and then back up and running again without most apps noticing (excluding games and video playback).


Indeed I’ve updated many a drivers on windows (including graphics as you mention) without a reboot required. Always needed a reboot to do the equivalent kind of updates under Linux.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: