This is interesting, and to some extent, expected. I'd expect that emulating one OS on top of another is going to have performance challenges and constraints in general.
But the deep dive into aspects of Windows and Linux I/O reminded me that I'd love to see a new generation of clean sheet operating systems. Not another Unix-like OS. Not Windows. Something actually new and awesome. It's been a long time since Unix/BSD, Linux, and Windows NT/XP were introduced.
A few years ago, Microsoft developed a new operating system called Midori, but sadly they killed it. MS, Apple, and Google are each sitting on Carl Sagan money – billions and billions in cash. Would it hurt MS to spend a billion to build an awesome, clean-sheet OS? They could still support and update Windows 10 for as many years as they deemed optimal for their business, while also offering the new OS.
Would it cost more than a billion dollars to assemble an elite team of a few hundred people to build a new OS? Would Apple, Microsoft, or Google notice the billion dollar cost? Or even two billion?
If you think there's no point in a new OS, oh I think there's a lot of room for improvement right now. For one thing, we really ought to have Instant Computing in 2018. Just about everything we do on a computer should be instant, like 200 ms, maybe 100 ms. Opening an application and having it be fully ready for further action should be instant. Everything we do in an application should be instant, except for things like video transcoding or compressing a 1 GiB folder. OSes today are way, way too slow, which comes back to the file access performance issue on Windows. Even a Mac using a PCIe SSD fails to be instant for all sorts of tasks. They're all too damn slow.
We also need a fundamentally new security model. There's no way data should be leaving a user's computer as freely and opaquely as is the case now with all consumer OSes. Users should have much more control and insight into data leaving their machine. A good OS should also differentiate actual human users from software – the user model on nix is inadequate. Computers should be able to know when an action or authorization was physically committed by a human. That's just basic, and we don't have it. And I'm just scratching the surface of how much better security could be on general purpose OSes. There's so much more we could do.
Google is doing this, very publicly, with Fuchsia. Brand new kernel, not even POSIX compliant.
Microsoft is also doing this, in a different and substantially more expensive way [1]. Over the past several years they've been rewriting and unifying their disparate operating systems (Windows, Phone (before the fall), Xbox, etc) into a single modular kernel they're calling OneCore. Its more than likely that this work is based off of, if not totally contains, much of the NT kernel, but its the same line of thinking.
There is one massive rule when it comes to engineering management we see repeated over and over, yet no one listens: Do Not Rewrite. Period.
Apple is exemplary in this. We don't know how many changes they've made to iOS since its fork from MacOS long ago, which was based on BSD even longer ago. But have you used an iPad in recent history? Instant app starts. No lag. No-stutter rendering at 120fps. When HFS started giving them issues, they swapped it out with APFS. Apps are sandboxed completely from one-another, and have no way to break their sandbox even if the user wants them to. Etc. Comparing an iPad Pro's performance to most brand new Windows laptops is like fighting a low-orbit laser cannon with a civil war era musket. They've succeeded, however they managed to do that.
Point being, you don't rewrite. You learn, you adapt, and you iterate. We'll get there.
(And if you've read all this and then wondered "but isn't Fucshia a rewrite" you'd be right, and we should all have serious concerns about that OS ever seeing the light of day on a real product, and its quality once that happens. It won't be good. They can't even release a passable ChromeOS device [2])
> Would it cost more than a billion dollars to assemble an elite team of a few hundred people to build a new OS? Would Apple, Microsoft, or Google notice the billion dollar cost? Or even two billion?
It's not a matter of money, resources, or talent. For reference: The Mythical Man-Month, waterfall process, Windows Vista.
Building something of this scale and complexity from scratch will take a lot longer than even the most conservative estimate and will take many more years to shake out the bugs. Again, remember Windows Vista? And that was not nearly as revolutionary as the things you suggest.
Consider also that basically every single ground up "we are rethinking everything, and doing it right this time" OS rebuild from scratch has been a failure. There have been dozens, if not hundreds of examples.
I would argue that BeOS was not a failure as an OS, but it was a failure in the market where it couldn't find a clear place for itself, and was attempting to break into a market pretty much totally dominated by MS at the time.
Remember in the time of home computers, there were many good Systems that were pretty much from scratch implementations (Amiga, GEM and Archimedes) so the idea of creating a totally new OS against the incumbents (Windows and OSX) is not totally pointless.
I think the biggest problem is building the application ecosystem. iOS and Android were able to attract developers because they were the first movers on the computer phone platform. Convincing people to use a new OS for desktop (or any existing) computing without any apps would be difficult, and vice-versa convincing devs to support a platform with no users is also tough.
I think it's actually much easier now to get people on a new OS then it was a decade ago, or in the 90s, or the 80s. The web is so centric that a new OS with a great browser has won half the battle (I could write five pages on how massively better a browser could be than Chrome, Edge, Firefox, Safari, Opera, etc. and I love Firefox and Opera and Vivaldi and Brave.)
ChromeOS has partly proved the point, especially among college students and younger. Anyway, a serious effort at a new OS, a whole new stack, along with a new office suite and other applications, could succeed with the right (large) team and funding. People aren't very loyal to Microsoft – they're not loved. And Apple users have no idea how much better computers could be, but they would if someone showed them.
Would it cost more than a billion dollars to assemble an elite team of a few hundred people to build a new OS? Would Apple, Microsoft, or Google notice the billion dollar cost? Or even two billion?
This gives some idea of what is required to keep Linux moving forward. It would be nice if we could see similar stats for MS and Apple.
Yes, I like them a lot. A formally verified OS should be the standard these days (and formally verified compilers). But for that to be so, we'll need new programming languages (not Rust) and toolchains I guess.
So I'm coming at this from a social science perspective. Why did I say "not Rust"? I'm working on a research program to get at what happens when people engage with programming for the first time, aspects of language design that attract or repel learners, and as sort of a related offshoot, why smart women are so disproportionately less interested in programming.
One of my working hypotheses is that programming is terrible. More precisely, the majority of (smart) people who are introduced to programming have a terrible experience and do not pursue it. I think many smart people walk away thinking that it's stupid – that the design and workings of prevailing programming languages are absurd, tedious, and nonsensical. (Note that most men aren't interested in a programming career either. So it's not just smart women.)
Most intro to CS or programming courses seem to use Java these days. That's an unfortunate choice. To veteran programmers, Rust likely seems very different from other languages. Its default memory safety, ownership and borrowship and other features are innovative. (Though the amazing performance of Go's garbage collector seems to weaken the case for Rust's approach somewhat.) However, to new learners, I'm pretty sure Rust will be awful. It's fully traditional in its mass of punctuation noise, just saturated with bizarre (to outsiders) curly braces and semicolons. It also has, in my opinion, extremely unintuitive syntax. I see it as somewhat worse than Java in that respect.
Honestly, Rust is brilliant in more ways than one. I just think they punted on syntax and punctuation noise. I think our timeline is just weird and tragic when it comes to programming languages. They're almost all terrible and just so poorly designed. There's hardly any real scientific research on PL design from a human factors perspective, and what research there is is not applied by PL designers. PLs are just the result of some dude's preferences, and no one seems to be thinking outside of the box when it comes to PL design. In the end, to really make progress in computing and programming, we might need outsiders and non-programmers to design PLs. I'll say that the now-canceled Eve programming language project is a notable exception to the consistent horribleness of modern PLs. We need a few dozen Chris Grangers.
Is it really important how easy it is to learn a language when we are talking about formally verified operating system kernels? Beginners are probably not going to write formally verified code, let alone formally verified system code, so this area is already only accessible to veteran programmers who may not have a problem with these aspects of Rust.
I don't think it's a contradiction to introduce people to programming with more "elegant" or uniform languages that are closer to math (like Scheme or Haskell), or "easy" languages like Python, and then have them move on to Rust if they want to do system-level programming.
I see your point on beginners and formally verified systems code. However, I'm imagining a somewhat different universe.
Formally verified systems code is a nightmare right now. It's almost impossible, and hardly anyone does it for pay. Obscure tools like Isabelle or Coq are needed. People try to shoehorn formal verification onto C, which is a terrible lack of imagination and use of time. Maybe it's easier with Ada or Haskell or something – not sure.
I want a revolution. I want a whole suite of new, clean-sheet programming languages that are the result of furiously brilliant multifaceted design. I want formal verification to be easy, or easier, and baked in to systems languages. I want new operating systems built in these new languages. Some of the dissenters in this thread are assuming that building a new operating system involves firing up your C compiler of choice, and dismiss the whole enterprise as a multiyear bug-ridden nightmare ("never rewrite!" they say) – that is definitely not what I'm thinking about. To hell with C.
It's time to move forward. This is supposed to be a technology industry. It's supposed to be about technological progress. PL design is not progressing. It's awful and it repels newcomers, especially women. We need to drop all these assumptions about programming being done in primitive plain text editors on primitive steampunk Unix-like OSes using primitive command line interfaces with languages as intractable as Linear A. That whole package, which is often what people are introduced to in intro CS courses, is about the worst way to introduce people to the power and wonder of computing that we could have devised.
We need to think more deeply about how to handle and represent various abstractions, functions, and imperative code. Functions should not have bizarre void statements and empty parens, for example. That's so unintuitive. At this point, functions should probably be like equation editors. The pure mathematical function in its normal book representation could be what we see in our IDEs. Visual/graphical programming never really got out of the gate because they just converted code to boxes and arrows. There's an opportunity there to rethink that whole approach.
But if we don't get people interested in programming, if we don't make it less terrible, they never get to systems work. And systems work should not look like Rust or C++. There is no reason why a modern IDE or even text editor should need curly braces or semicolons to understand or compile code at this point in the history of the software industry. Text is a solved problem. Indentation and spacing and line breaks are all trivial to parse. Colors could be semantic. Emoji could be code. Icons, badges like we see for build passing/failing could also be used as code. There's a lot we could do to make programming make a lot more sense.
One of the reasons I want to get more women into programming is that I think they'll be better programmers than men in some respects. Think of archetypes or stereotypes like Hermione vs Harry Potter, Lisa Simpson vs Bart, Wendy Testaburger vs Stan, Kim Wexler vs Jimmy McGill. There is anecdotal, cultural, and scientific evidence to suggest that women – or some subset of women – are more responsible and meticulous than some of the male personality types we often see in tech.
I'll have more on this this week in my report on Medium on Google's firing of developer James Damore for citing well-established social and personality psychology research (my field) on sex differences. I get into some of the reasons programming is generally terrible, and why women might be disproportionately repelled by it. Dart is my victim/example in this report, but I think pretty much all languages are awful. (Eiffel and F# should be the worst case, the bare minimum starting points...)
We did think a lot about the syntax, but we wanted to choose something that'd be familiar to existing systems programmers. It's largely based on C++ and Java, with a little bit of ML thrown in. This is because the language is, in many senses, C++ with some ML thrown in.
AIUI, proper program verification is still a very long-term goal for Rust (and most of the related short-term effort is currently going into building a formal model for the language, including its "unsafe" fragment). So, "not Rust" in that Rust simply doesn't cut it at present, whereas other solutions (Idris, Coq, even Liquid Haskell) might be closer to what's needed.
But the deep dive into aspects of Windows and Linux I/O reminded me that I'd love to see a new generation of clean sheet operating systems. Not another Unix-like OS. Not Windows. Something actually new and awesome. It's been a long time since Unix/BSD, Linux, and Windows NT/XP were introduced.
A few years ago, Microsoft developed a new operating system called Midori, but sadly they killed it. MS, Apple, and Google are each sitting on Carl Sagan money – billions and billions in cash. Would it hurt MS to spend a billion to build an awesome, clean-sheet OS? They could still support and update Windows 10 for as many years as they deemed optimal for their business, while also offering the new OS.
Would it cost more than a billion dollars to assemble an elite team of a few hundred people to build a new OS? Would Apple, Microsoft, or Google notice the billion dollar cost? Or even two billion?
If you think there's no point in a new OS, oh I think there's a lot of room for improvement right now. For one thing, we really ought to have Instant Computing in 2018. Just about everything we do on a computer should be instant, like 200 ms, maybe 100 ms. Opening an application and having it be fully ready for further action should be instant. Everything we do in an application should be instant, except for things like video transcoding or compressing a 1 GiB folder. OSes today are way, way too slow, which comes back to the file access performance issue on Windows. Even a Mac using a PCIe SSD fails to be instant for all sorts of tasks. They're all too damn slow.
We also need a fundamentally new security model. There's no way data should be leaving a user's computer as freely and opaquely as is the case now with all consumer OSes. Users should have much more control and insight into data leaving their machine. A good OS should also differentiate actual human users from software – the user model on nix is inadequate. Computers should be able to know when an action or authorization was physically committed by a human. That's just basic, and we don't have it. And I'm just scratching the surface of how much better security could be on general purpose OSes. There's so much more we could do.