I don't think this does a good job of clarifying concurrency and parallelism - rather it muddies the waters, or perhaps I should say will have have the result of confusing many people.
While people are generally clear on what parallelism is, or at least find themselves in very much the same ballpark, there seem to be two separate and completely different common usages of 'concurrency' in vogue as part of current technical vernacular.
Articles like this draw hackles and controversy because because two groups of people who've defined the same word differently refuse to realize that they're having two completely different conversations.
While the may be a 'correct' usage, it's functionally meaningless if large segments of people refuse to recognize the 'correct' definition.
...meh. Though the article actually had some interesting things to say, it seems to sure to have them overlooked just for the sake of begetting another round of terminology war.
So which terms do you suggest? What I was talking about was event handing systems needing different things than computational systems, and misapplications of tools designed for the former in the latter. The way I use the terms "concurrency" and "parallelism" is largely consistent with the way language designers involved in Haskell, Rust and ParaSail use these terms.
What other widespread uses of these terms conflict with my usage, and what should I have said to avoid "another round of terminology war"?
Your usage is slightly different than mine. My usage comes from what I learned way back in my operating systems class in undergrad. I've been doing work in parallel computing of various forms for a bit now, and this is how I think about it (cribbing from what I wrote here a few months back):
Concurrency is a statement about a duration of time. Parallelism is a statement about a moment in time.
During a one second window, did both of the tasks execute? If yes, then they are concurrent. If no, then they are not concurrent in that window.
Was there any moment in time when both tasks executing simultaneously? If yes, then they executed in parallel. If no, then they did not execute in parallel.
From this, it's easy to see that if something is parallel, it must also be concurrent - in order for two tasks to execute during the same moment in time, they must also both have executed during a window of time.
Concurrency without parallelism happens on machines with a single core. Processes and threads must time-share the only core. Frequently, some of these processes will have multiple threads, so the threads need to synchronize with each other, even though they don't execute at the same time.
Concurrency and parallelism implies non-deterministic interleavings of threads and processes, but whether or not the result is deterministic is up to how we program.
I don't really know what terms are best to use, all I know is I've met a fair number of people who don't think context switching is a true form of concurrency.
I think a large segment of this group sees parallelism as many backhoes digging one hole (or working as a team on a few holes) and concurrency as many backhoes all digging their own holes, roughly speaking.
I also encounter another (smaller) group of people who blur/conflate concurrency with distributed systems, probably because of things like websites bragging about how many 'concurrent connections' they can serve.
These two groups can blur with each other fairly easily because they share what I'd term an 'intuition overlap.'
Oddly enough, I think this helps explain why many people are often so confused about things like NodeJS. Touted for its concurrency, a lot of people first using it seem to think that it's somehow parallel by nature.
I think the reason this kind of confusion is so prevalent is that in vernacular English, concurrent is essentially a synonym for 'simultaneous,' while as used in programming (along the lines of how it was used in this article) it comes to mean something more like 'how many balls can a juggler keep in the air,' with highly concurrent systems being able to juggle many more balls with the same two hands, so to speak. And in fact, in a way, all the balls are concurrently in the air, but it's not (I think) the most intuitive thing, hence why many vehemently disagree about usage.
When you find yourself arguing about whether a tree falling in the forest causes a sound, the right thing to do is to start talking about vibrations and audio sensations instead. Ambiguous words can almost always be bypassed by using more words to nail down meanings properly.
While people are generally clear on what parallelism is, or at least find themselves in very much the same ballpark, there seem to be two separate and completely different common usages of 'concurrency' in vogue as part of current technical vernacular.
Articles like this draw hackles and controversy because because two groups of people who've defined the same word differently refuse to realize that they're having two completely different conversations.
While the may be a 'correct' usage, it's functionally meaningless if large segments of people refuse to recognize the 'correct' definition.
...meh. Though the article actually had some interesting things to say, it seems to sure to have them overlooked just for the sake of begetting another round of terminology war.