Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Your usage is slightly different than mine. My usage comes from what I learned way back in my operating systems class in undergrad. I've been doing work in parallel computing of various forms for a bit now, and this is how I think about it (cribbing from what I wrote here a few months back):

Concurrency is a statement about a duration of time. Parallelism is a statement about a moment in time.

During a one second window, did both of the tasks execute? If yes, then they are concurrent. If no, then they are not concurrent in that window.

Was there any moment in time when both tasks executing simultaneously? If yes, then they executed in parallel. If no, then they did not execute in parallel.

From this, it's easy to see that if something is parallel, it must also be concurrent - in order for two tasks to execute during the same moment in time, they must also both have executed during a window of time.

Concurrency without parallelism happens on machines with a single core. Processes and threads must time-share the only core. Frequently, some of these processes will have multiple threads, so the threads need to synchronize with each other, even though they don't execute at the same time.

Concurrency and parallelism implies non-deterministic interleavings of threads and processes, but whether or not the result is deterministic is up to how we program.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: