iosswiftgrand-central-dispatch

Why doesn’t calling serialQueue.sync() result in a deadlock?


While studying DispatchQueue, I came across something I don’t quite understand.

I created a custom serial queue and executed a synchronous block on it. When I printed the thread that was running the block, it turned out to be the main thread. This confused me, because the queue wasn’t the main queue, yet the block still executed on the main thread.

let serialQueue = DispatchQueue(label: "serial.queue")

serialQueue.sync {
    print("current thread = \(Thread.current)") // current thread = _NSMainThread
}

In the documentation for sync(), I found the following statement:

As a performance optimization, this function executes blocks on the current thread whenever possible, with one exception: Blocks submitted to the main dispatch queue always run on the main thread.

Based on this description, I assumed that if the block was indeed executed on the current thread (the main thread in this case), then a deadlock should have occurred.

Here’s the reasoning I had:

  1. The main thread calls serialQueue.sync(). Because it’s synchronous, the main thread becomes blocked until the block completes.
  2. The block is submitted to the serial queue, which then attempts to execute it on the main thread (the current thread).
  3. However, since the main thread is already blocked, it wouldn’t be able to process the task, which should cause a deadlock.

But contrary to my expectation, no deadlock occurred and the code executed just fine. I can’t figure out why.

Question.

  1. Why is a block executed on the main thread even though it was submitted to a custom serial queue?
  2. If, as the documentation suggests, sync() executes on the current thread as an optimization, why doesn’t this lead to a deadlock in this case?
  3. Can the main thread still execute other tasks while it is blocked? I would like to understand how exactly the main thread operates.

Solution

  • There are two very different issues at play here:

    1. Will it deadlock?

      It will deadlock if you dispatch synchronously from a serial queue to itself. But you are dispatching from one queue (the main queue in your example) to a different queue, so there is no deadlock.

    2. On which thread will the synchronously dispatched closure run?

      GCD has a clever optimization: It is smart enough to know that if you dispatch synchronously from one queue to another, the thread from which you called is not available to do other work for the duration of the sync closure and therefore would (in the absence of any optimizations) just be sitting idle. As a result, in many cases sync can avail itself of this idle thread, avoiding an unnecessary and expensive context switch to a different thread. As the sync documentation says:

      As a performance optimization, this function executes blocks on the current thread whenever possible…

    But it is important to not conflate these two questions: When you call sync, it first determines out whether the queue is blocked (i.e., a serial queue already running something), and if not, it will run the closure. But when it goes to run it, it avails itself of the caller’s thread if it can.


    It’s not relevant to your example, but for the sake of completeness, we should observe that there is a notable exception to the optimization I talk about in point two. Namely, the documentation goes on to say (emphasis added):

    … with one exception: Blocks submitted to the main dispatch queue always run on the main thread.

    So, if you dispatch synchronously from your serial queue to the main queue, that will always run that dispatched code on the main thread. But, if you dispatch synchronously from one queue to any other queue (other than the main queue), then it may avail itself of this optimization and just run the dispatched code on caller’s thread.


    Hopefully the above answered your first two questions. But you also asked:

    1. Can the main thread still execute other tasks while it is blocked? I would like to understand how exactly the main thread operates.

    No, it can’t. And we never want to block the main thread. This means, in the context of your question, that one must avoid dispatching synchronously from the main thread anything that takes more than just a few milliseconds. The same is true with locks or anything else that might block. It’s fine if you want to do something incredibly fast, say, synchronizing access to some shared mutable state, but anything slow should be done asynchronously.

    Needless to say, nowadays we generally wouldn’t even use GCD, but would favor Swift concurrency’s async-await patterns.