I am Boris Dobretsov, and this is the fifth part of a series Understanding Parallel Programming: A Guide for Beginners.
If you haven’t read the first four parts, have a look at Understanding Parallel Programming: A Guide for Beginners, Understanding Parallel Programming: A Guide for Beginners, Part II, Understanding Threads to Better Manage Threading in iOS, Understanding Parallel Programming: Thread Management for Beginners.
In our previous lesson, we explored key concepts like threads, processes, event loops, asynchronous programming, and multithreading. These fundamentals are enough to start splitting code into threads. However, managing threads with the Thread
class can be super-heavy. You need to track the number of threads, divide code across them, add tasks, synchronise data, and monitor thread completion - all without standard tools to streamline the process.
To address these challenges, Apple introduced Grand Central Dispatch (GCD). This library encapsulates thread management entirely, offering a simple interface based on queues and tasks instead.
GCD operates on the principle of organising blocks of code (tasks) into queues. These tasks can run in parallel or sequentially, depending on the type of queue. The execution speed of queues varies. Tasks enqueued in the main queue always run on the main thread, while others are distributed across available threads by GCD.
Here’s an example of executing a block of code on a different thread using GCD:
DispatchQueue.global().async {
for _ in 0..<10 {
print("😈")
}
}
Here, we fetch a global queue, call the async
method, and pass a closure that executes within that queue. At first glance, this might seem similar to Thread
. However, as we'll see, GCD offers much more.
It's crucial to understand that a queue is not equivalent to a thread. Tasks in the same queue may execute on different threads, and tasks from different queues might share the same thread.
GCD provides six standard queues in every iOS app: one main queue for UI-related tasks and five additional global queues with different priority levels.
DispatchQueue.main
DispatchQueue.global(qos: .userInteractive)
DispatchQueue.global(qos: .userInitiated)
DispatchQueue.global(qos: .utility)
DispatchQueue.global(qos: .background)
DispatchQueue.global(qos: .default)
userInitiated
and utility
. If you call DispatchQueue.global()
without specifying a QoS, this is the queue you get.
Let's see these queues in action:
DispatchQueue.global(qos: .userInteractive).async {
for _ in 0..<10 { print("😇") }
}
DispatchQueue.main.async {
for _ in 0..<10 { print("😈") }
}
In this example, the tasks execute almost simultaneously because the system resources are sufficient:
😇
😈
😇
😈
...
Now, let’s add two more queues with different priorities:
DispatchQueue.global(qos: .userInteractive).async {
for _ in 0..<10 { print("😇") }
}
DispatchQueue.main.async {
for _ in 0..<10 { print("😈") }
}
DispatchQueue.global(qos: .userInitiated).async {
for _ in 0..<10 { print("👻") }
}
DispatchQueue.global(qos: .utility).async {
for _ in 0..<10 { print("👽") }
}
Output:
😇
😈
👻
👽
😇
😈
...
👽
Here, tasks from higher-priority queues dominate, while lower-priority tasks execute later.
Example of serial execution:
let serialQueue = DispatchQueue(label: "mySerialQueue")
serialQueue.async { print("A") }
serialQueue.async { print("B") }
serialQueue.async { print("C") }
Output:
A
B
C
Example of concurrent execution:
let concurrentQueue = DispatchQueue(label: "myConcurrentQueue", attributes: .concurrent)
concurrentQueue.async { print("A") }
concurrentQueue.async { print("B") }
concurrentQueue.async { print("C") }
Output (may vary):
B
A
C
The sync
and async
methods control how tasks are executed relative to the current queue:
DispatchQueue.global().async {
print("😈")
}
print("😇")
Output:
😇
😈
Here, the main queue continues its execution while the background task runs asynchronously. Compare this with sync
:
DispatchQueue.global().sync {
print("😈")
}
print("😇")
Output:
😈
😇
Beyond the built-in queues, you can create your own:
let myQueue = DispatchQueue(label: "myQueue")
myQueue.async {
for _ in 0..<10 { print("😈") }
}
Custom queues are serial by default but can be configured for concurrency:
let myConcurrentQueue = DispatchQueue(
label: "myQueue",
qos: .utility,
attributes: .concurrent
)
Custom queues allow better organization of complex multithreaded code.
Today we explored how GCD simplifies task management in iOS, enabling efficient multithreading with queues and tasks. Understanding the difference between global and custom queues, serial and concurrent execution, and synchronous versus asynchronous calls helps you optimise the app’s performance.
Next time, we’ll tackle synchronisation challenges like race conditions and deadlocks, uncovering best practices to ensure smooth, error-free multithreaded operations. Stay tuned!