While developing iOS app one cannot escape from the concurrency or thread issues. There are many scenarios where the concept of threads, queues, and concurrency is highly required. These concept on the generic level are really vast but somehow with Apple while developing apps it manages a lot of thing around thread by itself.
Lets first talk about concurrency — what is it really and why its a big deal. To be precise concurrency is handling of more than one task by your CPU, that’s it, no big confusing words :). Earlier Apple devices were installed with 1 CPU for context switching among multiple tasks. Later in 2011, iPad and iPhone were upgraded with dual-core, yeah right, that was a big move. Apple provides GCD(Grand Central Dispatch) which can be used to perform simple tasks and Operation queues which is usually used to perform a complex task. One of the reasons of why expertise in concurrency is needed is to keep UI responsive, this can be understood by an example where scrolling of images is required in UITableView, while its downloading and transforming images takes time and are bit slow, so it is advisable to run these task on other threads apart from main thread.
Note: By default, all UI updating tasks are performed on main thread. If we perform our heavy or complex computation on the main thread then it leads to block UI of your app and make it unresponsive.
- Task modifying or updating the same resource must not run at the same time, we need to make sure resource is thread safe.
- The terms ‘task’ and ‘process’ are often used interchangeably. Unfortunately, the term ‘multitasking’ is generally used to mean the ability to manage multiple processes at once, while ‘multiprocessing’ refers to a system with multiple processors (CPUs).
We can create our own threads but using this option without expertise can lead to a lot of issues. While assigning a task to GCD or Operation Queues, the system manages thread by itself. We can create multiple tasks and assign it to a queue. Each task is divided into multiple units and each unit is executed by available threads. The breakage of the task into multiple units and assigning it to threads is handled by the system. In diagram given below, we have a queue with 6 tasks which is divided between 2 threads, ‘Thread 1' execute 4 task (purple, yellow, red and green) and ‘Thread 2’ execute 2 tasks(blue, orange).
Choosing between GDC and Operation queue depends on how we want tasks to communicate among themselves or with main thread as both provide various ways to make synchronous function to behave asynchronously.
There are 3 main potential problems that one can face while dealing with concurrency and first two can be escaped by using GCD or Operation queues:
- Race Condition
- Priority Inversion
- Dead Lock
Race Condition: When two threads try to access or change the same resource at the same time. This completely depends on how threads are scheduled and when they start, sleep and resume. Xcode 8 have The Thread Sanitizer, or TSan to figure out potential code for race condition. The Ideal scenario to access a resource by two threads is depicted in diagram given below where value is accessed for read/write by threads on different clock cycle.
But there is possibility that initialization and read by thread 2 is done before then write of thread 1 which gives inconsistent result after thread 2 write statement. This can be understood by the diagram given below.
The race condition can be dealt by either locking the value which will make the other thread wait till it completes its work or by using serial queues by restricting a single task at a time to a particular resource.
When high-priority work becomes dependent on lower priority work, or it becomes the result of lower priority work, a priority inversion occurs. As a result, blocking, spinning, and polling may occur.
In the case of synchronous work, the system will try to resolve the priority inversion automatically by raising the QoS(Quality Of Service) of the lower priority work for the duration of the inversion. This will occur in the following situations:
dispatch_wait()are called for a block on a serial queue.
pthread_mutex_lock()is called while the mutex is held by a thread with lower QoS. In this situation, the thread holding the lock is raised to the QoS of the caller. However, this QoS promotion does not occur across multiple locks.
In the case of asynchronous work, the system will attempt to resolve the priority inversions occurring on a serial queue.
Let’s consider an example where there are 3 tasks - task1, task2 and task3, having priority as low, medium and high respectively which needs a common resource. First, task1 starts executing and locks the resource , meanwhile task2 start and pause task1 but it will not be able to access common resource as it was locked by task1, meanwhile task3 starts and pause task2 and starts execution because it has high priority but when it needs common resource it gets blocked as it was still locked by task1. In this case, task with low and medium priority are stopped by high priority task and high priority task which needs common resource is stopped by low priority task, this is classic case for priority inversion as depicted below:
To resolve this, the system starts executing task2 with medium priority and this is known as priority inversion. When task2 no longer needs the shared resource, task1 starts where task3 is still waiting and then when task1 is done with resource then finally task3 starts using the common resource. This is well explained below:
Note: Developers should try to ensure that priority inversions don’t occur in the first place, so the system isn’t. forced to attempt a resolution.
A deadlock is a situation where minimum two threads holding lock on some different resource, and both are waiting for other’s resource to complete its task. And, none is able to leave the lock on resource it is holding. (See image below)
In the diagram above Thread 1 accessing resource 1 and blocks it, similarly, Thread 2 accessing resource 2 and blocks it. When Thread 1 also wants to access resource 2 which is already being in use by Thread 2 so Thread 1 waits until it get free but problem occurs when Thread 2 also wants to access resource 1 which is already blocked by thread 1, this creates deadlock as both threads are waiting for another one to complete.
Concurrency — Jargon
Most of the people get confused between concurrent and asynchronous so for records Apple has confirmed that Concurrent does not mean Asynchronous by creating operation property isAsynchronous and deprecating isConcurrent. We can create task from current queue as synchronous or asynchronous. We can run asynchronous task on both serial and concurrent queue.
Main queue is by default serial queue and it is also responsible for executing UI task also. If we have heavy/expensive task on main thread, then it will execute it and make wait to other task in queue which can be even UI task and may make your UI freeze for moment.
Concurrency programming relies on performing other heavy/expensive task on another queue.
Serial queues run one task at a time and wait until a task completes. So in case if each task has linear dependency then serial queue is the optimal solution. One can use serial queues to take off load of Non-UI task from main thread. It also provides mutual exclusion of resources without using thread locks.
Similarly, if we run task on concurrent queue then it has ability to run task on more than one thread. Here we have to be careful as it doesn’t provide thread safety and lot of thread issues can arise if not handled properly
- Synchronous and asynchronous both are source of the tasks.
- Serial and concurrent queues is about destination of the tasks
- Synchronous wait for other task to complete whereas asynchronous don’t wait.
- Concurrent queue runs task in more than one thread whereas serial queue execute only on a single thread.
Please feel free to ask any questions. I would love to read your comments.💚💚💚💚