Understanding the Pros and Cons of Concurrency

Concurrency: Pros and Cons
Concurrency: Pros and Cons


Concurrency in software engineering can be defined as the execution of numerous sequential instructions simultaneously. This can be noticed from the operating system’s end as a byproduct of multiple process threads being run parallelly. This occurs due to these threads communicating with each other by passing messages or through shared memories.

Even though concurrency facilitates better performance and utilisation of resources, it can end up causing a lot of errors or problems due to the extensive sharing of system resources. 

However, even with its few drawbacks, concurrency in OS allows multiple applications to be run simultaneously which certainly makes up for the potential optimisation, allocation or locating errors. There can also be challenging situations such as deadlocks during concurrency where sub-systems or units are waiting for assets (resources) to become free or waiting for other units to finish. 

Let us understand why concurrency is important and what benefits it offers while we also touch upon the problems or issues we can face with concurrency.


When discussing the principles of concurrency, we must first understand a few examples of concurrency. For instance, overlapped and interleaved processes can both be identified as processes engaging in concurrency. Overlapping in operating systems refers to I/O units or devices functioning parallelly, thus overlapping with other CPU functions while operating. This leads to better utilisation of CPU and provides seamless data transfer. 

Interleaving on the other hand highly improves the performance of storage or data accessing processes. This is done by allowing sequentially accessed data to be arranged in sectors that are non-sequential in nature. We can see concurrency all around us now, especially with most systems having shifted to multi-core processors that facilitate parallel processing.

This fundamentally allows multiple processes or threads (which access the same sector, same declared variable or same space in the memory) to be run at the same time. This can even enable actions such as writing and reading the same file or utilising the same unit for two different objectives. However, we cannot predict the completion time of each process, thus requiring us to use algorithms to estimate the time taken for concurrent processes and then decide upon sequences that we can implement or incorporate.

For instance, I/O devices can perform a two-way function where each process requires a different amount of time. If this TAT (turn around time) can be determined, we can prepare the operating system or program to carry on a function that relies on both in the most effective way possible. For example, let us take printers into consideration, where the output and input time is vastly different, thus, this requires a well-defined buffer to effectively carry on simultaneous processes at the same time. With proper planning, one can avoid common concurrency issues. 

Here are a few factors that determine the time required to finish concurrent processes:

  • The activities the other processes are involved with
  • The operating system and how the OS handles ‘interrupts’, overlapping or resource starvation
  • The scheduling policy of the OS and default prioritisation setups


Here are the main advantages of concurrency:

blog banner 1
  • It allows multiple applications to be executed or run simultaneously, thus increasing efficiency and the total output of workstations.
  • It promotes better utilisation of resources and allows unused assets or data to be accessed by other applications in an organised manner. This also improves average response time by reducing the waiting time between threads. Fundamentally, applications do not need to wait for the active operation to finish or other applications to complete their functions. This is done by using free resources when given the opportunity as threads can use a variety of resources to complete their objective.
  • It also helps in improving the operating system’s performance. This is possible due to different hardware resources being accessed simultaneously by separate applications or threads. This is different from the point above as it helps with simultaneous use of the same resources as well as the parallel use of different resources. It can also help integrate different resources or applications seamlessly to finish the main objective as fast as possible.


Here are some drawbacks of concurrency we should always consider before planning processes:

  • Concurrently running applications must be protected from one another to cause as little interference as possible.
  • Parallelly running applications must be coordinated, synchronised, and scheduled in a highly organised manner with special importance placed on allocation and sequential order.
  • Additional systems must be designed in order to facilitate the coordination of applications.
  • More complexities and performance overheads are noticed in OS when switching between applications.
  • Performance can even be negatively affected or degraded due to too many processes being executed at the same time.

Problems Faced

Here are some common problems that can be seen during concurrency:

  • There can be massive errors when multiple processes use common variables or change the value of the variables. This is especially problematic when the sequential order of the changes is mixed up.
  • Resources are not allocated optimally without manual interference, thus affecting processes negatively in turn. For example, resource starvation.
  • There can be multiple problems in running code due to simultaneous read and write operations or multiple threads accessing the same sector.
  • It becomes tough to identify errors in programming or reproduce findings due to the various combinations that are used by shared components and owing to the different states that are generated.
  • It causes massive problems when operating systems dedicate resources, thus preventing other processes from using the resource. This wastes time and ends up putting other processes on standby.

Here are the main issues with concurrency:

  • Non-atomic operations can be interrupted by other processes.
  • Outcomes depend on the sequential order of processes, thus enforcing a race condition.
  • Resources can get blocked for a long period due to waiting for input from the terminal or waiting for optimal resources. This causes major problems with active processes that are periodical in nature.
  • Processes might be unable to access data temporarily or obtain service, thus stopping them from progressing towards the final objective.
  • Deadlocks that affect multiple processes and do not allow any of the involved processes to proceed with their operation.

Frequently Asked Questions

What is a concurrency example?

There is concurrency in DBMS and concurrency in OS. However, a common example of concurrency would be the read and write operations being run on the same file simultaneously.

What is concurrency in programming?

Concurrency in programming refers to multiple computations occurring parallelly. In modern programming, concurrency is everywhere. This is due to programming languages employing multiple computations or processes on the same processor or using multiple processors. For example, concurrency in Java can be noticed when using this programming language to utilise multiple computers on a network.

What is concurrency in database?

Concurrency in database refers to multiple users using the same database for various data transactions. Fundamentally, this means that the database is being executed and accessed concurrently in a multi-user setting.

How do you describe concurrency?

Concurrency can be described as the execution of multiple processes or instructions at the same time.

Why is concurrency needed?

Concurrency is needed to facilitate multiple processes/applications being run at the same time or for allowing multiple units/users/applications to use the same hardware or data resources.

What causes concurrency?

Concurrency is caused by simultaneous transactions or processes which lead to common resources being allocated and utilised in operating systems.

What is the goal of concurrency vs parallelism?

Fundamentally, concurrency is multiple processes or computations being run that use common or shared resources while parallelism refers to multiple computations being executed at the same time, but generally not using the same assets.

Why is concurrency important?

Concurrency is important for allowing multiple computations to access the same assets using the operating system. Concurrency is also important for improving TAT and performance.

Why is concurrency risky sometimes?

Concurrency can pose a risk of degrading the performance through incorrect allocation, resource starvation and other sequential problems.

Key Takeaway

Concurrency is very useful and thus, even with its shortcomings, it is used extensively in modern computing and operating systems. This opens up a world of coordinated process execution, efficient allocation of memory and the ability to run multiple applications on a single system. This maximises the efficiency of computers for both basic users and advanced processes.