Complexity of Algorithm

Complexity of Algorithm

The term algorithm complexity measures how many steps are required by the algorithm to solve the given problem. It evaluates the order of count of operations executed by an algorithm as a function of input data size.

To assess the complexity, the order (approximation) of the count of operation is always considered instead of counting the exact steps.

O(f) notation represents the complexity of an algorithm, which is also termed as an Asymptotic notation or "Big O" notation. Here the f corresponds to the function whose size is the same as that of the input data. The complexity of the asymptotic computation O(f) determines in which order the resources such as CPU time, memory, etc. are consumed by the algorithm that is articulated as a function of the size of the input data.

The complexity can be found in any form such as constant, logarithmic, linear, n*log(n), quadratic, cubic, exponential, etc. It is nothing but the order of constant, logarithmic, linear and so on, the number of steps encountered for the completion of a particular algorithm. To make it even more precise, we often call the complexity of an algorithm as "running time".

Typical Complexities of an Algorithm

  • Constant Complexity:
    It imposes a complexity of O(1). It undergoes an execution of a constant number of steps like 1, 5, 10, etc. for solving a given problem. The count of operations is independent of the input data size.
  • Logarithmic Complexity:
    It imposes a complexity of O(log(N)). It undergoes the execution of the order of log(N) steps. To perform operations on N elements, it often takes the logarithmic base as 2.
    For N = 1,000,000, an algorithm that has a complexity of O(log(N)) would undergo 20 steps (with a constant precision). Here, the logarithmic base does not hold a necessary consequence for the operation count order, so it is usually omitted.
  • Linear Complexity:
    • It imposes a complexity of O(N). It encompasses the same number of steps as that of the total number of elements to implement an operation on N elements.
      For example, if there exist 500 elements, then it will take about 500 steps. Basically, in linear complexity, the number of elements linearly depends on the number of steps. For example, the number of steps for N elements can be N/2 or 3*N.
    • It also imposes a run time of O(n*log(n)). It undergoes the execution of the order N*log(N) on N number of elements to solve the given problem.
      For a given 1000 elements, the linear complexity will execute 10,000 steps for solving a given problem.
  • Quadratic Complexity: It imposes a complexity of O(n2). For N input data size, it undergoes the order of N2 count of operations on N number of elements for solving a given problem.
    If N = 100, it will endure 10,000 steps. In other words, whenever the order of operation tends to have a quadratic relation with the input data size, it results in quadratic complexity. For example, for N number of elements, the steps are found to be in the order of 3*N2/2.
  • Cubic Complexity: It imposes a complexity of O(n3). For N input data size, it executes the order of N3 steps on N elements to solve a given problem.
    For example, if there exist 100 elements, it is going to execute 1,000,000 steps.
  • Exponential Complexity: It imposes a complexity of O(2n), O(N!), O(nk), …. For N elements, it will execute the order of count of operations that is exponentially dependable on the input data size.
    For example, if N = 10, then the exponential function 2N will result in 1024. Similarly, if N = 20, it will result in 1048 576, and if N = 100, it will result in a number having 30 digits. The exponential function N! grows even faster; for example, if N = 5 will result in 120. Likewise, if N = 10, it will result in 3,628,800 and so on.

Since the constants do not hold a significant effect on the order of count of operation, so it is better to ignore them. Thus, to consider an algorithm to be linear and equally efficient, it must undergo N, N/2 or 3*N count of operation, respectively, on the same number of elements to solve a particular problem.


PERFORMANCE ANALYSIS

What is Performance Analysis of an algorithm?

If we want to go from city "A" to city "B", there can be many ways of doing this. We can go by flight, by bus, by train and also by bicycle. Depending on the availability and convenience, we choose the one which suits us. Similarly, in computer science, there are multiple algorithms to solve a problem. When we have more than one algorithm to solve a problem, we need to select the best one. Performance analysis helps us to select the best algorithm from multiple algorithms to solve a problem. When there are multiple alternative algorithms to solve a problem, we analyze them and pick the one which is best suitable for our requirements. The formal definition is as follows...

Performance of an algorithm is a process of making evaluative judgement about algorithms.

It can also be defined as follows...


Performance of an algorithm means predicting the resources which are required to an algorithm to perform its task.

That means when we have multiple algorithms to solve a problem, we need to select a suitable algorithm to solve that problem.

We compare algorithms with each other which are solving the same problem, to select the best algorithm. To compare algorithms, we use a set of parameters or set of elements like memory required by that algorithm, the execution speed of that algorithm, easy to understand, easy to implement, etc.,

Generally, the performance of an algorithm depends on the following elements...

1.     Whether that algorithm is providing the exact solution for the problem?

2.     Whether it is easy to understand?

3.     Whether it is easy to implement?

4.     How much space (memory) it requires to solve the problem?

5.     How much time it takes to solve the problem? Etc.,

When we want to analyse an algorithm, we consider only the space and time required by that particular algorithm and we ignore all the remaining elements.

Based on this information, performance analysis of an algorithm can also be defined as follows...

Performance analysis of an algorithm is the process of calculating space and time required by that algorithm.

Performance analysis of an algorithm is performed by using the following measures...

  • Space required to complete the task of that algorithm (Space Complexity). It includes program space and data space
  • Time required to complete the task of that algorithm (Time Complexity).

Popular posts from this blog

DATA STRUCTURES-II

DATA STRUCTURES FOR BCA

DATA STRUCTURES FOR B.TECH