How to calculate algorithm complexity

Table of contents:

What is the complexity of an algorithm?

Defining the complexity of an algorithm is not easy. Intuitively the complexity of an algorithm is an indicator of the difficulty in solving the problem addressed by the algorithm. But this view of the mind is not easy to quantify. We will then define complexity as a function that depending on the inputs to a program will give the consumption of a certain resource.

There are two types of algorithms complexity:

spatial complexity of algorithms: allows you to quantify memory usage.

temporal complexity of algorithms: allows the speed of execution to be quantified

The asymptotic complexity of algorithms

In this section, we will look at an extremely important aspect of an algorithm, namely its performance. Even if the speed of computers is steadily increasing, in fact, it will never be fast enough. People want to solve increasingly complex problems, and for many algorithms, doubling the speed of the computer does not mean that you can solve a problem twice as large.

It is therefore important for a computer scientist to choose the fastest algorithm. There is a difficult problem here: the speed of execution of an algorithm depends on the computer (processor, type and amount of memory, organization and size of the cache, and possibly the speed of the disk). How can we compare algorithms from a performance point of view if we do not know the computer on which the algorithm will run?

Moreover, the execution time of an algorithm depends of course on the size of the problem to be solved. To compare two algorithms from the point of view of performance, it would then be necessary to give the size of the problem to be solved. The result of this comparison may vary depending on the size of the problem.

These questions are studied by a branch of computer science that we call complexity theory. We are particularly interested in the concept of asymptotic complexity.

The basic idea is not to directly measure the execution time of an algorithm, but to count the number of elementary operations (additions, subtractions, comparisons, etc.) required by its execution according to the size of the problem. We will call this function the time function of the algorithm. For a problem of size n, we will denote this function t (n). Unfortunately, the time function does not necessarily compare the speed of execution of two algorithms. The reason is that a basic operation can run faster or slower depending on the computer in question.

To get around these problems, computer scientists invented the notion of asymptotic complexity. With asymptotic complexity, we do not give the exact time function, but only an approximation of it. This approximation is given in the form of a well-known function which gives the general form of the function of time.

Big O notation

Big O notation is a metric for describing the execution time of the algorithms.

This version of the notation on which the computer industry has agreed is a custom (fusion of the Big Theta and Big O) version of the mathematical asymptotic comparison.

Algorithmic complexity is the execution time of an algorithm measured in terms of the increase in the size of the input data. We write O (n) where n is a variable which represents the size of the input data. It is possible to have several elements as input, one would then note for example O (ab) or O (a + b) according to the logic of the algorithm.

This execution time can be represented on a graph where the x-axis is the size of the input data, and the y-axis is time.

Complexity classes

O Complexity type
O(1) constant
O(log(n)) logarithmic
O(n) linear
O(n√ólog(n)) quasi-linear
O(n2) quadratic
O(n3) cubic
O(2n) exponential
O(n!) factorial

Worst case complexity

Measures the time or space complexity of an algorithm in the worst case of possible execution. It is expressed as a function of the size of the input to the algorithm. Implicitly, we seek to build algorithms that are executed using the fewest possible resources, and this is consequently an upper limit of the resources required by the algorithm.

For example, the time complexity in the worst case corresponds to the longest execution time that the algorithm can have, and makes it possible to guarantee its termination.

The worst-case complexity makes it possible to compare the efficiency of two algorithms.

To minimize measurement errors, it is recommended to use a suction psychrometer to determine the humidity of the air, and a mercury barometer to determine the ambient pressure (the measurement given by the barometer should be corrected for deviations due to capillarity, to the height of the convex meniscus, to the density of mercury (which depends on the temperature) and to the acceleration of local gravity).