What is Big O?
This is such an important concept. Big O time is the language and metric we use to describe the efficiency of algorithms. Master this concept for interviews and as a programmer.
Let imagine
You’ve got a file on a hard drive and you need to send it to your friend who lives in a different city. You want to send a file to your friend as fast as possible, How should you send it?
Most people would thought be email, FTP or any electronic transfer, That is right but not fully correct.
If it’s a small file then it’s okay, It would take 5–10 hours to get an airport, hop on a flight, and then deliver it to your friend.
What if the file size is really large. Is it possible that it’s faster to physically it via plane?
Yes, actually it is. 1TB file could take more than a day to transfer electronically transfer. It would much faster to send via plane.
What if there is no flight, cars and transfer file is important for you?
Time complexity
This is the concept of asymptotic runtime or big O time.
- Electronic Transfer: O(s), where s is the size of the file, as file size increases, time also increases linearly to transfer.
- Airplane Transfer: O(1), as file size increases, it won’t take any longer time. Time will the same for all sizes of files.
Some common runtimes ones are O (1), O (log N), O (N log N), O (N), O (N²) and O (2^N).
In our academics use big O, big Θ, and big Ω to describe runtimes.
In an industry (in interviews) only big O used.
Best Case, Worst Case, and Expected Case.
Runtime for the algorithm described in three different ways.
Suppose if we got an array of numbers, where numbers are sorted ascending order.
Best case: If we find an element immediately, without an iterating whole array. That will be return very quickly. This is O (N).
Expected case: Usually this situation won’t happen. We except a runtime O (N log N).
Worst case: What if unluckily element found at the end of the array. We except a runtime O (N²).
We rarely ever discuss best case time complexity because it’s not a useful concept.
There is no relation between best/worst/expected and big O/theta/omega.
Space Complexity
Time is not the only thing that matters in an algorithm. Space also. I am not going deep into space complexity.
If we need to create an array of size n, this will require O (N) space. If the array is two dimensional, this will require O (N * N) i.e. O (N²).
Drop the constants.
It is very possible for O (N) code to run faster than the O (1) code for specific inputs. Big O just describes the rate of increase.
We drop constants in runtime. An algorithm that one might have described as O (2N) is actually O (N).
Examples :
- O (500) = O (1)
- O (13 N²) = O (N²)
- O ( N + 10) = O (N)
- O (1000N + 50) = O (N)
- (N² + 5N + 8) = O(N²)
Always remember,
Arithmetic operations are constants.
e.g. C = A + B
O (1)
Veriable assigments are constants.
e.g. int a = 2;
O (1)
Accessing elements in array or object is constant.
In loop, the complexity is the length of the loop times. The complexity of whetever happens inside of the loop.
var i;
for (i = 0; i < N; i++) {
console.log(i);
}
Time complexity : O (N)
Drop the Non-dominant Terms
What do you about an expression such as O (N² + N)? That second N isn’t exactly a constant. But it’s not especially important.
We already said that we drop constants. Therefore, O(N² + N²) would be O (N²).
You should drop the non-dominant terms.
O (N² + N) becomes O (N²).
O (N + log N) becomes O (N).
O(5*2^N + 1000N¹⁰⁰) becomes O (2^N).
Multi-Part algorithm
Suppose we have an algorithm that has two steps. When we do multiply the runtime and when do you add them?
Add the runtime: O (A + B)
for (int a : arrA) {
print(a);
}for (int b : arrB) {
print(b);
}
Multiply the runtime: (A * B)
for (int a : arrA) {
for (int b : arrB) {
print(a + "," + b);
}
}
Tips :
If the algorithm is in the form “do this, then you are all done, do that” then you add runtimes.
if the algorithm is in the form “do this for each time you do that” then you multiply the runtime.
Log N Runtimes
We commonly see O (log N) in runtimes. Where does this come from?
When you see a problem where the number of elements in the problem space gets halved or multiplying each time, that will likely be a O (log N) runtime.
e.g.
for (int i = 0; i < N; i = i * 2) {
//do something
}Here Assume i >= N
1 * 2 = 2
2 * 2 = 2^2
3 * 2 = 2^3 ⁂ i = 2^K
.
. ⁂ 2^K >= N
. 2^K = N
2^K K = log Nfor (int i = 0; i < N; i = i / 2) {
//do something
}Here Assume N = 16
On each iteration N is divided by 2
N = 16
N = 8 /* Divide by 2 */
N = 4 /* Divide by 2 */
N = 2 /* Divide by 2 */
N = 1 /* Divide by 2 */
More learning resources online:
https://www.geeksforgeeks.org/understanding-time-complexity-simple-examples/