🔧 Understaing Algorithm Complexity: A Deep Dive into Big O Notation With JavaScript
Nachrichtenbereich: 🔧 Programmierung
🔗 Quelle: dev.to
Hi there!! It's been a while since I published articles, due to events, lectures, work and so on.
But I'll be coming back slowly! But then, as I always ask, how are you? all good? everything in peace? I hope you are well!
Today I want to present you with super different content, it can be a little complex at first, but in many interviews (for big techs) these questions are asked frequently. Algorithm complexity analysis!
This is such an important concept! If you've never covered the algorithm complexit analysis or big O in an academic settings, rest assured! I'll try to address it in a simple way that you can understand! But I'm not going to delve too deeply into the topic, as it would take an entire college year to do that.
Big O
Big O is the language and metric we use to describe the efficiency of algorithms. Sometimes we need to describe if our algorithm is efficiently, and this analysis is important when look for performance or evaluate whether our algorithm is efficient.
Imagine the following scenario: You've got a file on a hard drive and you need to send it tor your friend who lives across the country. You need to get the file to your friend as fast as possible. How should you send it?
Some people will suggest, FTP? Email? Google Drive?
If it's a small file, you are certainty right. It woulld take, probably, 5-10 hours to get an airport, catch a plane and hand deliver it to your friend.
Buuut what if the file were really large. For example, 1TB? The file could take more than a day to transfer eletronically. Probably, it would be much faster to just fly it across the country.
This is the concept of asymptotic runtime, or big O time, means.
O(1) would be the example of delivering using the plane. O(1) respect to the size of the file. The time is constant.
O(n) would be the example using an eletronic transfer. where N, is the size of the file. This means that the time to transfer the file inscreses linearly with the size of the file.
The green line is O(n) and the blue line O(1).
There are many more runtimes than this. Some of the most common ones are O(log n), O(n * log n), O(n), O(nˆ2), O(2ˆn).
Best case, Worst case and Expected Case
We can actually describe our runtime for an algorithm in three different ways:
Imagine the algorithm of quick sort:
function quickSort(arr) {
if (arr.length <= 1) {
return arr;
}
const pivot = arr[arr.length - 1];
const left = [];
const right = [];
for (let i = 0; i < arr.length - 1; i++) {
if (arr[i] < pivot) {
left.push(arr[i]);
} else {
right.push(arr[i]);
}
}
return [...quickSort(left), pivot, ...quickSort(right)];
}
For now, don't be scared of resursion. Quick sort picks a random element as a "pivot" and then swaps values in the array suck that the element less than pivot appear before elements greater than pivot. This gives a "partial sort". Then it recursively sorts the left and right sides using a similar process.
Best Case: if all elements are equal, then quick sort will, on average, just transverse through the array once. This is O(n).
Worst Case: What if we get really unluck and the pivot is repeatedly the biggest element in the array? In this case, our recursion doesn't divide the array in half and recurse on each half. It just shrinks the subarray by one element. This will degenerate to an O(nˆ2) runtime.
Space Complexity
Time is not the only thing that matters in an algorithm. We might also care about the amount of memory or space. Imagine if we are working with low memory devices like IoT.
Space complexity is a parallel concept to time complexity. If we need to create an array of size N. This will require O(n) space. If we need a two-dimensional (matrix) array of nxn, this will require O(n^2) space.
Probably you already saw these errors 👀:
- Stack overflow
- Memory overflow
Stack space in resursive call counts, too:
function sum(n) {
if (n <= 0)
return 0;
return n + sum(n-1);
}
For example, code like this would take O(n) time and O(n) space.
Recursive Runtimes
You probably already heard about fibonacci in mathematics.
The Fibonacci sequence is a series of numbers where each number is the sum of the two preceding ones. It usually starts with 0 and 1. The sequence looks like this:
0, 1, 1, 2, 3, 5, 8, 13, 21, 34, ...
Now the algorithm:
function fibonacci(n) {
if (n <= 1)
return 1;
return fibonacci(n-1) + fibonacci(n-1);
}
A lot of people will, for some reason, see the two calls to fibonacci
and jump to O(nˆ2). This is completely wrong.
The space complexity of this algorithm will be O(n). Although we have O(2^n) nodes in the tree total, only O(n) exist at any given time.
Now what is the runtime of the bellow code?
function foo(array) {
let sum = 0;
let product = 1;
array.forEach(item => {
sum += item;
});
array.forEach(item => {
product *= item;
});
console.log(`${sum}, ${product}`);
}
If you answer O(n). Yep, you are right! The fact that we iterate through the array twice doesn't matter.
Now another example:
function printPairs(array) {
for(let i = 0; i < array.length; i++) {
for(let j = 0; j< array.length; j++) {
console.log(`${array[i]} , ${array[j]}`);
}
}
}
The runtime is O(n^2).
So now you are able to understand these concepts of time and space complexity analysis of algorithms! Of course, many other concepts I didn't cover here such as Big Theta, Big Omega, Amortized Time, trees and more complex recursion.
But this is a hard concept! And now you understand when asked about the time or space complexity of an algorithm.
That's all folks! I hope you liked!
If I help you, like this post, follow me on my social medias!
Linkedin: https://www.linkedin.com/in/kevin-uehara/
Instagram: https://www.instagram.com/uehara_kevin/
Twitter: https://x.com/ueharaDev
Github: https://github.com/kevinuehara
dev.to: https://dev.to/kevin-uehara
Youtube: https://www.youtube.com/@ueharakevin/
🔧 Big O Notation and Algorithms in JavaScript
📈 30.08 Punkte
🔧 Programmierung
🎥 Deep dive into Flutter deep linking
📈 29.63 Punkte
🎥 Video | Youtube