You can shortcut 90% of the course by learning this one simple concept:

Big O Notation.

Learning about Big O helped me write faster code, understand data structures, and saved my cat from a tree. š»

## š Here it is. In one Paragraph.

Big O tells you how complicated your code is.

My shortcut to Big O:

How many times does the computer touch each piece of data.

Lower is always better.

The best cheatsheet there is here:

## š¦¾ How does it work exactly?

Have a list `x`

and know where the item is?

`x[5]`

touches exactly one item, so it's O(1)

`for a in x:`

touches every item once, so it's O(n)

Two nested loops? O(nĀ²)

You can do worse, but O(nĀ²) is considered pretty horrible normally.

## š§® Why data structures?

The right structure cuts down on computation cost / reduced Big O!

Lose a 123 in a `list()`

?

`[15, 42, 123]`

searching the list takes O(n), because you touch every item.

A `dict()`

instead?

`{15: "a", 42: "b", 123: "c"}`

, the hashing algo makes it O(1)!

## š¤ Combining it all

Two nested for loops are O(nĀ²) when they both go through the list once.

O(n) * O(n) => O(nĀ²)

Similarly, when you have a constant O(1) operation, we ignore it:

O(1) * O(n) => O(n)

Have multiple O(1) that are less than n?

O(1) * O(1) * O(1) => O(1)

No O(3) just O(1)

## āļø Just like Einstein

We have Space and Time.

Most people care about Time complexity. How many times does the computer touch the data?

There is also Space complexity. How much does the data take up in memory?

A list() has O(n) space Big O. Copy in your code and O(?) goes up

## TL;DR

- Learn Big O notation
- O(1) is blazing fast down to O(nĀ²) or worse
- Count how many times the computer touches items in the data structure and get "n"
- Different data structures have different Big O for different tasks