Heads up! To view this whole video, sign in with your Courses account or enroll in your free 7-day trial. Sign In Enroll
Well done!
You have completed Algorithms: Sorting and Searching!
You have completed Algorithms: Sorting and Searching!
Preview
The algorithms we've discussed in this stage are very well-known, and some job interviewers are going to expect you to know their Big O runtimes. So let's look at them!
Quicksort Run Time (Worst Case)
O(nĀ²)
Quicksort Run Time (Average Case)
O(n log n)
Merge Sort Run Time
O(n log n)
Related Discussions
Have questions about this video? Start a discussion with the community and Treehouse staff.
Sign upRelated Discussions
Have questions about this video? Start a discussion with the community and Treehouse staff.
Sign up
We had to learn a lot of details for each
algorithm we've covered in this course.
0:00
Developers who need to implement their
own algorithms often need to choose
0:04
an algorithm for each and
every problem they need to solve.
0:08
And they often need to discuss their
decisions with other developers.
0:11
Can you imagine needing to describe all
the algorithms in this same level of
0:14
detail all the time?
0:19
You'd spend all your time in
meetings rather than programming.
0:20
That's why Big O Notation was created,
as a way of quickly describing how
0:24
an algorithm performs as the data set
its working on increases in size.
0:28
Big O Notation lets you quickly compare
several algorithms to choose the best one
0:33
for your problem.
0:37
The algorithms we've discussed in
this course are very well-known.
0:39
Some job interviewers are going to
expect you to know their Big O runtimes.
0:42
So let's look at them.
0:46
Remember that the n in Big O Notation
refers to the number of elements you're
0:48
operating on.
0:52
With selection sort, you need to check
each item on the list to see if it's
0:53
lowest so
you can move it over to the sorted list.
0:57
So that's n operations.
0:59
Suppose you're doing selection
sort on a list of 5 items, and
1:02
in this case, would be 5.
1:06
So that's 5 operations before you
can move an item to the sorted list.
1:08
But with selection sort,
you have to loop over the entire list for
1:11
each item you want to move.
1:15
There are 5 items in the list, and you
have to do 5 comparisons to move each one.
1:17
So it's more like 5 times 5 operations, or
1:21
if we replace 5 with n,
it's n times n, or n squared.
1:25
But wait, you might say, half of that
5 by 5 grid of operations is missing,
1:30
because we're testing one few item
in the unsorted list with each pass.
1:34
So isn't it more like,
1/2 times n times n?
1:38
And this is true, we are not doing
a full n squared of operations.
1:42
But remember, in Big O Notation,
1:46
as the value of n gets really big,
constants like 1/2 become insignificant.
1:48
And so we discard them.
1:53
The Big O runtime of selection sort is
widely recognized as being O(n squared).
1:55
Quicksort requires one operation for
each element of listed sorting.
2:02
It needs to select a pivot first, and then
it needs to sort elements into lists that
2:06
are less than or greater than the pivot.
2:10
So that's n operations.
2:12
To put that another way, if you have
a list of 8 items, then n is 8.
2:14
So it will take 8 operations to
split the list around the pivot.
2:18
But of course, the list isn't sorted after
splitting it around the pivot just once.
2:22
You have to repeat those data
operations several times.
2:26
In the best case you'll pick a pivot
that's right in the middle of the lists,
2:29
so that you're dividing
list exactly in half.
2:33
Then you keep dividing
the list in half until you
2:36
have a list with a length of one.
2:38
The number of times you need
to divide n in half until
2:40
you reach one is expressed as log n.
2:44
So you need to repeat n sorting
up rations log n times.
2:47
That leaves us with the best case one
times for Quicksort of O (n log n).
2:51
But that's the best case.
2:57
What about the worst case?
2:59
Well, if you pick the wrong pivot,
3:00
you won't be dividing
the list exactly in half.
3:02
If you pick a really bad pivot,
3:04
the next recursive call to Quicksort
will only reduce the list length by 1.
3:06
Since our Quicksort function simply
picks the first item to use as a pivot,
3:10
we can make it pick the worst
possible pivot repeatedly,
3:14
simply by giving it a list
that's sorted in reverse order.
3:18
If we pick the worst possible pivot every
time, we'll have to split the list once
3:21
for every item it contains, and
then, do n-sorting operations on it.
3:26
You already know another sorting algorithm
that only manages to reduce the list by
3:30
one element with each pass,
selection sort.
3:35
Selection sort has
a runtime of O(n squared).
3:37
And in the worst case,
that's the runtime for Quicksort as well.
3:40
So which do we consider when trying
to decide whether to use Quicksort,
3:44
the best case or the worst case?
3:48
Well, as long as your implementation
doesn't just pick the first item as
3:50
a pivot, which we did so
we could demonstrate this issue,
3:54
it turns out that on average, Quicksort
performs closer to the best case.
3:57
Many Quicksort implementations accomplish
this simply by picking a pivot at random
4:01
on each recursive loop.
4:06
Here we are sorting our reverse sorted
data again, but this time we picked pivots
4:08
at random, which reduces the number
of recursive operations needed.
4:12
Sure, random pivots sometimes
give you the best case and
4:17
sometimes you'll randomly
get the worst case.
4:19
But it all averages out over multiple
calls to the Quicksort function.
4:21
Now, with Merge Sort,
there is no pivot to pick.
4:26
Your list of n items always gets
divided in half log n times.
4:28
That means, Merge Sort always has
a big O runtime of O(n log n).
4:33
Contrast that with Quicksort,
4:38
which only has a runtime of
O(n log n) in the best case.
4:40
In the worst case,
Quicksort's runtime is O(n squared).
4:43
And yet, out in the real world, Quicksort
is more commonly used than Merge Sort.
4:47
Now, why is that if Quicksort's bigger
runtime can sometimes be worse than
4:51
Merge Sort?
4:55
This is one of those situations where
Big O Notation doesn't tell you
4:56
the whole story.
5:00
All Big O can tell you is the number
of times an operation is performed.
5:01
It doesn't describe how
long that operation takes.
5:05
And the operation merge of
performance repeatedly,
5:08
takes longer than the operation
Quicksort performance repeatedly.
5:11
Big O is a useful tool for
quickly describing how the runtime
5:15
of an algorithm increases as the data set
it's operating on gets really, really big.
5:18
But you can't always choose between two
algorithms based just on their Big O
5:23
runtimes.
5:27
Sometimes there is additional info
you need to know about an algorithm
5:28
to make a good decision.
5:31
Now that we can sort a list of items,
5:33
we're well on our way to being able
to search a list efficiently as well.
5:35
We'll look at how to do
that in the next stage.
5:39
You need to sign up for Treehouse in order to download course files.
Sign upYou need to sign up for Treehouse in order to set up Workspace
Sign up