Uploaded by lcc0612 on 01.02.2010

Transcript:

Hello YouTube, and welcome to Sorting Algorithms!

Our sorting algorithm for the day is "Quicksort".

This algorithm is a liiitle bit difficult to grasp, so do pay attention!

Oh, and there are a couple of variations of Quicksort going under the same name

and working in almost the same way.

I'll explain the one I understand best.

Quicksort works on the idea that a value, any value, is chosen as a "pivot".

Quicksort then runs once, and at the end of the pass, the pivot is put in place

and generally, two lists are created around it.

When I say the pivot value is "in place", I do mean that it has been put into its correct position

and by "correct" I mean that the slot it occupies

will be its actual position when the entire list is sorted.

There is something else to mention about the two sublists created on either side of the pivot

Each list is sorted with respect to the other.

That means no value smaller than the pivot will find itself on the right of the pivot

just like how no value larger than the pivot will ever be on the left.

Now, how exactly does quicksort do this?

Well, it's a pretty long story, so I'll run the demonstration while explaining.

Now, there are three pointers involved in quicksort

The "Left" pointer, the "Right" pointer, as well as one that identifies the Pivot.

If you're not a programming person and you're not familiar with the term "Pointer"

you can think of it in this context as simply a tag that can move independently of data items

though the "Pivot" pointer always sticks to a particular item in a single pass.

So, we start off by placing "Left" and "Right" on the two extreme ends of our unsorted list.

Even though any random value can be chosen as a pivot, our method uses the leftmost value.

The right pointer moves towards the left.

Any time in which the "Left" value is larger than the "Right" value, the two are swapped.

Because the "Pivot" must always be the same item within the same pass, it moves over to the right as well.

Notice that we are only swapping the two values. The left and right pointers remain as they are.

Now, notice that before this, we were moving the right pointer towards the left.

Now, after we have made a swap, we will have to move the left pointer towards the right instead.

Confused? You can think of it this way

The pivot has to be stuck to the left or right pointer. So, whichever pointer that has the pivot stuck to it cannot move.

Instead, the other pointer must move towards the pivot.

The right pointer is moving towards the left again.

The entire process repeats.

If you don't get the picture please rewind and take another look.

The concept does take getting used to.

So when does the pass end?

Well, with each pointer moving towards each other, they're eventually going to meet.

Now once left and right pointers meet, that seals the pivot in place.

Yes, that's right. The pivot item is now in its correct position

Any items to its left are smaller than it, while the items to its right are larger

To be clear, we will color the items that have been put in place blue.

Now, Quicksort needs to continue, and it utilizes a technique called "Recursion" to do this.

Basically the Quicksort algorithm needs to run itself - Yes, the session of quicksort that is running will pause itself,

and create a new session of quicksort to carry out further sorting.

This new session of quicksort can then do the same, and spawn further copies of quicksort.

What happens at this point of the algorithm is that once an item is put in place, Quicksort calls itself twice

to sort the left and right sublists seperately

That's right, we start the process over, but this time with smaller lists.

Quicksort always works on the left sublist first.

If that left sublist creates more left sublists, those will be sorted as soon as they are encountered.

Only when there are no left sublists does Quicksort begin to consider right sublists.

And that's it. Quicksort continues to do this until the list is sorted.

Note that sometimes, a sublist of one item is created, like right now, with "1"

It is automatically considered sorted, since the left and right pointers end up on the same item

fufilling the finishing condition for the algorithm.

That's all to be said for now, let the high-speed demonstration, commence!

So, it's time to consider the efficiency of the algorithm according to the Big O Notation.

Quicksort is an efficient algorithm because it uses a divide and conquer strategy

The list is repeatedly halved in size and sorted.

Many algorithms that work on this principle, including Quicksort, actually have the average time complexity of O(n log n).

Yes, that's log for logarithm, on base 2.

Now, right before we go, there is actually one more quirk here.

The O(n log n) time complexity is the average case one.

So how does Quicksort fare in the worst case?

Well, you thought you'd never see this appear on your screen again, but Quicksort is actually O(n²) in the worst case.

Yes, that's right. Quicksort is pretty efficient for normal data

but if you tell quicksort to sort a list that's in the inverse order than what you actually want

it'll take n² time, because it's not able to efficiently split the list into two sublists.

That's all for this episode of Sorting Algorithms.

Thanks for bearing with me through this extremely lengthy episode

Please hang around, the next episode will be a bonus episode, where I quicksort in real life!

You're watching 0612 TV.

Our sorting algorithm for the day is "Quicksort".

This algorithm is a liiitle bit difficult to grasp, so do pay attention!

Oh, and there are a couple of variations of Quicksort going under the same name

and working in almost the same way.

I'll explain the one I understand best.

Quicksort works on the idea that a value, any value, is chosen as a "pivot".

Quicksort then runs once, and at the end of the pass, the pivot is put in place

and generally, two lists are created around it.

When I say the pivot value is "in place", I do mean that it has been put into its correct position

and by "correct" I mean that the slot it occupies

will be its actual position when the entire list is sorted.

There is something else to mention about the two sublists created on either side of the pivot

Each list is sorted with respect to the other.

That means no value smaller than the pivot will find itself on the right of the pivot

just like how no value larger than the pivot will ever be on the left.

Now, how exactly does quicksort do this?

Well, it's a pretty long story, so I'll run the demonstration while explaining.

Now, there are three pointers involved in quicksort

The "Left" pointer, the "Right" pointer, as well as one that identifies the Pivot.

If you're not a programming person and you're not familiar with the term "Pointer"

you can think of it in this context as simply a tag that can move independently of data items

though the "Pivot" pointer always sticks to a particular item in a single pass.

So, we start off by placing "Left" and "Right" on the two extreme ends of our unsorted list.

Even though any random value can be chosen as a pivot, our method uses the leftmost value.

The right pointer moves towards the left.

Any time in which the "Left" value is larger than the "Right" value, the two are swapped.

Because the "Pivot" must always be the same item within the same pass, it moves over to the right as well.

Notice that we are only swapping the two values. The left and right pointers remain as they are.

Now, notice that before this, we were moving the right pointer towards the left.

Now, after we have made a swap, we will have to move the left pointer towards the right instead.

Confused? You can think of it this way

The pivot has to be stuck to the left or right pointer. So, whichever pointer that has the pivot stuck to it cannot move.

Instead, the other pointer must move towards the pivot.

The right pointer is moving towards the left again.

The entire process repeats.

If you don't get the picture please rewind and take another look.

The concept does take getting used to.

So when does the pass end?

Well, with each pointer moving towards each other, they're eventually going to meet.

Now once left and right pointers meet, that seals the pivot in place.

Yes, that's right. The pivot item is now in its correct position

Any items to its left are smaller than it, while the items to its right are larger

To be clear, we will color the items that have been put in place blue.

Now, Quicksort needs to continue, and it utilizes a technique called "Recursion" to do this.

Basically the Quicksort algorithm needs to run itself - Yes, the session of quicksort that is running will pause itself,

and create a new session of quicksort to carry out further sorting.

This new session of quicksort can then do the same, and spawn further copies of quicksort.

What happens at this point of the algorithm is that once an item is put in place, Quicksort calls itself twice

to sort the left and right sublists seperately

That's right, we start the process over, but this time with smaller lists.

Quicksort always works on the left sublist first.

If that left sublist creates more left sublists, those will be sorted as soon as they are encountered.

Only when there are no left sublists does Quicksort begin to consider right sublists.

And that's it. Quicksort continues to do this until the list is sorted.

Note that sometimes, a sublist of one item is created, like right now, with "1"

It is automatically considered sorted, since the left and right pointers end up on the same item

fufilling the finishing condition for the algorithm.

That's all to be said for now, let the high-speed demonstration, commence!

So, it's time to consider the efficiency of the algorithm according to the Big O Notation.

Quicksort is an efficient algorithm because it uses a divide and conquer strategy

The list is repeatedly halved in size and sorted.

Many algorithms that work on this principle, including Quicksort, actually have the average time complexity of O(n log n).

Yes, that's log for logarithm, on base 2.

Now, right before we go, there is actually one more quirk here.

The O(n log n) time complexity is the average case one.

So how does Quicksort fare in the worst case?

Well, you thought you'd never see this appear on your screen again, but Quicksort is actually O(n²) in the worst case.

Yes, that's right. Quicksort is pretty efficient for normal data

but if you tell quicksort to sort a list that's in the inverse order than what you actually want

it'll take n² time, because it's not able to efficiently split the list into two sublists.

That's all for this episode of Sorting Algorithms.

Thanks for bearing with me through this extremely lengthy episode

Please hang around, the next episode will be a bonus episode, where I quicksort in real life!

You're watching 0612 TV.