Introduction to. Algorithms. Lecture 10. Prof. Constantinos Daskalakis CLRS

Similar documents
Introduction to. Algorithms. Lecture 10. Prof. Piotr Indyk

Divide & conquer. Which works better for multi-cores: insertion sort or merge sort? Why?

Chapter 7: Sorting 7.1. Original

CS 758/858: Algorithms

A Lower Bound for Comparison Sort

CSE 100: RED-BLACK TREES

Fast Sorting and Pattern-Avoiding Permutations

CS3334 Data Structures Lecture 4: Bubble Sort & Insertion Sort. Chee Wei Tan

MITOCW 7. Counting Sort, Radix Sort, Lower Bounds for Sorting

CSE465, Spring 2009 March 16 1

Game Theory and Randomized Algorithms

The Theory Behind the z/architecture Sort Assist Instructions

MITOCW R7. Comparison Sort, Counting and Radix Sort

Merge Sort. Note that the recursion bottoms out when the subarray has just one element, so that it is trivially sorted.

CSE373: Data Structure & Algorithms Lecture 23: More Sorting and Other Classes of Algorithms. Nicki Dell Spring 2014

CS188 Spring 2010 Section 3: Game Trees

DATA STRUCTURES USING C

and 6.855J. Network Simplex Animations

MITOCW 6. AVL Trees, AVL Sort

CS 473G: Combinatorial Algorithms, Fall 2005 Homework 0. I understand the Homework Instructions and FAQ.

CS188 Spring 2010 Section 3: Game Trees

CSE 573 Problem Set 1. Answers on 10/17/08

CS188 Spring 2014 Section 3: Games

Coding for Efficiency

Patterns and random permutations II

The tenure game. The tenure game. Winning strategies for the tenure game. Winning condition for the tenure game

Animation Demos. Shows time complexities on best, worst and average case.

Hypercube Networks-III

Pin-Permutations and Structure in Permutation Classes

Algorithms and Data Structures CS 372. The Sorting Problem. Insertion Sort - Summary. Merge Sort. Input: Output:

Introduction to Algorithms / Algorithms I Lecturer: Michael Dinitz Topic: Algorithms and Game Theory Date: 12/4/14

CS/ENGRD 2110 Object-Oriented Programming and Data Structures Spring 2012 Thorsten Joachims. Lecture 17: Heaps and Priority Queues

Previous Lecture. How can computation sort data faster for you? Sorting Algorithms: Speed Comparison. Recursive Algorithms 10/31/11

Tree Drawings on the Hexagonal Grid

Design and Analysis of Algorithms Prof. Madhavan Mukund Chennai Mathematical Institute. Module 6 Lecture - 37 Divide and Conquer: Counting Inversions

MITOCW ocw lec11

A Memory Efficient Anti-Collision Protocol to Identify Memoryless RFID Tags

COMP 2804 solutions Assignment 4

game tree complete all possible moves

Counting in Algorithms

LRM-Trees: Compressed Indices, Adaptive Sorting, and Compressed Permutations

Twin Binary Sequences: A Nonredundant Representation for General Nonslicing Floorplan

Module 3 Greedy Strategy

CSE 332: Data Structures and Parallelism Games, Minimax, and Alpha-Beta Pruning. Playing Games. X s Turn. O s Turn. X s Turn.

mywbut.com Two agent games : alpha beta pruning

Topic 23 Red Black Trees

Algorithms. Abstract. We describe a simple construction of a family of permutations with a certain pseudo-random

COUNTING AND PROBABILITY

Game-playing AIs: Games and Adversarial Search FINAL SET (w/ pruning study examples) AIMA

CSE 21 Practice Final Exam Winter 2016

Raising Permutations to Powers in Place

Number-Theoretic Algorithms

CS : Data Structures

Module 3 Greedy Strategy

Lecture 14. Questions? Friday, February 10 CS 430 Artificial Intelligence - Lecture 14 1

RBT Operations. The basic algorithm for inserting a node into an RBT is:

Discrete Mathematics and Probability Theory Spring 2014 Anant Sahai Note 11

THE TAYLOR EXPANSIONS OF tan x AND sec x

Greedy Algorithms. Kleinberg and Tardos, Chapter 4

ON SOME PROPERTIES OF PERMUTATION TABLEAUX

ON SOME PROPERTIES OF PERMUTATION TABLEAUX

Splay tree concept Splaying operation: double-rotations Splay insertion Splay search Splay deletion Running time of splay tree operations

PRIORITY QUEUES AND HEAPS. Lecture 19 CS2110 Spring 2014

MITOCW R3. Document Distance, Insertion and Merge Sort

Extensive Games with Perfect Information A Mini Tutorial

Adversary Search. Ref: Chapter 5

Generalized Game Trees

The Complexity of Sorting with Networks of Stacks and Queues

Permutations with short monotone subsequences

Lecture 20: Combinatorial Search (1997) Steven Skiena. skiena

Permutations. Example 1. Lecture Notes #2 June 28, Will Monroe CS 109 Combinatorics

Lecture5: Lossless Compression Techniques

Last update: March 9, Game playing. CMSC 421, Chapter 6. CMSC 421, Chapter 6 1

What is a Sorting Function?

Games we will consider. CS 331: Artificial Intelligence Adversarial Search. What makes games hard? Formal Definition of a Game.

CS 331: Artificial Intelligence Adversarial Search. Games we will consider

Some results on Su Doku

1 Permutations. Example 1. Lecture #2 Sept 26, Chris Piech CS 109 Combinatorics

Chapter 6.1. Cycles in Permutations

Lecture 20 November 13, 2014

MATH 433 Applied Algebra Lecture 12: Sign of a permutation (continued). Abstract groups.

Notes for Recitation 3

Outline for today s lecture Informed Search Optimal informed search: A* (AIMA 3.5.2) Creating good heuristic functions Hill Climbing

Animation Demos. Shows time complexities on best, worst and average case.

Lecture 6: Latin Squares and the n-queens Problem

An Ethical Game-Theoretic Solution Concept for Two-Player Perfect-Information Games

Random Binary Search Trees. EECS 214, Fall 2017

CS103 Handout 25 Spring 2017 May 5, 2017 Problem Set 5

Game-Playing & Adversarial Search Alpha-Beta Pruning, etc.

1 This work was partially supported by NSF Grant No. CCR , and by the URI International Engineering Program.

Representing Square Numbers. Use materials to represent square numbers. A. Calculate the number of counters in this square array.

Announcements. Homework 1 solutions posted. Test in 2 weeks (27 th ) -Covers up to and including HW2 (informed search)

PRIORITY QUEUES AND HEAPS

ARTIFICIAL INTELLIGENCE (CS 370D)

5 AVL trees: deletion

What is counting? (how many ways of doing things) how many possible ways to choose 4 people from 10?

Problem Set 10 Solutions

Monitoring Churn in Wireless Networks

Stack permutations and an order relation for binary trees

UNIVERSITY of PENNSYLVANIA CIS 391/521: Fundamentals of AI Midterm 1, Spring 2010

Transcription:

6.006- Introduction to Algorithms Lecture 10 Prof. Constantinos Daskalakis CLRS 8.1-8.4

Menu Show that Θ(n lg n) is the best possible running time for a sorting algorithm. Design an algorithm that sorts in Θ(n) time. Hint: maybe the models are different?

Comparison sort All the sorting algorithms we have seen so far are comparison sorts: only use comparisons to determine the relative order of elements. E.g., merge sort, heapsort. The best running time that we ve seen for comparison sorting is O(n lg n). Is O(n lg n) the best we can do? Decision trees can help us answer this question.

Decision-tree A recipe for sorting n numbers a 1, a 2,, a n 1:2 - Nodes are suggested 2:3 comparisons: i:j means 123 1:3 compare a i to a j, for i, j {1, 2,, n}. 132 312 1:3 213 2:3 231 321 - Branching direction depends on outcome of comparisons. - Leaves are labeled with permutations corresponding to the outcome of the sorting.

Decision-tree example Sort a 1, a 2, a 3 = 9, 4, 6 : 2:3 1:2 9 4 1:3 123 1:3 213 2:3 132 312 231 321 Each internal node is labeled i:j for i, j {1, 2,, n}. The left subtree shows subsequent comparisons if a i a j. The right subtree shows subsequent comparisons if a i a j.

Decision-tree example Sort a 1, a 2, a 3 = 9, 4, 6 : 2:3 1:2 1:3 9 6 123 1:3 132 312 213 2:3 231 321 Each internal node is labeled i:j for i, j {1, 2,, n}. The left subtree shows subsequent comparisons if a i a j. The right subtree shows subsequent comparisons if a i a j.

Decision-tree example Sort a 1, a 2, a 3 = 9, 4, 6 : 2:3 1:2 1:3 123 1:3 213 4 6 2:3 132 312 231 321 Each internal node is labeled i:j for i, j {1, 2,, n}. The left subtree shows subsequent comparisons if a i a j. The right subtree shows subsequent comparisons if a i a j.

Decision-tree example Sort a 1, a 2, a 3 = 9, 4, 6 : 2:3 1:2 1:3 123 1:3 213 2:3 132 312 231 321 4 6 9 Each leaf contains a permutation π(1), π(2),, π(n) to indicate that the ordering a π(1) a π(2) a π(n) has been established.

Decision-tree model A decision tree can model the execution of any comparison sort: One tree for each input size n. A path from the root to the leaves of the tree represents a trace of comparisons that the algorithm may perform. The running time of the algorithm = the length of the path taken. Worst-case running time = height of tree.

Lower bound for decisiontree sorting Theorem. Any decision tree that can sort n elements must have height Ω(n lg n). Proof. (Hint: how many leaves are there?) The tree must contain n! leaves, since there are n! possible permutations A height-h binary tree has 2 h leaves Thus 2 h n! h lg(n!) (lg is mono. increasing) lg ((n/e) n ) (Stirling s formula) = n lg n n lg e = Ω(n lg n).

Sorting in linear time Counting sort: No comparisons between elements. Input: A[1.. n], where A[ j] {1, 2,, k}. Output: B[1.. n], a sorted permutation of A Auxiliary storage: C[1.. k].

Counting sort for i 1 to k do C[i] 0 for j 1 to n do C[A[ j]] C[A[ j]] + 1 for i 2 to k do C[i] C[i] + C[i 1] for j n downto 1 do B[C[A[ j]]] A[ j] C[A[ j]] C[A[ j]] 1 store in C the frequencies of the different keys in A i.e. C[i] = {key = i} now C contains the cumulative frequencies of different keys in A, i.e. C[i] = {key i} using cumulative frequencies build sorted permutation

Counting-sort example 5 A: 4 1 3 4 3 C: one index for each possible key stored in A B:

Loop 1: initialization 5 A: 4 1 3 4 3 C: 0 0 0 0 B: for i 1 to k do C[i] 0

Loop 2: count frequencies 5 A: 4 1 3 4 3 C: 0 0 0 1 B: for j 1 to n do C[A[ j]] C[A[ j]] + 1 C[i] = {key = i}

Loop 2: count frequencies 5 A: 4 1 3 4 3 C: 1 0 0 1 B: for j 1 to n do C[A[ j]] C[A[ j]] + 1 C[i] = {key = i}

Loop 2: count frequencies 5 A: 4 1 3 4 3 C: 1 0 1 1 B: for j 1 to n do C[A[ j]] C[A[ j]] + 1 C[i] = {key = i}

Loop 2: count frequencies 5 A: 4 1 3 4 3 C: 1 0 1 2 B: for j 1 to n do C[A[ j]] C[A[ j]] + 1 C[i] = {key = i}

Loop 2: count frequencies 5 A: 4 1 3 4 3 C: 1 0 2 2 B: for j 1 to n do C[A[ j]] C[A[ j]] + 1 C[i] = {key = i}

Loop 2: count frequencies 5 A: 4 1 3 4 3 C: 1 0 2 2 B: for j 1 to n do C[A[ j]] C[A[ j]] + 1 C[i] = {key = i}

[A parenthesis: a quick finish 5 A: 4 1 3 4 3 C: 1 0 2 2 B: Walk through frequency array an place the appropriate number of each key in output array

A parenthesis: a quick finish 5 A: 4 1 3 4 3 C: 1 0 2 2 B: 1

A parenthesis: a quick finish 5 A: 4 1 3 4 3 C: 1 0 2 2 B: 1

A parenthesis: a quick finish 5 A: 4 1 3 4 3 C: 1 0 2 2 B: 1 3 3

A parenthesis: a quick finish 5 A: 4 1 3 4 3 C: 1 0 2 2 B: 1 3 3 4 4 B is sorted! but it is not stably sorted ]

Loop 2: count frequencies 5 A: 4 1 3 4 3 C: 1 0 2 2 B: for j 1 to n do C[A[ j]] C[A[ j]] + 1 C[i] = {key = i}

Loop 3: cumulative frequencies 5 A: 4 1 3 4 3 C: 1 0 2 2 B: C': 1 1 2 2 for i 2 to k do C[i] C[i] + C[i 1] C[i] = {key i}

Loop 3: cumulative frequencies 5 A: 4 1 3 4 3 C: 1 0 2 2 B: C': 1 1 3 2 for i 2 to k do C[i] C[i] + C[i 1] C[i] = {key i}

Loop 3: cumulative frequencies 5 A: 4 1 3 4 3 C: 1 0 2 2 B: C': 1 1 3 5 for i 2 to k do C[i] C[i] + C[i 1] C[i] = {key i}

Loop 4: permute elements of A 5 A: 4 1 3 4 3 C: 1 1 3 5 B: for j n downto 1 do B[C[A[ j]]] A[ j] C[A[ j]] C[A[ j]] 1

Loop 4: permute elements of A 5 A: 4 1 3 4 3 C: 1 1 3 5 B: There are exactly 3 elements A[5]; so where should I place A[5]? for j n downto 1 do B[C[A[ j]]] A[ j] C[A[ j]] C[A[ j]] 1

Loop 4: permute elements of A 5 A: 4 1 3 4 3 C: 1 1 3 5 B: 3 Used-up one 3; update counter. for j n downto 1 do B[C[A[ j]]] A[ j] C[A[ j]] C[A[ j]] 1

Loop 4: permute elements of A 5 A: 4 1 3 4 3 C: 1 1 2 5 B: 3 for j n downto 1 do B[C[A[ j]]] A[ j] C[A[ j]] C[A[ j]] 1

Loop 4: permute elements of A 5 A: 4 1 3 4 3 C: 1 1 2 5 B: 3 for j n downto 1 do B[C[A[ j]]] A[ j] C[A[ j]] C[A[ j]] 1

Loop 4: permute elements of A 5 A: 4 1 3 4 3 C: 1 1 2 5 B: 3 There are exactly 5 elements A[4], so where should I place A[4]? for j n downto 1 do B[C[A[ j]]] A[ j] C[A[ j]] C[A[ j]] 1

Loop 4: permute elements of A 5 A: 4 1 3 4 3 C: 1 1 2 4 B: 3 4 for j n downto 1 do B[C[A[ j]]] A[ j] C[A[ j]] C[A[ j]] 1

Loop 4: permute elements of A 5 A: 4 1 3 4 3 C: 1 1 2 4 B: 3 4 for j n downto 1 do B[C[A[ j]]] A[ j] C[A[ j]] C[A[ j]] 1

Loop 4: permute elements of A 5 A: 4 1 3 4 3 C: 1 1 2 4 B: 3 4 for j n downto 1 do B[C[A[ j]]] A[ j] C[A[ j]] C[A[ j]] 1

Loop 4: permute elements of A 5 A: 4 1 3 4 3 C: 1 1 1 4 B: 3 3 4 for j n downto 1 do B[C[A[ j]]] A[ j] C[A[ j]] C[A[ j]] 1

Loop 4: permute elements of A 5 A: 4 1 3 4 3 C: 1 1 1 4 B: 3 3 4 for j n downto 1 do B[C[A[ j]]] A[ j] C[A[ j]] C[A[ j]] 1

Loop 4: permute elements of A 5 A: 4 1 3 4 3 C: 1 1 1 4 B: 3 3 4 for j n downto 1 do B[C[A[ j]]] A[ j] C[A[ j]] C[A[ j]] 1

Loop 4: permute elements of A 5 A: 4 1 3 4 3 C: 0 1 1 4 B: 1 3 3 4 for j n downto 1 do B[C[A[ j]]] A[ j] C[A[ j]] C[A[ j]] 1

Loop 4: permute elements of A 5 A: 4 1 3 4 3 C: 0 1 1 4 B: 1 3 3 4 for j n downto 1 do B[C[A[ j]]] A[ j] C[A[ j]] C[A[ j]] 1

Loop 4: permute elements of A 5 A: 4 1 3 4 3 C: 0 1 1 4 B: 1 3 3 4 for j n downto 1 do B[C[A[ j]]] A[ j] C[A[ j]] C[A[ j]] 1

Loop 4: permute elements of A 5 A: 4 1 3 4 3 C: 0 1 1 3 B: 1 3 3 4 4 for j n downto 1 do B[C[A[ j]]] A[ j] C[A[ j]] C[A[ j]] 1

Analysis Θ(k) Θ(n) Θ(k) Θ(n) Θ(n + k) for i 1 to k do C[i] 0 for j 1 to n do C[A[ j]] C[A[ j]] + 1 for i 2 to k do C[i] C[i] + C[i 1] for j n downto 1 do B[C[A[ j]]] A[ j] C[A[ j]] C[A[ j]] 1

Running time If k = O(n), then counting sort takes Θ(n) time. But, sorting takes Ω(n lg n) time! Where s the fallacy? Answer: Comparison sorting takes Ω(n lg n) time. Counting sort is not a comparison sort. In fact, not a single comparison between elements occurs!

Stable sorting Counting sort is a stable sort: it preserves the input order among equal elements. A: 4 1 3 4 3 B: 1 3 3 4 4

Radix sort Origin: Herman Hollerith s card-sorting machine for the 1890 U.S. Census. (See Appendix.) Digit-by-digit sort. Hollerith s original (bad) idea: sort on mostsignificant digit first. Good idea: Sort on least-significant digit first with auxiliary stable sort.

Operation of radix sort 3 2 9 4 5 7 6 5 7 8 3 9 4 3 6 7 2 0 3 5 5 7 2 0 3 5 5 4 3 6 4 5 7 6 5 7 3 2 9 8 3 9 7 2 0 3 2 9 4 3 6 8 3 9 3 5 5 4 5 7 6 5 7 3 2 9 3 5 5 4 3 6 4 5 7 6 5 7 7 2 0 8 3 9

Correctness of radix sort Induction on digit position Assume that the numbers are sorted by their low-order t 1 digits. Sort on digit t 7 2 0 3 2 9 4 3 6 8 3 9 3 5 5 4 5 7 6 5 7 3 2 9 3 5 5 4 3 6 4 5 7 6 5 7 7 2 0 8 3 9

Correctness of radix sort Induction on digit position Assume that the numbers are sorted by their low-order t 1 digits. Sort on digit t Two numbers that differ in digit t are correctly sorted. 7 2 0 3 2 9 4 3 6 8 3 9 3 5 5 4 5 7 6 5 7 3 2 9 3 5 5 4 3 6 4 5 7 6 5 7 7 2 0 8 3 9

Correctness of radix sort Induction on digit position Assume that the numbers are sorted by their low-order t 1 digits. Sort on digit t Two numbers that differ in digit t are correctly sorted. Two numbers equal in digit t are put in the same order as the input correct order. 7 2 0 3 2 9 4 3 6 8 3 9 3 5 5 4 5 7 6 5 7 3 2 9 3 5 5 4 3 6 4 5 7 6 5 7 7 2 0 8 3 9

Runtime Analysis of radix sort Assume counting sort is the auxiliary stable sort. Sort n computer words of b bits each. Each word can be viewed as having b/r base-2 r digits. 8 8 8 8 Example: 32-bit word If each b-bit word is broken into r-bit pieces, each pass of counting sort takes Θ(n + 2 r ) time. Setting r=log n gives Θ(n) time per pass, or Θ(n b/log n) total