Remember, "the answer" is only half of it! Also, make sure everyone in your group can explain why.

We will use this Gradescope Online Assignment as our worksheet for in-class work. These problems are coded as being worth points for our simplicity, but will not impact your grade in the course.

Question 2

Question 2.1

Consider the worst-case situation where all elements hash to a single bucket, leading to a LinkedList of length nn. In the best case, all elements would hash to their own unique buckets, avoiding all collisions.

Question 2.2

Get, put, and find would be O(n) worst-case, and O(1) best case.

Question 2.3

Even though they have the same best and worst cases, we would still probably use ChainingHashMap over ArrayMap, since we said that in practice the ChainingHashMap will run much closer to its best-case than its worst-case.

Question 2.4

Recall that the load factor is our estimation of the average number of elements per bucket

$\displaystyle\lambda = \frac{\#\text{ elements}}{\#\text{ buckets}}$ Since we want our hash table to have near constant-time lookup, we want to make sure this is some constant value like 1 (other constant values are also totally fine). While it technically works to use any of those load factors since the hash table won't break with any of them, it won't have our optimal constant-time access if we let $\lambda$ scale with $n$.

Question 3

Question 3.1

https://s3-us-west-2.amazonaws.com/secure.notion-static.com/6764f7e5-b1e5-467f-b5c9-2fe6ae1071e8/Untitled.png

Question 3.2

1.4

The load factor $\lambda$ is defined as:

$\displaystyle\lambda = \frac{\#\text{ elements}}{\#\text{ buckets}}$