Remember, "the answer" is only half of it! Also, make sure everyone in your group can explain why.

We will use this Gradescope Online Assignment as our worksheet for in-class work. These problems are coded as being worth points for our simplicity, but will not impact your grade in the course.

Question 2

Question 2.1

$\mathcal{O}(n)$.

Code Modeling: Working from the inside out:

Recall, the specific constants 52 and 1 are not going to end up being important! We will throw them away in asymptotic analysis so it's totally fine if you had slightly different numbers there. Remember, we are a bit hand-wavy on the definition of "one step".

Asymptotic Analysis: We start by identifying the dominating term of $52n + 1$. Since the largest power of $n$ here is 1, we can tell the dominating term is $52n$. We then drop constant factors to report our answer $\mathcal{O}(n)$.

Question 2.2

$\mathcal{O}(1)$

This one is a bit tough! Code Modeling: This one is a lot harder to come up with a code model for since the run-time changes pretty drastically depending on n. Using our counting rules: we would see like if n < 1000, the steps would be something like $n \cdot n$ steps (being lazy with constants since we would drop them later). If it was larger than 1000, it would be 1 step. This is a perfectly fine result from our code modeling! It's a function that takes $n$ as an input and tells us the number of steps. Asymptotic Analysis: It's tempting to think that this will be $\mathcal{O}(n^2)$ because of that n * n loop in the if statement. Notice though, that Big-Oh is a statement about the function as $n$gets large ("for all $n > n_0$"). So the case we ould consider is the else branch which does constant work, or is bounded by $\mathcal{O}(1)$.

Note, since Big-Oh is an upper-bound, $\mathcal{O}(n^2)$ is true if $\mathcal{O}(1)$ is also true. However, this problem asked for the tightest possible bound, so $\mathcal{O}(1)$ is a better answer.

Question 3

This reasoning does not work because it somewhat contradicts itself. Big-Oh is a statement about a function as $n$ gets large ("for all $n \ge n_0$ "). It doesn't work to make a statement about a small and constant size $n$ in the same claim where we are making a statement about $n$ growing large. The key takeaway is that $\mathcal{O}$ is making statements as the input size continues to grow.

Since removing from the front will require looping over all the data, it is more appropriate to say the function is $\mathcal{O}(n)$.