A grammar is formally a tuple $G=(V, \sum, R, S)$
$V=$ the finite set of variables/non-terminals
$\sum=$ finite alphabet
$R=$ finite set of production rules/productions
<aside> ⚒️ $\alpha \overset{*}{\rightarrow} \beta$ means $\alpha$ can be rewritten as $\beta$ by applying a finite number of production rules in succession.
</aside>
$S \in V$ is known as the start variable
Let the grammar $G=(\{S\}, \{0,1\}, R,S)$, where

<aside> 💡 We replaced $S$ with $_0S_1$ as many times as we wanted (by the rules $R$) and then finished by replacing it with $\epsilon$.
</aside>
And we can say the language of this grammar is
$$ L(G) = \{0^n1^n:n \geq 0\} $$
$$ L(G) = \{w \in \Sigma^{}:S \overset{}{\rightarrow} w\} $$
<aside> ⚒️ That is, all strings in $\Sigma^*$ which can be derived from $S$ using finitely many applications of the production rules $R$ for $G$.
</aside>
A left most derivation is where rules are applied to the left most non-terminal/variable. Vice versa for right most derivation.
An example of left most derivation:
