This phase checks whether the learner understands how learning actually happens inside PyTorch, not just how to write code.
Question Description:
Explain what automatic differentiation means in PyTorch and why it is essential for training machine learning models. Focus on how gradients are computed automatically instead of manually writing mathematical derivatives.
Sample Input:
Model parameters need gradients
Sample Output:
PyTorch automatically computes gradients of loss with respect to parameters using autograd.
requires_gradQuestion Description:
Explain why the requires_grad flag is needed in PyTorch tensors and what happens when it is set to True.
Sample Input:
torch.tensor([2.0], requires_grad=True)
Sample Output:
PyTorch starts tracking operations on this tensor to compute gradients.