Estimated time to completion: 1.5-2 hours.

General guideline for completing the assignment:

Feel free to stack-overflow, Google, and research any questions/issues you have. You are also welcome to use any programming language. The only thing we ask is that this work is done independently by you without the help of your family/friends. If the assignment requires coding, please write necessary docs to instruct people to run your codes.

When you have completed the assignment, please name the file in accordance with this naming pattern: [YOUR FULL NAME]-Assignment for [NAME OF THE ROLE THE ASSIGNMENT IS FOR] and upload it to **this dropbox.**

Question 1: MobileNet and Batch-Normalization (1.5-2 hours)

  1. If you didn't know about MobileNet before, please quickly scan through the paper on depthwise-separable factorization using MobileNets https://arxiv.org/abs/1704.04861 .

  2. For this question, you can decide which ML framework (e.g. Tensorflow, PyTorch, Keras) you want to use. Find a pre-trained Mobilenet-v1 model (or Mobilenet-v2 if Mobilenet-v1 is not available in the official model zoo) on ImageNet Classification and download the weights.

  3. Given the weights, write a test script to run image classification on this image. Find the top 5 labels and their respective probabilities.

    https://s3-us-west-2.amazonaws.com/secure.notion-static.com/700c66f5-bdbd-4d31-8405-a18a778f606d/1_Whats_New_Hero.png

  4. Print the first three layers of the network and the shapes of the corresponding parameters (weights, biases, ...) and save them to a .txt file.

  5. Instead of doing separate convolution and batch normalization step for each layer, we would like to combine the two steps so that convolution takes care of the batch normalization. Combine the batch normalization parameters and convolution-layer weights to calculate the batch-normalized weights and biases in the first convolutional layer. Specify how you obtained the new weights and biases. Flatten the weights and bias tensors and print the first 5 elements for each. (Reference: batch normalization paper: https://arxiv.org/pdf/1502.03167.pdf)