viernes, 11 de septiembre de 2020

Assignment 02 – Course COMPUTER VISION II: APPLICATIONS (PYTHON)

 

Assignment 02 – Course COMPUTER VISION II: APPLICATIONS (PYTHON)

Instructions:

 

Improving CNN Training

This assignment is aimed at reinforcing your knowledge about the training process for a CNN. Given below is the task:

 

Situation

We have given the code for training a CNN using the SGD optimizer. On running the training for 5 epochs

we found that the Training accuracy we get is only 10%.

 

Your Task

Your task is to improve the overall training process, so that we get higher Training accuracy.

You can change any parameter you may find suitable ( we have provided some hints in Section 4. ).

 

Only Section 4 is the place where you need to make changes.

 

The distribution of marks is as follows:

 

1. Training Accuracy > 30% - 5 marks

1. Training Accuracy > 50% - 10 marks

1. Training Accuracy > 65% - 15 marks

P.S. We were able to achieve 68% Training accuracy by changing a few things from the current configuration.

 

NOTE: You can also download the NB and run your experiments on Google Colab or Kaggle

and when you think you have got the solution, you can add the code in section 4 and submit.

The assignment carries 30 marks and you will have a total of 5 attempts. The grading will be done manually.

# 4. TODO

“””

Currently, the model and optimizer is configured such that it gives very low accuracy ~10%.

Your task is to explore options by modifying model and optimizer to get to more than 65% Training accuracy.

 

Here are a few hints of what changes can help increase the accuracy in just 5 epochs:

 

Changing the Model parameters like activation type, droupout ratio etc.

Changing the optimizer

Changing optimizer parameters

“””

What did I do?

I wrote a list of possible parameter values in each change.

Activation = [relu, sigmoid, softmax, softplus, softsign, tanh, selu, elu, exponential]

Dropout_rate = [0.2, 0.5]

Posible_keras_optimizers = [SGD, RMSprop, Adam, Adadelta, Adagrad, AdaMax, Nadam, Ftrl]

Only import Optimizer = [ SGD, Adam, Adagrad, RMSprop]

Learning_rate = [0.001, 0.0001]

Then, I did only one change each time and run the Jupiter notebook. Biggest improves in accuracy in the Optimizer.


This is my output with best accuracy.

 

No hay comentarios:

Publicar un comentario