Why do I see a drop (or jump) in my final validation accuracy

Technical Source
1 min readApr 1, 2022

--

Why do I see a drop (or jump) in my final validation accuracy when training a deep learning network?

NOTE:-

Matlabsolutions.com provide latest MatLab Homework Help,MatLab Assignment Help for students, engineers and researchers in Multiple Branches like ECE, EEE, CSE, Mechanical, Civil with 100% output.Matlab Code for B.E, B.Tech,M.E,M.Tech, Ph.D. Scholars with 100% privacy guaranteed. Get MATLAB projects with source code for your learning and research.

If the network contains batch normalization layers, the final validation metrics are often different from the validation metrics evaluated during training. This is because the network undergoes a ‘finalization’ step after the last iteration to compute the batch normalization layer statistics on the entire training data, while during training the batch normalization statistics are computed from the mini-batches.

If in addition to batch normalization layers the network contains dropout layers, the interaction between these two layers can aggravate this issue, as described here: https://arxiv.org/abs/1801.05134

If one removes the batch normalization (and dropout) layers from the network, the ‘final’ accuracy should be the same as the last iteration accuracy.

SEE COMPLETE ANSWER CLICK THE LINK

--

--

Technical Source
Technical Source

Written by Technical Source

Simple! That is me, a simple person. I am passionate about knowledge and reading. That’s why I have decided to write and share a bit of my life and thoughts to.

No responses yet