Presentation Information

[D-20-14]CNN Model Compression by Distillation using Regions of Interest in the Teacher Model

○Takumi Shirahama1, Keisuke Kameyama1 (1. Univ. Tsukuba)
PDF DownloadDownload PDF

Keywords:

neural networks,knowledge distillation,transfer learning,image recognition,class activation mapping