Presentation Information
[D-20-14]CNN Model Compression by Distillation using Regions of Interest in the Teacher Model
○Takumi Shirahama1, Keisuke Kameyama1 (1. Univ. Tsukuba)
Keywords:
neural networks,knowledge distillation,transfer learning,image recognition,class activation mapping