MXNet pre-trained model Full ImageNet Network inception-21k.tar.gz
dmlc

inception-21k.tar.gz 125.14MB
Type: Dataset
Tags:

Bibtex:
@article{,
title= {MXNet pre-trained model Full ImageNet Network inception-21k.tar.gz},
keywords= {},
journal= {},
author= {dmlc},
year= {},
url= {https://github.com/dmlc/mxnet-model-gallery/blob/master/imagenet-21k-inception.md},
license= {},
abstract= {# Full ImageNet Network

This model is a pretrained model on full imagenet dataset [1] with 14,197,087 images in 21,841 classes. The model is trained by only random crop and mirror augmentation.

The network is based on Inception-BN network [2], and added more capacity. This network runs roughly 2 times slower than standard Inception-BN Network.

We trained this network on a machine with 4 GeForce GTX 980 GPU. Each round costs 23 hours, the released model is the 9 round.

Train Top-1 Accuracy over 21,841 classes: 37.19%

Single image prediction memory requirement: 15MB

ILVRC2012 Validation Performance:

|        | Over 1,000 classes | Over 21,841 classes |
| ------ | ------------------ | ------------------- |
| Top-1  | 68.3%              | 41.9%               |
| Top-5  | 89.0%              | 69.6%               |
| Top=20 | 96.0%              | 83.6%               |


Note: Directly use 21k prediction may lose diversity in output. You may choose a subset from 21k to make perdiction more reasonable.

The compressed file contains:
- ```Inception-symbol.json```: symbolic network
- ```Inception-0009.params```: network parameter
- ```synset.txt```: prediction label/text mapping

There is no mean image file for this model. We use ```mean_r=117```, ```mean_g=117``` and ```mean_b=117``` to noramlize the image.

##### Reference:

[1] Deng, Jia, et al. "Imagenet: A large-scale hierarchical image database." *Computer Vision and Pattern Recognition*, 2009. CVPR 2009. IEEE Conference on. IEEE, 2009.

[2] Ioffe, Sergey, and Christian Szegedy. "Batch normalization: Accelerating deep network training by reducing internal covariate shift." *arXiv preprint arXiv:1502.03167* (2015).},
superseded= {},
terms= {}
}

Hosted by users
No stats to report yet.

Send Feedback Start
   0.000006
DB Connect
   0.000470
Lookup hash in DB
   0.000383
Get torrent details
   0.000132
Get torrent details, finished
   0.000228
Get authors
   0.000033
Parse bibtex
   0.000137
Write header
   0.000248
get stars
   0.000123
home tab
   0.000144
render right panel
   0.000015
render ads
   0.000389
fetch current hosters
   0.000330
Start get stats
   0.000317
End get stats
   0.000002
related datasets
   0.001261
Done