Notice: This repo is no longer actively maintained. You are very welcome to use it, but I am unable to respond to issues and provide support.
This repo contains a collection of common MatConvNet functions and DagNN layers which are shared across a number of classification and object detection frameworks.
vl_nnmax- element-wise maximum across tensorsvl_nnsum- element-wise sum across tensorsvl_nninterp- a wrapper for bilinear interpolationvl_nnslice- slicing along a given dimensionvl_nnspatialsoftmax- spatial application of the softmax operatorvl_nnreshape- tensor reshapingvl_nnchannelshuffle- channel shuffling (introduced in ShuffleNet)vl_nnflatten- flatten along a given dimensionvl_nnglobalpool- global poolingvl_nnsoftmaxt- softmax along a given dimensionvl_nncrop_wrapper- autonn function wrapper forvl_nncrop.mvl_nnaxpy- vector opy <- a*x + y(BLAS Level One style naming convention)vl_nngnorm- group normalization (an alternative to batch norm)vl_nnhuberloss- computation of the Huber (L1-smooth) lossvl_nneuclidenaloss- computation of the Euclidean (L2-smooth) lossvl_nntukeyloss- computation of Tukey's Biweight (robust) lossvl_nnsoftmaxceloss- soft-target cross entropy loss (operates on logits)vl_nncaffepool- "caffe-style" pooling (applies padding before pooling kernel)vl_nnl2norm- l2 feature normalisation
mcnExtraLayers requires the following modules:
- autonn - automatic differentiation
The module also contains some additional utilities which may be useful during network training:
- findBestCheckpoint - function to rank and prune network checkpoints saved during training (useful for saving space automatically at the end of a training run
- checkLearningParams - compare mcn network against a caffe prototxt
The module is easiest to install with the vl_contrib package manager:
vl_contrib('install', 'mcnExtraLayers') ;
vl_contrib('setup', 'mcnExtraLayers') ;
vl_contrib('test', 'mcnExtraLayers') ; % optional