Quantized Grads#

This subproject can be used for emulating training in python with quantized gradients.

Python#

Layers#

Right now this package supports the following base layers:

  • Batchnorm2d

  • Conv1d

  • Conv2d

  • Linear

  • Sigmoid

  • ReLU

Both quantization schemes can be applied for forward and backward independently. In my opinion the only valid combinations are the following three combinations.

  1. Forward Fullresolution / Backward Fullresolution

  2. Forward Quantized / Backward Fullresolution

  3. Forward Quantized / Backward Quantized

Quantization#

Right now we support following quantization schemes:

  • FixedPoint

  • FixedPointStochastic

  • BlockFloatingPoint

  • BlockFloatingPointStochastic