Quantized Grads#
This subproject can be used for emulating training in python with quantized gradients.
Python#
Layers#
Right now this package supports the following base layers:
Batchnorm2d
Conv1d
Conv2d
Linear
Sigmoid
ReLU
Both quantization schemes can be applied for forward and backward independently. In my opinion the only valid combinations are the following three combinations.
Forward Fullresolution / Backward Fullresolution
Forward Quantized / Backward Fullresolution
Forward Quantized / Backward Quantized
Quantization#
Right now we support following quantization schemes:
FixedPoint
FixedPointStochastic
BlockFloatingPoint
BlockFloatingPointStochastic