Argmax Softmax
Created by: bo3z
A# Description
📝 A new implementation of Softmax activation forVivado
andQuartus
- Softmax is a monotonically increasing function; therefore, in classification problems, the output is equivalent to the largest output before Softmax activation (i.e. Softmax
- Default implementation is still
stable
, as we are sometimes interested in the normalized output probability.
Two implementations are added:
- ArgMax - Returns a one-hot encoded vector. This would be set through the hls4ml config, so for example hls_config['LayerName']['softmax']['strategy'] = 'argmax' (very similar to what we do now with stable and latency implementations of Softmax).
- Logits - Removes the Softmax layer. Again handled through hls4ml config, through an optional boolean attribute skip (defaults to false), so for example: hls_config['LayerName']['softmax']['skip'] = True. There would be an optimizer that removes the Softmax node from the model graph and rewires the network.
Type of change
-
New feature (non-breaking change which adds functionality)
Tests
- Expanded
test/pytest/test_softmax.py
with new implementation
Checklist
-
I have read the guidelines for contributing. - [] I have commented my code, particularly in hard-to-understand areas.
- [] I have made corresponding changes to the documentation.
-
My changes generate no new warnings. -
I have added tests that prove my fix is effective or that my feature works.