multiple layers running in parallel on same input
Created by: benjaminkreis
@sergojin has a use-case for multiple dense layers running in parallel on the same input and producing multiple outputs.
Two example keras models are here: https://github.com/hls-fpga-machine-learning/hls4ml/tree/multiple_layers/keras-to-hls/fromSergo
And a working HLS project made by hand from the ".5" model is here: https://github.com/hls-fpga-machine-learning/hls4ml/tree/multiple_layers/keras-to-hls/my-hls-test-modified The final layers run in parallel on the output of the previous layer, and their output is merged to form the result.
What is needed is the hls4ml translation part. Right now we assume the output of each layer is input to only one layer, with the order taken from the order in the json file. @sergojin and @nhanvtran found we can use the inbound_nodes
of the json to map the layers to each other.