Compilation failure when result_t and data_t of compute_layer point to same data type
Created by: benjaminkreis
In this branch I've added a model from Sergo that's failing to compile. If you do the normal python keras-to-hls.py -c keras-config.yml
you'll pick up the model and see the error when you try to build the project.
The error is
INFO: [SIM 211-2] *************** CSIM start ***************
INFO: [SIM 211-4] CSIM will launch GCC as the compiler.
Compiling ../../../../myproject_test.cpp in debug mode
Compiling ../../../../firmware/myproject.cpp in debug mode
In file included from ../../../../firmware/parameters.h:7:0,
from ../../../../firmware/myproject.cpp:21:
/home/kreis/muon/hls4ml/nnet_utils/nnet_layer.h: In function ‘void nnet::compute_layer(data_T*, res_T*, typename CONFIG_T::weight_t (*)[CONFIG_T:: n_out], typename CONFIG_T::bias_t*) [with data_T = ap_fixed<18, 8>, res_T = ap_fixed<18, 8>, CONFIG_T = config4, typename CONFIG_T::weight_t = ap_fixed<18, 8>, typename CONFIG_T::bias_t = ap_fixed<18, 8>]’:
../../../../firmware/myproject.cpp:82:77: instantiated from here
/home/kreis/muon/hls4ml/nnet_utils/nnet_layer.h:100:13: error: invalid use of incomplete type ‘class ap_fixed<18, 8>’
/data/xilinx/Vivado_HLS/2017.2/include/ap_int.h:318:7: error: declaration of ‘class ap_fixed<18, 8>’
/home/kreis/muon/hls4ml/nnet_utils/nnet_layer.h:100: confused by earlier errors, bailing out
make: *** [obj/myproject.o] Error 1
ERROR: [SIM 211-100] 'csim_design' failed: compilation error(s).
INFO: [SIM 211-3] *************** CSIM finish ***************
4
while executing
"source [lindex $::argv 1] "
("uplevel" body line 1)
invoked from within
"uplevel \#0 { source [lindex $::argv 1] } "
I've definitely seen this one before, but I can't remember the previous causes.
In any case, what's special about this model is that there is no activation on the final layer, and it seems that compute_layer does not like it when the data_t and result_t typedefs both point to the same type. If I change result_t to ap_fixed<19,8>, it works. It also works if I add an activation after, which is why we haven't seen this before.
(and it must only matter for the last layer with output res??)