Skip to content

WIP: serial mode for conv1d

Javier Duarte requested to merge bjk/conv_ser into master

Created by: benjaminkreis

Changes similar to #45 for conv1d

I'm seeing that the Latency and Initiation Interval have different maximums and minimums due to the conv layers:

    +------+------+------+------+---------+
    |   Latency   |   Interval  | Pipeline|
    |  min |  max |  min |  max |   Type  |
    +------+------+------+------+---------+
    |  6600|  8920|  6601|  8921|   none  |
    +------+------+------+------+---------+
 + Detail: 
        * Instance: 
        +--------------------------------+---------------------+------+------+------+------+---------+
        |                                |                     |   Latency   |   Interval  | Pipeline|
        |            Instance            |        Module       |  min |  max |  min |  max |   Type  |
        +--------------------------------+---------------------+------+------+------+------+---------+
        |grp_conv_1d_0_0_2_fu_206        |conv_1d_0_0_2        |  3204|  4644|  3204|  4644|   none  |
        |grp_conv_1d_0_0_1_fu_214        |conv_1d_0_0_1        |  1684|  2404|  1684|  2404|   none  |
        |grp_unflatten_1_fu_250          |unflatten_1          |    61|    61|    61|    61|   none  |
        |grp_softmax_fu_274              |softmax              |   202|   202|   202|   202|   none  |
        |grp_conv_1d_0_0_fu_288          |conv_1d_0_0          |   554|   714|   554|   714|   none  |
        |grp_relu_fu_314                 |relu                 |    21|    21|    21|    21|   none  |
        |grp_unflatten_fu_328            |unflatten            |    51|    51|    51|    51|   none  |
        |grp_relu_2_fu_362               |relu_2               |    31|    31|    31|    31|   none  |
        |grp_flatten_1_fu_396            |flatten_1            |    51|    51|    51|    51|   none  |
        |grp_compute_layer_0_0_fu_430    |compute_layer_0_0    |   419|   419|   419|   419|   none  |
        |grp_flatten_fu_448              |flatten              |    11|    11|    11|    11|   none  |
        |grp_compute_layer_0_0_1_fu_462  |compute_layer_0_0_1  |   224|   224|   224|   224|   none  |
        |grp_relu_1_fu_480               |relu_1               |    21|    21|    21|    21|   none  |
        |grp_flatten_2_fu_504            |flatten_2            |    41|    41|    41|    41|   none  |
        +--------------------------------+---------------------+------+------+------+------+---------+

Due to this, keeping it as a separate "work in progress" PR for now.

Merge request reports

Loading