Skip to content

Implementation and optimizations linked to Simple-RNN and LSTM for qu…

Created by: nemerchiedde

This PR explores the integration of two RNNs for quartus. The current implementation has integrated the Simple-RNN and LSTM. Data can be passed in two different formats (single-cell and sliding-window):

  • Single-cell: The LSTM and Single RNN cell are computed at every instance, and the same RNN operation is applied for each value combining the past state and the new value (this procedure is repeated until the end of the data)

  • Sliding-Window: Full sequence split into overlapping subsequences with a sliding window, used for time date acquisition.

The activation functions for Simple RNN and LSTM were recovered directly from the trained model and then were automatically overwritten for the RNN, so no changes in the code are longer necessary. This is also true for the recurrent activation function of LSTM.

Concerning the LSTM, the weights (kernel, recurrent kernel and bias) have been subdivided into their respective gates to allow bit_width independence.

Merge request reports

Loading