This repository holds the code to a new kind of RNN model for processing sequential data. The model computes a recurrent weighted average (RWA) over every previous processing step. With this approach, the model can form direct connections anywhere along a sequence. This stands in contrast to traditional RNN architectures that only use the previous processing step. A detailed description of the RWA model has been published in a mansuscript at https://arxiv.org/pdf/1703.01253.pdf.

Because the RWA can be computed as a running average, it does not need to be completely recomputed with each processing step. The numerator and denominator can be saved from the previous step. Consequently, the model scales like that of other RNN models such as the LSTM model.

In each folder, the RWA model is evaluated on a different task. The performance of the RWA model is compared against a LSTM model. The RWA is found to train faster and/or generalize better on each task. See the above manuscript for additional details about each result.

Code Quality Rank: L5
Programming language: Python

rwa alternatives and related packages

Based on the "Machine Learning" category

Do you think we are missing an alternative of rwa or a related project?

Add another 'Machine Learning' Package

rwa Recommendations

There are no recommendations yet. Be the first to promote rwa!

Have you used rwa? Share your experience. Write a short recommendation and rwa, you and your project will be promoted on Awesome Python.
Recommend rwa

Recently added rwa resources

Do you know of a usefull tutorial, book or news relevant to rwa?
Be the first to add one!