The field of artificial intelligence have found wide applications and recorded success in diverse facet of life from providing solutions in finance to medicine. Machine Learning is an important aspect of the artificial intelligence field and there are various architectures disrupting the status quo. Recurrent neural networks (RNN) are an important architecture which have found usefulness in time series-based problems, speech processing, natural language processing and so on. Recurrent Neural networks comprises of various architectures and one important and simple architecture is the Echo State Networks (ESN).
We will review various RNN architectures including ESNs and others. ESNs provide an architecture and supervised learning principle for recurrent neural networks. In some aspects ESNs are better than other known RNN architectures because they are fast, does not suffer from bifurcations and very easy to implement. We hope to focus on applications in time series generation and/or crypography.
The aim of this project is to implement an efficient RNN, perhaps an ESN on an FPGA. This will be achieved by using a computationally efficient and effective activation function. Furthermore, the learning algorithm of echo state network will be investigated in order to further reduce its computational complexity. Finally, the area space will be reduced by exploring quantization of some echo state network parameters.
Lab allocations have not been finalised