Using Deep Reservoir Computing to Solve Sequential Tasks
Grade Level at Time of Presentation
Senior
Major
Data Science
Minor
Honors
Institution
Northern Kentucky University
KY House District #
4
KY Senate District #
4
Faculty Advisor/ Mentor
Dr. Kevin Kirby
Department
Department of Computer Science
Abstract
In recent years, artificial intelligence has been dominated by neural networks. These systems potentially provide unparalleled accuracy to tasks thought to be unsolvable in the past. They are now are being applied in technologies such as self-driving cars and conversational software such as Alexa or Siri. Reservoir Computing is a more efficient form of neural network that can learn to solve hard problems quickly and with minimal computational power. In recent years they had been put aside in favor of more sophisticated "deep learning" models as computational speeds increased. In this project, we investigate Reservoir Computing in light of newer technological developments to see which situations it works well in, with particular attention to new Deep Reservoirs, which rely on special high-performance computing architectures. Beginning with simple code written in the Python language to explore these networks, we then scale up to employ Google’s state of the art Tensorflow software. We then benchmark Deep Reservoir Computing (DRC) models on sequence data from the life sciences and compare it to results from other popular models such as Gated Recurrent Units and Long Short-Term Memory networks. We then characterize the type of problems that are well-suited to these efficient DRC models.
Using Deep Reservoir Computing to Solve Sequential Tasks
In recent years, artificial intelligence has been dominated by neural networks. These systems potentially provide unparalleled accuracy to tasks thought to be unsolvable in the past. They are now are being applied in technologies such as self-driving cars and conversational software such as Alexa or Siri. Reservoir Computing is a more efficient form of neural network that can learn to solve hard problems quickly and with minimal computational power. In recent years they had been put aside in favor of more sophisticated "deep learning" models as computational speeds increased. In this project, we investigate Reservoir Computing in light of newer technological developments to see which situations it works well in, with particular attention to new Deep Reservoirs, which rely on special high-performance computing architectures. Beginning with simple code written in the Python language to explore these networks, we then scale up to employ Google’s state of the art Tensorflow software. We then benchmark Deep Reservoir Computing (DRC) models on sequence data from the life sciences and compare it to results from other popular models such as Gated Recurrent Units and Long Short-Term Memory networks. We then characterize the type of problems that are well-suited to these efficient DRC models.