MIT & Google Quantum Algorithm Trains Wide and Deep Neural Networks
+ Quantum algorithms for training wide and classical neural networks have become one of the most promising research areas for quantum computer applications. While neural networks have achieved state-of-the-art results across many benchmark tasks, existing quantum neural networks have yet to clearly demonstrate quantum speedups for tasks involving classical datasets.
Given deep learning’s ever-rising computational requirements, the use of quantum computers to efficiently train deep neural networks is a research field that could greatly benefit from further exploration.
+ Motivated by the success of classical deep neural networks (DNNs), a team from the Massachusetts Institute of Technology and Google Quantum AI has proposed a quantum algorithm designed to train such networks in logarithmic time. The team provides compelling evidence of their proposed method’s efficiency on the standard MNIST image dataset.
+ The team summarizes their proposed full quantum algorithm’s approach re the approximate NTK as: 1) Assume the existence of a quantum random access memory (QRAM) to store and access any necessary quantum states; 2) Use amplitude estimation and median evaluation to evaluate inner products between data example to compute NTK elements; 3) Post-select to prepare the NTK between the test data point and the training set.
+ The main contribution of this work is a quantum algorithm designed to train wide and deep neural networks under an approximation of the NTK, estimating the trained neural network output with vanishing error as the size of the training set increases.
Source: Synced. Synced, MIT & Google Quantum Algorithm Trains Wide and Deep Neural Networks…
Content may have been edited for style and clarity. The “+” to the left of paragraphs or other statements indicates quoted material from “Source:” document. Boldface title is original title from “Source:” Italicized statements are directly quoted from “Source:” document. Image sources are indicated as applicable.