Quantum Computing’s Influence on Artificial Intelligence, Machine Learning
How may quantum computing affect Artificial Intelligence?
Excerpts and salient points ~
+ The processing power required to extract value from the unmanageable swaths of data currently being collected, and especially to apply artificial intelligence techniques such as machine learning, keeps increasing. Researchers have been trying to figure out a way to expedite these processes applying quantum computing algorithms to artificial intelligence techniques, giving rise in the process to a new discipline that’s been dubbed Quantum Machine Learning (QML).
The use of quantum algorithms in artificial intelligence techniques will boost machines’ learning abilities. This will lead to improvements in the development, among others, of predication systems, including those of the financial industry. However, we’ll have to wait to start these improvements being rolled out.
+ Machine learning and artificial intelligence technologies are the two key areas of research in the application of quantum computing algorithms. One of the particularities of this calculation system is that it allows representing several states at the same time, which is particularly convenient when using AI techniques. For example, as noted by Intel, voice-assistants could greatly from this implementation, as quantum could exponentially help improve their accuracy, boosting both their processing power and the amount of data they would be able to handle. Quantum computing increases the number of calculation variables machines can juggle and therefore allow them to provide faster answers, much like a person would.
+ Currently, most industrial applications of artificial intelligence come from the so-called supervised learning, used in tasks such as image recognition or consumption forecasting. “In this area, based on the different QML proposals that have already been set forth, it is likely that we’ll start seeing acceleration – which, in some cases, could be exponential – in some of the most popular algorithms in the field, such as ‘support vector machines’ and certain types of neural networks,” explains Fernández Lorenzo. A less-treaded path, but which shows great promise, is the field of non-supervised learning. “Dimensionality reduction algorithms are a particular case. These algorithms are used to represent our original data in a more limited space, but preserving most of the properties of the original dataset.” In this point, the researcher notes the use of quantum computing will come in particularly handy at the time of pinpointing certain global properties in a dataset, not so much specific details.
Content may have been edited for style and clarity.