Barclays, JP Morgan Chase; Barking at and Chasing Quantum Computing for the Financial Leap Ahead.  Numerous aspects to quantum computing’s presumed sooner-than-later arrival on the financial services computing scene are discussed in this thorough read.  Qubit.

Quantum on the money: fintech is banking on the future of computing.

The financial services sector thinks quantum computing can help supercharge its products and profits, but what impact could it have on cyber-risk exposure?

Opinion is divided over when quantum computers will enter the commercial mainstream: IBM reckons four to five years; more conservative forecasts suggest not for a decade, at least. Whichever timeframe proves correct, it’s probable that the financial services sector will be foremost to adopt quantum tech for line-of-business applications – and to bear the brunt of the challenges that early adopters always face.

Quantum promises pros but also cons, especially in respect to the cyber-security liabilities that its power gives rise to. Quantum computing will “redefine” banking, says Elisabetta Zaccaria, chairman at Secure Chorus, an organisation that promotes multi-stakeholder cooperation in information security. “Yet while quantum computers present an opportunity to solve many challenges, they also create problems for data security,” she says.

Analysts do largely agree, however, about the prospective market opportunities. According to BCC Research’s Quantum Computing: Technologies and Global Markets, its expansion into key industries will boost demand for quantum-scale solutions. The industry anticipates a compound annual growth rate (CAGR) of 37.3 per cent to 2022, when it could be worth $161m; this date falls before the earliest estimate for mainstream penetration to begin. The report further estimates that between 2022 and 2027, the market will see a CAGR approaching 53 per cent and be worth $1.3bn. By sector application, the financial services will see a CAGR from 2022-2027 of 62.6 per cent.

Such profitability is likely to derive from a quantum computing industry formed around similar lines to IT and computing markets – hardware, software and services. The hardware has already started to arrive. IBM consolidated its prediction at the start of 2019 with the launch of its Q System One, a standalone platform running at 20 qubits that claims to be the first integrated general-purpose quantum computing system designed for commercial use.

Start-ups like D-Wave and Rigetti Systems, meanwhile, have been marketing more task-specific machines. D-Wave sells self-contained quantum computers with a variant form of quantum technology known as quantum annealing. There are also incumbent technology leaders like Fujitsu, whose Digital Annealer offering is based on “quantum-inspired” architecture.

IBM points to more supportive evidence that quantum will arrive sooner rather than later: global engagement with its Q Network, an initiative set up to advance a quantum computing development framework by providing cloud-based access to quantum compute resources and simulators for like-minded parties to try their own ideas in an ‘approximate’ quantum environment. Banks JPMorgan Chase and Barclays are two of the biggest commercial members of Q Network. They are engaged upon a range of development projects designed to determine which of their financial services operations are most likely to benefit from being scaled to quantum environments.

Their projects are described as exploratory but, from their public positioning statements, it’s clear that both financial institutions plan to make quantum integral to their commercial strategies, as well as their IT agendas, for the 2020s. JPMorgan Chase’s senior engineer and quantum project lead Constantin Gonciulea is investigating a limited range of financial services areas where quantifiable operational benefit could be derived from a phased transition to quantum enablement.

Areas under investigation include the redefinition of trading strategies, optimisation of portfolio management, asset pricing and risk analysis. “Fund managers have to choose among an infinite number of investment combinations for their portfolios, based on the level of risk they want to take,” Gonciulea explains. “Imagine if they had a computer that could detect potential risk – based on their appetite for risk – within seconds of the change taking place in a particular market.”

On the information security front, enhanced risk-detection capability could also be applied to fraud. Gonciulea adds: “JPMorgan Chase has thousands of employees who detect fraud as part of their job. The application of quantum computing to the assessment of potential fraudulent payments… would be very beneficial for our merchant services clients.”

The IT security industry itself has been reviewing quantum’s potential to enhance its own value proposition for some time, both as a counteractive to cyber threats – to conduct cyber-risk analysis – and to drive better enterprise threat intelligence.

Ridgeback Network Defense, for example, is developing algorithms that can leverage quantum systems to evaluate risk within an organisation’s IT infrastructure and identify possible changes in system state, based on different attack or fault scenarios. “This will allow a financial organisation to optimise the allocation of resources and provide an organisation with a near real-time analysis of likely what-if scenarios for both information security and for operations,” says the company’s CTO Thomas Phillips. “The same technology should, by the way, be applicable also to market analysis.”

It has also been recognised that as quantum compute capability becomes available for mainstream commerce, it also falls within reach of malevolent nation states and even cyber criminals. Indeed, according to Frederik Kerling, senior quantum expert at Atos, the biggest impact on financial services will not be the deployment of quantum computers; it will be preparing for the cyber criminals to have quantum computers. But “quantum vulnerabilities [can] be countered by quantum solutions”, he suggests.

According to Atos’s 2018 report Quantum Computing in Financial Services, as well as the desire to attain operational efficiencies and superior decision-making, there’s another driving factor for financial services firms’ take-up of quantum: to make it integral to their digital transformation programmes as they move away from aged platforms that, in an era of cryptocurrencies and blockchain, now embody unacceptable degrees of risk exposure.

Cryptocurrencies such as Bitcoin are likely to be a target for quantum computing with its ability to break the underlying security technology behind them. Current encryption standards, therefore, will be particularly vulnerable to quantum-powered attacks, which makes cyber security a critical aspect of quantum computing.

Standard

Banking on security by decree

For cyber security, there are encouraging signs of growing recognition of the need to secure quantum. In 2018, the Accredited Standards Committee X9 established a study group to review the risk posed to the financial services industry by quantum computing. The group will review the status of quantum computing and assess the effects large-scale, fault-tolerant, general-purpose quantum computers will have on the cryptography used by the industry. The group will also review efforts by companies and government agencies to identify both at-risk and quantum-safe algorithms.

“Several banks are performing quantum risk assessments, implementing proofs of concept, and looking at deploying quantum-resistant algorithms in select systems,” says the Global Risk Institute’s Michele Mosca. “For years, some have also used quantum cryptography in their link encryptors to protect the link to their off-site backups.”

Secure Chorus is collaborating with ISARA Corporation to evolve a cryptography standard called MIKEY-SAKKE (Multimedia Internet KEYing-Sakai-Kasahara Key Encryption) to become quantum-safe. Developed by the UK’s National Technical Authority for Information Assurance, now part of the National Cyber Security Centre, MIKEY-SAKKE was standardised by the Internet Engineering Task Force.

In addition to encryption and cyber security, elsewhere quantum computing will have impact throughout the financial services industry, from retail-banking customer relationship management (CRM) and investment portfolio risk analysis, to high-frequency trading (HFT) and capital markets.

Analytics-driven CRM is a hot topic among financial services players, for both incumbents and fintech arrivistes. In retail banking and insurance, CRM will be improved by the automation of more ‘tightly targeted’ services. Atos suggests that quantum technology can provide “greater accuracy in simulating customer purchasing preferences based on demographic data, whether that’s for an insurance policy or a mortgage. Customer personal information, meanwhile, can be protected more effectively through simultaneous automation and analytics of pending threats.”

In HFT, ultra-fast computers and algorithms on advanced, highly specified trading platforms can execute thousands of trades per second at nanosecond speeds. According to TABB Group, HFT accounts for around 50 per cent of all US equity trades – even though some analysts and traders attribute to HFT spates of stock market volatility that date back to the ‘Flash Crash’ incident of 2010 (see E&T Vol 5, Issue 13 and Vol 8, Issue 6). Quantum computing will accelerate HFT trading further yet and enhance it with artificial intelligence and predictive capabilities.

Risk analysis is “the most promising – and exciting – application of quantum computing at this time. Practical quantum algorithms will be developed to evaluate risk much more accurately and rapidly,” says Phillips at Ridgeback Network Defense. “A paper published in the journal Physical Review A showed how a quantum system could dramatically speed up the Monto Carlo algorithm for pricing financial derivatives.” A Monte Carlo algorithm is a computer algorithm used to simulate the behaviour of other systems. It is not an exact method, but a heuristic one, typically using randomness and statistics to obtain results.

Before it can exploit these good things, the financial industry will have to resolve the questions of information security and data integrity that come in tandem with the benefits of ‘quantumisation’. As noted, along with driving supercharged growth, quantum could power a new generation of advanced persistent threats, according to the Global Risk Institute.

The threats stem from the power of quantum computing to execute tasks far beyond the reach of conventional computers. These use long strings of bits that encode either a 0 or a 1. By contrast, quantum computing enables the bit to embody the 0 and 1 states simultaneously. By manipulating a large collection of quantum bits – qubits – quantum computers can process countless configurations of 0s and 1s at the same time.

“Quantum computers pose a systemic threat to cyber security, as they will break the current public-key cryptography that underpins the security of most technology platforms,” warns Michele Mosca, special advisor on cyber security at the Global Risk Institute. “The integrity of software validation, identity verification, and information encryption will all be affected.”

Successful attacks could result in a lack of confidence and trust in the tools and institutions underpinning our digital economy dependent on quantum-vulnerable cryptography for security, Mosca adds: “Financial institutions depend on the strength of their reputation and the trust placed in them by individuals and organisations to keep their assets safe and their transactions secure. The emergence of quantum computers will challenge this trust, especially since it will be difficult to evaluate the extent of the threat before an attack happens.”

Further warning also comes from Zaccaria at Secure Chorus. “Quantum computing will solve complex mathematical problems faster and better – including those used at the core of modern cryptography,” she says. “Moreover, quantum computers deployed by malicious attackers will be able to decrypt the data protected by many public key cryptography methods now used by governments and corporations to protect sensitive data. And they will be able to do this relatively quickly.”

Concerns about crypto-key exchange integrity are growing through the fintech sector and will have to be addressed if quantum growth is to proceed as forecast, says Zaccaria. To this end, Secure Chorus is steering a quantum-safe initiative in public key cryptography with the development of a new set of public key cryptosystems for conventional computers that resist quantum computer attack. These cryptosystems are called ‘quantum-safe’ or ‘post-quantum cryptography’. The underlying principle is the use of mathematical problems of a complexity beyond quantum computing’s capability to solve them.

“The wallets protecting crypto-assets are generally vulnerable to quantum attacks, although the consensus mechanisms based on proof-of-work – e.g. Bitcoin, Ethereum, and suchlike – seem less vulnerable,” says Mosca at the Global Risk Institute. “Nevertheless, asymmetric distribution of full-scale quantum computing platforms could pose a threat to the network that should be taken into account when designing quantum-resistant blockchains.”

Such defensive initiatives notwithstanding, Ridgeback Network Defense’s Phillips expects standard due diligence to kick in long before risk exposure reaches critical levels. “If companies choose to depend on blockchain technology – public or private implementations – then I expect them to stay abreast of weaknesses. Quantum technology will march along at a steady pace. There will be fair warning before we see any quantum advances that could threaten existing cryptographic implementations used by financial institutions.”

Jacky Fox, director of cyber risk at Deloitte Ireland, agrees that concerns about the level of quantum cyber-risk should not be over-estimated – for the time being. “Controlled quantum, as opposed to virtual qubits [quantum binary digits], are currently being measured by the tens rather than the thousands or millions required for cyber-threat activity by any actor,” she explains. “Quantum-computing-as-a-Service – QCaaS – will be a likely delivery model with single- and multi-tenancy offerings.”

Fox continues: “How could these services be restricted from criminals? Should commercial organisations be regulated to vet and monitor their clients for criminal cyber-threat activity? State-sponsored activity against financial services seems more probable but, given some of the current challenges in relation to bandwidth, error correction and environmental requirements for quantum computing, 2025 sounds ambitious to have scale general-purpose quantum solutions available to anyone.”

Case study

NatWest accelerates understanding of risk

Financial institutions face an ongoing challenge with the creation and maintenance of optimally balanced portfolios of assets, selected from thousands of options. Ideally, these incorporate various liquid assets that deliver the maximum possible return while helping maintain risk at an acceptable level. While liquidity is of critical importance to financial institutions, the process involved in the calculation of the best combination of assets – an expensive, time-consuming manual task – is generally infrequently undertaken.

Retail bank NatWest has partnered with Fujitsu on a proof-of-concept project that aims to optimise its mix of high-quality liquid assets, including bonds, cash and government securities. Fujitsu’s Digital Annealer compute platform processes this type of complex scenario and provides results “orders of magnitude faster” than conventional compute resources can deliver, NatWest reports.

NatWest’s quantum technology team has completed highly complex calculations on the bank’s £120bn-value High Quality Liquid Assets portfolio at 300 times the speed of conventional cloud-based compute resources, says director of innovation Kevin Hanley, and with a higher degree of accuracy.

“NatWest can complete a comprehensive risk assessment for its portfolio much faster, as well as gaining access to a far wider range of results and permutations, therefore helping to ensure an optimised spread and reduced risk,” explains Hanley. “The technology could be applied to other calculations and [risk-based] problems the bank faces on a daily basis.”

Quantum timeline

1927: German physicist Werner Heisenberg introduces the ‘uncertainty principle’, which says that you can’t know everything about a quantum mechanical particle.

1935: Albert Einstein, Boris Podolsky and Nathan Rosen argue that a complete theory must be local and realistic. No particle should affect another faster than the speed of light, and every measurable quantity of nature has to be accurately represented by theory. The ‘EPR paradox’ says that either quantum mechanics is incomplete or the requirements for a complete theory are too strong.

1964: John Bell formulates a local hidden variable model and shows the bound on how strongly things can be correlated. Quantum mechanics can violate this bound and has measurement outcomes with greater correlation than any local hidden theory. Quantum mechanics allows for stronger correlations with entangled states.

1970: Notes from Stephen Wiesner and Charles Bennett contain the possible first use of the phrase ‘quantum information theory’ and the first idea of entanglement as a communication resource.

1981: Nobel Prize winner Richard Feynman challenges a group of computer scientists to develop a new breed of computer based on quantum physics. Almost 40 years later, scientists around the world are getting closer to this.

1994: MIT’s Peter Shor shows you can factor a number into its primes on a quantum computer – a problem that takes classical computers “an exponentially long time”. His algorithm launches an explosion of interest in the field of quantum computing.

1995: Quantum error correction emerges, showing that it’s possible to use a subtle redundancy to protect against environmental noise, making the physical realisation of quantum computing more tenable.

1996: IBM’s David DiVincenzo’s five requirements for a quantum computer: a well-defined scalable qubit array; an ability to initialise the state of the qubits to a simple fiducial state; a ‘universal’ set of quantum gates; long coherence times, much longer than the gate-operation time; single-qubit measurement.

1997: The first topological quantum error connecting code – the surface code – is proposed by California Institute of Technology’s Alexei Kitaev. The surface code is considered the most promising platform for realising a scalable, fault-tolerant quantum computer.

2001: Shor’s algorithm is demonstrated in a quantum computing experiment, albeit with a very pedestrian problem: 15 = 3×5. The IBM system employed qubits in nuclear spins, similar to an MRI machine.

2004: Robert Schoelkopf and his collaborators at Yale University invent circuit QED to study the interaction of a photon and an artificial quantum object on a chip. This established the standard for coupling and reading out superconducting qubits.

2007: Schoelkopf and his collaborators invent a type of superconducting qubit with reduced sensitivity to charge noise, a major obstacle for long coherence. The superconducting qubit has been adopted by many superconducting quantum groups.

2012: Several important parameters for quantum information processing with transmon qubits are improved. IBM extends the coherence time, the duration that a qubit retains its quantum state, up to 100 microseconds.

2015: IBM demonstrates the smallest, almost-quantum error detection code. With a single quantum state stabilised, it’s possible to detect bit-flips and phase-flips. The code is in a 4-qubit lattice arrangement, a building block for quantum computing systems.

2016: IBM scientists build the IBM Q experience, a quantum computing platform delivered via the IBM cloud. It enables users to run experiments on IBM’s quantum processor, work with individual qubits, and explore tutorials and simulations of quantum computing.

Source: E&T, James Hayes, Quantum on the money: fintech is banking on the future of computing…