The Cyber Age, Artificial Intelligence, and Quantum Computing Should Have Made the World a Better Place
‘Future tense, it will be war of technologies next’
+ Cyber Age The leaps and bounds in technology should have made the world a better place. Instead, they have given rise to a hybrid system of warfare and security threats, “where the lines between the physical, digital and biological reams are being highly blurred.” Technology, like big data, modern analytics, machine learning and quantum computing can do immense damage in the hands of an opponent, “one unrestrained by any notions of law or morality.”
Military strength and economic power are no longer the sole factors that dictate leadership, as IT, biotechnology, nanotechnology and cyber take on increasingly important roles, even as the threats of military conflict and nuclear war grow more unlikely, Narayanan says.
+ Artificial Intelligence AI has the potential to solve the problem but it doesn’t provide an all encompassing solution.” Badly designed AI, believes Narayanan, can cause more harm than anything else. “Decision makers don’t possess adequate knowledge of AI processes, as attackers tunnel effortlessly through digital networks, penetrating our defenses.” AI-enabled warfare, he says, will be a game-changer and “an extremely dangerous cyber arms race is already on the anvil.”
+ Quantum Computing “The rise of quantum computers could be even more radical.” Existing cryptographic algorithms can be broken in realistic time periods. These computers will also be able to produce “hyper realistic images,” making veracity a scarce resource. “Passwords are easier to crack, even biometric devices that use fingerprints, facial recognition or retinal signatures can be bypassed. “Artificial Intelligence algorithms are dangerous also because they are opaque and as they learn on their own, learning from bad data could wreak havoc.”
Source: The Asian. Darshana Ramdev, ‘Future tense, it will be war of technologies next’…
Content may have been edited for style and clarity.