Blog Quantum Computing — The CTO Perspective
By Juan Orlandini / 8 May 2025 / Topics: Cybersecurity Data and AI Data protection
By Juan Orlandini / 8 May 2025 / Topics: Cybersecurity Data and AI Data protection
Until recently, the common wisdom was that it was years away from being practical. However, recent announcements have given us hints that it is more imminent than anticipated.
Google, IBM, Microsoft, and several other players have all stated that they have made independent breakthroughs in their approaches to building stable and scalable quantum computing systems. In this blog, I explore some high-level concepts pertaining to quantum computing and make a few recommendations for organizations to rally around immediately.
Quantum computing always seemed a long way off … until suddenly, it didn’t. What changed?
One of the fundamental aspects limiting this technology has been the need to build platforms with a large number of stable qubits. A qubit refers to a quantum bit and a stable qubit refers to qubits that maintain their state long enough for a quantum computer to perform its intended action.
In traditional computing, the more complex the computing is, the more bits (short for binary digits) are needed for a function. The same is true for quantum computing. The more complex the quantum computing — and quantum computing is inherently complex — the more qubits you need.
As such, a very large number of qubits is needed to implement the algorithms developed for quantum computing technology. The technological implementation varies, but quantum computing generally requires a 10:1 qubit ratio.
What does this mean? It means if you want a single error-corrected qubit, you need 10 unstable qubits to create it. Keep this in mind when you hear announcements about the number of qubits a new technology enables.
To date, those announcements have all been for unstable qubits. And for the quantum algorithms everyone is talking about, the number of stable qubits needed is easily in the thousands — which means the system requires tens of thousands of unstable qubits to compute.
So, while we’re getting closer to quantum computing becoming a reality, the main challenge that continues to hold back progress is current technology’s inability to make so many stable qubits.
Quantum computing is a radical new way of using fundamental computing building blocks. In traditional computing, we only worry about two things: 0s and 1s.
To be fair, this isn’t as simple as it seems. There’s a ton that goes into making our modern computers able to handle huge volumes of 0s and 1s. Still, when it comes down to it, traditional computing is “just” 0s and 1s.
But in quantum computing, qubits have the power to exist as 0s, 1s, and in a state known as superposition where they represent 0s and 1s at the same time — to put it simply.
It gets more complicated: Qubits also exhibit entanglement (once referred to by Albert Einstein as “spooky action at a distance”) and quantum interference.
On the face of it, these complicated traits might seem like obstacles that would block the advancement of quantum computing. However, through some novel algorithms and usage of these properties — which, frankly, I don’t comprehend — there have been breakthroughs in parallelizing computation in ways that are impossible in traditional computing.
The prime example of this is Shor’s factoring algorithm. In theory, this algorithm can factor enormous numbers into their constituent factors. You might be thinking, “So what?”
Well, most of our modern cryptography accepts the idea that factoring enormous numbers is extremely difficult and time-consuming for traditional computing technology. With quantum computing and Shor’s algorithm, traditional encryptions based on factoring could be broken exponentially faster.
The security implications of quantum computing are so concerning that entirely new kinds of encryption have already been developed. Not only are these new encryption methods available, but the National Institute of Standards and Technology (NIST) recommends transitioning to these new methods immediately.
And here’s the rub: Not only do we need to start using these new encryption methods in the future, but we also need to re-encrypt things that were already encrypted using traditional encryption methods.
Why? Imagine this scenario: Quantum computing is still in the future, but your organization gets hacked. The hackers pilfer a huge amount of your sensitive records, but you — a diligent steward of your data — are not concerned because those records are encrypted. The hackers make demands, but you ignore them because you and your data are safe.
Then suddenly, quantum computing is practical and cheaply available (remember, cloud providers are working on quantum computing, too). Now, hackers who previously hoarded your data can decrypt it, leaving you with a big mess on your hands.
Unfortunately, the threats created by quantum computing won’t stop there. All of our secure communications — think of bank transactions, eCommerce, etc. — are built on classical encryption. This even extends to things like printers, Wi-Fi access points, IoT sensors, and processors.
When quantum computing becomes real, these all become vulnerable — creating new entry points for bad actors looking to infiltrate your environment. To keep everything protected, everything must be updated.
Organizations across industries must start preparing for quantum computing. The best way to get started is by creating a dedicated team to focus on it. After assembling your quantum computing preparedness team, follow this initial checklist:
QRNGs eliminate one of the threat vectors in current encryption and do it relatively cheaply. In traditional systems, encryption algorithms are partially built on random numbers generated by an algorithm.
However, algorithms are repeatable processes by definition. That means a bad actor with a powerful enough computer can figure out that process and start generating the same “random” numbers, which makes it possible to decrypt your secure data. QRNGs generate truly random numbers, eliminating this possibility.
Rest assured, many positive outcomes will come from quantum computing as well — like the potential for major advancements in AI, drug discovery, weather forecasting, path optimization, and much more. Still, it’s imperative for organizations to get started on safeguarding their data against quantum computing’s potential threats ASAP.
Keep an eye out for more announcements as this field evolves. The day of quantum computers is likely closer than we think, and exciting things are ahead.
Chief Technology Officer, North America and Distinguished Technologist, Insight
Juan is Insight’s chief technology officer, North America, and one of Insight’s distinguished technologists. He is a 30-plus-year veteran of the IT industry and has designed and deployed enterprise computing, storage, data protection, virtualization and hybrid cloud solutions. Juan evaluates next-generation technologies for Insight and works with enterprise clients, assisting them in architecting and selecting strategic roadmaps. In his current role, Juan designates champions of the technology community within Insight and drives events that promote thought leadership, professional development and knowledge sharing.