I saw this headline earlier today...I don't know much about the issue, but my impression is that a functioning quantum computer would make our most powerful computer today look like a rickshaw compared to a Bugatti.
I think there's a better analogy. I would say it makes our current systems look like a prop plane compared to an F-16. The F-16 is obviously way more powerful but it's not useful for every task. Sometimes, what you need is a prop plane. Let me explain it to the best of my ability.
1. Quantum computing is basically a way of massively parallelizing a computation. The key is entanglement. If you have 50 qubits working together, it can store 2^50 values simultaneously and essentially run a series of logic operations against those values at the same time. In other words, it's sort of like chaining 2^50 classical processors together (I think that many processors would actually be more powerful than the quantum one, but 2^50 is huge -- it's about the same as the number of atoms in the earth -- and thus is an idle consideration).
The way it works, essentially, is that the "program" is represented in a series of logic gates (this is the part I don't really understand). Entangle the q-bits, and press go. The quantum system will, per thermodynamics -- spontaneously move to its lowest energy state. If you've defined your logic gates correctly, the lowest energy state should be the solution that you're trying to solve.
But typically, in quantum operations, the mere act of obtaining information from the system destroys it. This is called wave form collapse or depending on the context, decoherence (which is basically the opposite of entangling). Which is to say, the system has to be reloaded, almost like an old musket. You can't just press the trigger over and over again.
Now, the reloading process is the other part I don't really understand, and I don't know how a chip fits into that. But I'm fairly confident that reloading is not computationally trivial. It might be automatable but it's not going to be fast. Which doesn't really matter that much. If you decode something, and then you have to spend five minutes reloading before decoding the next thing -- you're still really, really fucking far ahead of where you were. And airlines can potentially use quantum computing to the find the optimal flight patterns that would minimize costs (a task that is currently impossible; it's called the traveling salesman problem). Again, if you can optimize your routes in 10 minutes, even an hour - hell, even a few days -- of reloading means nothing.
2. But if I'm right about the reloading, it makes quantum computers not very useful for sequential tasks. That is, you'd never use them to render graphics. You'd still use a GPU for that because, while the rendering problem is computationally complex, it's not that complex and it needs to be done millions of times per second in sequence.
Hence the analogy I offered above. Incredibly powerful, but not general use.
3. Caveat #1: Jensen Huang at NVidia has said it's going to be decades before quantum computing can compete with his processors in AI tasks. Others have said it's not decades but years. Either way, everyone seems to agree that quantum would be useful in AI, and that's certainly in tension with what I wrote above. AI involves lots and lots of sequential tasks. That said, it might work like this: MS says that you could conceivably put 1 million qubits on one of its chips (not yet, but that's the scalability claim). I cannot imagine there's any problem in the universe that would require 1M qubits. 1000 qubits would surely suffice. But if you had 1M qubits, you could divide them into processing units -- let's say, 1K qubits. The first 1K solve a problem. While they are reloading a next problem, the chip uses the next 1K qubits for a new problem. So you can solve 1K problems in the time it takes the system to reload.
I don't know if that's the architecture they have in mind exactly, but the reloading is an issue and it will limit its ability to be used for sequential tasks.
4. Caveat 2: as I said, I find quantum computing very difficult to understand. I say that as someone who a) was a professional computer programmer and did about 3/4 of a computer science degree in college and b) was a physics major who got an A in my 500 level quantum mechanics course. Now, I was never great at quantum mechanics because it requires you to think about the world spatially, which is not my strength, and it was a bit of a weak A because it was curve-aided. I did not think I had sufficient talent in quantum mechanics to justify going to grad school. Still, my background is more conducive to understanding quantum computing than 99% of people out there. If I find it hard, it's going to be really fucking hard for most people.