I understand the basic of quantum computing (qubits, superposition, probability distributions, noise, how observing works, etc), but I’m struggling to understand what operations actually go on within an algorithm. I know you essentially do linear algebra to transform a vector closer and closer to the “true” outcome, but that’s very abstract. What’s actually going on when one does that? How does an operation “know” what the “true” outcome is?
Apologies if my terminology is way off—I’m only just looking into this, lol.
You must log in or # to comment.
There are zero. Zero applications for quantum computing. A very few heavily controlled (faked) proof of concepts have been done, but they’re useless.


