About how far does this leave us from a usable quantum processor? How far from all current cryptographic algorithms being junk?
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
The latest versions of TLS already have support post-quantum crypto, so no, it's not all of them. For the ones that are vulnerable, we're way, way far off from that. It may not even be possible to have enough qbits to break those at all.
Things like simulating medicines, folding proteins, and logistics are much closer, very useful, and more likely to be practical in the medium term.
Is there gov money in folding proteins though? I assume there’s a lot of 3 letter agencies what want decryption with a lot more funding.
There's plenty of publicly funded research for that, yes.
Three letter agencies also want to protect their own nation's secrets. They have as much interest in breaking it as they do protecting against it.
At least a week, probably more
Just in time for the fall of American democracy. What could possibly go wrong.
Maybe they can use the same techniques for keeping their product management and feature roadmap for more than an hour.
108 qubits, but error correction duty for some of them?
What size RSA key can it factor "instantly"?
Seeing quantum computers work will be like seeing mathemagics at work, doing it all behind the scenes. Physically (for the small ones) it looks the same, but abstractly it can perform all kinds of deep mathematics.
There's a dime stuck in the road behind our local store, tails side up, for over 15 years. And that doesn't even need error correction.
Why does it sound like technology is going backwards more and more each day?
Someone please explain to me how anything implementing error correction is even useful if it only lasts about an hour?
I mean, that's literally how research works. You make small discoveries and use them to move forward.
As stable as that dime is, it's utterly useless for all practical purposes.
What Google is talking about it making a stable qbit - the basic unit of a quantum computer. It's extremely difficult to make a qbit stable - and as it underpins how a quantum computer would work instability introduces noise and errors into the calculations a quantum computer would make.
Stabilising a qbit in the way Google's researchers have done shows that in principle if you scale up a quantum computer it will get more stable and accurate. It's been a major aim in the development of quantum computing for some time.
Current quantum computers are small and error prone. The researchers have added another stepping stone on the way to useful quantum computers in the real world.
Do you have any idea the amount of error correction needed to get a regular desktop computer to do its thing? Between the peripheral bus and the CPU, inside your RAM if you have ECC, between the USB host controller and your printer, between your network card and your network switch/router, and so on and so forth. It's amazing that something as complex and using such fast signalling as a modern PC does can function at all. At the frequencies that are being used to transfer data around the system, the copper traces behave more like radio frequency waveguides than they do wires. They are just "suggestions" for the signals to follow. So there's tons of crosstalk/bleed over and external interference that must be taken into account.
Basically, if you want to send high speed signals more than a couple centimeters and have them arrive in a way that makes sense to the receiving entity, you're going to need error correction. Having "error correction" doesn't mean something is bad. We use it all the time. CRC, checksums, parity bits, and many other techniques exist to detect and correct for errors in data.
It can be useful if they build enough of these that they can run programs that regular computers can't run at this scale, in less than an hour.
Quantum computers aren't a replacement for regular computers because they're much slower and can't do normal calculations, but they can do the type of problem where you have to guess-and-check too many answers to be feasible with regular computers in many fewer steps.
Because quantum physics. A qubit isn't 0 or 1, it's both and everything in between. You get a result as a distribution, not as distinct values.
Qubits are represented as (for example) quantumly entangled electron spins. And due to the nature of quantum physics, they are not very stable, and you cannot measure a value without influencing it.
Granted, my knowledge of quantum computing is very hand-wavy.
it only lasts for an hour
"Only"? The "industry standard" is less than a millisecond.
Show the academic world how many computational tasks the physical structure of that coin has solved in the 15 years.