this post was submitted on 10 Dec 2024
329 points (100.0% liked)
Technology
67987 readers
3187 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
There's a dime stuck in the road behind our local store, tails side up, for over 15 years. And that doesn't even need error correction.
Why does it sound like technology is going backwards more and more each day?
Someone please explain to me how anything implementing error correction is even useful if it only lasts about an hour?
I mean, that's literally how research works. You make small discoveries and use them to move forward.
What's to research? A fucking abacus can hold data longer than a goddamn hour.
Are you really comparing a fucking abacus to quantum mechanics and computing?
they are baiting honest people with absurd statements
Russian election interference money dried up and now they're bored
They are shaking some informative answers out of literate people so I’m getting something out of it :D
Absurdly ignorant questions that, when answered, will likely result in people knowing more about quantum computing than they did before.
And if it stopped there, we'd all be the better for it :)
Don’t feed the trolls.
Are you aware that RAM in your Computing devices looses information if you read the bit?
Why don't you switch from smartphone to abacus and dwell in the anti science reality of medieval times?
And that it looses data after merely a few milliseconds if left alone, that to account for that, DDR5 reads and rewrites unused data every 32ms.
You're describing how ancient magnetic core memory works, that's not how modern DRAM (Dynamic RAM) works. DRAM uses a constant pulsing refresh cycle to recharge the micro capacitors of each cell.
And on top of that, SRAM (Static RAM) doesn't even need the refresh circuitry, it just works and holds it's data as long as it remains powered. It only takes 2 discreet transistors, 2 resistors, 2 buttons and 2 LEDs to demonstrate this on a simple breadboard.
I'm taking a wild guess that you've never built any circuits yourself.
I'm taking a wild guess that you completely ignored the subject of the thread to start an electronics engineering pissing contest?
And you would have been there shitting on magnetic core memory when it came out. But without that we wouldn't have the more advanced successors we have now.
Obvious troll is obvious
Must be the dumbest take on QC I've seen yet. You expect a lot of people to focus on how it'll break crypto. There's a great deal of nuance around that and people should probably shut up about it. But "dime stuck in the road is a stable datapoint" sounds like a late 19th century op-ed about how airplanes are impossible.
The internet is pointless, because you can transmit information by shouting. /s
AND I can shout while the power is out. So there!
Welp, quantum computers have certain advantages (finding elements in O(sqrt(n)) time complexity, factorizing primes, etc). The difficulty is actually making everything stable because these machines are pretty complex.
https://en.wikipedia.org/wiki/Quantum_supremacy
As stable as that dime is, it's utterly useless for all practical purposes.
What Google is talking about it making a stable qbit - the basic unit of a quantum computer. It's extremely difficult to make a qbit stable - and as it underpins how a quantum computer would work instability introduces noise and errors into the calculations a quantum computer would make.
Stabilising a qbit in the way Google's researchers have done shows that in principle if you scale up a quantum computer it will get more stable and accurate. It's been a major aim in the development of quantum computing for some time.
Current quantum computers are small and error prone. The researchers have added another stepping stone on the way to useful quantum computers in the real world.
Do you have any idea the amount of error correction needed to get a regular desktop computer to do its thing? Between the peripheral bus and the CPU, inside your RAM if you have ECC, between the USB host controller and your printer, between your network card and your network switch/router, and so on and so forth. It's amazing that something as complex and using such fast signalling as a modern PC does can function at all. At the frequencies that are being used to transfer data around the system, the copper traces behave more like radio frequency waveguides than they do wires. They are just "suggestions" for the signals to follow. So there's tons of crosstalk/bleed over and external interference that must be taken into account.
Basically, if you want to send high speed signals more than a couple centimeters and have them arrive in a way that makes sense to the receiving entity, you're going to need error correction. Having "error correction" doesn't mean something is bad. We use it all the time. CRC, checksums, parity bits, and many other techniques exist to detect and correct for errors in data.
It can be useful if they build enough of these that they can run programs that regular computers can't run at this scale, in less than an hour.
Quantum computers aren't a replacement for regular computers because they're much slower and can't do normal calculations, but they can do the type of problem where you have to guess-and-check too many answers to be feasible with regular computers in many fewer steps.
Because quantum physics. A qubit isn't 0 or 1, it's both and everything in between. You get a result as a distribution, not as distinct values.
Qubits are represented as (for example) quantumly entangled electron spins. And due to the nature of quantum physics, they are not very stable, and you cannot measure a value without influencing it.
Granted, my knowledge of quantum computing is very hand-wavy.
"Only"? The "industry standard" is less than a millisecond.
Show the academic world how many computational tasks the physical structure of that coin has solved in the 15 years.
How many calculations can your computer do in an hour? The answer is a lot.