Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Until a company can demonstrate a quantum volume of even just 2^16... their computer is just about worthless for any sort of half-real quantum computing.

I pay zero attention to any technical information and marketing speak coming from a quantum computing company until they demonstrate decent quantum volume.

Most companies computers can't even hit 2^16 so they are figuring out means of distracting the market from the poor fidelity of their systems.





Until a company can demonstrate that a motor vehicle can be operated by a layperson, it's just about worthless for any sort of half-real transportation task.

I pay zero attention to any technical information and marketing speak coming from a motor vehicle company until they demonstrate decent improvements in operating their machines.

Most companies' motor vehicles have to be hand cranked and kick-started by a youthful person with good vitality who doesn't fear being run over by their motor vehicle.

-- Some guy in 1925 who worked in the horse and buggy industry...


The better analogy here, in my opinion, is until someone builds a car that actually drives forward...

A lot of these quantum computing papers are just loud engine rev'ing.

But you are right... "zero attention" is hyperbole... I shouldn't have said "I pay zero attention"... I should have said, "I am underwhelmed and don't give a lot of credence to systems until they demonstrate ability to perform high fidelity calculations of reasonable complexity and depth."

I do think superconducting qubits approaches will continue to improve in fidelity and ability... there are just too many brilliant people working on these challenges to count them out.


The Ford Model T had been on the market for over almost 2 decades by then. Over 15 million were sold, mostly to laypeople.

In contrast, no quantum computer today actually quantum computes. They just approximate quantum computing in specific scenarios and have yet to match, let alone exceed, the performance of regular computers.


The Model T was produced in 1908. Normal people being able to use (though not afford) cars was normal before that.

It took only 30 years to go from Otto and Benz going "Look at this neat contraption" to "Mass produced, semi-affordable personal cars".


Quantum volume is a good metric but that's kind of one-dimensional take. Almost any interesting circuit doesn't requires all-to-all connectivity and superconducting QC are bad at all-to-all connected circuit so we can have interesting NISQ experiments without particularly large QV

It is not a one dimensional take... it is a stress test of qubit gate fidelity [across all qubits involved in the circuit], state prep and measurement , lifetime (coherence), memory errors, etc.

Now I agree that there are other great stress tests of quantum computer systems... but most of the industry agreed that quantum volume was a great metric several years ago. As many companies systems have been unable to hit decent QV, companies have pivoted away from QV to other metrics... many of them are half baloney.


> fidelity [across all qubits involved in the circuit]

I don't see a scenario in which the fidelity of 2QG between two far away qubits matter. Stress tests should be somehow related to the real tasks the system is intended to solve.

In case of quantum computers, the tasks are either NISQ circuits or fault-tolerant computation, and in both cases you can run them just fine without applying 2QG between far-away qubits that translate in large amount of swaps.

If you're interested in applying Haar-random unitaries, then surely QV is an amazing metric, and then systems with all-to-all connectivity is your best shot (coincidentally, Quantiniuum keeps publishing their quantum volume results). It's just not that interesting of a task.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: