This is the same group that already had their publication retracted by Nature the last time they made such a claim. Maybe Nature's reviewers got better and this time it is legit, but I wouldn't count on that. So there's nothing to see here until this gets independently replicated. They totally deserve all the scrutiny and resistance they seem to be getting.
> a handful of experts spotted unusual patterns in the data
> a year later ... they detailed an unusual and complicated method [for processing the data]
regarding which a team claims:
> [We] proved basically mathematically that the raw data are not measured in the laboratory; they are fabricated
So multiple different peers have pointed out multiple different times that this team is publishing "exactly the results you'd want to see to confirm superconductivity", and then a year later publishing fake-looking raw data or overly-complex "data processing" that magically gives them the exact result they needed to get published. These guys don't even seem particularly good at their fraud:
"We just made the most important discovery in the history of superconductivity!"
[Immediately]: Okay but what about all this bullshit here?
"Oh that? (nervous laughter) ha ha, well, see, the thing about that is ... [a year later] ... there, see, here's the raw data; we were just processing the data in a way we never mentioned before. We didn't tell you about it because, um, well, um, we ... forgot?"
[Immediately]: Okay but this raw data is obviously fake
"Oh, well, um, psych! We were just kidding about that result! Ha ha ha, take-backsies. Um yeah, we take it back."
[A year later]
"Hey we did the result again! This time it's for real serious realsies. We have the result here. We did the result thing again. It's the exact same result but better this time. Guys?"
Why is anyone taking them seriously?
Lying in a professional context is basically never OK. If someone is shown to be lying, they should never work in that field again, nor any field that bestows upon them any sort of trust. Lying is a big deal; it should absolutely be career-killing. It's not something people just do once and then get over. A few weeks ago people on Hacker News were talking about how to help their children cheat at school, like it's just a normal thing. It absolutely blows my mind that this would be seen as OK.
> I was helping my son with his homework. He had to write an essay about why the gender of the protagonist might be female (although this is never mentioned in the short story). Fortunately for him, ChatGPT knew the story and was happy to write an essay with arguments.
It goes on like that. The parent helps their son use ChatGPT to cheat at school, and the responses are all positive, like:
> I am delighted to hear that children are already adopting the new tech
And:
> This gives your son a competitive advantage over the kids who will simply turn up what the AI wrote (and get a 0) or spend too long writing their own essays.
They're celebrating it and talking about how learning to cheat well will give them a "competitive advantage" (zero-sum bullshit) over the kids who cheat badly, and then making a mockery of the children who actually "spend too long" doing the work themselves.
It's insane. It's bizzaro world. It's infuriating. They're teaching their children to grow up to be liars and cheaters. They're knowingly punishing the children who choose not to cheat. They sound like toxic psychopaths, which I got downvoted (obviously) for calling them.
Sorry, but I don't agree. The form of this argument is indistinguishable to me from those who argued back in the 20th century that using calculators to do arithmetic was cheating.
Writing essays is on the same road as doing arithmetic by hand, calligraphy, blacksmithing, and buggy-whip manufacturing. All of these used to be useful skills but technology has displaced them. That's the reality, and no amount of railing at the wind will change that.
Is it cheating? No. Are these now plenty of people who are incapable of doing basic math without their phones? Yes. It'll be interesting to see what effects this has once everyone outsources their long form thinking and communication to their AI friend.
there isn't enough evidence to say they are lying compared to making a mistake. It is remarkably hard for many scientists to ensure that the presented "raw datA" isn't actually the output of a synthetic analysis. also, if the new work repros and is valid, it means that the original work may well have been right.
If it doesn't repro probably this guy's career is over.
> there isn't enough evidence to say they are lying compared to making a mistake
You're right, of course. I would not be jumping to conclusions if I had actual authority over the consideration of their work; I would be much more measured. But here in my armchair, I'm free to express my strong gut feeling that these guys are obvious liars -- but that's just, like, my opinion, man. This is one of those cases where I'd be very happy to be proven wrong.
I tried to go through the "Room-temperature superconductivity — or not?" paper related to the "[We] proved basically mathematically that the raw data are not measured in the laboratory; they are fabricated" comment. I'll admit, the paper is well over my head. Long time no college science classes.
I'm curious about the methods used to determine that the data was fabricated though. Would someone mathier than me help answer this: Was the fabrication of data determined through a mathmatical analysis of the actual data numbers ala Benford's Law? Or was this determined more through a scientific analysis of the experiment that determined that the numbers weren't possible?
Stuff like Benford's law or the central limit theorem are trivial observations that anyone with some basic statistics knowledge could have called out. But that also means anyone who has some education in the field could easily falsify plots to avoid that, so you will not catch real physicists with those methods. The problems in the paper that was eventually retracted were only visible to experts who actually work on this very topic. Here they talk about some of the abnormalities: https://www.nature.com/articles/s41586-021-03595-z
This write up of it [0] tries to address that a little but I agree that all the scrutiny is deserved and should continue.
"The previous paper has been resubmitted to Nature with new data that validates the earlier work, Dias says. The new data was collected outside the lab, at the Argonne and Brookhaven National Laboratories in front of an audience of scientists who saw the superconducting transition live. A similar approach has been taken with the new paper. "
The good news is that 1 GPa (10 kbar) is a lot easier to achieve than hundreds of GPa, which means that replication will be much easier. The previous "room temperature" claim involved around 2 Mbar or 200 GPa. For comparison, the detonation pressure of a high explosive is around 50 GPa. So, hopefully, this time, we won't have to wait as long.
> However, outside access may fall short of the community’s hopes. Dias and Salamat have founded a startup, Unearthly Materials, which, Dias said, has already raised over $20 million in funding from investors including the CEOs of Spotify and OpenAI. They’ve also recently applied for a patent on the lutetium hydride material, which would deter them from mailing out samples. “We have clear, detailed instructions on how to make our samples,” Dias said. “We are not going to distribute this material, considering the proprietary nature of our processes and the intellectual property rights that exist.” He suggested that “certain methodologies and processes” are also off the table.
This is exactly the kind of thing the patent system was intended to prevent. You have to share enough details of your secret sauce so others can replicate it, even if you intend to not license it or charge outrageous licensing fees. That way the knowledge is out there today, can be expanded on by other researchers/inventors, and at a specific date in the future anyone can use it.
It also helps reduce fraudulent claims of new discoveries.
I don't understand, and if you could illuminate it would be very helpful, how are such pressures achieved and maintained even for the shortest time without destroying the apparatus that holds it?
I understand that but it sort of dodges the issue that such enormous pressures must exceed the strength of any material, however small that pressure is contained in.
The compressive strength of typical diamond is about 470 GPa, so no this is very well within the strength of the material. The current record for highest compression achieved in a lab is 770 GPa using a special form of diamond.
> Reactions by 10 independent experts contacted by Quanta ranged from unbridled excitement to outright dismissal, with many of the experts expressing some version of cautious optimism.
This is exactly how an arbitrary group of scientists should react to extraordinary claims like this. Not 100% dismissal because of the history or provenance of the researchers, not 100% acceptance without independent replication.
I think this is working as intended. We often joke that if something is publishing in Nature it's almost certainly wrong.
But realistically, at any time in any field there are only a small number of people who are truly pushing the state of the art (I'm referring to discovery science, not reference science- IE, a focus on adding 1% additional knowledge to our already copious understandings. Any I think tthose people shouldn't have to follow the normal rules about scientific publishing: I want them to push the limits as much as possible. That means occasionally publishing something that contains a mistake (not a falsification), and then being willing to have it retracted (without consequence to future publication).
These sorts of fields tend to self-correct because somethign that's wrong isn't reproducible, and all these scientists are working to reproduce each other's results (note: both the people who thing Dias faked his results were going to take the new protocol home and attempt to repro it in their hands ASAP).
In a sense, it's being willing to accept a higher false positive/false negative rate so you don't filter out some true positives.
hmm, very interesting. I don't think it works like that anymore. The next step has got to be commercial availability. else investors to would loose money or worse, some Chinese company could "steal" the future profits from this valuable novel technology painstakingly developed by the publishing group.
Even if it turns out to be true, this result is still very far from commercial applications. It would be a fantastic thread to continue for academic research groups at universities, but pretty much noone else will care about it.
“We are not going to distribute this material, considering the proprietary nature of our processes and the intellectual property rights that exist.”
The fact that they are going to making independent verification impossible makes me highly highly suspicious. Most scientists I know in their position would be sharing the material widely (even ahead of publication) to have replication from other parties, to bolster this claim. Until there is replication, I trust NOTHING from this group which has outright fabricated data previously.
Extraordinary claims, from those who have previously faked data, require independent third-party verification.
The article ends with a warning that full reproducibility may be difficult due to IP issues.
Considering the academic (?) origin and allegedly shady history of the research this drawing of a commercial veil might encourage scepticism as to how revolutionary and successful it really is.
Definitely sets off my BS detector. They're under no obligation to publicize this so if they want it to be a secret they could keep it a secret. This means they do want publicity, probably so they can raise money. But there's an official way to protect your IP and get publicity: file a patent that details your process.
Yeah, this sounds like they are using Nature as an advertising platform to hype up their work.
I didn't realise it was possible to publish a paper with the premise that details required to reproduce it are omitted in some sort of weird guessing game... It makes sense that the allegations focus on analysing data manipulation. Maybe it's real, maybe it's fake, but the extreme scepticism seems deserved if they aren't going to be completely open and have a history of doing the same thing over and over.
Room temperature, but 1GPa pressure (or 145037 psi).
An exciting step forward if it's reproducible, but not useful commercially. I'd rather see breakthroughs in theoretical understanding; superconductivity in these regimes is still very poorly understood to my knowledge.
A woodworking clamp with a handle in line with the screw will generate about 400lbs of force. That’s a reasonable model for how much axial force you will be able to exert with a screwdriver handle and beefy threaded screws. To get 150000 psi you have to squeeze something .0027 square inches, or 1.7mm^2, leave some room for edges and let’s work with 1mm^2.
I’m not sure what interesting things can be done with this superconductor in 1mm^2, but squeezing the anvil isn’t outrageous.
For perspective, TSMC will put 150 million transistors in that area. I kind of suspect that integrated circuits might flow at this pressure, so that might not be the application, but a superconducting ground plane and power plane would be interesting.
Edit: Wait, maybe that’s just a giant capacitive load on all the signals. Do something smarter.
I'm beyond my expertise, but silicon crystal's compressive strength is 3GPa, so maybe? Plus I assume one would surround everything with a "fluid" (or at least flows at that pressure) to distribute the forces. Silicon chip fluid suspended inside carbide anvil with a steel clamping structure and tightening bolts should work. Interconnects are left as an exercise for the reader.
I have enough faith that this would work out that I'd put it in a sci-fi story, and not enough that I would invest in a startup.
Thin 1 mm2 speck of high explosive is like few milligrams - estimating from youtube vids that's not much. Might punch tiny hole into the underlying PCB at worst.
This is about 10,000x standard pressure. CRTs operate at 0.0000001x standard pressure, for reference -- they implode.
For example if superconductivity is desired from a 1 square inch (6.5e-4 m^2) footprint of this material, a weight of 6.5e5 N must fully rest on that footprint. Near the surface of earth that translates to a mass of approximately 146,000 lbs. Something like the mass of 3 tanks.
You'll have to ask a smart guy what deposition, subtraction, injection, or teleportation process he's going to use to make a permanent diamond container that leverages the high compressive strength because I'm stupid.
I wonder if it's possible to come up with a crystal or other material that has that much internal pressure - and isn't prone to unscheduled pressure release
After a bit of googling, the yield strength of some common materials is within striking distance of 1GPa (many steels are 0.25 GPa, silicon carbide is 3.4 GPa [1]). So perhaps it could be possible to embed this superconductor in a highly stressed supporting structure to provide the pressure.
Not useful for cabling, but this could conceivably be used for room temperature SQUIDs [2], paving the way to cheap MRIs.
Wolfram Alpha tells me that to bring 10 sq. cm of the material to 1GPa would require 18x the bite force of a T-Rex. I don't consider the inside of a dinosaur's mouth "room pressure" either, but I could imagine a reasonably-sized housing capable of producing that force.
Another comparison for scale: It is 1/6th the pressure at which artificial diamonds are manufactured.
> Yeah, why is 1GPa being referred to as "near-room pressure"?
Because it's _very_ close to room pressure compared to other results. For example this team's last result (that's ... very questionable, but ignoring that), needed around 250 GPa.
Maybe a problem keeping it in a pressure vessel, but it might be possible to produce such high pressures at very small scales in solid state devices without use of a pressure vessel. In that case it would be more about changing the shape or bond angle of some nearby material to distort the electric field and induce the desired pressure.
1 GPa is less than half the yield strength of the best steels. All you would need to maintain the superconducting state is to put the material in a steel pipe.
Peer review exists to make sure that the scientific methodology is sound. It doesn't protect from systematic incompetence or malice. So if you can make something sound reasonable, there's no reason they wouldn't publish you, especially for something so spectacular. On the other hand, your downfall will be even harder if the inevitable replication confirms that you screwed up. I can't imagine why this group would risk their entire career, because they already screwed this up once. If they did so again, they're done.
Looking at existing precedent, people who get caught in even the most blatant fraud (e.g. J. H. Schon) end up doing alright. One scapegoat will lose their position and credentials and go on to work in industry for about the same wage they would have expected prior to the "revolutionary work" that was exposed as fraud. Everyone else will not feel any repercussions at all and find positions (e.g. tenure) about as easily as they would have given the prestige of the group prior to being found out.
Was the previous paper retracted due to honest experimental errors or the inability to produce results? Or was it something more concerning? Experimental errors happen to everyone, no need to crush careers over them.
If you look at Pons & Fleischmann, their fraud was so bad that it essentially killed the entire field. You can't publish anything in that area anymore without getting dismissed immediately.
Fascinating. This material alone would be revolutionary if legitimate, although I’m sure there’d be further improvements.
Question for any experts - what’s the relative difficulty of keeping something under sustained high pressure in a piece of hardware vs keeping it very cold?
Our ultra cold usages work decently well. Would it be any easier to keep a hardware component under pressure like what this new material requires?
Keeping things cold is effectively an energy vs space tradeoff.
For example, if you wanted to keep an underground superconducting wire cold, then you would send coolant pipes along it, and wrap it in an insulator. You need to put energy into chilling the coolant, inversely proportional to the thickness of the insulation.
However, typically for most things humans want to do, the cooling cost works out higher than the energy lost to resistance in a non-superconductor, so, apart from a few specialist use cases (MRI machines, particle accelerators), superconductors have seen no use.
The second to holy grail is temperature above liquid nitrogen at around normal pressure with cheap and easy to obtain coolant. It will enable big things like really high voltage intercontinental transmission lines etc.
We've already got there with YbCO. But just barely, which means useful amounts of current and magnetic field density bring it out of the critical region.
Is there some napkin math available on the net for a transmission line with a nitrogen-cooled high-temp superconductor (Tc > 90K) and thick thermal insulation? I mean for the energy required per km too keep it cooled below Tc.
Big high voltage transmission lines lose ~200 watts per meter in resistive losses when under full load.
The electrical energy to keep something 1 meter long at liquid nitrogen temperatures is also ~200 watts, assuming 8 inches of insulation.
The resistive losses go down with the square of the power transmitted - so they fall to zero rapidly when not under full load. Cooling losses stay approximately constant.
Therefore, I suspect a liquid nitrogen cooled superconducting cable wouldn't work out financially.
This is my own ignorance, but what determines the power carrying capacity of a wire besides melting from the resistance? Could you transmit a lot more wattage through that same line when superconducting?
The energy loss (in the sense of where the heat ends up) is at the refrigeration plant. The cable itself extracts heat from the environment. So SC cables make sense for underground cables, where heat buildup is a problem.
I do seem to recall seeing a story about using computers to look at the space of all chemical formulas for conductors and filter/deprioritize the implausible ones. Basically AI driving task queues for humans, hoping to make the queue more effective.
I had a teacher that in the 80s or something made an experiment where he would drop lots of elements/composites that were "good candidates" in a melting pot and hope that some superconductor material would come out from it.
He made some important discoveries there but no "room temperature" superconductor came from there. I wonder how AI could be used in the "same way" with data and much less effort.
Are we really sure we understand how superconductivity happens? At one point there were theories, materials that didn't match the theories, and new theories to try to explain the anomalies. But I don't know how that all played out in the end.
If we actually understand the process then you could simulate the materials, the way we do fluid dynamics or protein folding. But without that understanding we could end up flagging the eventual winners as unlikely and reduce progress instead of increasing it.
And if we can do that, why haven't we used the same thing for battery chemistries? Give me all of the low-dendrite battery formulas sorted by simulated capacity, weight, longevity and engineering difficulty, please.
I don’t think a little light-hearted pun thread turns HN into Reddit, especially when the title of the article itself is a pun.
Not justification, simply observation: I would hope that anyone reading the title would get the reference but I’ve been surprised before. This thread is an additional clue to the uninitiated that they may have missed something.