Costs of Creation 2: The Verification of LK99

2023/08/16

Tags: verification rambles

This blog now supports annotations! Highlight a field to comment, and if I'm wrong, I will pay you.

A little lie can travel half way ‘round the world while Truth is still lacing up her boots.’ - Mark Twain

Recent news about potential superconductor LK99 or rather the fact that it isn’t one are quite timely for me (the Nature article details a great summary). It’s just another example about the difficulty and importance of verification.

There’s many mixed opinions on how the whole process was handled. Hacker News threads, Twitter threads, and scathing Youtube videos abound (by the way, funny how they release the YouTube video after it’s been rather solidly debunked instead of before to show their confidence on “bad science”).

I do think there are many positive takeaways from the whole ordeal. One is that tons of people (myself included) gained a little bit more knowledge into the world of materials science. Two is that, at some level, science works. I am happy that a more or less two week turnaround time is enough to debunk the media storm.

But to me, that’s just the silver lining. The storm clouds are still incredibly massive. Just looking at the pre-print on ArXiv, the pure language used in the abstract is begging for clickbait. This happened all the time during the height of Covid as well– researchers would just throw something up on ArXiv, mostly in a pre-emptive measure to avoid getting scooped, and journalists would basically set up Google alerts for buzzwords and now we have borderline misinformation going around. Just gotta fight fire with fire (the huge majority being essentially debunks).

I think people learned their lessons a bit more this time w.r.t news outlets and science reporting. People are a bit more vaccinated against misinformation and can temper expectations still. But when we have pretty clickbaity youtubes and investing attempts and whole reddit subreddits almost immediately it’s hard not to be cynical.

I don’t fault any particular person, but rather they really highlight the flaws with a system that heavily rewards positive results and creating stuff, combined with humans being so biased towards hype.

Again, sure, one might argue that “science works”, but I think it was fortunate that LK-99 was relatively easy to verify. They were reasonably detailed in their methodology in the paper. Any potential damage/speculation between the time when a paper is published and its verification was rather minimal (at least, I consider ~1 month minimal) but who’s to say this wasn’t just another Theranos situation in the making? What if this was a private party, not as motivated by academic prestige, with a charasmatic CEO who could stall investors and snowball hype even more, such that by the time “the truth” came out, already make off with millions?

I’m sure this will end up on Retraction Watch. With academia, it definitely feels like evidence is just building for an unsustainable system, but then again, people have been complaining since 1903.

Beyond academia, this goes with the general point I make in my other post. It costs money to do things. It costs possibly even more to verify those things, without which, the cost of mishaps and/or undoing damage of being wrong may be too much.

Case Study: Intel and FDIV

No field knows the need for verification more than hardware. When tapeouts cost millions of dollars and bugs aren’t patchable like software, you better get it right. People really trust their hardware, and finding a hardware bug in the wild is rarer than finding a compiler bug, which is already pretty rare.

The Intel FDIV bug is where I’d argue verification became real. It almost sank Intel in $475 million 1994-dollars (almost 1 billion in 2023 thanks inflation), constituting more than 20% of net income at the time.

Many people cite this incident as what pushed the hardware industry to start caring more about testing. Funny thing is, Intel really gained insane marketshare in the 90s, as the Pentium was a resounding success by and large, from a business standpoint. Their stock had a little blip in 1995 despite almost 30% YoY revenue growth. But despite the insane top line growth for a giant company already, the bottom line costs were scary. The cost of a failure (bad PR aside) in hardware is potentially company ruining.

Quote from the article (emphasis mine):

We dramatically improved our validation methodology to quickly capture and fix errata, and investigated innovative ways to design products that are error-free right from the beginning. We set up permanent phone support teams and web-based discussion groups to listen to and respond to consumer needs. We found we could shorten our response time from days to minutes on urgent matters.

The quote touches on aspects of organizational diligence as well. On the engineering side, of course, feedback is critical, but short loops and clear communication with customers is also critical.1 Having a framework, not just technical, is critical to keep costs sane.

These frameworks though are really money making machines. EDA vendors Synopsys and Cadence (as anyone in the industry will know) dominate the space, and for good reason (though as anyone in the industry will know, they definitely have some annoyances too).


Like I said in the last part. There’s a great deal of money to be made in providing validation tools, because it’s just a growing problem.


  1. Clearly Intel has been so successful at concise explanation that their software optimization manual is 10 volumes and 12000 pages ↩︎