

I simply pointed out that nobody is expecting LLMs to validate the solutions it comes up with on its own, or to trust it to come up with a correct solution independently
I want to point you towards (gestures broadly at everything) where LLMs are being sold as panacea and things like hallucinations and overconfidence are being minimized. The AI industry is claiming these tools are a way to remove the human in the loop, and are trustworthy.
You may understand that a LLM is a starting point, not an end, but the way most people are being sold them is different and dangerous. Articles like the one you posted, which also downplayed the mistakes the model made (“Gemini made some minor numerical errors…”) while suggesting that it made a novel discovery about the source material are problematic. How many people just read the headline, or the whole article, and now assume that the data presented is fact, when the data presented is an unconfirmed opinion at best and made up at worst.
LLMs are really good at making “smart sounding” text that reads like someone intelligent wrote it, but we have tons of examples where smart-sounding and factual are a Venn Diagram that doesn’t overlap.















You may not realize that AI evangelism is not only in the scientific community. Articles like this, claiming AI is doing something amazing, something that humans have been unable to do, is propaganda.
Peer review is great. The average reader of that article is assuming the peer review is complete by the time they read it. Even if that isn’t true. The takeaway for them is not “this annotation was likely made by a German scribe at XYZ date” the point of the article is that “Gemini figured out something that stumped human researchers”.
And I refute the original position: the notes are not inscrutable and I doubt a human has never translated the numerals or analyzed the script to guess age. It just wasn’t important enough to write an article until it made AI look good.
The article is propaganda. It’s neat, but if you read it as “look at this cool thing llms can do…” Then you fell for it.