Link tags: uci

5

Codestin Search App

Hallucinations in code are the least dangerous form of LLM mistakes

The moment you run LLM generated code, any hallucinated methods will be instantly obvious: you’ll get an error. You can fix that yourself or you can feed the error back into the LLM and watch it correct itself.

Compare this to hallucinations in regular prose, where you need a critical eye, strong intuitions and well developed fact checking skills to avoid sharing information that’s incorrect and directly harmful to your reputation.

With code you get a powerful form of fact checking for free. Run the code, see if it works.

Living In A Lucid Dream

I love the way that Claire L. Evans writes.

Why isn’t the internet more fun and weird?

During the internet of 2006, consumer products let anyone edit CSS. It was a beautiful mess. As the internet grew up, consumer products stopped trusting their users, and the internet lost its soul.

The internet of 2019 is vital societal infrastructure. We depend on it to keep in touch with family, to pay for things, and so much more.

Just because it got serious doesn’t mean it can’t be fun and weird.

Cooper Journal: One free interaction

Small interactions that serve no useful purpose but are nonetheless satisfying. "Design this interaction such that: It's “free,” i.e. having no significance to the task or content, It's discoverable in ordinary use of the product, It's quick and repeatable (Less than half a second.), It's pleasant"

Cruciforum: crucially simple

A super simple lightweight piece of forum software from Stuart in just one PHP file. Drop it in a directory and you're done.