How AI Iterates


Or, when AI tries again.

As coders, we try many solutions. Sometimes it's because we're stupid and haven't learned any better yet. Sometimes it's because we're not happy with the result yet. Maybe something gnaws at us. Maybe a thrill to betterment invites us. We care about certain principles related to our craft. We have taste and values. At our best, we seek the true and beautiful.

How many times does AI iterate?

Well, it's really fast. It can "try" lots of things at the speed of a computer. It's reply to your query isn't its "first" solution. Before you can think, it has "thought" -- many times -- and starts streaming the bytes back.

But does that mean that it iterates?

Not from v1. The operator could push the AI further. Again, he types more details or different parameters into the nice little gray-bordered box. The AI won't do that by itself. What kind of person would do that?

An "LLM kiddie" (like a script kiddie) wouldn't. If the think that plopped out of the vending machine works, he'll use it. He's only browsings scripts long enough to download one and move on to the lolz.

Would AI push for more simplicity?

A push in a direction or for a reason is a certain kind of iteration. Simplicity isn't a part of an LLM's programming. AI's overriding principle seems to be "most likely to not be wrong" -- the statistical average of words, based on input.

Does an LLM kiddie have programming principles? No. "Get it done, get paid," is their motto.

Is a creative person needed, then, once an AI gets involved?

Only if you care about the kind of thing that comes out of the AI.

Is it categorically what you want? Is it of sufficient quality? Does it integrate well with your other assets and vision? Etc.

If all you care is if it works or not, you just need someone how knows how to ask AI questions in a way that the output is even less likely to be wrong.

And how will you know if it works? You'll copy-paste the code and try it out. Or you'll copy-paste the AI test suite and save it so it runs in CI.

There is no vetting here. There are no value judgement made on the code produced. There is no taste. There is a very narrow creativity. It all fits into that gray input box. It is utilitarian and brutalist.

We're not even talking about creating something fresh that the AI (or the world) hasn't seen.

But do we ever produce anything really original?

What about "nothing new under the sun"? Fair, I guess. We rehash stuff. And we can be surprised to learn we're not as original as we think. When it comes down to it, if we're uninspired, without a vision of what we really want, then it doesn't matter how it's done, as long as it's done.

But what about "where there is no vision, the people perish"? A coder turned LLM kiddie will lose his principles, taste and creativity. The act of "coding" seems likely a dull existence.

I think it's possible for a creative coder to keep and grow in his creativity using AI, but the temptation to his baser tendencies is strong and being made stronger.

An AI will not produce art. It produces factory code. Sure, this factory is more flexible than most widget factories. This one takes average color, average consistency, average strength grey goo and just 3D-prints something approximating your desire.

An LLM kiddie will not be a sculptor of software. He will be a factory widget inspector, dizzied and drained by the conveyor belts.