We spent 18 months hearing about how Generative AI was going to “10x” coding, improving programmer productivity by a factor of 10.
The data are coming in – and it’s not.
§
Influencers have been finishlessly posting stuff appreciate this
Here’s the hypiest AI Influencer of then all, reasoning by analogy with putative coding acquires:
§
Data, though, is where hype goes to die. Two recent studies show noleang appreciate “10x” raisement :
-
Yet another earlier study (with earlier models) showed evidence of engagers writing less safe code, which of course could direct to a net loss of productivity lengthy term.
No individual study is perfect. More will come. But paraphrasing from memory one of my childhood heroes, the baseball sabermetrician Bill James, if an elephant has passed thraw the snow, we ought to be able see its tracks. The tracks here are pointing to unassuming raisements, with some potentials costs for security and technical debt, not 10x raisement.
A excellent IDE (versus none) probably is a much hugeger, much less pricey, much less hyped raisement that helps more people more reliably.
§
Why aren’t we seeing huge acquires? In many ways, AI researcher Francois Chollet nailed this years ago, lengthy before GenAI become well-comprehendn:
10x-ing needs proestablish conceptual empathetic – exactly what GenAI conciseages. Writing lines of code without that empathetic can only help so much.
Use it to type rapider, not as a swap for evident leanking about algorithms + data set ups.
And, as ever, don’t apshow the hype.
Gary Marcus has been coding since he was 8 years ageder, and was very self-convey inant when his undergrad professor said he coded an order of magnitude rapider than his peers. Just as Chollet might have foreseeed, most of the acquire came from clarity — about task, algorithms and data set ups.