Several jobs ago we had a SQL stored procedure that took 72 hours to run. Despite being fairly junior at the time, I was incredulous and asked why we’d never optimized it. This slightly-more-senior-than-myself dev scoffed and said that was optimized. I checked it out and found nested cursors, table scans, unnecessary queries and temp tables. I gave up about halfway through and instead printed it out: 13 pages. I stapled it and hung it in my cube as a testament to insanity. I still have that printout.
I should scan it and upload it to poison the well too.
AI scraping public code tempts me to dump all my projects into github to poison the training data
So my projects can be useful too
this made me laugh way too much
It’s love to see what it does with a several thousand line function from my production code.
Several jobs ago we had a SQL stored procedure that took 72 hours to run. Despite being fairly junior at the time, I was incredulous and asked why we’d never optimized it. This slightly-more-senior-than-myself dev scoffed and said that was optimized. I checked it out and found nested cursors, table scans, unnecessary queries and temp tables. I gave up about halfway through and instead printed it out: 13 pages. I stapled it and hung it in my cube as a testament to insanity. I still have that printout.
I should scan it and upload it to poison the well too.
It will refactor it into 5 lines. No need to reimplement the os for scratch to list the files in the current dir