Amazon’s ecommerce business has summoned a large group of engineers to a meeting on Tuesday for a “deep dive” into a spate of outages, including incidents tied to the use of AI coding tools.
The online retail giant said there had been a “trend of incidents” in recent months, characterized by a “high blast radius” and “Gen-AI assisted changes” among other factors, according to a briefing note for the meeting seen by the FT.
Under “contributing factors” the note included “novel GenAI usage for which best practices and safeguards are not yet fully established.”
Junior and mid-level engineers will now require more senior engineers to sign off any AI-assisted changes, Treadwell added.
So instead of getting a human to write it and AI peer reviewing it you want the most expensive per hour developers to look at stuff a human didn’t write and the other engineers can’t explain? Yeah, this is where the efficiency gains disappear.
I read stuff from one of my Jr’s all the time and most of it is made with AI. I don’t understand most of it and neither does the Dev. He keeps saying how much he’s learned from AI but peer programming with him is the pits. I try to say stuff like, “Oops! Looks like we forgot the packages.” And then 10 secs of silence later, “So you can go to line 24 and type…”
“Everyone must use AI.”
…
“No! Not like that!”
…
Do the senior engineers NOT sign off on changes to systems that can take down the production servers? Even if we take out the LLM created code, this sounds like a bigger problem
the way private companies work is that they require their employees to produce more than is reasonable given the work quality that is expected.
when this discrepancy is pointed out, it’s handwaved away. when the discrepancy results in problems, as it most obviously will, somebody is found to place the blame on.
it’s not the developer’s faults. it’s a management decision.
source: I’m talking out of my ass I’m just a salty employee who is seeing this happen at their own workplace when it didn’t used to, at least not to this level
We may start to see people realize that “have the AI generate slop, humans will catch the mistakes” actually is different from “have humans generate robust code.”
Not only that, but writing code is so much easier than understanding code you didn’t write. Seems like either you need to be able to trust the AI code, or you’re probably better of writing it yourself. Maybe there’s some simple yet tedious stuff, but it has to be simple enough to understand and verify faster than you could write it. Or maybe run code through AI to check for bugs and check out any bugs it finds…
I definitely have trusted AI to write miniature pointless little projects - like a little PHP page that loaded music for the current directory and showed a simple JS player in a webpage so I could share Christmas music with my family and friends. No database, no file uploading or anything. It worked decently, although not perfectly, and that’s all it needed to do.
I guarantee there’s so much pressure on those engineers to deliver code that they rubber stamp a ton of it with the intention of “fixing it later”
Source: I’ve worked in software for 20+ years and know a lot of folks working for and who have worked for Amazon
That’s basically the story at all the big tech companies, from what I’ve heard. In my time at Facebook, I felt like the only person who actually read the merge requests that people sent me before hitting it with “LGTM”
If companies are going to place increasing reliance on review due to having lower-quality submissions, then they should probably evaluate employees weighting review quality (say, oh, rate of bugs subsequently discovered in reviewed commits or something like that).
When I worked there 20% of the work we had to do had to go through a senior engineer. And getting his time was like pulling teeth.
More of the time he would just nitpick grammar in docs and then finally rubber stamp work. It was awful.
Well that’s going to make your senior developers quit.
I am not a developer, but:
I told the owner of the company recently that, and I quote, “I will fucking kill myself if my job becomes reviewing AI output”
Never kill yourself for something that’s somebody else’s fault.
Exactly. If you’re too stupid or lazy to adequately vet what your LLM puts out yourself, it shouldn’t be somebody else’s job to wade through the sewage you’re producing. You either shouldn’t be using one or, if you can’t do your job without it, you shouldn’t have that job.
—Someone who doesn’t use genAI but has spent way too much time digging through LLM slop
You know what my favorite pizza topping is? Bleach.
Dominoes REFUSES to put bleach on my pizza, so I gotta do it myself. I found out about it from AI. Now my pizza tastes great! The downside is having to go to the hospital to get a stomach pump everytime.
Bleach Boys - https://open.spotify.com/track/0o6zZmPn5a3FJMNjINjZIB
I mean honestly yeah, I’m not going to waste my time with some junior developer who can’t explain how the code works and how it interacts with whatever framework I’m working on. I ain’t got time for that nonsense, especially when the code I deal with involves safety critical sections of code.
Honestly if my work ever decided to allow unfettered AI code generation into my code base, I would immediately look for a new job at that point.
it’s pretty fucking stark right? these are the devs that stayed after management mandated they USE the shit in the first place, now they want the same devs to become responsible for what the shit does to their codebases.
“You are now a senior auditor.”
You’re absolutely right!
It’s going to make snr devs get fired, surely?
They either refuse to sign off when boss wants them to and get fired or sign off and get fired when ai code they signed off on causes issues.
How can they fire them ?
They’re not employing them to look pretty you know.
Bingo.
Maybe not outright fired, but absolutely open them up to career limits based on what you described.
All of Amazon’s code undergoes code reviews already. Accepting a PR is already spiritually a sign off.
This is just explicitly a threat, explicitly trying to find someone to hold accountable because you can’t hold ai accountable. What are they gonna do, fire the ai? Sign here to be the fall guy. Fuck off.
The way AI is being pushed onto workers on a global scale has to be the dumbest thing to ever happen in the work space. Executives are getting hysterical over something they don’t even try to understand and even governments shower companies in subsidies if they do anything with AI. Of course the only result so far are mass layoffs and exploding costs for energy and hardware. All the while economies are crumbling everywhere because of course they do when mass unemployment sweeps around the globe. And again, governments everywhere are subsiding this crap with tax payer money. What’s even worse than all of that is the insane environmental damage all of this causes. But I’ll have to cut myself short here because I’m just getting increasingly upset here.
I guess what I’m trying to say is: We’re funding our own decline in rapid speed. Human stupidity has found a new peak in 2026 and it’s not even close. I knew the way AI was advertised was completely overblown years ago but I never anticipated it would get this bad this quickly.
xD
Guess that all-in-on-AI attitude was not such a bold and brilliant idea after all.
Somwthing similar just happened to McKinsey, too.
Aren’t their names already on the commits? Or is the AI given write access to their code repository?
I think you already know the answer to that.
They want to move fast and break things but they still want a few meat bags around to blame when things inevitable blow up in their faces.
They don’t want to break the things that hurt their bottom line.
Break society and make everyone hate each other? Hey buddy - small price to pay. Gotta think of the shareholders, after all. (╯°□°)╯︵ ┻━┻
LOL, so they can blame and fire SOMEONE.
Hard no.
I wonder how well that is received by their staff :D
I hear it’s been a stressful time. I’m in Seattle. Practically everyone knows someone or other who works at AWS.












