I have a boss who tells us weekly that everything we do should start with AI. Researching? Ask ChatGPT first. Writing an email or a document? Get ChatGPT to do it.
They send me documents they “put together” that are clearly ChatGPT generated, with no shame. They tell us that if we aren’t doing these things, our careers will be dead. And their boss is bought in to AI just as much, and so on.
I feel like I am living in a nightmare.
My company added an AI chatbot to our web site, but beyond that we generally are anti-AI.
Disclaimer: I only started working at this company about three weeks ago, so this info may not be as accurate as I currently think it is.
I work in quality management and recently asked my boss what the current stance on AI is, since he mentioned quite early that he and his colleagues sometimes use ChatGPT and Copilot in conjunction to write up some text for process descriptions or info pages. They use it in research tasks, or, for example, to summarize large documents like government regulations, and they very often use it to rephrase texts when they can’t think of a good way to word something. From his explanation, the company consensus seems to be that everyone has access to Copilot via our computers and if someone has, for example, a Kagi or Gemini or whatever subscription, we are absolutely allowed and encouraged to utilize it to its full potential.
The only rules seem to be to not blindly trust the AI output ever and to not feed it company sensitive information (and/or our suppliers/customers)
We had a discussion about AI at work. Our consensus was that it doesn’t matter how you want to do your work. What matters is the result, not the process. Are you writing clean code and on finishing tasks on time? That’s the metric. How you get there is up to you.
I work in social work; I would say about 60 percent of what I do is paperwork. My agency has told us not to use LLMs, as that would be a massive HIPPA nightmare. That being said, we use “secure” corporate emails. These use Microsoft 365 office suite, which are copilot enabled. These include TLDRs at the top, before you even look at the email, predictive texts… and not much else.
Would I love a bot who could spit out a Plan based on my notes or specifications? absolutely. Do I trust them not to make shit up. Absolutely not.
Apparently a hospital in my network is trialing a tool to generate assessment flowsheets based on an audio recording of a nurse talking aloud while doing a head to toe assessment. So if they say, you’ve got a little swelling in your legs it’ll mark down bilateral edema under the peripheral vascular section. You have to review before submitting but it seems nice.
You’re right, that does seem very nice.
Everyone in my office hates it, including my director who occasionally ends up going on a rant because Microsoft and Adobe often end up pushing their AI to us when we don’t want it.
Sometimes I’m very thankful I work for a nonprofit. They’re still p shitty to us employees, but our focus is first and foremost on doing the job right, something AI has no chance at.
My boss loves it and keeps suggesting we try it. Luckily, there isn’t much use for it in our line of work.
I’m in an environment with various level of sensitive data, including very sensitive data. Think GDPR type stuff you really don’t want to accidentally leak.
One day when we started up our company laptops Copilot just was installed and auto launched on startup. Nobody warned us. No indication about how to use it or not use it. That lasted about 3 months. Eventually they limited some ways of using it, gave a little bit of guidance on not putting the most sensitive data in there. Then they enabled Copilot in most apps that we use to actually process the sensitive data. It’s in everything. We are actively encouraged to learn more about it and use it.
I overheard a colleague recently trying to let it create a whole PowerPoint presentation. From what I heard the results were less than ideal.
The scary thing is that I’m in a notoriously risk averse industry. Yet they do this. It’s… a choice.
They just hopped onto the bandwagon pushing for copilot and SharePoint. Just in time as some states are switching to open source.
I’m a consultant so I’m doing a lot of different things day to day. We use it to track meetings with the copilot facilitator and meeting recaps and next steps. It is pretty helpful in that regard and often matches the tasks I write for myself during the meeting.
I also have to support a wide arrange of different systems and I can’t be an expert in all of them so it is helpful for generating short scripts and workflows if it is powershell one day, bash the next, exchange management etc. I do know powershell and bash scripting decently well and the scripts often need to be fixed but it is good at generating templates and starter scripts I flesh out as the need arises. At this point I’ve collected many of the useful ones I need in my repos and reuse them pretty often.
Lastly one of the companies I consult for uses machine learning to design medical implants and design and test novel materials and designs. That is pretty cool and I don’t think they could do some of the stuff they’re doing without machine learning. While still AI, it isn’t really GPT style generative AI though, not sure if that is what you’re asking.
The order is:
Use whatever tool is not malicious and doesnt attack customer data.Most use (IMO) way too much AI. The first result (the google AI answer) is trusted and done.
No research done beyond that.I purposefully blocked the AI answer in uBlock. I don’t want any of that.
Besides that I use it on occassion to look for a word or reword my search query if I don’t find or know what I am looking for.
Very useful for the “What was the name of X again? It does Y and Z” queries.
Also for Powershell scripting because it can give me examples on using it.But every asnwer is double and tripple checked for accuracy.
Seen too much news about made up answers.At home I usually only use it for bash scripting because I can’t be bothered to learn that.
Our company is forcing us to do everything with AI, hell they developed a “tool” to generate simple apps using AI our customers can use for their enterprise applications and we are forced to generate 2 a week minimum to “experiment” with the constant new features being added by the dev teams behind it (but we’re basically training it).
The director uses AI spy bots to tell him who read and who didn’t read his emails.
Can’t even commit code to our corporate github unless copilot gives it the thumbs up and its constantly nitpicking things like how we wrote our comments and asking to replace pronouns or to write them a different way, which I always reply with “no” because the code is what matters here.
We are told to be positive about AI and to push the use of AI into every facet of our work lives, but I just feel my career as a developer ending because of AI. We’re a bespoke software company and our customers are starting to ask if they should use AI to built their software instead of paying us, which I then have to spend hours explaining them why it would be a disaster due to the shear complexity of what they want to build.
Most if not all executives I talk to are idiots who don’t understand web development, shit some don’t even understand the basics of technology but think they can design an app.
After being a senior dev and writing code for 15 years I’m starting to look at other careers to switch to… Maybe becoming an AI evangelist? I hear companies are desperately looking for them… Lol, what a fucking disaster this shit is becoming.
That work environment sounds like hell. Literally. If I woke up one day and had to work like this, I would think I never woke up at all and Lucifer finally started torturing me.
AI is ruining my ability to think and sucks the fun out of writing code. I am so happy our boss doesn’t force us to use it.
I am very, very concerned at how widely it is used by my superiors.
We have an AI committee. When ChatGPT went down, I overheard people freaking out about it. When our paid subscription had a glitch, IT sent out emails very quickly to let them know they were working to resolve it ASAP.
It’s a bit upsetting because may of them are using it to basically automate their job (write reports & emails). I do a lot of work to ensure that our data is accurate from manual data entry by a lot of people… and they just toss it into an LLM to convert it into an email… and they make like 30k more than me.
The head of my agency is a gullible rube who is terrified of being “left behind”, and the head of my department is a grown-up with a family and a career who spends his days off sending AI videos and memes into the work chat.
I’ve been called into meetings and told I have to be positive about AI. I’ve been told to stop coding and generate (very important) things with AI.
It’s disheartening. My career is over, because I have no interest in generating mountains of no-intention code rather than putting in the effort to build reliable, good, useful things for our clients. AI dorks can’t fathom human effort and hard work being important.
I’m working to pay off my debts, and then I’m done. I strongly want to get a job that allows me to be offline.
It almost sounds like were both in the same company
I work for a small advertising agency as a web developer. I’d say is mixed. The writing team is pissed about AI, because of the SEO-optimized slop garbage that is ruining enjoyable articles on the internet. The video team enjoys it, because it’s really easy to generate good (enough) looking VFX with it. I use it rarely. Mostly for mundane tasks and boilerplate code. I enjoy using my actual brain to solve coding problems.
Customers don’t have a fucking clue, of course. If we told them that they need AI for some stupid reason, they would probably believe us.
The boss is letting us decide and not forcing anything upon us. If we believe our work is done better with it, we can go for it, but we don’t have to. Good boss.
How does one AI for SFX?
VFX, not SFX. In our company, the team shoots real-life videos and then puts effects on top. The most recent project I saw was a movie for a manufacturer of paper colors. The artists made a big tower in one of their factories explode into a wave of paint, it looked pretty (but it was only a few seconds long).
Wrote the wrong abbr. :p
Sounds cool!
it doesn’t exist. but i work for a company that does real work. it doesn’t bullshit.
what does your company do?









