It doesn’t need to be good to replace jobs, as long as there are no consequences for the people making those decisions.
I’ve lost count of how many “oops, it was AI’s fault, not my fault!” stories I’ve heard, even within highly regulated fields. Like, lawyers submitting documents with completely fake citations, and then…no real consequences. Seems to me like that should be cause for immediate disbarment, but no, apparently not.
The lack of consequences has been a problem for quite a while now, from before LLMs. In my opinion it’s been caused by a widespread increase in professional incompetence, together with a mutually protective network of incompetent people. “I won’t point out that you’re incompetent and won’t blame you for your mistakes, if you do me the same favour”.
They call it “imposter syndrome”, but it isn’t a syndrome: it’s a symptom.
I’m a licensed professional engineer and was working on evaluating some claims for a client. When I was done, my results were sent to the PR arm of the company. The PR people tried to use CoPilot to redo my work and make the claims language. Their results were full of hallucinated errors with no supporting evidence. The project manager was not happy.
It doesn’t need to be good to replace jobs, as long as there are no consequences for the people making those decisions.
I’ve lost count of how many “oops, it was AI’s fault, not my fault!” stories I’ve heard, even within highly regulated fields. Like, lawyers submitting documents with completely fake citations, and then…no real consequences. Seems to me like that should be cause for immediate disbarment, but no, apparently not.
The lack of consequences has been a problem for quite a while now, from before LLMs. In my opinion it’s been caused by a widespread increase in professional incompetence, together with a mutually protective network of incompetent people. “I won’t point out that you’re incompetent and won’t blame you for your mistakes, if you do me the same favour”.
They call it “imposter syndrome”, but it isn’t a syndrome: it’s a symptom.
Indeed: Everything was already AI
This has been a very long project — separating conduct from consequences, in order to maximize profit. AI is just a breakthrough tool for doing it.
This roughly mirrors my experience in corporate America.
I’m a licensed professional engineer and was working on evaluating some claims for a client. When I was done, my results were sent to the PR arm of the company. The PR people tried to use CoPilot to redo my work and make the claims language. Their results were full of hallucinated errors with no supporting evidence. The project manager was not happy.