I tried it and it was interesting but I don’t think people are out of work yet.
My main problem is that it lacks the ability to truly understand what it is saying. It may look on the surface to be correct, but often when I fact checked it, the details were wrong. It is confidently incorrect, which isn’t a benefit for me.
I have had people tell me that they used it to do part of their jobs and then rewrite and reorganize what it said. Did that save time? Perhaps but the flow is always at a comprise in my opinion.
One of the tests that I did was ask it to rewrite my resume and it did a terrible job. I was so unimpressed that I stopped it while it was working on the answer. I think it makes the most sense for facts that are not in question. If things need a nuanced approach, I think it fails and it isn’t useful.
I wouldn’t trust any summary that it does. I asked it do summaries of things that I read and understood and it was a poor summary. There is an essential philosophical element of what humans do that machines just can’t capture. You might also call it spiritual, but I don’t want to introduce things that I can’t define well. It lacks in the ChatGPT writing and its attempts at humor are pathetic.
Is it a useful tool for bad writers? Sure. It gives you an outline that you may not be able to come up with yourself. Can an intermediate writer do better? Yes! In comparison to classic literature I doubt a machine will ever reach those heights. ChatGPT can imitate the form of the human being, but it can’t illuminate its spirit.