
It turns out that using the Copilot Prompt coach isn’t perfect.
I tried to use it today to refine a query that ws complex and it suggested something. Great! I try it and it doesn’t do what I say. In fact I had to tell it again to output to an Excel file and its thinking about it now.
If the prompts are not getting you what you need then there is a mystery factor about getting the results. Yes this is new technology, but at the very minimum it should do what it is told to do unless there is a reason why and then tell you why. It just ignores things and you are left wondering why it ignored you.
Even if this gives me an excel output I have no faith in its contents. I am so frustrated by Copilots inability to be trustworthy. It should be bending over backwards to double check its answers so people can trust it.
If you want to go bankrupt trust Copilot/ChatGPT explicitly.