I have been using Google Search Console every day for the past 3 months and it is so inconsistent.
The most inconsistent thing is that when something is in a category that says it is not available, you try to delete it and it says it is online. Then you look and it is online. So why was it classified as something that couldn’t be reached repeatedly? And why did it give an error when you tried to test it before you attempted to delete it? It feels as though they created this and then just ignored it.

Every day I try to accomplish two things in GSC. I try to add pages for new articles that may not be indexed, and I try to delete old pages that I have deleted from SERP results. You would think this would be straightforward. It is not. To add a new page I have to individually ask for it to be indexed, and each index request can take 15 seconds. There is a limit of 10 requests per day, so if you have a huge site, this would be a terrible way to try to get it indexed.
Now I have already shared that I have a sitemap.xml file which theoretically should index pages but does not always. It is not due to a lack of traffic either. I have traffic from LinkedIn to these articles and I see the traffic in Google Analytics. Google knows they are popular and read, but doesn’t think they are important enough to index.
Yet, Google indexes things that make no sense and I never asked it to index. Like parts of how security apps work, or temporary staging sites. I understand why it does that, but it isn’t very smart at all.
Now to delete a page you would think that would be straightforward. No. You have to manually ask them to be deleted 1 by one. If you ask for too many at one time, they will all be ignored. So you can’t do great numbers of them at once. I know I tried and it ignored my request. Once you request them to be deleted it yells at you if you ask it to delete it again, even though GSC shows that the page exists. So either it exists and is a dead link, or it doesn’t exist and doesn’t need to be deleted. Make up your mind Google and tell me!
In the past 3 months, I have increased the number of SERP results to 1000 URLS by daily logging in and spending time doing this. It has taught me that GSC is poorly designed and not trustworthy. If you feel frustrated with it, know that many others are frustrated as well. Google should automatically add pages in a sitemap.xml index and if not, explain why they can’t be added. It should automatically delete references that are not available after 3 months and not ask owners to do this garbage cleanup that you can’t properly do because Google itself doesn’t know what is going on with the page.
I’m not mad, I just find it unprofessional how when a company is a monopoly they don’t feel the need to have a good product.