Google’s John Mueller said that Google Search Console’s URL Inspection tool to submit a page to be crawled would probably not see the quota increase anytime soon. He said on Mastodon, “given how much junk we get submitted there, I don’t see us increasing those limits.”
David Iwanow asked, “the GSC team has to lift the quotas this was from 7 different domains and 20-30 requests to re-index several websites that we completely blocking Google.”
John replied, “Usually our systems pick up on bigger changes with regards to indexability fairly quickly, and recrawl a bit faster. Given how much junk we get submitted there, I don’t see us increasing those limits (or if so, then ignoring more submissions). I’d recommend focusing on making things well-crawlable, and awesome in a way that Googlebot goes out of its way to crawl it well. I realize that’s not as simple as a pushbutton though.”
Do you use this feature often? I rarely ever use it, like almost never. But this site does get crawled and indexed insanely quickly.
As a reminder, Google changed the quotas for this back in 2018 or so. Back then, John said something similar, he said there were people using it to abuse Google and searchers. People were submitting hacked content, spam, and other forms of content that Google did not want in their index. So Google had to change how the quota system worked in order to balance the good use of the tool and the bad use of the tool.
So don’t expect much to change with this feature anytime soon.
Forum discussion at Mastodon.
The content at the Search Engine Roundtable are the sole opinion of the authors and in no way reflect views of RustyBrick ®, Inc
Copyright © 1994-2023 RustyBrick ®, Inc. Web Development All Rights Reserved.
This work by Search Engine Roundtable is licensed under a Creative Commons Attribution 3.0 United States License. Creative Commons License and YouTube videos under YouTube’s ToS.