Googlebot 404 errors due to page numbers


(Jay Pfaffman) #1

I’ve got a site that’s gotten the “Googlebot identified a significant increase in the number of URLs” error.

It looks like the bad urls (returning 404s) are all of the form https://site/t/slug/id?page=XXX.

And changing XXX to XXX-y makes the URLs work fine. (I thought it was an off-by-one error at first, but sometimes y needs to be significantly more than 1).

It’s not immediately apparent that it’s due to deleted posts, which was my next guess.

Perhaps if the page number is greater than the number of pages there should be a 301 to page 1 or something? (I don’t pretend to know anything about SEO). Or should I tell them just not to worry about those 404s?


(Sam Saffron) #2

This is only an issue we can address if you can find actual cases on the forum where the meta tags link to non existent pages

If the content is looking good and google are just having an adventure adjusting to the fact that a book once had 100 pages and now has 10 there is not much we can do, a 301 or 302 here is not ideal, perhaps we can add it for deletions… it’s a giant edge case though