Paginated Headache for SEOs

Pagination is usually overlooked because… not sure why. My best guess is the default system works as is so why bother?

What are you trying to achieve?

The two obvious goals are:

  • Help the users to browse efficiently through a very large website (e.g. forum)
  • Decrease your website depth (i.e. distance in clicks from a starting point, usually the homepage, to other pages – it usually makes sense to have your most important pages within the first levels)

Pagination is not boring

At least it doesn’t have to be, some solutions can be very creative. One of the best examples would be the ghostblocks pagination, probably the best solution to decrease the depth of a site. (I strongly suggest reading the full article; it was an eye-opener for me. It might be dated but still relevant)

source: https://audisto.com/guides/pagination/#ghostblock

In my in-house SEO career, I’ve been dealing with a lot of large websites (>100k pages to several million), and pagination offers interesting challenges if you want to improve UX and optimize your crawl. I often used Ghostblocks as a one size fits all solution with great success until recently… I had a very SEO kind of revelation:

IT DEPENDS - Rainbow Spongbob | Make a Meme

For years now, I’ve been working as SEO for big publishers, and by definition, a publisher publishes… A LOT.

see also: How to fix index bloating issues (slideshare)

Most of the content stays under control thanks to a predictable calendar for publication and updates (e.g. product reviews, event coverage, buying guide, etc.)

modern times 1936 | Tumblr

The rest of the website (e.g. the news) feels more like:

Modern Times GIFs - Get the best GIF on GIPHY

There’s no one size fits all solution. (aka it depends)

The mistake is usually to think we can have one pagination system for the whole website across all content types. We should change that approach because not all pages are equal and your pagination should depend on one very much overlooked criteria: your page lifespan.

Once it’s indexed, how often do you need a page to be re-crawled? is the question you should try to answer.

Below 3 of the most frequent cases + suggested solutions:

1 – An evergreen page that gets frequent updates will naturally be pushed again within the first levels every time it’s updated. (e.g. ‘Best Shows on Netflix Now’)

Nothing to do here, yay \o/

2 – A page that doesn’t require an update and has a short lifespan has no benefit from being crawled often once indexed. (e.g. News)

  • Infinite scroll: it is still possible for the users to browse but less easy for a crawler
  • Archive section that limits your crawlable paginated content to X days or X posts. It provides you better control of the crawling frequency for your older content by “burying” it deeper on the site. You can use a different URL structure (e.g. /archive/ directory ) + noindex to discourage crawlers from spending too much time there.

3 – A page that doesn’t require an update (or not often) but is evergreen (e.g. How to clean your oven with baking soda)

  • Ghostblocks: this is a typical case where those are very efficient in keeping content not too many clicks away

A rule of thumb…

I would keep this in mind: The goal is to keep the important pages within the first levels, not ALL pages.


Some caveats and tactics not mentioned here for the purpose of the article (i.e. pagination):

  • Search Engines are probably not following blindly deep pages even if linked with Ghostblocks because smart enough to understand page-899522 will be less interesting to recrawl than page-2 – still, a link vs. no link makes a difference
  • Sitemaps could reflect or not that strategy: making a page harder or impossible to find on your website will make little difference if it appears in your sitemap (from a crawl frequency standpoint)
  • Leverage Internal Linking: if that page is important, it needs to be ‘promoted’ more than the others and prevent it from being buried too deep: related content, contextual navigation, etc.
  • Internal Search: at some point, the best user experience to find older content might be your internal search. It is rare to see a website with a performant one and advanced search filters options, it might be something to look at in some cases (e.g. forum)

IRL examples:

NYTimes doesn’t paginate the sections (it uses infinite scroll, so not crawlable but still user-friendly)
Techradar has an archive for all sections. It arbitrary starts after the 10th page + indexed and crawlable, which seems suboptimal, but the idea is here.
Stackoverflow links automatically to the last page, surfacing 2008 content within the first level (in that example)

Read Also:

Was this helpful?

2 / 0

Leave a Reply 0

Your email address will not be published. Required fields are marked *