Pagination is usually overlooked because… not sure why. My best guess is the default system works as is so why bother?
What are you trying to achieve?
The two obvious goals are:
- Help the users to browse efficiently through a very large website (e.g. forum)
- Decrease your website depth (i.e. distance in clicks from a starting point, usually the homepage, to other pages – it usually makes sense to have your most important pages within the first levels)
Pagination is not boring
At least it doesn’t have to be, some solutions can be very creative. One of the best examples would be the ghostblocks pagination, probably the best solution to decrease the depth of a site. (I strongly suggest reading the full article; it was an eye-opener for me. It might be dated but still relevant)
In my in-house SEO career, I’ve been dealing with a lot of large websites (>100k pages to several million), and pagination offers interesting challenges if you want to improve UX and optimize your crawl. I often used Ghostblocks as a one size fits all solution with great success until recently… I had a very SEO kind of revelation:
For years now, I’ve been working as SEO for big publishers, and by definition, a publisher publishes… A LOT.
see also: How to fix index bloating issues (slideshare)
Most of the content stays under control thanks to a predictable calendar for publication and updates (e.g. product reviews, event coverage, buying guide, etc.)
The rest of the website (e.g. the news) feels more like:
There’s no one size fits all solution. (aka it depends)
The mistake is usually to think we can have one pagination system for the whole website across all content types. We should change that approach because not all pages are equal and your pagination should depend on one very much overlooked criteria: your page lifespan.
Once it’s indexed, how often do you need a page to be re-crawled? is the question you should try to answer.
Below 3 of the most frequent cases + suggested solutions:
1 – An evergreen page that gets frequent updates will naturally be pushed again within the first levels every time it’s updated. (e.g. ‘Best Shows on Netflix Now’)
Nothing to do here, yay \o/
2 – A page that doesn’t require an update and has a short lifespan has no benefit from being crawled often once indexed. (e.g. News)
- Infinite scroll: it is still possible for the users to browse but less easy for a crawler
- Archive section that limits your crawlable paginated content to X days or X posts. It provides you better control of the crawling frequency for your older content by “burying” it deeper on the site. You can use a different URL structure (e.g. /archive/ directory ) + noindex to discourage crawlers from spending too much time there.
3 – A page that doesn’t require an update (or not often) but is evergreen (e.g. How to clean your oven with baking soda)
- Ghostblocks: this is a typical case where those are very efficient in keeping content not too many clicks away
A rule of thumb…
I would keep this in mind: The goal is to keep the important pages within the first levels, not ALL pages.
Some caveats and tactics not mentioned here for the purpose of the article (i.e. pagination):
- Search Engines are probably not following blindly deep pages even if linked with Ghostblocks because smart enough to understand page-899522 will be less interesting to recrawl than page-2 – still, a link vs. no link makes a difference
- Sitemaps could reflect or not that strategy: making a page harder or impossible to find on your website will make little difference if it appears in your sitemap (from a crawl frequency standpoint)
- Leverage Internal Linking: if that page is important, it needs to be ‘promoted’ more than the others and prevent it from being buried too deep: related content, contextual navigation, etc.
- Internal Search: at some point, the best user experience to find older content might be your internal search. It is rare to see a website with a performant one and advanced search filters options, it might be something to look at in some cases (e.g. forum)