FAQ: What is the best way to get new pages indexed in Google?

1) Encourage your pages to be crawled naturally

Ensure your content can be accessed by crawlers / spiders / search engine robots like Googlebot, by linking to your pages from your homepage, main menus and other major pages on your site, and using accessible code.

This depends on Google knowing about your site in the first place though, which itself can be guaranteed in the same kind of ways – by either adding your site to Search Console, or linking to it from other sites that Google is already aware of and crawling often.

This method also depends on your pages meeting basic quality standards. Google will be more reluctant to index your page if it seems spammy. There’s no guarantee you’ll get indexed, irrespective which method you use.

2) Submit pages through Google Search Console

The other main way of getting pages indexed in Google is to submit new pages through the Search Console, by either using the ‘fetch’ tool under crawling (which can also be used to identify crawling errors that prevent your pages from ranking), or by uploading sitemap files that contain a current list of all your site’s pages.

Pros and cons of URL submission vs link-building for indexing

Manual submission works fast and avoids linkbuilding spam signals

The main pro of getting your pages indexed by submitting them through the Search Console is that it’s pretty much as instant as possible (unless you over-do it), and it avoids the risk of spam penalties associated with the process of rapid unnatural backlinking that’s often done to get content ranking quickly on sites that are not well established and frequently updated enough for quick natural pickup.

The risk of spam signals via the linkbuilding method exists even if you only build one high quality link from a partner site, because of its lack of natural growth there won’t be additional similar links to follow so Google will always sense it’s role as part of an unnatural growth pattern eventually.

Manual submission is most relevant during absence of link equity, which is a problem in itself

A con of submitting pages manually is that it does lead to pages pages indexed that may entirely lack link equity.

This isn’t always a problem depending on how competitive the niche is, and how good the content is at avoiding bounces (which is more important than linkbuilding in the long term). However, when link equity is really needed for high rankings, which is most of the time, then you’re not going to get any significant results by having it indexed through manual ‘tip off’, plus that act of manually tipping off Google can stunt the momentum you’d get by having the content found via endorsement (linking) from other sites.

Risk of submissions being correlated with spammy campaign activity

The manual intervention of submitting URLs via the Search Console could potentially be used by Google as a signal that you might be ‘interfering’ too much in the natural ranking of your site, which could potentially work against you, depending how often you do it and if it correlates to other detected spam signals. This last point is worth heeding especially if participating in campaigns which could fall under the bracket of spam, and anyone who would probably not want to verify all their sites under the same active Google Account (in case they all get penalised together) nor verify them under fake/dummy accounts that Google knows have minimal activity (another thing that could work against you).

Sitemaps usually contain misinformation

A con of submitting URLs via Sitemaps is that while they can be simply up-to-date lists of all the pages on your site, they are usually a lot more complicated than that, with ‘link equity modifiers’ that don’t really work as a directive but can be used as a hint. Google takes this data with a huge pinch of salt because sitemaps are usually auto-generated so Google’s natural link equity distribution system is much more meaningfully founded and useful. People also often leave some URLs out of sitemaps – in theory this would stop Google from being so keen to index them, but as mentioned, Google only uses this as a hint because errors are so common. So all in all, sitemaps are messy business and they’re really only needed by sites that have technological barriers to allowing Google to crawl up-to-date pages and deindex defunct pages as it normally would, so for those sites sitemaps can mean the differentce between zero traffic and a thriving business. However for 99.99% of sites, sitemaps are not actually helpful. Google still pushes them for the little value they have, but heavily devalues their features all except for basic discovery of new content – finding URLs it doesn’t yet know about.

So, what is the best way to get pages indexed in Google?

The main way to should get your pages indexed is by initially manually submitting your site in Google Search Console (as it’s a useful tool for various reasons). Once your site is indexed by Google, ensure the homepage links to inner pages, and there’s always a path from the homepage to any other page on your site. To increase Google’s interest in crawling your site often, keep updating it and consider building links to the homepage and other important pages using white-hat methods, but be careful to avoid spam penalties (which can be imposed on people who update their site too often with junk content, and on people who build too many spammy links).

Ranking is another story

This basic approach to getting your pages indexed efficiently is not necessarily enough for also getting your pages ranking highly for competitive terms – this is only the basic approach to getting them initially featured in the index – you should keep working to improve the quality and reputation of your content to make it rank most highly.