How can I get pages indexed in Google?
Submitting pages through Search Console
Generally submitting new pages to Google is done through the Search Console, by either using the ‘fetch’ tool under crawling (which can also be used to identify crawling errors which prevent your pages from ranking), and also by uploading sitemap files that contain a current list of all your site’s pages.
Ensure content can be crawled
The second way of getting pages indexed in Google is to ensure that they can be accessed by spiderbots by linking to the pages from your homepage, major menus or any other major pages. However this depends on Google knowing about your site in the first place though, which itself can be guaranteed in the same kind of ways by either adding your site to Search Console, or linking to it from other sites that Google is already aware of and crawling often.
What are the pros and cons of submitting pages through search console vs. building links quickly to get content ranking?
The main pros of getting your pages indexed by submitting them through Search Console is that it’s pretty much as instant as possible and avoids the risk of spam penalties associated with the process of rapid unnatural backlinking needed to get content ranking quickly through linkbuilding. Those risks exist even if you only build one high quality link from a partner site, because of its lack of natural growth there won’t be additional similar links to follow so Google will always sense it’s unnatural growth pattern eventually.
A con of submitting pages manually is that it does lead to lack of link equity which you’d gain from building links from other indexed sites, which isn’t always a problem depending on how competitive the niche is, and how good the content is at avoiding bounces (which is more important than linkbuilding in the long term). However when link equity is really needed for high rankings, which is most of the time, then you’re going to get any significant results by having it indexed through manual ‘tip off’, plus that act of manually tipping off Google can stunt the momentum you’d get by having the content found via the endorsement (linking) of other sites. This manual intervention could potentially be used by google as a signal that you might be ‘interfering’ too much in the natural ranking of your site, which could potentially work against you. This last point is worth heeding especially if participating in campaigns which could fall under the bracket of spam, and anyone who would probably not want to verify all their sites under the same active google account (in case they all get penalised together), or verify them under fake/dummy accounts that google knows have minimal activity (another thing that could work against you).
Another con of submitting URLs via sitemaps is that while they can be simply up-to-date lists of all the pages on your site, they are usually a lot more complicated than that, with ‘link equity modifiers’ that don’t really work but can be used as a hint. Google takes this data with a huge pinch of salt because sitemaps are usually auto-generated so google’s natural link equity distribution system is much more useful. People can often leave some URLs out of sitemaps – in theory this would stop Google from being so keen to index them, but as mentioned, Google only uses this as a hint because errors are so common. So all in all, sitemaps are a messy business and they’re really only needed by sites that have technological barriers to allowing google to crawl up-to-date pages and deindex defunct pages as it normally would, so for those sites sitemaps can mean the different between 0 traffic and a thriving business. However for 99.99% of sites, sitemaps are not actually helpful. Google still pushes them for the little value they have, but heavily devalues their features all except for simply finding URLs it doesn’t yet know about.
So what is the best way to get pages indexed in Google?
Under Google’s recommendations the main way you should get your pages indexed is by manually submitting your homepage/initial set of pages in Search Console. Once your site is indexed by Google links should then be attained using white-hat methods to avoid spam penalties.
However in competitive niches this is not enough to get your pages ranking strongly, hence the approach suitable for most sites is to ensure all your pages are accessible by crawlers by linking to them from your homepage/menus and to support your pages by building a small number of links from sites already indexed by Google.
Therefore it is recommended that you focus on your ranking strategy (high volume competitive terms vs. niche markets), then your indexing strategy will follow.