Obviously, the faster you can get your pages indexed, the faster you can appear for relevant terms and drive traffic to your web properties.
Most SEOs allow the search engines to “do their thing”, not knowing there are some surefire ways to get their pages in the index quicker than they realize.
Without the obvious things like setting up a sitemap or a robots.txt, let’s see what some of those are:
1. Get on the Search Consoles
It surprises me how many regular webmasters miss this critical step. Sure, it may be our own curse of knowledge at play here, but basically step one of any SEO campaign involves letting the search engines know about your site via their own webmaster central they’ve created for you.
Google, Bing and Yandex all have their own dedicated console ready for you to take advantage of. If you haven’t already, let’s get on those:
From here, you can request indexing, submit a sitemap and see the status of your indexed and non-indexed pages.
2. Ping the Spiders
If you don’t get crawled, you won’t get indexed. Spiders are the programmed applications (also known as “robots”) that are tasked with crawling the web, which can be pinged to re-crawl your site. In times past, you could directly ping the robots in the console to do just that, but these days you would use services such as:
There’s no direct evidence to suggest these services make a significant impact, nor is there a way to tell which service is the most effective, but it certainly can’t hurt to try them all considering it’s free to do so and should only take a few moments to complete.
3. Fetch the Sitemap
After you create a XML sitemap, going beyond including it in your search console(s), you can also notify Google whenever there has been a change to the sitemap by doing a forced fetch using the following URL:
https://www.google.com/ping?sitemap=https://YourWebsite.com/sitemap.xml
Replacing the string with the absolute URL of your sitemap.xml file.
4. Mega Internal Linking
Most blogs and sites are naturally set up in a way to display the new and fresh content first. This is not merely a benefit to returning visitors, but also for robots to crawl the latest page. Many sites, however, may not have this feature for a variety of reasons – their new page could be buried 2, 3 or even 4 layers deep into their site, which a spider may not reach for a while (if at all).
Therefore, if it is feasible, it’s always a great idea to have every page accessible directly from the homepage. Whether it is in the footer or found on a mega menu, this will dramatically increase the chances of having your newer pages indexed much quicker than if they were to simply be published and forgotten about.
5. Build Single Purpose Links
While building backlinks is a part of any SEO strategy – the concept of creating direct links for each new page may not be in the scope or just overlooked when the projects are much larger and resources cannot be spent on lone link building efforts in this way.
Considering the speed of implementation and the increased likelihood of quicker indexing, however, it’s a good idea to take a few seconds to promote your new page in a variety of ways, such as:
- Repost on Medium
- Link on your Twitter bio
- Share with an online community (Reddit, Facebook group etc.)
- Public forum signature
Depending on the niche, you could also make submissions to certain aggregators such as Growth Hackers and Hacker News also.
Final Thoughts
There are other sneaky ways some SEOs engage in that I won’t discuss here, as these should be more than sufficient to help you get your pages crawled and indexed relatively quickly and faster than once imagined.
To the index!