There is nothing more satisfying that finishing a new website. By the time the creative and engineering are done, chances are you’re ready to take a break and watch traffic flow to your site.
At this stage, you can shift your focus to nurturing an audience. Since so many people are searching on Google, roughly 1 trillion searches a day, this is a reliable channel for getting your message out to new people.
The popular line from Kevin Cosner’s Field of Dreams, “If you build it, they will come.” doesn’t apply to search engines.
In order to show up in Google search results, your site needs to be indexed by Google. I put together this introductory guide to indexation for anyone who has a new website — after all, you’ve put in all the work so don’t stop now!
When you type into the Google search bar you aren’t actually searching the web. You’re searching Google’s index of the web. You can think of the index like a giant library that stores all the books, or websites, that Google can find.
In a traditional library, librarians add and take away new books from the collection. A manual process like this would take forever given that there are 30 trillion pages on the web.
How does Google find websites, which you can think of as books, and webpages, or the pages on a website, into it’s massive library? Using spiders.
A web spider finds a few webpages on the web and follows the links to other webpages and the webpages they point to. If Google can’t find your page because there are no links pointing to it then your website won’t be indexed — your website won’t show up when someone types a keyword into Google.
Google eventually indexes most websites, but you don’t want to wait around for a spider to come by because it could take weeks or months. Google’s spiders are really busy, so they prioritize pages that are trustworthy. Since you have a new site, you don’t have much street cred as far as Google is concerned.
You don’t have to wait for your site to be indexed. Below, we have two methods for getting your site indexed fast and a couple other recommendations to make sure your optimizing your time with Google’s spider.
1) Create a Sitemap
There are different ways to implement a sitemap, but if you think of Google as a library and a website as a book, then a sitemap is a combination of details about where your book is located and what’s inside.
There isn’t a one-size-fits-all solution for sitemaps. It depends on the type of website. For publishers an RSS feed might make sense, but if you have an eCommerce website or one with thousands of pages then you might want to create an XML sitemap that is regularly updated.
Google has all the details you need on creating sitemap, regardless of your site’s size. Once you’ve created your sitemap, you’ll need to submit it through Search Console.
2) Fetch & Render
A sitemap is the best solution if you’re regularly producing new content. Let’s say you’re a small blogger or business who updates your blog 2-3 times a week and it’s rare for you to add new pages to your website.
A sitemap might be too complicated in this case. Since your site is small, you might as well just use the Fetch as Google feature that’s available through Search Console.
3) Robots.txt
Where a sitemap tells Google which parts of your website to visit, you’re robots.txt file can tell Google which areas are forbidden — kind of like the Restricted Section from Hogwarts Library in Harry Potter.
You’ll want to block any admin pages from being crawled, but if you view your robots.txt file all your customer-facing pages should not be mentioned. Unless, of course, you have a reason for excluding some pages from your index. You can use the robots.txt file to prevent duplicate content and to exclude low-value pages from the index.
The directives in your robots.txt file are guidelines, not hard and fast rules. Googlebot, or the Google spider, will most likely follow your directives but the same cannot be said for other crawlers.
Check your robots.txt file by navigating to http(s):// to make sure your entire website isn’t blocked.
4) Internal Linking Structure
If you find that you’ve implemented these best practices and your new pages and posts are still slow to show up in search results, then make sure there are links on your site pointing to your new content.
Cross-link relevant articles, create a logical site navigation, or even a sidebar widget that links to your most valuable pages.
Indexing is the first step to ranking in search results. If Google, or any other search engine, can’t find your website then it won’t be very valuable to optimize the page.
To check on your site’s indexing status, visit Search Console and Crawl Errors and Crawl Stats to check on your site’s index health.
If you have other questions about indexing, I’m happy to answer them in the comments below!