Google has a "spider", a Googlebot, which crawls around the internet and notifies the algorithm when a new page appears, or when existing pages are changed. But what to do if the spider can't find your page?
Things that have to do with algorithms and programming are called technical SEO. Technical SEO does not care whether your text is readable, unique and searchable, but whether the page loads quickly, has good contrast and the like. And if it is even possible to find it in search. When you talk about SEO optimization, you often focus on the first category, while technical SEO might be forgotten. In this article we will focus on how to get Google to find your page as quickly as possible.
First of all: check if a page on your domain is indexed, preferably a page you have recently published. To see if the page is indexed, you can paste this into your search field:
site:mittdomene.no important-search term-that-the-page-should-appear-on
Replace the placeholder information with your own URL and search terms. This tells Google that you only want to search your own domain. For example, am I searching for:
site:frontkom.com non-working keywords
I can only find one article on frontkom.com that mentions non-working keywords:
I search instead for:
site:frontkom.com bad marketing tips
I can't pick up anything. It means that this blog post has not been indexed yet.
Google and all search engines have a "spider", also called a "crawler", which is a program that constantly searches the web for web pages and useful content that it can display in search results. Google's spider is called quite simply Googlebot. Googlebot follows links on your website and sitemap that you provide, and builds an overview of how the page is structured. Afterwards, it analyzes the content of the pages and adds them to its library. This is called indexing.
When a page is indexed, it will be added to Google's search page (Search Engine Results Page/SERP) so that it can be found by those who are looking for what the website offers.
Google's algorithms are becoming more and more sophisticated, and is constantly evolving towards rewarding the most user-friendly and relevant content. But since we're still a long way from quantum machines that can run the same process a thousand times at once, even super-relevant content sometimes won't show up in search results until weeks or months after it's published. Googlebot has a maximum working capacity which means that reviewing pages takes a certain amount of time. At worst, Google will link to pages on your website that give 404 errors.
For businesses with a lot of traffic, operating in a market with a lot of searches, or simply wanting to appear credible, this can be a problem.
Fortunately, Google is aware of this problem and has developed a tool that can deal with it: namely Google Search Console. Search Console can do a lot of cool things, like help you find performance issues, do keyword analysis, post sitemaps, and more.
The function we need is the one that makes Googlebot aware that it should add a review of a page to its worklist, rather than discovering the page organically. Here's how to do it:
Open Search Console and select the correct domain. Then press “Review URL”
We get a few different fields with information about the page, but what we're looking for is this:
Here we see that the blog post has not yet been indexed at the time of writing. This means that if you search Google for the keywords we used in this article, our article will not appear in the search results.
If Google doesn't find the article, the customer won't find us either when they use the search terms. We are therefore at risk of losing potential customers. We need to fix that, ASAP.
The solution is simple from here: Just press the “Request indexing” button.
After a few seconds of waiting, you will get this dialog box:
There, you've asked Google to prioritize this page. It may still take a few days before it is indexed, but in any case we have done everything we can to make the process go quickly.
If you constantly see that pages are not indexed, and that you have to enter and add them manually, then you need to check more carefully whether there is something wrong with the website or the sitemap. A good place to start is Google's own troubleshooting documentation.
When you update a landing page or an article, you also want to know whether the update has had any effect on reach, SERP ranking, etc. To keep track of which pages are indexed when, you can use an annotation tool, for example in in Google Analytics or in SEO tools such as Semrush.
Enter the Search Console again and click on “Pages”:
Then go to "See data about indexed pages":
At the bottom of that page you will see an overview of which pages are indexed on which date. I recommend checking this at least once a week.
Then go into your chosen annotation tool and write a note on the pages you want to check. In Semrush, we do it on the main page for keywords (Position tracking->Overview), to have a quick overview:
The annotation will then tell you the reason for the sudden jump (or fall!) you see in the overview of the keywords you have ownership of.
Good luck! And as usual: if you need help with SEO-related work or want an SEO audit of your website,we are always available for a chat via our contact form.