First, let’s understand the basic terminology and how the search engine works.
Google search engine like a spider crawls the World Wide Web, finds and adds new web pages to the database. After the user enters a query, the search engine sorts the resources so that the top are those whose content most closely matches the entered search phrase.
Now for the terms:
- Indexing – adding and saving new pages to the database. (in Google is the collection and entry of information about a content resource into the search engine database.)
- Crawling – the process of bypassing hyperlinks in order to find new content.
- Ranking – distribution of pages by relevance to the key query.
- Indexing time.
How long does it take to index a Google page?
There is no definite answer to this question. However, the practice of webmasters shows that it can take from several days to several months for a new site to be added to the search engine index. In this article we will consider the factors that speed up and slow down this process and try to answer the question whether it is possible to estimate the real time of Google page indexing.
Every site owner sooner or later faces some problem. One of the most common among them is the problem of indexing one’s own site. If you want to know how to easily and quickly optimize a site for mobile devices, check the sitemap or change the domain – these tips are for you.
SipkoSeo specialists have prepared information that you will definitely need. From the article, you will learn how to fix common problems so that Google starts indexing the pages of the site again.
1. You Don’t Have A Domain Name
So, the first and probably the most common reason why Google does not index your site is the lack of your own domain name. This can usually be because you’re using the wrong URL for the content or it’s not configured correctly in WordPress.
*Domain name is a unique alphanumeric character set that identifies a website in search engines and for visitors. In other words, a domain name is the name of a website on the Internet.
2. Your Site Is Not Mobile-Friendly
Ease of use of any Internet resource on mobile devices is crucial for Google indexing, as it introduced “smart indexing Mobile-First.
No matter how good the content on your site is, if it’s not optimized for viewing on a smartphone or tablet, you’ll lose out on rankings and a lot of traffic, which are one of the fundamental components of successful indexing. Most people think of mobile optimization as something similar to Dante’s “9 circles of Hell”, but in fact it is not at all. It not only doesn’t have to be complicated, it should take minimal time and minimal resources to create, based only on simple web design principles, such as the simple addition of responsive design principles such as floating markup and CSS Media Queries can make a big difference in ensuring that users can find what they need without any navigation problems.
3. Make sure that the web page is unique, useful, not an “orphan”, not a duplicate.
Google does not pay attention to pages with non-unique content. Therefore, if there are no technical problems, the problem may be in the content. Try to look at the content through the eyes of an ordinary person, make it more interesting and useful.
It is important to remember that content that contains more than 1000 words is much more effective than content that contains less than 1000 words.
Writing articles for the site should also take into account the “position” on certain issues of its readers and himself. Also a very important aspect of successful indexing is the correctness of the chosen topic, choosing a topic take into account the interests of readers. After all, well-written content in the context of competition is the key to success. If you are wondering why a website does not rank high in Google search results for some keywords, despite the fact that it follows leading SEO techniques such as adding relevant keywords throughout the text, one of the culprits is “thin content” on the page, where there really should be more than just 100 words.
Also, search engines do not like sites with non-unique content that “trade” links. Such sites sooner or later (and recently it happens very often) fall under the filter of the search engine, from which it is almost impossible to get out and the pages of such sites disappear from the search forever. If the site had non-unique content, but links from it were not sold, there is a chance that the site will be indexed and ranked normally after unique articles appear on your site.
As for orphan pages, their feature is the absence of any incoming links, both within your Internet resource and from third-party web platforms. Since search engine bots are looking for new content, moving alternately between pages, they, in fact, like users, cannot find orphan pages. You can check the presence of “orphans”, as well as duplicates, with a parser. Only in the first case you will need to compare the number of all crawled pages with the list of urls uploaded through the CMS. Pages that are not found during parsing will be the searched “orphans”.
4. The site loads slowly
Websites that load too slowly greatly reduce the likelihood that Google, as well as any other search engine, will rank them at the top of the index results. If your site takes a long time to load, it can be due to many different factors.
Use Google PageSpeed Insights – it is definitely one of the best tools for working with websites. It helps to determine which sections of the site require urgent attention to improve its speed. The tool analyzes the web page according to the top five performance recommendations (which are crucial for faster loading sites) such as minimizing connections, reducing payload size, using browser caching, etc. and gives you suggestions on how to improve each aspect.
The second such tool is webpagetest.org. This tool will inform you if the site is loading fast enough or too slow. It will also let you see specific elements on the site that are causing particular loading problems. This allows you to identify serious speed problems before a page is dropped from the index.
5. Optimize robots.txt, sitemap.xml and .htaccess files
Another, but no less important problem is an incorrectly composed robots. The presence of Disallow blocking directives in it can prevent the bot from processing the entire resource or its individual web pages. It is important to remember that every website has pages that should be blocked from indexing. These include technical, search results, get-parameters, login, admin, shopping cart, “trash”, etc.
The sitemap.xml file is necessary for the interaction of your online resource with the search engine. It informs it about important pages and the recommended frequency of crawling.
6. Your site is not user-friendly and not attractive to visitors
Having a user-friendly and attractive website is essential for effective SEO. When visitors can easily find what they are looking for and browse your site without getting frustrated or annoyed, Google will rank your site higher in search results.
Google doesn’t want users to spend too much time on pages that take too long to load, have confusing navigation, or are difficult to use because of too many distractions, such as ads at the top of the page.
If you only list one product in each category instead of multiple products, this could be the reason why your content is not ranking well on Google! It is important not only to target keywords in each article, but also to make sure that all related articles link to other related articles/pages on that topic.
7. If you use a complex coding language, Google will not index your site.
If you are having problems, we recommend running Google’s mobile-friendly tool to check how mobile-friendly your site really is (and make any necessary fixes). If your website is not yet up to par, there are many resources to help you with the various design areas that may arise when developing a responsive web page.
Google wants you to support crawling of all JS and CSS. If any of these files are blocked, you can unblock them and allow a full crawl to give Google the right insight into your site.
9. Your meta tags are set to Noindex, Nofollow
Usually, such meta tags are set only in two cases: for the complete transfer of indexing to noindex, or completely by accident. For example, there may be a link or page on the site that has already been indexed by Google’s search robot, and then deleted or changed before the transition to noindex, nofollow was correctly configured in the server side of your site. As a result, this page may not have been re-indexed, and if you use a plugin to block Google’s site analytics services from crawling your site, this page may fall off the “list” and not be indexed.
You should always pay attention to such a trivial things, because the performance of your site directly depends on it
10. Make a competent internal linking of the site
It involves putting (sometimes transferring or rewriting) links from one page of the resource to another. The internal structure of the site with correctly set linking not only increases usability, but also greatly helps the user to quickly navigate, simplifying the search for the necessary elements in the huge structure of the site. Also competent internal linking will provide a significant increase in the speed of indexing of new materials and their appearance in the list of search engines.
11. Your SEO is not effective enough
People by nature are quite lazy, so very often they make wrong choices, choosing something on principle: cheap and fast – means quality. However, high-quality work on the site, and SEO in general can not be cheap. If you want the site to not just appear in the index, but began to generate income, increasing traffic and attracting new users, then you need to work on it regularly, providing its system with everything necessary for successful and high-quality work. Proper technical SEO is another key to the development of your Internet resources. After all, who wants to return to the site, which is constantly updated, without loading to the end, crashes, or contains a lot of information that is useless, and worse – can not solve your problem, or answer questions that interest you?
12. You use plugins that block Googlebot from crawling your site (Indexing ban is enabled)
If after creating a new site you can’t wait for Google to index it for several months, the reason may be a ban on indexing. To check this, go to the root folder of the site and open robots.txt. Check if the Disallow directive is written there. In addition, indexing, as described above, can be prohibited by setting the noindex and nofollow tags in the source code.
13. Check for duplicates and correct use of the rel=”canonical” attribute
The presence of duplicate content can be another reason for slow or zero indexing. If the page is duplicated or has content that is 90-99% similar to another page, Google is unlikely to index it, and as a result – it simply will not be shown in the list of search queries. because often such pages do not even get into the top 50. So, make sure that there are no duplicate pages on the site. If there are, then it is recommended to specify the canonical version of the resource using the rel=”canonical” tag or delete such pages, as Google will consider their content non-unique.
The last but not least important issue related to indexing is backlinks.
If you want your website to work as it should – you definitely need quality backlinks .The presence of these links shows Google that the page they point to has weight. Google considers such resources more important, so it crawls them more often.
It is worth remembering that in the pursuit of quality backlinks, first of all, you should pay attention to the authority and reviews of the resource on which you are placed.
How quickly can a search engine index a page?
In most cases it takes a few days. It can take up to 24 hours, although it cannot be said that it happens all the time. Webmasters admit that it can be different: a page can appear in the index the same day, a week or even a month later.
How long does it take to index changes on pages?
There is no definite answer to this question. Google works at a certain pace, but it is known that in addition to the frequency of updates, it also takes into account other factors. In order for the search robot to visit the site more often, it must be regularly updated. After making changes, for example, deleting or adding pages, send a request for the transition through the webmaster panel.
Slow indexing of new pages.
It may take from 24 hours to several months for a new page to appear in the search results. This is because the site is new, it has no inbound links.
Waiting for a few weeks for an address to appear in search results is a natural process. As Internet users, we are used to getting instant answers to our queries, but on the other hand, for webmasters it does not happen as fast as we would like.
Slow indexing of changed pages.
It should be understood that regular and frequent content updates speed up indexing and, therefore, increase the chance to rise in the search engine rankings. The more often something happens on the site, the more likely Google will consider it active and send its crawler to analyze the changes more often.