When you see the message “Detected, not indexed” in Google Search Console, it means that Google has found and crawled the pages on your website but has not added them to its index. This can occur for various reasons, and it’s important to address the underlying issues to ensure that your web pages get properly indexed and show up in Google search results.
Here are some common reasons why Google might mark pages as “Detected, not indexed” in Search Console:
Noindex Meta Tag: The page may contain a meta
tag with the noindex
directive, which tells search engines not to index the page. This is often used for pages you don’t want to appear in search results, like privacy policy pages or thank you pages.
Example: You have a “Thank You” page for users who have completed a form on your website. You want to acknowledge their submission but don’t want this page to appear in search results. You add the following meta tag to the HTML of the page:
<meta name="robots" content="noindex">
Robots.txt Blocking: The page may be blocked by your website’s robots.txt file, which instructs search engines not to crawl or index certain pages or directories.
Example: You have a development or staging area on your website that you don’t want search engines to index. In your robots.txt file, you specify that search engines should not crawl this directory:
User-agent: * Disallow: /dev/
Canonical Issues: If there are canonical tags or rel=canonical attributes on your pages, they may be pointing to a different URL, causing Google to not index the page.
Example: You have a product page that exists in multiple versions with different query parameters. For instance, the same product page can be accessed using URLs like:
https://www.example.com/products/product-1?color=red
https://www.example.com/products/product-1?color=blue
To address this, you add a canonical tag to specify the preferred version:
<link rel="canonical" href="https://www.example.com/products/product-1">
Low-Quality Content: Google may choose not to index pages with low-quality content, thin or duplicate content, or content that violates its guidelines.
Example: You have a blog post that is poorly written and doesn’t offer valuable information to your audience. The content may contain numerous spelling and grammar errors, and it doesn’t provide any unique insights or solutions to a common problem.
Crawl Errors: If Google’s crawler encounters issues accessing the page due to server errors, redirects, or other technical problems, it may not index the page.
Example: Your server experiences frequent outages, leading to intermittent accessibility issues. When Googlebot attempts to crawl your site, it encounters server errors, which prevent it from accessing certain pages.
Page Speed and Mobile Friendliness: Slow-loading pages or pages that aren’t mobile-friendly may not get indexed because Google aims to provide a good user experience to its searchers.
Example: Your website has a large image and video files that are not optimized, causing pages to load very slowly, especially on mobile devices. The pages are also not responsive, making them difficult to use on mobile phones.
Manual Actions: In some cases, Google may have taken manual action against your site due to violations of its quality guidelines, which can result in some or all of your pages not being indexed.
Example: Your website has been engaging in manipulative link-building practices, such as buying low-quality backlinks from link farms. As a result, Google’s manual actions team has reviewed your site and issued a penalty. In response, they may have deindexed or demoted certain pages.
To address the issue, you should:
Review your page content: Ensure that the content is valuable, unique, and not of low quality.
Example: You run an online gardening blog, and you have a post titled “10 Tips for a Beautiful Garden.” Upon review, you find that the content consists of generic advice found on many other gardening websites. To improve it, you add personal anecdotes, unique tips, and images from your own garden to make the content more valuable and distinctive.
Check for noindex tags: Make sure your pages don’t have a meta tag with a noindex directive.
Example: You discover that one of your important product pages, “Product XYZ,” is not appearing in search results. Upon inspecting the page’s HTML, you notice a meta tag like this:
<meta name="robots" content="noindex">
You remove this tag to allow the page to be indexed by search engines.
Review your robots.txt file: Ensure that important pages are not blocked by the robots.txt file.
Example: You realize that your “Contact Us” page is not appearing in search results, even though it’s essential for users to find. After reviewing your robots.txt file, you find that the entire “/contact/” directory is disallowed for crawlers. You adjust your robots.txt file to allow crawling of this directory:
User-agent: * Disallow: Allow: /contact/
Check for canonicalization: Ensure that canonical tags are correctly set, and they are not pointing to different URLs.
Example: You have a category page for “Running Shoes,” and you notice that there’s a canonical tag pointing to a different URL. In the HTML of the page, you find this tag:
<link rel="canonical" href="https://www.example.com/shoes/sports-shoes">
You update the canonical tag to correctly point to the “Running Shoes” category page.
Address technical issues: Resolve any crawl errors, server issues, or problems with page load speed and mobile-friendliness.
Example: Users have been reporting that your website loads slowly on mobile devices. You investigate and find that large, unoptimized images are causing slow load times. You compress and optimize the images, implement mobile-friendly design changes, and address other technical issues to improve the site’s performance and mobile-friendliness.
Verify manual actions: Check if Google has taken manual action against your site and address any issues accordingly.
Example: You notice that a significant portion of your website’s pages has been deindexed, and your organic search traffic has plummeted. You check Google Search Console and find a notification stating that your website has been penalized for unnatural link-building practices. You take immediate action by removing or disavowing the problematic backlinks and submit a reconsideration request to Google, explaining the steps you’ve taken to rectify the issue.
Regularly monitoring Google Search Console and addressing issues that prevent pages from being indexed is crucial for maintaining good visibility in Google’s search results.