It typically takes around a week for a new page to be indexed in Google after you submit a sitemap or submit a crawl request in Google Search Console, if you’ve waited this long or even longer and there’s no sign of your page or website then there could be other issues going on.
Having spent over a decade working in SEO in and around Liverpool I’ve come across many sites, and pages that for some reason just won’t appear, or at least won’t stay, in Google’s index.
Some of the most common causes of this include the following:
1. Noindex Tags
Noindex tags tell search engines not to index a page, if you have a no-index tag on your page then Google will not index the page under any circumstances.
Noindex tags are sometimes used by developers on staging sites and could have been left on the site when the new site was put live.
A no-index tag will typically look something like this:
<meta name="robots" content="noindex">
To check if a page has the noindex tag on it right click on the page and select ‘view page source’.
Then Press CTRL & F to open up the search box and type in ‘noindex’.
If you find a noindex tag present you need to remove it in order to enable the page to be indexed again.
One of the most common ways that the noindex tag is added to a page is via the Yoast plugin.
If you are using Yoast follow these steps to fix it:
- Log in to WordPress.
- Click ‘edit’ on the page where the noindex tag was found.
- Scroll down to the bottom and open up the Yoast drop-down section.
- Find the ‘Advanced’ section and make sure that where it says “Allow search engines to show this Post in search results?” it is set to “Yes“.

2. Your Robots.txt File Prevents Search Engines From Indexing Your Site
Another common reason for a website not appearing in Google is that the robots.txt file tells bots not to crawl it.
To check if this is the case on your site go to yourwebsite.com/robots.txt.
You want to look for code that looks like this:
User-agent: *
Disallow: /
This is telling all bots that they are disallowed from crawling any of your website.
Also look out for variations of that code such as:
User-agent: Googlebot
Disallow: /
This specifically prevents Google from crawling your site while allowing other bots to crawl it.
The most common way that this happens is when the ‘Discourage search engines from indexing this site’ check box has been selected in your WordPress settings.
To fix this follow these steps:
- Log into WordPress.
- Navigate to ‘Settings’ in the sidebar menu
- Under ‘Settings’ click ‘Reading’.
- Where it says ‘Search Engine Visibility’ make sure that the check box for ‘Discourage search engines from indexing this site’ is not selected.

3. Poor Internal Linking
Internal linking is an easy way to help your content be found on your own website.
Do it well and it can transform your rankings.
However, do it poorly and it can create confusion and have the opposite effect.
While poor internal linking shouldn’t cause a page to be deindexed it absolutely will cause it to rank far worse than it otherwise could, making it much harder to find in the search results.
Here are four tips on how to do your internal links well in a way that will help improve the search visibility of your content:
- Don’t link to a page more than once from the same post, multiple links from the same page with different anchor text can make it confusing for bots to work out exactly what you want to rank the target page for.
- Use clear anchor texts that make it obvious to a user what type of page they will go to if they click it.
- Only link from pages where it is relevant to do so, for example, don’t stuff links to all of your services in the conclusion of an article.
- Place links prominently in your content, ideally above the fold so that users are more likely to click them.
4. Your Content Doesn’t Match The Search Intent
Search intent involves understanding what exactly a user is searching for when they type a keyword into Google.
The easiest way to understand this is to test it out, simply type your target keyword into Google before you plan any content for it.
If you try to write an informational article for a result that is bringing up eCommerce results or vica versa then you are going to struggle to rank.
Here’s an example:
Suppose you run a web store selling dresses and you want to rank a specific product page for the search term “blue and gold dress”.
You’re going to struggle to rank because most people searching for this want to see images of the blue and gold dress meme that circulated a few years back.

Google knows this and consequently serves results relating directly to that rather than serving eCommerce pages.
5. Your Content Is Not Good Enough
Good quality content is essential for success in search results.
If your content is clearly nowhere near as useful as whatever is ranking on page one then you are likely to have trouble ranking.
The pages and posts on your site need to be useful and informative.
Make sure to answer all user questions, provide all the information they may be looking for and where possible find ways to differentiate yourself from other sites trying to rank for the same search term.
This doesn’t mean all of your content needs to meet some arbitrary length, however it does mean that you should strive to make your content as useful as possible for the user.
If Google decides that it is not particularly useful then you will struggle to rank.
6. Lack Of External Links
The internet is essentially a collection of linked documents. Links are crawled by search engine bots, following these links around the web allows them to find new content.
In theory the more links you have to a page the more times it will be crawled.
Conversely, if you have zero links to your website then it is not going to get crawled at all, making it far less likely that search bots will find it and index it.
To fix this you don’t need to invest tons of effort into running a big link acquisition campaign right from the get-go.
Simply securing one or two relevant links from trusted websites in your niche can be enough to get your content found and indexed.
7. Your Site Has Been Penalised
A final reason why you may be struggling to get your site appear to appear on Google is that your site may have received a manual penalty.
Manual penalties occur when someone at Google has manually reviewed your site and found it to be breaching its guidelines.
To check if your site has received a manual penalty head here and click the ‘Open The Manual Actions Report’ button.
This will take you to the manual action report in your Google Search Console account.

If you see the ‘No issues detected’ message in there then you haven’t received a manual penalty.

However, if you see a different message in there then you should also be given guidance on what needs to be done to resolve the penalty. Once you have taken the necessary steps you can resubmit your site to be re-reviewed.
Thank you for sharing, very insightful and helpful
you’re welcome Bhati, glad you found it useful