Why Is my Website not Showing up on Google: 15 Possible Causes

People often ask Google what they can do to get their website’s organic rankings on the first page of search results. This article outlines 15 possible causes for why your site is not showing up in Google, and how you might be able to fix it.

The “why does my website not show up in google search” is a question that many people have been asking. Google provides 15 possible causes for the issue, which can be helpful to provide insight.

Before you go through the list of reasons why your site could be showing up, go over this brief pre-checklist of things to do and things you should have in place. While they may not be able to solve the problem, they may help you narrow it down and avoid having to go through the checklist item by item to find the underlying cause.

Checklist for the Pre-Audit

  • Are you using Google Search Console?

  • Have you set up Bing Webmaster? 

  • Is Google Analytics up and running and gathering data? If you’re not sure, put it to the test first.

  • Use the SEMrush Audit tool and/or your preferred crawler to crawl your site.

For those who are more advanced, you should also consider:

Now that you’ve completed those tasks, we’ll look at the difficulties that may be causing you to be invisible in Google.

1. Visibility in Search Engines In WordPress, change the setting to “Discourage.”

This is a WordPress option that is often turned on by default. The option next to “Discourage search engines from indexing this site” will be checked, which will change your Robots.txt and instruct search engines like Bing and Google to ignore your site.

Why-Is-my-Website-not-Showing-up-on-Google-15

This option is the first thing I look at when I hear a site owner complain about having trouble getting into search engines, whether their site has been open for days, weeks, or years.

2. Noindex is set in Robots.txt.

Go to HTTP or https://www.yourdomain.com/robots.txt to see whether this is the case. You’re instructing search engines not to scan your site if you see the following code in your Robots.txt file. 

Allow: / User-agent: *

What is causing this? It happens for a variety of causes and in a variety of ways:

  • In WordPress, the Search Engine Visibility option was chosen (see #1).

  • When a Dev or Staging version of code was pushed to Production, a robots.txt file was created, which included the prohibit code.

  • It’s possible that there’s a misspelling on the page, or that there’s a misunderstanding about robots.txt and how it works.

The following is an example that often shocks me:

Allow: * User-agent:

While this indicates that you want ALL search engines to crawl your site, it is also only one character away from banning ALL engines. Normally, I would advise against placing this in your robots.txt file since I am always afraid that someone may accidently add a “/” and remove the site from Google and Bing.

Instruments for Testing

3. Metatag “noindex”

This, like #1 and #2, is a sure-fire approach to remove a page or an entire site from Google. This may be hard-coded into a header/site template or introduced to a site using a WordPress plugin.

/meta name=”robots” content=”noindex”/> /meta name=”robots” content=”noindex”/> /meta name=”robo

<meta name=”robots” content=”noindex”/>

Do the following to see whether this is the problem:

Another option is to scan your site with your preferred crawler or run an audit using the SEMrush site audit tool to see which pages include the meta tag. Both should provide you with a list of pages, after which you may go further into your CMS or code to identify the problem.

4. Site or Host Migration

Have you recently switched to a new CMS, theme, WebHost, HTTP->HTTPs, or made any other major changes to your site? Did you update a lot of URLs and use a lot of redirects, or did you use no redirects at all? All of the factors I just mentioned might be affecting your site’s ranking in search engines and making them quite confused. 

Here’s a set of checklists to assist you with HTTP-> HTTPS and Website Migrations (even if you’ve already done the switch, they should help you troubleshoot):

If you’ve previously completed a migration, it’s quite likely that you overlooked something on one of these checks.

These are separate concerns, yet they often present simultaneously. Because crawling is difficult for engines when all of your navigation is in JS, your deeper internal pages may become islands or orphan pages.

If your site is entirely constructed in JavaScript, you may face much more serious problems than simply a Mega Menu or a JavaScript Menu. You may want to look at Bartosz and the Onely team’s study on Javascript SEO, as well as a Google series on JavaScript SEO.

Even if you installed a Mega Menu in an SEO-friendly manner, they might still cause SEO and crawling issues in other ways. A conversation initiated by John Mueller on Twitter in which a lot of individuals support, despise, and applaud Mega Menus could be of interest to you.

Mega Menus, on the other hand, may be problematic and deserve to be included on this list. Whether you’re experiencing problems with crawlers, check your Menu code to see if it’s generating 1000s or 10,000s of lines of code, which might be the source of your problems.

6. The use of redirects

Redirects are one of the key reasons why #4 is most certainly a problem. The problem would be either a lack of redirects or a chain of redirects. Crawlers will only know about your previous site/pages if you transfer your site to a new domain or move all of your pages to new URLs without using redirects. Due to both missing redirects and redirect chains, I’ve seen Redirects be the most common difficulty for getting sites and pages into search engines.

If you’re not sure what a “redirect chain” is, let me explain. It’s a series of page jumps followed by repeated 301 or 302 answers. Something like this may be seen by a browser or crawler:

Land on site.com/page.php -> 301 Redirect -> site.com/productpage.php -> site.com/productpage5.php -> 302 Redirect 302 Redirect -> site.com/page3.php and get 200 response finally.

There has been some discussion of John Mueller’s statements concerning redirects, page rank, and why you should attempt to prevent redirect chains (from late 2018). 

Redirects to WWW or Non-WWWW

While this might technically go under #5, I felt it was important to separate it off. I had a customer who was a SaaS firm, and they had a login page for the service, of course. They also had white label partners and an extensive number of code checks to ensure that the correct login page skin was shown depending on cookies, referrer, and other factors.

What a typical user didn’t see was that the page, no matter what, would often do a series of redirects. The statuses of these hops included 404, 302, 301, 301, and a few more – in short, it was a jumble. The product team, on the other hand, was worried that no matter what we did, the login page would never appear in Google.

As you can expect, this is because Google had no notion what a proper login page URL was. What are the functions of your www, non-www, HTTPS www, and HTTPS non-www homepages? Do they all come to a conclusion, or do they bounce about like a tennis ball throughout a match?

If you’re unsure, I suggest checking https://httpstatus.io/ for a quick answer. Simply enter your domain, choose “Canonical domain check,” and click Check Status.

What you want to see is three of your domain’s four versions redirecting to the fourth, as seen below:

1636526417_897_Why-Is-my-Website-not-Showing-up-on-Google-15

8. Click Depth or Deep Content

Pagination is one of the factors that causes this, and it is sometimes referred to as “clicks from the homepage” or “click depth.” Deep content, or material that is many clicks away from the homepage, is most often seen on huge e-commerce sites and blogs.

You can simply monitor your click depth and what material is presently being affected using your site crawler or the SEMrush Audit Tool. Both of these strategies will assist you in locating pages that are 5, 6, 7, or even 10 clicks away from the homepage.

1636526418_236_Why-Is-my-Website-not-Showing-up-on-Google-15

Here’s an example of pagination from the SEMrush Blog, which is often a source of deep content:

1636526419_540_Why-Is-my-Website-not-Showing-up-on-Google-15

Internal links may be found in a variety of places on the SEMrush Blog, including tags, user/author pages, internal links in blog entries, and more. However, on many sites, these additional ways of internal linking are not as effectively implemented, and the site instead relies on the blog homepage and maybe a few categories to connect to hundreds, thousands, or even millions of items and pages. Read Arsen’s article ” Pagination; You’re Doing it Wrong” for a more in-depth look into pagination.

So, if the majority of your content isn’t in Google or Bing but is 5, 6, or even 10 clicks away from the homepage, I recommend doing an audit and looking for methods to improve internal connections to those sites.

9. Page Load Time / Page Speed

While I’ve never seen this prevent a whole site from appearing in Google/Bing, I have seen it prevent certain pages or page categories from appearing. Just because I haven’t personally seen the effect of sluggish page performance on a whole site doesn’t imply it hasn’t occurred.

If your server does not reply or responds slowly to search engines, it will seem as if your site is sluggish (which it is technically), and your site’s ability to rank highly or at all may be jeopardized. Perhaps more crucially, I’ve seen sluggish pages affect a page’s ability to convert people into leads and purchases, and the effect on your bottom line is the most critical factor to consider.

You may use a variety of tools to evaluate your site’s performance, including Google Search Console’s new speed report, Google Analytics, Lighthouse in Chrome, or any of the numerous free testers accessible online. 

1636526422_721_Why-Is-my-Website-not-Showing-up-on-Google-15

https://webmasters.googleblog.com/2019/11/search-console-speed-report.html Image source: https://webmasters.googleblog.com/2019/11/search-console-speed-report.html 

Faceted Navigation is number ten.

If you’re still reading this list and your site hasn’t been flagged for Duplicate or Thin content, it’s possible that the problem is with your eCommerce site’s faceted navigation. While this isn’t exclusive to e-commerce sites, it’s where it’s most often seen.

So, what exactly is faceted navigation, and why may it be causing problems for search engines? Traps for spiders. Budget for crawling. Content that is duplicated. These are only a few of the harmful consequences of a sloppy faceted navigation.

Overall, faceted navigation may be beneficial to both users and crawlers, but there are a few challenges that your site can be experiencing that are affecting its ability to be crawled and ranked well:

  • You employ URL variables to modify the order of the pages and make them seem fresh to Google. The URLs below all exhibit the same list, yet they seem to be three separate pages.

    • /mens/?type=shoes&color=red&size=10

    • /mens/?color=red&size=10&type=shoes

    • /mens/?type=shoes&size=10&color=red

  • Site Navigation vs. Site Search are two examples of several paths.

    • When you dig down to a Mountable & Smart Capable TV at Best Buy, you get 3DFeaturesMountable%3DFeaturesMountable%3DFeaturesMountable%3DFeaturesMountable%3DFeaturesMountable%3DFeaturesMountable%3DFeaturesMountable%3DFeaturesMountable%3DFeaturesMountable%3DFeaturesMountable%3D

    • However, the URL https://www.bestbuy.com/site/searchpage.jsp? dyncharset=UTF-8&browsedCategory=pcmcat220700050011&id=pcat17071&iht=n&ks=960&list=y&qp=features facet percent 3DFeaturesMountable is doing quite well in Google. % 24abcat0101001&type=page&usc=All% 20&sc=Global&st=pcmcat220700050011 categoryid% 20&sc=Global&st=pcmcat220700050011 categoryid% 20&sc=Global&st=pcmcat220700050011 categoryi Categories

  • Issue: Crawl Depth Limitation — Sometimes, restricting the selections also limits the sites that can be ranked, so you’re not putting your best foot forward. Example: 

    • Drill down to https://www.grainger.com/category/machining/drilling-and-holemaking/counterbores-port-tools/counterbores-with-built-in-pilots?attrs=Material+-+Machining percent 7CHigh+Speed+Steel&filters=attrs&attrs=Material+-+Machining percent 7CHigh+Speed+Steel&filters=at

    • When searching Google for “High-Speed Steel Counterbores with Built-In Pilots,” https://www.grainger.com/category/machining/drilling-and-holemaking/counterbores-port-tools/counterbores-with-built-in-pilots returns a result that is close but not identical.

While each instance is different, the following are common solutions:

  • Tags that are canonical

  • Allowing or disallowing through Meta Tags

  • Allowing thru Robots.txt

  • Internal Links That Don’t Follow

  • Hide links using JavaScript and other methods

11. You Have a Manual or Algorithmic Penalty on Your Website

In some ways, this is the polar opposite of #5. Instead of having no connections, your site most likely has a lot of them, but they are harmful and questionable. You might have been hacked and removed off the index, but the most probable reason is because you have too many bad connections.

So, how can you know whether you’re facing a fine? You should have been informed by Google and Bing. To double-check, go to your website and look for further information:

  • Google Search Console should be opened.

  • Open the Bing Webmaster Tools application.

  • Open the email account linked with Google Search Console (or the email address from which the notifications are received). See whether you missed or mistakenly deleted a Google notification by searching for “google” and “penalty.”

I won’t get into the idea of what constitutes a poor vs. good link since it has already been addressed on the site, but if you do have a penalty due to links, you should read Ross Tavendale’s Weekly Wisdom on Evaluating Links.

12. Your website is missing links

It’s highly possible that if you have a new site, you have no or very few connections. It may take some time since no site begins with hundreds of thousands of links. To see how many links you have, go to Google Search Console or SEMrush Backlink Analytics and search up your site.

1636526423_286_Why-Is-my-Website-not-Showing-up-on-Google-15

Google and Bing will scan your site even if there are no connections, but you are unlikely to appear for many competitive keywords without them. So, if your site is technically sound, you have finable content, and your onsite SEO is generally in order, the thought that you lack links and that search engines haven’t prioritized your site is a very real problem.

I’ve rated sites and seen them rank with no or few links on a regular basis, however this isn’t always the case for sites and sectors.

13. Your website is devoid of content (& Context)

When it comes to visual vs. content, the visual side usually wins. This, on the other hand, leads to numerous sites that are aesthetically stunning but leave little to no space for information and context. Consider an art gallery where the flooring and walls are white or extremely plain, allowing the work to take center stage. Perhaps a tiny plaque with the artist’s name and perhaps the title of the work will be placed next to it. If your site looks the same, there is no context, and search engines rely on context to learn more about the art, the creator, and why it’s in the gallery.

Frequently, websites would put all of the material in an image, use just scant language on a page, and utilize an image or, in many instances, never actually lay out what the firm does in words. It’s crucial about identifying your viewpoint and conveying your tale, as my cohost explained in our Podcasting Guide.

Test your site and pages by copying just the primary text and displaying it to someone – ask them what the page is about, what questions they have, and have them answer questions about your company/services/product based on that information. If you have weak material and little context, they will most likely do badly on all of those tasks.

How would a consumer or search engine know what your site is about if you are a seafood restaurant but never mention seafood or any of the other varieties of fish? How would a consumer or a search engine realize that you are a lawn care provider if you don’t include any of your services?

How would a consumer or a search engine comprehend what you’re offering if your product description is the same as everyone else’s and is just 30 words long?

14. You’ve got a TON of content. That Is A Lot Of Duplicate Content

If you have a ten-page site, and six of them are for cities around you, all of which contain 80-99% of the same material, you’re going to have trouble ranking. A site with 10-30 blog entries but 10 category pages and 100 tag pages is the same. Because there is a lot of thin and duplicate material on your site, it will have a poor perceived value.

Consider how a person searching for anything on your site may feel about those thin pages, and why would a search engine want to rank, much alone crawl, them.

You may learn more about recognizing duplicate material in How to Identify Duplicate Content, which will take you through the process of scraping your website.

The SEMrush Site Audit tool is another alternative for locating duplicate material. If the SEMrush Site Audit bot detects numerous pages with 80% content similarity, it will mark them as duplicate content. Furthermore, pages with insufficient material may be considered duplicate content. After you’ve completed an audit, you’ll get the following screen:

1636526424_296_Why-Is-my-Website-not-Showing-up-on-Google-15

The problem in the example above was that Page 2 was meant to be redirected in a recent upgrade but was not.

So, if you’re having trouble improving your search engine rankings, have a look at your internal pages to determine whether they’re too identical, as well as your content in comparison to other sites. When it comes to affiliate and ecommerce sites, it’s not uncommon for tens, if not hundreds, of them to utilize the identical product description. If you have 100 goods on your site and none of them are distinctive, getting ranked for the keywords you want will be difficult.

15. Your Keywords Are In Demand

This is one of the first things I’d suggest looking into, but it may also be the last. Frequently, the problem isn’t that a site isn’t showing up, but that it isn’t showing up for industry and target keywords that a CEO or someone else wants to see a site show up for.

Take a look at this post by Nikolai Boroda on keyword research, and attempt to better define your target keywords based on it (and expectations of your boss).

This isn’t to suggest you won’t fight for the most competitive keywords in your sector one day; but, if your site is new, you’ll need to work your way up to them, starting with your brand and longer tail keywords.

Have you ever had to deal with any of these issues? In the comments, tell us how you coped with them. 

Get a 7-day trial for free.

Begin to improve your web exposure.

ADS illustration

The “how to make my website visible on google search” is a question that many people have asked. The article will discuss the 15 most common reasons why your website might not be showing up in Google searches.

Related Tags

  • google search console
  • wordpress site doesn t show up on google
  • why is my shopify website not showing up on google
  • you are searching for a website that was published within the last week when you type in the url
  • how to get website on google first page