Webinomy Study: Most Common SEO Mistakes and How To Fix Them

Nobody wants to rank lower than their competitors when it comes to Google searches. In order for your company’s website and online presence to be more visible, there are some steps that you should take in order make sure the site is optimized for search engine visibility.

The “on-page seo mistakes” is a list of the most common SEO mistakes and how to fix them.

It’s essential to have a basic understanding of search engine optimization if you want to ensure that your websites rank effectively in search engines (SEO). But the reality is that many individuals don’t, and as a result, they’re a little stuck when it comes to SEO issues. According to SEMrush’s study, website owners often battle with technical challenges. That’s hardly surprising, given that the typical SEO checklist has more than a dozen items that must be addressed before a site can be considered successful.

Which SEO difficulties, on the other hand, do you actually need to concentrate on?

We decided to use actual data to find out. Using SEMrush’s Site Audit tool, we gathered anonymous data on 100,000 websites and 450 million pages to determine:

  • top SEO issues
  • How many sites are affected by these issues?

This article includes a list of the most frequent on-page, technical SEO, and website difficulties, as well as information on how they may impact your search engine rankings. Our primary results are summarized in the infographic below, but you may discover considerably more if you continue reading.

Errors in Sitemaps are Identified and Corrected

utilizing the Audit Tool for Sites

ADS illustration


1. Content that is duplicated


“Substantive blocks of information inside or across domains that either entirely match other content or are noticeably similar,” according to Google.

According to our research, the most common SEO issue affecting websites is duplicate content, which we found on 50 percent of the sites we analyzed. During a recent Google Q&A session, Andrey Lipattsev, Search Quality Senior Strategist at Google, stated that there is no such thing as a duplicate content penalty. But that doesn’t mean it’s something you can just turn a blind eye to.

To begin with, having duplicate content on your website eliminates your option to choose whatever page you want to rank for. Because search engines have no way of knowing which sites you want to be considered landing pages in SERPs, they may begin to compete with one another. Second, search engines are expressly built to make the web a better place for users; both search engines and users value quality, original material.


Images are a crucial component of content marketing, but they may also cause severe search engine optimization concerns. According to our findings, 45 percent of websites contain photos with missing alt tags, and another 10% have internal images that are damaged. Both of these are inconvenient.

Let’s begin with alternative tags. They play a crucial role in image search. Alt tags assist search engines comprehend what photos are about, even if they’ve grown quite savvy.

Alt tags, in other words, give textual descriptions of photos, allowing search engines to classify them. One of the reasons why your picture alt tags should include your SEO keyword phrases is because of this.

Alt tags are also important for visually impaired persons who use screen readers, since the information in alt tags is used by the readers to explain pictures to web visitors. Images without alt tags may scarcely be regarded an indication that a website gives value to the user, and search engines are particularly concerned about user experience. They may result in a greater bounce rate, which might be one reason of bad search engine performance.

Broken pictures produce the same problems as broken links, which we’ll discuss further below. Broken links are dead ends for both users and search engines, and they may lead to your website being downgraded by search engines due to the bad user experience they provide.

3. Problems with Title Tags


Title tags (page titles) are used by search engines to establish what pages are about. Because title tags display at the top of search results, they aid web visitors in deciding whether or not to click on your link. Title tags are also one of the most crucial SEO aspects on your website; properly designed title tags may have a significant beneficial influence on your results.

At SEMrush, we’ve seen four key SEO issues with title tags:

  • 35 percent of websites have duplicate title tags.
  • 15 percent of websites have title tags that are too lengthy.
  • On 8% of websites, missing title tags are a concern.
  • On 4% of the sites we looked at, the title tags were too short.

As previously said, Google strives to provide consumers with unique material. However, missing or repeated title tags do not convey meaningful information about a website’s content to visitors or search engines, nor do they signal that a page is valuable.

The length of your title tag is significant since it determines how much of your title appears in search results. According to the newest news, Google may display 70-71 characters depending on the device being used, therefore it’s a good idea to keep the crucial information (including your selected key phrase) inside that range.

The majority of decent SEO tools will assist you in detecting duplicate, lengthy, short, or missing title tags. Check out our latest guide on title tag optimization for more information.

Errors in Sitemaps are Identified and Corrected

utilizing the Audit Tool for Sites

ADS illustration


Meta descriptions that appear in search results assist online visitors in deciding whether or not to visit your site. Although the relevancy of the meta description has no direct impact on page ranking, it does have a significant impact on page CTR. 

According to our findings, 30% of websites contain redundant meta descriptions, while 25% have no meta descriptions at all.

Our Site Audit tool will assist you in identifying both problems, but it’s critical that you go in and manually correct them if required. If you’re using WordPress, the correct SEO plugin may assist you in creating rules to guarantee that all of your content includes meta descriptions.

Here are some tools to assist you build fresh meta descriptions when you’re ready:

5. Internal and external links that are broken


Broken links are a major onsite SEO issue. One or two broken links may not be an issue as your site expands and materials are updated. They’re not an issue if the 404 page is correctly put up, but what if you have hundreds of them?

As a result, broken linkages pose a risk. There are various reasons for this, but the first is that if a person is sent to a 404 page rather than the relevant information they sought, traffic drops. Furthermore, consumers will consider your website to be of poor quality.

Broken links, on the other hand, are a waste of crawl money. When search engine bots visit your website, they just explore a portion of it; they do not crawl the full site. You risk diverting the bots’ attention away from your pages if you have a lot of broken links, which is bad since your pages won’t be crawled and indexed.

According to our findings, 35% of the websites we scanned contain broken internal links that return error HTTP status codes (70 percent of those return a 4xx – page not found or similar – code).

We found broken external connections on 25% of the sites we looked at. This problem has the potential to limit the number of pages that show in search engine results and to lower page authority, therefore it’s something you should address.

You may always use our Site Audit tool or a link checker plugin to find broken links and then repair them all. You may also look for broken links and contact webmasters to recommend a new resource on your site for them to connect to.

Check out 5 Steps to Get Your Website Crawled Faster and the definitive list of crawlability faults – 18 Reasons Your Website is Crawler-Unfriendly: Guide to Crawlability Issues for more information on crawl errors and broken links.

6. Text-to-HTML Ratio Is Low


On 28% of the sites we looked at, the text-to-HTML ratio was alarmingly low. This indicates that there is a higher amount of back-end HTML code on these sites than there is content that users can read. We propose a 20 percent lower limit as an appropriate lower limit. This alert is often a hint that you have additional SEO ranking concerns that need to be addressed. A low text-to-HTML ratio, for example, might indicate:

  • A website that has been badly coded (with invalid code and excessive Javascript, Flash and inline styling)
  • Hidden text is a red signal for search engines since it’s what spammers do.
  • A sluggish site – the more code and script on a page, the longer it will take for it to load, and page speed is a key SEO element.

Check any pages where this notice occurs and fix the problem by:

  • Unnecessary code is being removed from the website to make it smaller and faster.
  • Creating separate files for inline scripts and styling
  • Where appropriate, include pertinent on-page content.

7. Problems with the H1 Tag


Header tags have long been a crucial aspect of SEO, since they indicate the most significant material on a website. On every page, there should only be one H1 element, which is usually the content title. Despite the fact that HTML5 has altered the way header tags are used (you may now have more than one H1 on a page), header tags continue to provide a valuable hierarchy for both search engines and web visitors.

According to our findings, 20% of the sites we examined had several H1 tags, 20% had no H1 tags, and 15% had duplicate content in their title tag and H1.

It’s crucial to understand the distinction between title and header tags. Your title tag’s content appears in search results, whereas header tags are what your reader sees on your website. 

As previously stated, numerous H1 tags may be used on a page, but only if the appropriate HTML5 syntax is used to differentiate between portions of similar value.

More information may be found in this Tuts+ lesson. In all other circumstances, stay away from H1s. Also, make sure your H1s are similar to, but not identical to, your title tags (for SEO purposes).

8. Word Count Is Low


We found that 18% of the websites we crawled had pages with a low word count. On the one hand, there is no minimum word count for a page, which makes word count a difficult SEO statistic. On the other hand, Google is renowned for favoring material with more depth, and lengthier content is one sign of depth, particularly if fluff is avoided.

We discussed the necessity of incorporating useful on-page content wherever feasible in onsite issue number six. You should do everything possible to provide your material depth and value to readers. Consider this: don’t you love it when an infographic’s designer goes out of their way to give more context? We do, and your readers will as well.

9. An Excessive Number of On-Page Links


Linking is a difficult skill to master. That’s presumably why 15% of the sites we looked at had an excessive number of on-page links on certain pages. While Google no longer requires that the number of links on a page be kept under a set amount, excellent SEO still entails having a natural link profile with relevant and high-quality connections.

Too many links may dilute the value of your website and drive away the majority of your visitors. However, if the links are good and relevant, your site will still rank highly.

Perform a link audit to resolve an on-page link problem and ensure that all links on the page in question contribute value. Otherwise, to boost your SEO and create a better user experience, get rid of the ones that don’t (which is also good for SEO).

10. Incorrect Declarative Language


Our online audience is international. That’s why it’s critical to include a language declaration on the page to specify the page’s default language. According to our analysis, just 12% of websites got this right. A language declaration is a useful tool that allows you to:

  • Browsers are notified of the content’s language (which useful for translation and page display)
  • Those who use text-to-speech converters will hear your information read in their native language’s dialect (for example Castilian or Latin American Spanish)
  • Geolocation and foreign SEO are aided.

Make sure you get the language declaration properly by using this list of language codes. Language declaration will be used by Google to guarantee that the correct individuals view the right information. While this may not have a direct impact on your SERPs, keep in mind that it will assist increase the relevance score of your pages, which is an essential aspect of SEO.

Temporary Redirects (No. 11)


Redirection is a great approach to notify search engines when a page has relocated so you don’t lose page authority. In terms of SEO, however, there is a significant difference between permanent (301) and temporary (302) redirection. Temporary redirection were found on 10% of the sites we looked at, according to our research.

Search engines may continue to index an old website while disregarding the page to which it is being redirected if you employ a 302 redirect. As a result, if the change is permanent, a permanent redirect should be implemented. That, according to Moz, is the ideal strategy.

Of course, Google may eventually identify that a 302 redirect is permanent and convert it to a 301, but it’s best if you take control of the process to prevent bad SEO and traffic loss.


Many sites’ SEO attempts are being harmed by these 11 typical flaws. According to our findings, many websites have major on-page SEO difficulties. Yes, not all of them have an equal influence on your rankings. Bad SEO, on the other hand, does not necessarily arise from a clear violation of search engine regulations; it may also emerge from a failure to properly care for your website and its visitors.

Yes, manual temporary redirection may not have the same negative impact on your rankings as other on-page errors, and no, there is no such thing as a duplicate content or title tag issue penalty. However, you should be aware of such concerns since they have an impact on your traffic and user trust, and as a consequence, financial losses. And there’s always the possibility that you’ll reach a point when a series of little errors add up to a plethora of SEO issues that will devastate your website’s SERP ranks.

Are you entirely aware of all of the difficulties with your website? If you’re forewarned, you’re forearmed. We hope you find this information useful and informative as you work your way to the top of the SERPs!

Get a 7-day trial for free.

Begin to improve your web exposure.

ADS illustration

The “worst seo mistakes” is a list of the most common SEO mistakes. They are then followed by their solutions.

Frequently Asked Questions

What are the most common SEO mistakes?

A: One of the most common mistakes that website owners make is not being sure what their target audience wants and needs. This can be fixed by determining which type of content your site would suit best and then creating it accordingly to avoid any confusion with readers or potential customers.

What are some common SEO mistakes you see companies make?

A: Many people throughout the web claim that search engine optimization is an art form. This isnt entirely true, as it has much more to do with science and mathematics than creativity or art. There are many mistakes companies make when they try to optimize their website for better rankings on Google.

What are the common SEO mistakes to be avoided?

A: The biggest mistake that marketers make is putting a lot of focus on keywords and meta tags, making sure they have the best ones possible. They also often forget about how important it is to use internal linking between your pages.

Related Tags

  • seo mistakes to avoid in 2021
  • what is seo
  • seo issues check
  • what is seo semrush
  • semrush low text-html ratio

Leave a Comment

Your email address will not be published.