fbpx
Red

7 search engine optimization Crawling Device Warnings & Errors You Can Safely Ignore

7 SEO Crawling Tool Warnings & Errors You Can Safely Ignore

In lots of circumstances, what an SEO crawler marks as a deadly error wants speedy consideration – however generally, it’s not an error in any respect.

This may occur even with the most well-liked search engine optimization crawling instruments equivalent to Semrush Web site Audit, Ahrefs Web site Audit, Sitebulb, and Screaming Frog.

How will you inform the distinction to keep away from prioritizing a repair that doesn’t must be achieved?

Listed below are just a few real-life examples of such warnings and errors collectively, with explanations as to why they could be a difficulty in your web site.

1. Indexability Points (Noindex Pages on the Web site)

Any search engine optimization crawler will spotlight and warn you about non-indexable pages on the location. Relying on the crawler sort, noindex pages could be marked as warnings, errors, or insights.

Right here’s how this problem is marked in Ahrefs Web site Audit:

Noindex page issue details form Ahrefs Site Audit.Screenshot from Ahrefs Web site Audit, September 2021

The Google Search Console Protection report can also mark non-indexable pages as Errors (if the location has non-indexable pages within the sitemap submitted) or Excluded although they don’t seem to be precise points.

Commercial

Proceed Studying Beneath

That is, once more, solely the data that these URLs can’t be listed.

Here’s what it seems like in GSC:

Google Search Console Coverage report non-indexable pages as Errors.Screenshot from Google Search Console, September 2021

The truth that a URL has a “noindex” tag on it doesn’t essentially imply that that is an error. It solely implies that the web page can’t be listed by Google and different engines like google.

The “noindex” tag is considered one of two attainable directives for crawlers, the opposite one being to index the web page.

Commercial

Proceed Studying Beneath

Virtually each web site incorporates URLs that shouldn’t be listed by Google.

These might embody, for instance, tag pages (and generally class pages as effectively), login pages, password reset pages, or a thanks web page.

Your process, as an search engine optimization skilled, is to evaluate noindex pages on the location and determine whether or not they certainly needs to be blocked from indexing or whether or not the “noindex” tag might have been added by chance.

2. Meta Description Too Quick or Empty

search engine optimization crawlers may even test the meta parts of the location, together with meta description parts. If the location doesn’t have meta descriptions or they’re too quick (often beneath 110 characters), then the crawler will mark it as a difficulty.

Right here’s what that appears like in Ahrefs:

Meta description element issue in Ahrefs.Screenshot from Ahrefs Web site Audit, September 2021

Right here is how Screaming Frog shows it:

Meta element issue in the report of Screaming Frog.Screenshot from Screaming Frog, September 2021

Relying on the dimensions of the location, it isn’t all the time attainable and/or doable to create distinctive meta descriptions for all its webpages. You could not want them, both.

A great instance of a web site the place it might not make sense is a large ecommerce web site with tens of millions of URLs.

In truth, the larger the location is, the much less vital this component will get.

The content material of the meta description component, in distinction to the content material of the title tag, isn’t taken into consideration by Google and doesn’t affect rankings.

Search snippets generally use the meta description however are sometimes rewritten by Google.

Here’s what Google has to say about it of their Superior search engine optimization documentation:

“Snippets are robotically created from web page content material. Snippets are designed to emphasise and preview the web page content material that finest pertains to a person’s particular search: which means that a web page would possibly present completely different snippets for various searches.”

What you as an search engine optimization have to do is understand that every web site is completely different. Use your widespread search engine optimization sense when deciding whether or not meta descriptions are certainly a difficulty for that particular web site, or you can safely ignore the warning.

Commercial

Proceed Studying Beneath

3. Meta Key phrases Lacking

Meta key phrases have been used 20+ years in the past as a option to point out to engines like google equivalent to Altavista what key phrases a given URL needed to rank for.

This was, nonetheless, closely abused. Meta key phrases have been a kind of a “spam magnet,” so the vast majority of engines like google dropped assist for this component.

Screaming Frog all the time checks if there are meta key phrases on the location, by default.

Since that is an out of date search engine optimization component, 99% of websites don’t use meta key phrases anymore.

Right here’s what it seems like in Screaming Frog:

Screaming Frog highlights that meta keywords are missing on the site.Screenshot from Screaming Frog, September 2021

New search engine optimization execs or purchasers might get confused pondering that if a crawler marks one thing as lacking, then this component ought to truly be added to the location. However that’s not the case right here!

Commercial

Proceed Studying Beneath

If meta key phrases are lacking on the location you’re auditing, it’s a waste to suggest including them.

4. Photographs Over 100 KB

It’s vital to optimize and compress photos used on the location so {that a} gigantic PNG brand that weighs 10 MB doesn’t must be loaded on each webpage.

Nonetheless, it’s not all the time attainable to compress all photos to beneath 100 KB.

Screaming Frog will all the time spotlight and warn you about photos which can be over 100 KB. That is what it seems like within the device:

Screaming Frog will always highlight about images that are over 100 KB.Screenshot from Screaming Frog, September 2021

The truth that the location has photos which can be over 100 KB doesn’t essentially imply that the location has points with picture optimization or could be very sluggish.

Commercial

Proceed Studying Beneath

If you see this error, be sure that to test the general web site’s pace and efficiency in Google PageSpeed Insights and the Google Search Console Core Internet Vitals report.

If the location is doing okay and passes the Core Web Vitals assessment, then often there isn’t any have to compress the photographs additional.

Tip: What chances are you’ll do with this Screaming Frog report is kind the photographs by measurement from the heaviest to the lightest to test if there are some actually large photos on particular webpages.

5. Low Content material or Low Phrase Rely Pages

Relying on the settings of the search engine optimization crawler, most SEO auditing tools will spotlight pages which can be beneath 50-100 phrases as low content material pages.

Here’s what this problem seems like in Ahrefs:

Low word count issue in Ahrefs.Screenshot from Ahrefs Web site Audit, September 2021

Screaming Frog, however, considers pages beneath 200 phrases to be low content material pages by default (you’ll be able to change that setting upon configuring the crawl).

Commercial

Proceed Studying Beneath

Right here is how Screaming Frog reviews on that:

Screaming Frog Low Content Pages report.Screenshot from Screaming Frog, September 2021

Simply because a webpage has few phrases doesn’t imply that it is a matter or error.

There are lots of sorts of pages that should have a low phrase depend, together with some login pages, password reset pages, tag pages, or a contact web page.

The crawler will mark these pages as low content material however this isn’t a difficulty that can forestall the location from rating effectively in Google.

Commercial

Proceed Studying Beneath

What the device is attempting to let you know is that if you need a given webpage to rank extremely in Google and convey lots of natural site visitors, then this webpage might must be fairly detailed and in-depth.

This usually contains, amongst others, a excessive phrase depend. However there are various kinds of search intents and the content material depth isn’t all the time what customers are searching for to fulfill their wants.

When reviewing low phrase depend pages flagged by the crawler, all the time take into consideration whether or not these pages are actually meant to have lots of content material. In lots of circumstances, they don’t seem to be.

6. Low HTML-Textual content Ratio

Semrush Web site Audit may even provide you with a warning in regards to the pages which have a low text-HTML ratio.

That is how Semrush reviews on that:

Semrush Site Audit report about low text-HTML ratio.Screenshot from Semrush Web site Audit, September 2021

This alert is meant to indicate you:

Commercial

Proceed Studying Beneath

  • Pages that will have a low phrase depend.
  • Pages which can be doubtlessly inbuilt a posh means and have an enormous HTML code file.

This warning usually confuses much less skilled or new search engine optimization professionals, and chances are you’ll want an skilled technical search engine optimization professional to find out whether or not it’s one thing to fret about.

There are lots of variables that may have an effect on the HTML-text ratio and it’s not all the time a difficulty if the location has a low/excessive HTML-text ratio. There isn’t any such factor as an optimum HTML-text ratio.

What you as an search engine optimization professional might concentrate on as a substitute is making certain that the location’s pace and efficiency are optimum.

7. XML Sitemap Not Indicated in robots.txt

Robots.txt, along with being the file with crawler directives, can be the place the place you’ll be able to specify the URL of the XML sitemap in order that Google can crawl it and index the content material simply.

search engine optimization crawlers equivalent to Semrush Web site Audit will notify you if the XML sitemap isn’t indicated in robots.txt.

Commercial

Proceed Studying Beneath

That is how Semrush reviews on that:

Semrush Site Audit report about sitemap.xml not indicated in robots.txt.Screenshot from Semrush Web site Audit, September 2021

At a look, this seems like a critical problem although typically it isn’t as a result of:

  • Google often doesn’t have issues crawling and indexing smaller websites (beneath 10,000 pages).
  • Google won’t have issues crawling and indexing large websites if they’ve an excellent inner linking construction.
  • An XML sitemap doesn’t must be indicated in robots.txt if it’s accurately submitted in Google Search Console.
  • An XML sitemap doesn’t must be indicated in robots.txt if it’s in the usual location – i.e., /sitemap.xml (typically).

Earlier than you mark this as a high-priority problem in your search engine optimization audit, guarantee that not one of the above is true for the location you’re auditing.

Bonus: The Device Reviews a Vital Error That Pertains to Few Unimportant URLs

Even when the device is displaying an actual problem, equivalent to a 404 page on the location, it might not be a critical problem if one out of tens of millions of webpages on the location return standing 404 or if there aren’t any hyperlinks pointing to that 404 web page.

Commercial

Proceed Studying Beneath

That’s why, when assessing the problems detected by the crawler, you need to all the time test what number of webpages they relate to and which of them.

You have to give the error context.

Sitebulb, for instance, will present you the share of URLs {that a} given error pertains to.

Right here is an instance of an inner URL redirecting to a damaged URL returning 4XX or 5XX reported by Sitebulb:

Example of a report about an internal URL redirecting to a broken URL.Screenshot from Sitebulb Web site Crawler, September 2021

It seems like a fairly critical problem but it surely solely pertains to one unimportant webpage, so it’s undoubtedly not a high-priority problem.

Commercial

Proceed Studying Beneath

Ultimate Ideas & Suggestions

search engine optimization crawlers are indispensable instruments for technical search engine optimization professionals. Nonetheless, what they reveal should all the time be interpreted inside the context of the web site and your targets for the enterprise.

It takes time and expertise to have the ability to inform the distinction between a pseudo-issue and an actual one. Luckily, most crawlers supply intensive explanations of the errors and warnings they show.

That’s why it’s all the time a good suggestion – particularly for newbie search engine optimization professionals – to learn these explanations and the crawler documentation. Be sure to actually perceive what a given problem means and whether or not it’s certainly price escalating to a repair.

Extra Assets:


Featured picture: Professional Symbols/Shutterstock

Source link

Leave A Comment

Categories

Logo-White-1

Our purpose is to build solutions that remove barriers preventing people from doing their best work.

Giza – 6Th Of October
(Sunday- Thursday)
(10am - 06 pm)
Cart

No products in the cart.

Select the fields to be shown. Others will be hidden. Drag and drop to rearrange the order.
  • Image
  • SKU
  • Rating
  • Price
  • Stock
  • Availability
  • Add to cart
  • Description
  • Content
  • Weight
  • Dimensions
  • Additional information
  • Attributes
  • Custom attributes
  • Custom fields
Click outside to hide the compare bar
Compare