8505
•
20-minute read
Don't underestimate the power of the basics. Even SEO experts sometimes miss these simple yet crucial SEO limits that can boost your website's ranking.
In this article, I will navigate you through 24 often overlooked yet critically significant SEO limits that can affect your website’s performance and visibility. From technical aspects like title length and meta descriptions to Google crawl limits and beyond, I'll cover them all.
Before we get started, download the All SEO Limits cheat sheet for a quick reference whenever needed.
Limit: 60-70 characters for desktop and 70-76 for mobile full visibility in search results.
Search engines typically display only the first 60-70 characters in desktop and 70-76 in mobile search results.
Titles that exceed the limit might be cut off in the SERP and appear incomplete. Such titles may lack clarity and fail to capture user attention. That leads to decreased click-through rates, which you don’t want.
Plus, in some cases, Google might rewrite excessively long titles, which could alter your intended message. If you want to have at least some feeling of control over your snippet, consider sticking to this limit.
Use Content Editor when editing your copy to check how your title will look like in the SERP. To do that, go to WebSite Auditor > Page Audit > Content Editor and enter your URL.
When everything is set up, click on your title to start editing it and immediately see how it will look in the SERP.
Limit: One title tag and one H1 heading per page.
According to Google, there are no limits to how many title and H1 tags you can add. However, since only one will be taken into account anyway, multiple tags can confuse search engines and dilute the focus of your content.
Having only one title and one H1 will ensure that you send clear signals about your page's main topic. However, note that your title tag may differ from the H1 tag.
In WebSite Auditor, you can check if there are multiple title and H1 tags.
Go to Site Structure > Pages and add the H1 Count and Multiple Title Tag columns to your workspace.
Limit: under 160 characters for desktop and 120 for mobile for optimal user engagement.
The reason for such limitations is the same as for title tag length. Search engines commonly truncate meta descriptions beyond 160 characters in desktop search results and 120 characters in mobile. Your message may become fuzzy if it isn't fully visible. That's the reason for the low click-through rates.
So, craft concise and descriptive meta descriptions that entice users to click. Of course, don't forget to use relevant keywords. However, note very often (70% of the time) Google rewrites meta descriptions to match search quiries with pages more accurately.
Limit: 200kb for faster loading speeds and a better user experience.
Large image file sizes can slow down page loading times. This, in turn, leads to higher bounce rates. Nobody loves to wait for so long till your page is fully loaded - they would go to your competitor instead. Plus, image size impacts Core Web Vitals and, consequently, your rankings. So, sticking to the 200KB rule is important.
Compress images without sacrificing quality with tools like JPEG Optimizer or TinyPNG to improve page speed performance. Also, read our comprehensive guide on image SEO to learn all the peculiarities of image optimization.
You can check if there are issues with your image sizes in WebSite Auditor.
Just go to Site Structure > Site Audit and find the Page Speed section with the Properly size images factor.
Limit: 50MB max size (uncompressed), 50,000 URLs per sitemap.
Sitemaps act as roadmaps for search engines, helping them discover and index your website's content effectively. If you adhere to the abovementioned limits, search engines can efficiently process your sitemaps and index all relevant pages.
If your website contains more than 50,000 URLs, consider splitting your sitemap into multiple ones. In this case, create a sitemap index file that lists them all (you can submit up to 500 sitemap index files for each site in your Search Console account). This helps search engines discover and access all your sitemaps.
Remember to update your sitemaps whenever you add or remove important content from your website.
You can compile an error-free sitemap file right in WebSite Auditor. To do that, go to Site Structure > Pages and click the Website Tools button in your workspace.
Choose the Sitemap option and start building up your sitemap.
Once done, choose your publish option and ta-dah. You’re all set.
Limit: 2MB and 100,000 URLs.
A disavow file lets you inform Google about backlinks you consider low-quality or spammy. Large disavow files might take longer for Google to process. Besides, if your file exceeds the URL limit, not all intended backlinks might be disavowed. All of this may potentially affect your SEO performance.
However, don’t abuse disavowing – submit your file to Google only when it’s really needed, for example, when Google Penalty gets you.
You can easily and quickly generate your disavow file with SEO SpyGlass. For that, go to the project's Preferences > Disavow/Blacklist backlinks and add your list of toxic backlinks there.
Alternatively, you can select the bad backlinks right in your Backlinks module workspace and right-click to choose the Disavow Backlinks option.
Limit: 500KB.
The robots.txt file instructs search engine crawlers on how to access and navigate your website. It acts as a set of rules for crawlers, specifying which pages they can access and which ones they should avoid. However, this file has a size limit to ensure search engines process it efficiently.
Search engines may simply ignore the entire robots.txt file if it's too large. Besides, the crawling process may be slowed down. What's more, if the file size exceeds the limit, parts of the instructions might be cut off. This may lead to misinterpretations, and search engines may block access to important content.
To compile the robots.txt file correctly, use the built-in WebSite Auditor's Robots.txt tool.
Find it in Site Structure > Pages, then choose Website Tools > Robots.txt.
Limit: 120 characters.
Alt text plays a crucial role in website accessibility and SEO. It provides a textual description of images for users who are visually impaired or rely on screen readers. It also helps search engines understand the context and content of your images.
Don't make your alt text too long – describe the image accurately and include relevant keywords. However, prioritize clarity and usefulness for those who can't read.
Watch out for empty alt texts as they mean missed opportunities for ranking in Google Images and insufficient accessibility.
In WebSite Auditor, you can check if there are empty alt texts on your site. To do that, find the Site Audit module and Images section with the Empty Alt Text factor.
Limit: 6 - 8 words or 55-60 characters.
Anchor text is clickable text on a link, usually underlined and blue. Like most of the list's limits, there is no "official" anchor text limit. However, for user experience's sake, keeping your anchor text concise is essential.
Concise anchor text enhances user experience and avoids clutter. It also helps search engines understand the context of linked pages.
Limit: 2,000 characters.
While technically, URLs have no strict character limit, excessively long ones can pose several challenges.
First of all, long URLs can be difficult for users to read and understand. Such URLs may confuse users and search engines. Second, they are suspicious. Users may hesitate to click on overly long URLs, fearing they might lead to irrelevant or misleading content.
So, keep URLs concise and descriptive, use relevant keywords, and avoid unnecessary characters or parameters.
Limit: ideally, below five hops from the original URL.
Redirects are essential for managing website changes, like moving content to a new URL or fixing broken links.
However, you need to remember that each redirect adds an extra HTTP request, which can significantly increase page load times. Plus, with each redirect, some link value is lost. This means the destination page may get very little or no PageRank, and its performance may be worse than expected.
Besides, search engines may struggle to follow long redirect chains, which may cause indexing and crawling issues for your website.
Audit your website's redirect chains regularly to minimize unnecessary hops and improve user experience and search engine crawling efficiency.
Limit: 3 seconds for desktop and 2 seconds for mobile.
Page load time is the time it takes for a web page to load and become fully interactive for users. It's a crucial factor that significantly impacts both UX and SEO.
Users expect websites to load quickly. Slow loading times cause frustration, impatience, and, ultimately, bounces. Plus, pages that load quickly are believed to get a boost in Google search rankings.
Limit: LCP < 2.5 seconds, FID < 100 milliseconds, CLS ≤ 0.1, INP < 200 milliseconds.
Core Web Vitals (CWV) are a set of metrics introduced by Google that measure key aspects of user experience on web pages. These metrics focus on loading performance, interactivity, and visual stability. Here's a breakdown of these metrics:
Optimizing CWV is crucial for a more enjoyable and frustration-free experience for users. This, in turn, can lead to increased engagement, conversions, and overall website success.
Limit: under 3MB in size to ensure fast loading times across devices and various network conditions.
Page size is the total amount of data a web page needs to download and render in a user's browser. Large files take longer to download, especially on slower internet connections or mobile devices. Obviously, a heavy page can lead to user frustration and increased bounce rates.
So, compress images, minify CSS and JavaScript, and optimize multimedia content to reduce page size and improve your page performance.
Limit: 3-4 from the homepage.
Click depth is the number of clicks it takes a user to reach a specific page from the homepage. While there's no strict limit, it's recommended to strive for a lower click depth. Why?
Well, users expect to find the information they need quickly and easily. Excessive clicks can lead to frustration. People tend to interact more with content that is easy to find and doesn't require too much effort to navigate. Moreover, a large click depth will eat up your crawl budget, which can ultimately lead to indexing problems.
Three is considered the maximum acceptable click depth, as it allows users to reach most pages within a few intuitive clicks from the homepage.
So, keep your most important pages as close to the homepage as possible.
Limit: 1-3% without keyword stuffing.
Keyword density is the percentage of times a specific keyword appears in your copy in relation to the total number of words. If it's too high, it's called keyword stuffing and can harm your website's ranking and user experience. Search engines may even penalize websites that engage in such practices.
Strive for a keyword density between 1-3% for optimal balance. However, your primary focus should always be on creating valuable and engaging content for users. Use keywords naturally only where they're relevant and appropriate.
To calculate your keyword density, use the formula: (# of keywords / total word count) * 100
For example, if your target keyword appears 10 times and the total word count of your page is 500 words, your keyword density = (10 / 500) * 100 = 2%.
Don't just focus on one single keyword. Explore related keywords and long-tail phrases that users might search for. And, of course, maintain a natural reading flow. Don't sacrifice readability for keyword placement – your readers will be thankful.
You can check if your pages stuffed with keywords in WebSite Auditor. To do that, go to Page Audit > Content Audit, enter your page URL, and then proccees to the Body section. Here, you will find the info on the number of keywords in your copy and if keyword stuffing takes place.
Limit: >300 words.
There is such a term in SEO as thin content, which means a page offers visitors little or no added value. This is about pages that have little to no content on them. Besides, thin content is still believed to cause Google penalties.
That's why the more quality content you offer – the bigger value you provide. While longer content may provide more opportunities for keyword inclusion and depth of coverage, quality, and relevance should be paramount for you.
To come up with the ideal word count, run a SERP analysis before you start writing any copy. Check what word count your top competitors stick to.
Or, what's even more convenient, use Content Editor. It will automatically generate recommendations on word count based on your SERP competitors.
Limit: 100-150 internal links per page.
While Google previously mentioned that too many internal links can do more harm than good, it's important to understand that this 100-150 links rule is not strict. Always focus on creating a well-structured and user-friendly internal linking strategy, don’t just adhere to a certain number.
Consider the user's perspective when creating internal links. Ensure they are placed logically and contribute to a smooth navigation experience. And remember: internal linking distributes PageRank. So, the more you link, the more equity you share.
Limit: 15MB.
A page of over 15MB is a rare thing. Most likely, your pages are much smaller (remember we discussed the page size limit previously?).
While exceeding a hypothetical "crawl limit" per page might not directly prevent indexing, it can negatively impact your website's crawl efficiency and overall SEO. If a web page exceeds this limit, Googlebot will only process the first 15MB of the page.
Minify and optimize your website's code (HTML, CSS, JavaScript).
Compress images to reduce file size without sacrificing quality. Consider using next-generation image formats like WebP or AVIF.
Limit the use of external scripts, fonts, and other resources that can contribute to page size and potentially slow down crawling.
Avoid excessive use of irrelevant elements that bloat page size.
Limit: a few thousand pages per day for larger sites.
Search engine crawlers allocate a limited budget for crawling for each website, focusing on pages with higher importance and freshness. That’s why you should always focus on creating a well-structured, efficient, and user-friendly website.
Limit: 1,000 properties, 1,000 rows, up to 500 sitemap index files, 10,000 pixels of rendering.
Google Search Console (GSC) offers valuable tools for website owners and SEO professionals. However, it's essential to be aware of certain limitations in place:
Prioritize essential data – focus on the most critical metrics and reports relevant to your SEO goals. And if you manage a large number of websites, consider creating separate GSC accounts or utilizing team features to share access efficiently.
Besides, if your website requires more than 500 sitemap index files, consider consolidating or restructuring them for better manageability.
Limit: 100 characters in a business name and many more.
Google Business Profile is a free tool that allows businesses to manage their online presence across Google Search and Maps. And, of course, it has some technical limits:
While some fields have character limits, prioritize providing accurate and concise information.
Don’t forget about adding relevant keywords to your business name, description, and category selection to improve discoverability in search results.
Limit: 10 million hits per month per property and many more.
While Google Analytics is a powerful platform for website analytics, there are certain limitations that ensure efficient data processing and system functionality.
One such limit is the 10 million hits per month per property. If your website surpasses the 10 million hit limit, data processing might slow down or even be temporarily suspended. In this case, you may experience difficulty accessing certain reports or see some inconsistencies in data visualization.
Other GA limits are:
Focus on collecting and analyzing data that directly contributes to your marketing goals and strategic decisions. Segment your data to analyze specific website sections, reducing the overall data volume processed for each report.
If needed, leverage data sampling for high-traffic periods to gain insights without exceeding processing limitations.
Limit: 700 keywords per search.
Google Keyword Planner is a free tool offered by Google Ads that helps users discover relevant keywords for their content strategy or advertising campaigns.
The 700 keyword limit applies to the number of keywords you can add to a single Keyword Ideas search. It also applies to the number of keywords you can download when using the "Get search volume and forecasts" feature.
Phew, we've covered a lot of ground, haven't we? While these limits may seem like small details, they can have a big impact on your website's success. So, pay attention to the details, stay up to date with the latest best practices in SEO, and watch your site climb the ranks in search engine results.
And I want to remind you to download the All SEO Limits PDF file. Share it with friends and colleagues. Let’s do beautiful SEO together.