3 Overlooked SEO Metrics by Web Developers

Each day, businesses are spending large amounts of time and money on their SEO campaigns only to find that their site is not improving in organic rankings. Most website owners put a lot of the emphasis on establishing a good link building profile and on-page content changes based on ongoing keyword research. While these are both very important to the success of an SEO campaign, many developers and site owners overlook some important aspects of the site’s structure and backend that will actually hurt your organic keyword rankings. With all of the algorithm changes that have taken place (like Panda, Penguin and Hummingbird updates), search engines like Google are gaining more ability to crawl sites the same way people view them. Very similar to Google’s mobile-first indexing that was just recently released.

CHECK MY SITE NOW

Page Speed

The speed of your pages is all about setting good usability standards. Since Google factors both your desktop and mobile speed in your rankings, it is important to make your site as fast as possible. That way, the faster your pages load, the less time it takes for the Googlebot to crawl them and the more room for you to beat your slow competition. From a conversion standpoint, page speed is also very important. A user is much more inclined to continue browsing a page that is quick and working properly. Slow loading pages end in lost visitors every time.

Heavily coded pages can cause slow loading pages. There are some tips and tricks you can implement that will increase the speed of your site, making the experience for your users much friendlier:

  • External CSS: Your site’s CSS should be located in a separate file on your server, and not inline on every page. Include all of your CSS into 1 external style sheet to reduce HTTP requests enabling each page to load faster.
  • Caching and Compression: Browser caching allows your browser to load previously downloaded material from your machine rather than over the network, and text compression compacts resources sent over the network, reducing download time. Two of the most popular methods of compression are GZIP and Deflate. Implementing text file compression and leverage browser caching will depend on the hosting and CMS you are using. If possible, using these tools will dramatically increase your site’s speed.
  • Optimized Images: Reducing the file size of your images will dramatically increase the speed of your sites pages. The trick is trying to reduce file size without losing too much image quality. There are many free and paid tools available for image compression and optimization.

 

Indexation and Crawling

Insufficient crawling and incomplete indexation is a major downfall for many sites. It is important to ask yourself, through every stage of the development process, if your site can be crawled properly by search engines. There are ways to ensure Google, or any search engine, will be able to see your site properly and crawl it thoroughly:

  • XML Sitemap: An XML Sitemap is used to help search engines find all of the pages of your site that you want crawled. It should include every page that you want indexed and should be updated often. For larger sites, it is common practice to break up your pages into several smaller sitemaps. It should be stored on the top level of your server and a reference to your XML Sitemap should be included in your Robots.txt file as well (see below):

sitemap: http://www.example.com.com/sitemap.xml

  • Canonicalization: Basically, Canonical URLs ensure that your link flow isn’t distributed among different variations of the same content. For instance, if your sites homepage can be accessed from www.yoursite.com, yoursite.com, www.yoursite.com/index.html you will not see any difference on each page from a user standpoint. But this is telling search engines that these are 3 completely different pages, thus causing you to lose page authority because the juice given by the crawler is being split up among the three different URLs instead of just being distributed to the one canonicalized version. So using the correct canonical URL means using the proper URL structure in all outbound and inbound links site-wide.
  • Robots.txt: This file is stored on the top-level of your sites server and prompts search engines whether or not certain files and/or directories should be crawled and indexed. Depending on your setup, there are particular server directories that should not allow crawling. Below is an example of a robots.txt with two rules. Take notice the XML sitemap reference!

Page Structure and Hierarchy

A common SEO mistake that is made quite often is the misuse of HTML heading tags on a site’s page. Heading tags (<h1> <h2> <h3>) are used to indicate the headlines of your pages that tell search engines what the page is actually about. For instance, the H1 tag of every page should be as descriptive and keyword relevant to the pages content as possible. Sub-headings within the content under the H1 tag should be tagged as H2’s, sub-headings within the content of the H2 tags should be tagged as H3’s and so forth.

The most common misapplications of heading tags are:

  • Using heading tags for blocks of text (they  are only for headings, and should be short and descriptive)
  • Using the H1 tag multiple times per page (it should only be used once per page as the main heading for that page)
  • Hiding the H1 tag or placing the logo inside of the H1 tag

The examples above are just a handful of tips that site owners should utilize when developing with SEO in mind and adhering to Google’s Quality Guidelines. You can easily avoid complications with search ranking and penalties from Google’s Panda and Penguin algorithm updates if all aspects of your site’s optimization is taken into account. Don’t forget to use our website analyzer to check all these components and ensure your website checks out.

FREE SITE ANALYSIS

Please follow and like us:

Leave a Reply

Your email address will not be published. Required fields are marked *