Skip to main content

SEO Audit Analysis, technical checklist in Digital Marketing

Technical SEO Checklist

  • Setup Google Analytics for website
  • Use AHREFS.com to measure keyword ranking and link building
  • Google search console setup & verify website
  • Run google mobile friendly test - https://search.google.com/test/mobile-friendly
  • Google page speed Insights - https://pagespeed.web.dev/

URL Campaign builder

https://ga-dev-tools.web.app/campaign-url-builder/

Audit

A thorough audit requires at least a little planning to ensure nothing slips through the cracks.

Crawl Before You Walk

Before we can diagnose problems with the site, we have to know exactly what we're dealing with. Therefore, the first (and most important) preparation step is to crawl the entire website.

Crawling Tools

Screaming Frog's SEO Spider is a good tool to perform the site crawl (it's free for the first 500 URIs and £99/year after that).

Alternatively, if you want a truly free tool, you can use Xenu's Link Sleuth; however, be forewarned that this tool was designed to crawl a site to find broken links. It displays a site's page titles and meta descriptions, but it was not created to perform the level of analysis we're going to discuss.
https://home.snafu.de/tilman/xenulink.html
https://www.screamingfrog.co.uk/seo-spider/

Crawling Configuration
By default, I suggest disabling cookies, JavaScript, and CSS when crawling a site. If you can diagnose and correct the problems encountered by dumb crawlers, that work can also be applied to most of the problems experienced by smarter crawlers. Then, for situations where a dumb crawler just won't cut it (e.g., pages that are heavily reliant on AJAX), you can switch to a smarter crawler.

SEO Audit Analysis

The actual analysis is broken down into five large sections:
  1. Accessibility
  2. Indexability
  3. On-Page Ranking Factors
  4. Off-Page Ranking Factors
  5. Competitive Analysis

1. Accessibility

If search engines and users can't access your site, it might as well not exist. With that in mind, let's make sure your site's pages are accessible.

Robots.txt

The robots.txt file is used to restrict search engine crawlers from accessing sections of your website. Although the file is very useful, it's also an easy way to inadvertently block crawlers.
As an extreme example, the following robots.txt entry restricts all crawlers from accessing any part of your site:
User-agent:*
Disallow:/

Robots Meta Tags

The robots meta tag is used to tell search engine crawlers if they are allowed to index a specific page and follow its links.
When analyzing your site's accessibility, you want to identify pages that are inadvertently blocking crawlers. Here is an example of a robots meta tag that prevents crawlers from indexing a page and following its links:
<head>
<meta name="robots" content="noindex, nofollow" />
</head>

HTTP Status Codes

Search engines and users are unable to access your site's content if you have URLs that return errors (i.e., 4xx and 5xx HTTP status codes). During your site crawl, you should identify and fix any URLs that return errors

Speaking of redirection, this is also a great opportunity to inventory your site's redirection techniques. Be sure the site is using 301 HTTP redirects (and not 302 HTTP redirects, meta refresh redirects, or JavaScript-based redirects) because they pass the most link juice to their destination pages.

XML Sitemap

Your site's XML Sitemap provides a roadmap for search engine crawlers to ensure they can easily find all of your site's pages.

Here are a few important questions to answer about your Sitemap:

Is the Sitemap a well-formed XML document? Does it follow the Sitemap protocol? http://www.sitemaps.org/protocol.html
if yours doesn't conform to this format, it might not be processed correctly.

Has the Sitemap been submitted to your webmaster tools accounts? It's possible for search engines to find the Sitemap without your assistance, but you should explicitly notify them about its location.

Did you find pages in the site crawl that do not appear in the Sitemap? You want to make sure the Sitemap presents an up-to-date view of the website.

Are there pages listed in the Sitemap that do not appear in the site crawl? If these pages still exist on the site, they are currently orphaned. Find an appropriate location for them in the site architecture, and make sure they receive at least one internal backlink.

Site Architecture

Your site architecture defines the overall structure of your website, including its vertical depth (how many levels it has) as well as its horizontal breadth at each level.

When evaluating your site architecture, identify how many clicks it takes to get from the homepage to other important pages. Also, evaluate how well pages are linking to others in the site's hierarchy, and make sure the most important pages are prioritized in the architecture.

Ideally, you want to strive for a flatter site architecture that takes advantage of both vertical and horizontal linking opportunities.

JavaScript Navigation

The best site architecture in the world can be undermined by navigational elements that are inaccessible to search engines. Although search engine crawlers have become much more intelligent over the years, it is still safer to avoid JavaScript navigation.

Site Performance

Users have a very limited attention span, and if your site takes too long to load, they will leave. Similarly, search engine crawlers have a limited amount of time that they can allocate to each site on the Internet. Consequently, sites that load quickly are crawled more thoroughly and more consistently than slower ones.

2. Indexability

Next, we need to determine how many of those pages are actually being indexed by the search engines.

Site: Command

Most search engines offer a "site:" command that allows you to search for content on a specific website. You can use this command to get a very rough estimate for the number of pages that are being indexed by a given search engine.

Index Sanity Checks

Make sure the search engines are indexing the site's most important pages.

Brand Searches

After you check whether your important pages have been indexed, you should check if your website is ranking well for your company's name (or your brand's name).

Just search for your company or brand name. If your website appears at the top of the results, all is well with the universe.

3. On-Page Ranking Factors

In general, the page level analysis is useful for identifying specific examples of optimization opportunities, and the domain level analysis helps define the level of effort necessary to make site-wide corrections.

URLs

Since a URL is the entry point to a page's content, it's a logical place to begin our on-page analysis.

When analyzing the URL for a given page, here are a few important questions to ask:
  • Is the URL short and user-friendly? A common rule of thumb is to keep URLs less than 115 characters.
  • Does the URL include relevant keywords? It's important to use a URL that effectively describes its corresponding content.
  • Is the URL using subfolders instead of subdomains? Subdomains are mostly treated as unique domains when it comes to passing link juice. Subfolders don't have this problem, and as a result, they are typically preferred over subdomains.
  • Does the URL avoid using excessive parameters? If possible, use static URLs. If you simply can't avoid using parameters, at least register them with your Google Webmaster Tools account.
  • Is the URL using hyphens to separate words? Underscores have a very checkered past with certain search engines. To be on the safe side, just use hyphens.

Content

We all know content is king so now, let's give your site the royal treatment.

To investigate a page's content, you have various tools at your disposal. The simplest approach is to view Google's cached copy of the page (the text-only version). Alternatively, you can use SEO Browser or Browseo. https://www.browseo.net/
  • Does the page contain substantive content? There's no hard and fast rule for how much content a page should contain, but using at least 300 words is a good rule of thumb.
  • Is the content valuable to its audience? This is obviously somewhat subjective, but you can approximate the answer with metrics such as bounce rate and time spent on the page.
  • Does the content contain targeted keywords? Do they appear in the first few paragraphs? If you want to rank for a keyword, it really helps to use it in your content.
  • Is the content spammy (e.g., keyword stuffing)? You want to include keywords in your content, but you don't want to go overboard.
  • Does the content minimize spelling and grammatical errors? Your content loses professional credibility if it contains glaring mistakes. Spell check is your friend; I promise.
  • Is the content easily readable? Various metrics exist for quantifying the readability of content (e.g., Flesch Reading Ease, Fog Index, etc.).
  • Are search engines able to process the content? Don't trap your content inside Flash, overly complex JavaScript, or images.

When analyzing the content across your entire site, you want to focus on 3 main areas:
  1. Information Architecture
    Your site's information architecture defines how information is laid out on the site. It is the blueprint for how your site presents information (and how you expect visitors to consume that information).
    During the audit, you should ensure that each of your site's pages has a purpose. You should also verify that each of your targeted keywords is being represented by a page on your site.
  2. Keyword Cannibalism
    Keyword cannibalism describes the situation where your site has multiple pages that target the same keyword. When multiple pages target a keyword, it creates confusion for the search engines, and more importantly, it creates confusion for visitors. To identify cannibalism, you can create a keyword index that maps keywords to pages on your site. Then, when you identify collisions (i.e., multiple pages associated with a particular keyword), you can merge the pages or repurpose the competing pages to target alternate (and unique) keywords.
  3. Duplicate Content
    Your site has duplicate content if multiple pages contain the same (or nearly the same) content. Unfortunately, these pages can be both internal and external (i.e. hosted on a different domain).
    To identify duplicate content on external pages, you can use Copyscape or blekko's duplicate content detection.
    https://copyscape.com/
    https://www.siteliner.com/

HTML Markup

It's hard to overstate the value of your site's HTML because it contains a few of the most important on-page ranking factors.

W3C offers a markup validator to help you find standards violations in your HTML markup. They also offer a CSS validator to help you check your site's CSS.
http://validator.w3.org/
http://jigsaw.w3.org/css-validator/

Titles

A page's title is its single most identifying characteristic. It's what appears first in the search engine results, and it's often the first thing people notice in social media. Thus, it's extremely important to evaluate the titles on your site.

When evaluating an individual page's title, you should consider the following questions:

Is the title succinct? A commonly used guideline is to make titles no more than 70 characters. Longer titles will get cut off in the search engine results, and they also make it difficult for people to add commentary on Twitter.
Does the title effectively describe the page's content? Don't pull the bait and switch on your audience; use a compelling title that directly relates to your content's subject matter.
Does the title contain a targeted keyword? Is the keyword at the front of the title? A page's title is one of the strongest on-page ranking factors so make sure it includes a targeted keyword.

Meta Descriptions

A page's meta description doesn't explicitly act as a ranking factor, but it does affect the page's click-through rate in the search engine results.

The meta description best practices are almost identical to those described for titles. In your page level analysis, you're looking for succinct (no more than 155 characters) and relevant meta descriptions that have not been over-optimized.

In your domain level analysis, you want to ensure that each page has a unique meta description. Your Google Webmaster Tools account will report duplicate meta descriptions that Google finds (look under "Optimization" > "HTML Improvements").

Note
  • Do any pages contain a rel="canonical" link? This link element is used to help avoid duplicate content issues. Make sure your site is using it correctly.
  • Are any pages in a paginated series? Are they using rel="next" and rel="prev" link elements? These link elements help inform search engines how to handle pagination on your site.
  • Does the page use an H1 tag? Does the tag include a targeted keyword? Heading tags aren't as powerful as titles, but they're still an important place to include keywords.
    Is the page avoiding frames and iframes? When you use a frame to embed content, search engines do not associate the content with your page (it is associated with the frame's source page).
  • Does the page have an appropriate content-to-ads ratio? If your site uses ads as a revenue source, that's fine. Just make sure they don't overpower your site's content.

Images

A picture might say a thousand words to users, but for search engines, pictures are mute. Therefore, your site needs to provide image metadata so that search engines can participate in the conversation.

When analyzing an image, the two most important attributes are the image's alt text and the image's filename. Both attributes should include relevant descriptions of the image, and ideally, they'll also contain targeted keywords.

4. Off-Page Ranking Factors

Popularity

Is your site gaining traffic? Your analytics package is your best source for traffic-based information.
How does your site's popularity compare against similar sites? Using third party services such as http://www.alexa.com/
Is your site receiving backlinks from popular sites? Link-based popularity metrics such as mozRank are useful for monitoring your site's popularity as well as the popularity of the sites linking to yours.

Authority

A site's authority is determined by a combination of factors (e.g., the quality and quantity of its backlinks, its popularity, its trustworthiness, etc.).

To help evaluate your site's authority, SEOmoz provides two important metrics: Page Authority and Domain Authority. Page Authority predicts how well a specific page will perform in the search engine rankings, and Domain Authority predicts the performance for an entire domain.

Both metrics aggregate numerous link-based features (e.g., mozRank, mozTrust, etc.) to give you an easy way to compare the relative strengths of various pages and domains. For more information, watch the corresponding Whiteboard Friday video about these metrics: Domain Authority & Page Authority Metrics.
http://smallseotools.com/page-authority-checker/

Social Engagement

As the Web becomes more and more social, the success of your website depends more and more on its ability to attract social mentions and create social conversations.

More SEO
http://www.semrush.com/
Visit MOZ for more details

Comments

Popular posts from this blog

Digital Marketing Plan

Digital Marketing Plan A digital marketing plan (OR digital plan) is a document that outlines a company's goals for promoting products and services online and how they expect to connect with them. These documents focus on the methods the marketing team plans to use for communicating information about the company's offerings to its target market. A digital marketing plan helps you determine the most effective strategies for your business and then put those strategies into action . It can include business and marketing goals , strategies you’ll use to meet those goals, timelines, digital channels, and more. A marketing plan is a document that outlines the activities the marketing team will undertake to reach the goals set out in the company’s marketing strategy. It is the roadmap to achieve those objectives, reach people, tell them about your products, and convert them into long-time customers. Both a marketing plan and a marketing strategy are important parts of an org

Product Lifecycle Management PLM

Product Lifecycle A product life cycle is the length of time from a product first being introduced to consumers until it is removed from the market .  The concept of product life cycle helps inform  business decision-making , from  pricing and promotion  to  expansion or cost-cutting . (advertising, reduce prices, expand to new markets, or redesign packaging.) A company often incurs higher marketing costs when introducing a product to the market but experiences higher sales as product adoption grows. Sales stabilize and peak when the product's adoption matures, though competition and obsolescence may cause its decline. A product's life cycle is usually broken down into four stages;  introduction ,   growth ,   maturity , and   decline . Product Life Cycle 1. Introduction Stage The introduction phase is the first time customers are introduced to the new product. A product begins with an idea, and within the confines of modern business, it isn't likely to go further until