Are SEO audit tools useful?

SEO audit


An SEO audit is a complex process to assess the level of search engine optimization of a website. The technical infrastructure, various on-page and off-page factors, the performance in the area of ​​social media and the SERP positions of the competitors are checked. The aim of an SEO audit is to be able to optimize the visibility of a website in a targeted manner.

General information on the topic [edit]

Usually, websites are checked using checklists during an SEO audit in order to formulate recommendations for improvements based on these list items. SEO audits can be described as measures in the area of ​​quality management: The current state of the website is assessed and compared with the standards of search engines. The TARGET status of an SEO audit depends on the website (size and target) and the standards of the various search engine providers (Google, Bing or Yahoo).

As a rule, SEO audits are carried out by external agencies, consultants or internal SEO employees. Most audits start with a conversation between the auditor and the website owner. Questions like:[1]

  • What is the website's goal or business model?
  • Which key areas of the website are particularly important to the customer?
  • Which previous SEO measures have been (or are still) implemented?
  • How can changes be made to the server, the file management, the CMS and the source code?
  • What accesses are there for the Google Search Console and the like?
  • What results can be expected from experience?

Aspects of an SEO audit [edit]

The basics of audits are initially data in the form of worksheets or Excel spreadsheets as well as a document that the customer can use as a recommendation for action. On the basis of the data obtained, recommendations are made, which are marked with various symbols (e.g. red symbols for absolutely necessary changes). These files are also documentation that makes the work steps transparent for the customer.

Server and infrastructure [edit]

To find out how the technical infrastructure is doing, the website is either read with the help of a crawler or indirectly checked with the webmaster tools of the search engine providers (e.g. Google Search Console and Bing Webmaster Tools). For example, the crawler detects whether there are error messages for certain URLs. Other important technical factors are:[2]

  • Robots.txt and Robots Meta Tags: Is the crawler allowed to read the website? Are individual pages blocked by robots meta tags?
  • HTTP Status Code: Do certain URLs output an error code (4xx, 5xx and also Soft 404) when they are called? How are redirects handled?
  • XML Sitemaps: Is there a sitemap available to the search engine? Is it a valid XML document? Do the URLs listed in the sitemap match the results of the crawler?
  • Page speed and time to first byte: How fast does the website load when a user calls it up? How much time passes before the first byte of data is loaded from the server?
  • Frames, Flash and JavaScript: Are iFrames, Flash or JavaScript used to implement content and menus?
  • Dynamic, canonical URLs and paginated pages: Are dynamically generated URLs rewritten with mod-rewrite? Is there a canonical url that serves as the main domain? Are pages correctly paginated with rel = "next" and rel = "prev"?


Indexing and findability [edit]

How many URLs of a domain are listed in the index of the respective search engine? Are or have there been penalties or manual actions from Google for individual URLs? These questions can be answered by comparing the index and the results of the crawl. With commands such as "site: www.example.de", for example, the Google index can be queried. Ideally, the data fit together.

If this is not the case, it either points to errors during crawling (URLs are not read, i.e. not indexed) or to duplicate content (there are websites with the same content in the index).[3] Entering relevant keywords (brand, company name, services, products) into the search mask provides an additional view of the visibility with regard to certain key terms. Individual URLs with complete addresses can also be checked for their presence in the index.

Information architecture [edit]

The information architecture is the vertical and horizontal structure of a web page that can be represented as an inverted tree. How many clicks does a user need to get to the information they are looking for on the website? How many levels in depth does the website have? How many horizontal website elements are there? The most important sub-pages should be accessible with about three clicks. A relatively flat overall structure is also important, as this enriches the customer journey.

The website architecture also has an impact on the linking options within the hierarchy. The link juice should be evenly distributed over the structures in order to avoid so-called silos - these are structures that only distribute the link juice horizontally.[4]

On Page [edit]

  • URLs: URLs should be short, meaningful and therefore user-friendly. In addition, they should contain the relevant keywords for individual pages - and if possible no special characters. Parameters are often used for dynamically generated URLs. If so, these URLs should be rewritten or registered with the Search Console.
  • Duplicate Content: Some URLs generate duplicate content. This is to be avoided at all costs.
  • Content: The content can be inspected using a text-based browser such as SEO Browser or Browseo. Each page should provide substantial information for the user, be easy to read and contain the most important keywords for the individual page. A clear structure with the use of H1-H6 headings and other text markups is recommended. Keyword stuffing and grammatical errors should be avoided.
  • Meta Title: One of the most important ranking factors. The title tag ideally contains relevant keywords and describes the page content.
  • Meta Description: Good meta descriptions can attract users to the website and increase the click-through rate. Again, one or two keywords should describe the content of the website.
  • Images: Images, logos and graphics should be described concisely. The ALT tag gives an alternative description of the image if reading programs or the like are used. Here, too, keywords can be used moderately.
  • Outbound and internal links: Hyperlinks act as recommendations for the quality of a website. Every outgoing link should be checked for the trustworthiness of the website, its relevance for its own content and the anchor text. It is important to avoid error messages and unnecessary redirects. Ideally, most of the links are set to "dofollow" so that the link juice can spread.[5]

Off Page [edit]

  • Popularity: How does the popularity compare to the competition? Is the website linked from other popular websites?
  • Trustworthiness: Was keyword stuffing carried out on the website? Is there any hidden text that is invisible to crawlers? Do you use cloaking?
  • Backlinks: An organic link profile is one of the most important criteria in an SEO audit. How many domains are linking to your own website? How many different domains are there? Does the link profile contain nofollow links (no nofollow links at all would look unnatural)? Are the backlinks thematically relevant and of high quality? Are there any websites that contain earned media?
  • Authority: Both the authority of individual subpages and that of the entire domain can have an impact on the ranking.
  • Public Relations: Is the company, brand or website mentioned in different media?
  • Social media: There are different social signals for every social network. Most of the time, the focus is on interactions between users and a profile. Is content distributed? Is the website mentioned by users on social media? Are there influencers who distribute content virally? Is the social profile SEO optimized?
  • Competition: Information about competitors can help customers understand the strengths and weaknesses of competitors and improve their own portfolio. In order to obtain such data, an SEO audit can be carried out for competitors - with certain changes, of course, as certain data is not available. With WDF * IDF analyzes of texts, successful competitors can also be identified with regard to keywords.[6]

Importance for search engine optimization [edit]

An SEO audit reveals the strengths and weaknesses of a website and shows potential for improvement. In addition to the basic aspects such as crawlability, indexing and on- and off-page factors, the user-friendliness of a website is an important signal. User behavior changes only slightly over longer periods of time, but the ranking factors and standards of search engines are constantly changing. This is one of the main reasons why an SEO audit should take place on a regular basis.[7]

The SEO audit not only aims to optimize a website for search engines, but also to offer users high-quality content and a good customer journey. However, this makes the SEO audit very extensive. Some agencies therefore split the areas and offer content audits, on-page audits, off-page audits and technical SEO audits.

For companies with limited budgets, free tools and audit software may be useful. Examples are MySiteAuditor, ScreamingFrog, ZadroWeb, Found, SEO Report Card, WooRank or Marketing Grabber.[8] There are also numerous tools that prove to be useful for certain tasks - such as Xenu as a crawler, Google Page Speed ​​and Pingdom as a loading time test, SEMRush as a traffic test or Opensiteexplorer as a link checker. However, when using free tools, it is often necessary to have a sound knowledge of how to interpret the data correctly.

References Edit]

  1. ↑ SEO Site Audits: Getting Started moz.com. Retrieved on June 2nd, 2015
  2. ↑ How To Perform an SEO Audit - FREE $ 5000 Template Included quicksprout.com. Retrieved on June 2nd, 2015
  3. ↑ How to Perform the World's Greatest SEO Audit moz.com. Retrieved on June 2nd, 2015
  4. ^ Successful Site Architecture for SEO moz.com. Retrieved on June 2nd, 2015
  5. ↑ SEO Audit: What is it and do I Need One? linkedin.com. Retrieved on June 2nd, 2015
  6. ↑ How to perform A SEO audit of your web site (checklist included) reliablesoft.net. Retrieved on June 2nd, 2015
  7. ^ The Perennial SEO Audit - Creating an Effective Framework for Keeping Your Campaign Running at Peak Performance searchenginewatch.com. Retrieved on June 2nd, 2015
  8. ↑ Top 5 Free Website Audit Tools For Agencies business2community.com. Retrieved on June 2nd, 2015

Web links [edit]