Data-driven SEO: Using Free Tools to get detailed insights about your SEO Performance
Jan-Willem Bobbink is an international freelance SEO with main focus on financial and travel markets, brand ambassador for Majestic, and speaker at numerous international SEO events. His blog can be found at Notprovided.eu.
Research on data effectiveness shows that at certain point more data leads to poorer results – there is no need to collect data for the sake of collecting data.
Data-driven SEO methodology should be simple:
- Define questions before collecting data;
- Select your tools;
- Make your data visual (not just spreadsheets).
Facts to know before you start:
- Top three ranking pages and why (due to content, links, or else?);
- Link data (best linked and potential pages);
- Quality of content.
- Identify every URL at website (master Vlookup function in Excel). Conduct website crawling (with ScreamingFrog tool – free for 500 URLs; VisualSEO; ISS SEO toolkit by Microsoft; Xenu). 95 percent of SEO tools are built for Windows, so it’s better not to use Mac (tools for Mac – Integrity; Java-based Spider, Python LinkChecker). If you have no access to tools, use sitemap.xml files.
Result: list of live URLs.
- Enrich dataset of URLs. Download SeoTools for Excel.
Result: page-specific data. Word count shows number of words (the better the content is, the higher the chances website starts ranking). Inlinks and outlinks show number of links per page (can be used to build internal linking profile).
- Know URL-based performance data. Download data from Google Search Console (specifically, desktop vs. mobile data; page performance data), but be aware it’s limited to 90 days. Verify individual folders for big websites.
- Look at SEO specific data. Links are one of the most important ranking factors now and will be for at least five more years. Use Majestic for free link data (download data and add to Excel sheet).
- Think of real user: have social value. Download overall performance of your site; see your date per channel (social vs. overall). Socialcrawlytics.com is a great tool to get social data.
- Put all data together and start analyzing.
Onpage data: shows pages that need more content.
Webmastertools data: show pages that need optimizing.
Quick wins from your data:
- Create a list of URLs with established visibility in Google but few internal links. Add more links to them.
- Create a list of pages with goof amount of organic traffic but high bounce rate. Check whether content is relevant to queries; update, remove or enrich content accordingly.
- Find pages with poor Google rankings but high traffic from other sources. Try to get those pages ranked better.
- Find visible pages with good metrics but no top 3. Get more links to them.
- Find pages from top 10 with few internal links and lots of external links. Add internal links and content; move them to top 3.
- Create overview of pages that rank well but have low click-through rate. Optimize meta data; rebuild page; change keywords.
- Create list of URLs with impressions and visibility but no great content targeting a specific query. Optimize onpage content.
Make your data understandable and visual to convince your shareholders/clients (example of page links below)!
More content might bring better ranking in Google. Compare your volume of content with that of high-ranking websites.
- Find pages with good traffic and low conversion rate. Test new layout, different content, or different pricing.
- Look at average value per user. Prioritize according to best performing pages/average number of conversions per user.
- Find pages with no organic traffic. Remove old an irrelevant to avoid Panda.
- Cluster pages into themes (do it automatically with monkeylearn.com). If there are folders without traffic, this might signal of link penalty or Google finding them irrelevant to queries.
- Find pages that perform well in AdWords, but have poor organic results. Push them harder.