Skip to main content

Last week our Head of SEO, Jamie White, attended the BrightonSEO conference and has put together this round-up of his favourite and most insightful talks.

Aleyda Solis – Goodbye SEO f*ck ups! Learn to set an SEO Quality Assurance Framework

The first talk of the day was by Aleyda Solis, who was talking about Quality Assurance Frameworks in SEO. The message that Aleyda put out was resonating with almost everyone in the audience – too much time being spent fixing broken, technical SEO, rather than building or executing new initiatives.

She has built a framework which facilitates technical SEO improvements getting pushed through faster and more accurately. The framework is built upon three core components:

  • Education
  • Validation
  • Monitoring

The whole process ensures that, from the very beginning, you’re earning the buy-in of client teams by educating them on the purpose of SEO, why certain changes will be recommended, and the expected impact of those changes.

Next, validating those changes by gaining access to different site environments, such as staging, development, and UAT. This helps to ensure that whatever is being implemented is aligned with the suggested recommendations.

Finally, monitoring the impact of those changes and reporting back to the client whether or not the desired impact was seen. This might happen instantly, or over the course of a few months, but is crucial to retain the buy-in that you successfully earned at the beginning of the process.

You can view Aleyda’s slides here.

Fili Wiese – Mastering robots.txt: SEO insights by an ex-Google engineer

The Crawling and Indexation session, run by ex-Google engineer Fili Wiese, gave great insight into mastering the robots.txt file for better crawl management. This talk was very much “hints and tips”, so I will do my best to list some of the lesser-known ones:

  • Every origin (domain, subdomain) can – and should – have its own robots.txt file.
  • Currently, numbers are not recognised in the robots.txt file, so any user agents or commands containing a number are considered invalid.
  • Disallow commands are case-sensitive, whereas user agents are not.
  • Certain characters, such as the plus symbol, need to be fully encoded to be recognised.
  • The least restrictive rule always applies, so you can’t allow a user-agent access to the full site and then disallow a certain folder.

Mark Williams-Cook – Effective zero-volume keyword research and why it’s important

One of the later talks, led by Mark Williams-Cook, focused on harnessing the power of Zero-volume Search Queries. This is a concept that plagues SEOs all the time – keywords for which search volume data is unavailable or reported as being very low.

This makes it hard to prioritise certain keywords for optimisation, or even know whether they are a valid approach at all. Mark suggested that we should be thinking more around “intent volume” rather than “search volume”, by bucketing together keywords with a similar intent to create a more holistic view of a topic instead of focusing on individual keywords.

This was a really great approach because it suddenly turns a handful of low-volume keywords into a bucket with high intent volume.

In terms of compiling these intent buckets, Mark introduced us to Custom Search Engines, which can be created to isolate results from specific websites and then extracted via API to use in tools such as Semrush, AlsoAsked, or Google Keyword Planner. These can be accessed via https://cse.google.com.

Mark’s slides are available at this link.

 

Thank you for listening and we hope you enjoyed our coverage! For more information on our SEO services, visit here.