Hub

Robots: Txt, Meta & X – Brighton SEO September 2017

3 min read by tom 13 Sep 2017

Brighton SEO 2017

Methods of blocking/restricting Google’s crawl of your site are so often confused or incorrectly selected. In this presentation describe the process – which I believe is deceptively simple method for picking the right tool for the job.

The BrightonSEO slides can be found here:

I’ve also written more on the subject on State of Digital recently, it covers much of the same ground as the presentation itself, but for those who didn’t see it on the day it may plug some of the gaps/answer any questions you may have. Read it here!

The Flowchart Process

There’s a lot to fit in a 20 min presentation, so I’ve opted to include a flow chart to help give more of a step-by-step process for you to follow should you need to! Click the image to download the file itself.

my robots.txt/meta robots process

Within the presentation I’ve done what I can to credit all my sources – although in the process of pulling this together, much of the support/insight I’ve received risks going uncredited. Thanks to Will O’Hara for sanity checking much of my thought process as well as Jon Myers who put off working on his own presentation to lend me a hand.

Do you Have any More Questions?

Did I miss something, is there more you want to know or do you think you’ve spotted something I’ve missed? Keep the conversation on this going over on twitter.

Let's talk strategy
Drag Read