MENU
Talk to us 01164 244 244 / 02035 877 877 Drop us a message Request a Callback

31 March 2016

SEO Search Engine Optimisation

What is robots.txt and how can it help with SEO?

What is robots.txt and how can it help with SEO?

When it comes to SEO, the most technical aspects always seem like the most complicated – however, this isn’t always the case and this is especially true with techniques such as Robots.txt which, though it may have a very technical name, has a pretty simple concept behind it.

What is Robots.txt?

Robots.txt is a file that can restrict the Search Engine from crawling a particular page, and is normally placed in something called the “root directory” of your website. The root directory defined as the webpage that appears when a user accesses a domain name in the browser. Your domain name is the name of your website e.g. for www.nameofcompany.co.uk, the domain name would be “nameofcompany”.

So, now that we’ve got all the technical jargon out of the way, let’s explore the main question:

Why would you use a robots.txt file and how can it help with SEO?

Logically, it may not make sense immediately. After all, the whole point of SEO is to have your website crawled more often, so why would you use a robots.txt file to stop a Search Engine from crawling certain pages of your website? Well, let’s put it this way – SEO is not just about having your website crawled more often. It is actually about having the relevant pages of your website crawled more often.

Because Search Engines such as Google make user experience their priority, you have to as well. This means taking measure to ensure that only the relevant pages come up in the search results. Therefore, if you have a page that you don’t think is relevant to a search, you can use robots.txt to make sure it doesn’t come up in the search results.

However, bear in mind that robots.txt is not something that should be used for sensitive or confidential information, as someone could still access that page by guessing the URL or finding a link to that URL somewhere else. It is therefore important to make sure extra measures are taken for situations such as these. For more information on what you can do to block confidential information, contact us here at FDC for a free consultation.

How can you use robots.txt?

Google webmaster has a robots.txt generator to help you create the file, however if you want to restrict access to certain subdomains (e.g. www.nameofcompany.co.uk/home), you will need to creates separate files for this. However, the better and safer alternative is to contact a web design agency so that they can help.

Alternatives

If you don’t want to use Robots.txt for whatever reason, you can also add a “NOINDEX” and use Google Webmaster to remove links that have already been followed. Again, this is something that would need the attention of a web design agency, unless you yourself are very familiar with the technicalities.

So, now that you know the basics of robots.txt, you can decide for yourself if you want or need it – chances are, if you have a few pages on your website, you will want to restrict access to at least one or two of them to make sure you are only providing relevant, useful content regarding your business to web browsers.

Archive

Or just call us on 0800 131 0707