WordPress Robots.txt For SEO

The robots.txt file is used to instruct search engine robots about what pages on your website should be crawled and consequently indexed. Most websites have files and folders that are not relevant for search engines (like images or admin files) therefore creating a robots.txt file can actually improve your website indexation.

Implementing an effective SEO robots.txt file for WordPress will help your blog to rank higher in Search Engines, receive higher paying relevant Ads, and increase your blog traffic.

Here is my robots.txt files, which can further protect WordPress from this duplicate content issue.

User-agent: *
Disallow: /wp-
Disallow: /feed/
Disallow: /trackback/
Disallow: /comments/feed/
Disallow: /page/
Disallow: /comments/

After you created the robots.txt file just upload it to your root directory and you are done!

1 Comment

 Add your comment
  1. 生是做网站的人。死是做网站的鬼。我的网站什么时候才有你网站的那么成功啊。羡慕中~~~不嫌弃的来个友情链接如何啊

Leave a Comment

Your email address will not be published.

*