WordPress Robots.txt For SEO

The robots.txt file is used to instruct search engine robots about what pages on your website should be crawled and consequently indexed. Most websites have files and folders that are not relevant for search engines (like images or admin files) therefore creating a robots.txt file can actually improve your website indexation.

Implementing an effective SEO robots.txt file for WordPress will help your blog to rank higher in Search Engines, receive higher paying relevant Ads, and increase your blog traffic.

Here is my robots.txt files, which can further protect WordPress from this duplicate content issue.


User-agent: *
Disallow: /wp-
Disallow: /feed/
Disallow: /trackback/
Disallow: /comments/feed/
Disallow: /page/
Disallow: /comments/

After you created the robots.txt file just upload it to your root directory and you are done!

Leave a Comment

Your email address will not be published.

*

此站点使用Akismet来减少垃圾评论。了解我们如何处理您的评论数据

1 Trackback