Bing launches a tool to analyze Robots.txt

Bing robot.txt tool

When talking about SEO and SEO tools, the natural tendency is to talk about SEO in Google and positioning tools in Google. However, SEO works for all search engines, such as B. Bing, the search engine from Microsoft that has launched a new tool for Analyze Robots.txt files look for mistakes.

Bing has released a tool that allows you to analyze and edit the Robots.tx file to avoid common mistakes.

In July 2020, Bing announced the release of a tool to analyze Robots.txt files on websites. It wasn’t until September 4th that the company officially published on its blog the launch of this new tool, which emerged from the merger of Bingbot and BingAdsBot and was developed to analyze and configure the Robots.txt files of one or more websites. The tool is free and can be found on the Bing Developer Tools page.

It is a full Robots.txt file editor that allows users to modify access to the different robots and configure the file according to the webmaster. Once the changes have been made, the tool performs a full scan of the file for errors at the touch of a button.

It is able to check the “permits / prohibitions”, that is, the permissions and blocks for all “user agents” which are different Search engine bots (Google, Bing, Yandex … etc.). It also shows four different versions of the same text: http: //, https: //, http: // www, and https: // www so that the user can see how it works on any version of a webpage.

Webmasters can use the tool to update and edit the file or download it for offline editing. When you download the file, step-by-step instructions will appear on the screen showing the entire update process including downloading and uploading the edited file and uploading the final file to the server. If you are editing the file yourself, you can reload the file and click the “Check recent changes” button so that the tool only analyzes the last changes made to the file.

Click to rate this entry!
(Votes: 0 Average: 0)
Share!

Leave a Comment