平台开发中,欢迎参与测试。你可以在 QQ群: 12370907 中与我们交流,或是直接在 社区发帖

ChatBot Blocker by CellarWeb

You can block ChatGPT and Bard and other compatible chatbots from using your site content by adding a specific command to your site’s virtual robots.txt file. Note that an actual robots.txt file in your site root will override the virtual commands.

The plugin also adds your sitemap.xml file to the virtual robots.txt file.

BEGIN – Added by ChatBot Blocker by CellarWeb plugin

     #  Blocks ChatGPT bot scanning
            User-agent: GPTBot
            Disallow: /
      #  Blocks Bard bot scanning
            User-agent: Bard
            Disallow: /
      #  Blocks Bing bot scanning
            User-agent: bingbot-chat/2.0
            Disallow: /
      #  Blocks Common Crawl bot scanning
            User-agent: CCBot
            Disallow: /
      #  Blocks omgili bot scanning
            User-agent: Omgili
            Disallow: /
      #  Blocks omgilibot bot scanning
            User-agent: Omgili Bot
            Disallow: /

END - Added by ChatBot Blocker by CellarWeb plugin

What about other chatbots scanners?

We’ll add them to the plugin when we find them. And the latest list will automatically be enabled by the plugin.

If you want to add your own, use the same format as shown above. You can determine the ‘User-agent’ value by looking at your access logs – the User-agent will usually have ‘bot’ as part of it’s value. Add additional ones at the end of the list on the settings input box.

Whare do I add more settings?

We recommend adding them at the end of the default settings. Use the ‘#” character to document your settings.

How do I restore a default virtual robots.txt file?

Rmmove all text from the box that displays the current virtual robots.txt file. Then click the ‘Save Changes’ button.

How do I see the actual virtual robots.txt file?

Use the URL of www.yoursite.com?robots=1 , replacing ‘yoursite.com’ with your actual site domain name. There is a link on the Settings, Reading screen under the virtual robots.txt box created by the plugin.

What if I have an actual (not virtual) robots.txt file in the root of my site?

The actual file will override any virtual settings. The screen will show a warning message below the display of the virtual settings if an actual file is found in the site’s root folder.

Why do I want a robots.txt file?

The robots.txt file is used by responsbile bots to scan only the files that you want to be scanned. The plugin adds commands to block the ChatGPT and Bard bots from scanning your site.

There might be some bots that ignore the robots.txt directives. This is not common, but there is no easy way to block those irresponsible bots.

Should I put SEO Optimization commands in the robots.txt file?

You could, but they won’t be effective. There are better ways to do SEO optimization.

Where can I learn more about how site crawlers work and how they use the robots.txt file?

Here is a general guide by Google and here is the WordPress SEO documentation.

屏幕截图
  1. Download the plugin
  2. Unzip it
  3. Upload the unzipped folder to wp-content/plugins directory
  4. Activate and enjoy!

Or you can simply install it through the admin area plugin installer.

Where do I find the settings for this plugin?

It’s all automatic, but you can see the current virtual robots.txt settings via the Admin, Settings, Reading screen. This is because the virtual file is automatically read by WordPress on every page access.

What are the default settings created with this plugin?

Default settings are:

    User-agent: *
    Disallow: /wp-admin/
    Allow: /wp-admin/admin-ajax.php

    Sitemap: https://cellarweb.com/development/wp-sitemap.xml

1.01 (1 Nov 2023)

  • Minor changes to plugin header area for links to plugin

1.00 (28 Oct 2023)

  • Initial version release