mauricerenck/darkvisitors

Kirby robots.txt plugin for blocking AI Crawlers and Bots

1.0.1 2024-04-19 09:54 UTC

This package is auto-updated.

Last update: 2024-04-19 09:55:48 UTC


README

Dark Visitors is a plugin for Kirby 3 and 4 that blocks unwanted AI Crawlers from your website using robots.txt. It uses the Dark Visitors API to identify and block unwanted visitors.

It also allows you to add custom rules and your sitemaps to your robots.txt file.

Installation

composer require mauricerenck/darkvisitors

Or download the latest release unzip it, copy it to site/plugins/dark-visitors

Get the access token

You need a Dark Visitors access token to use this plugin. Go to https://darkvisitors.com/ create an account and your own Project. Open your project and get your access token under settings.

Usage

Edit your config.php and add the following line:

'mauricerenck.dark-visitors.token' => 'YOUR TOKEN'

AI crawlers

Set which types of AI crawlers you want to block:

'mauricerenck.dark-visitors.aiTypes' => ['AI Assistant', 'AI Data Scraper', 'AI Search Crawler'],

Your custom rules

Add your custom rules to the robots.txt file:

'mauricerenck.dark-visitors.agents' => [
    [
        'userAgents' => ['Googlebot', 'Bingbot'],
        'disallow' => ['/admin'],
    ],
    [
        'userAgents' => ['Bingbot'],
        'allow' => ['/microsoft'],
    ],
],

Setting your custom rules will overwrite the default rules, which are:

[
    'userAgents' => ['*'],
    'disallow' => ['/kirby', '/site'],
];

Sitemaps

Add your sitemaps to the robots.txt file:

'mauricerenck.dark-visitors.sitemaps' => [
    'Sitemap: https://your-site.tld/sitemap.xml',
    'Sitemap: https://your-site.tld/sitemap2.xml',
],

Learn more about robots.txt and AI crawlers