mauricerenck / darkvisitors
Kirby robots.txt plugin for blocking AI Crawlers and Bots
Installs: 220
Dependents: 0
Suggesters: 0
Security: 0
Stars: 14
Watchers: 2
Forks: 0
Open Issues: 0
Type:kirby-plugin
Requires
- php: >=8.0.0
- amphp/amp: ^3.0
- getkirby/composer-installer: ^1.2
Requires (Dev)
- getkirby/cms: ^4
- phpunit/phpunit: ^9.5
README
Dark Visitors is a plugin for Kirby 3 and 4 that blocks unwanted AI Crawlers from your website using robots.txt. It uses the Dark Visitors API to identify and block unwanted visitors.
It also allows you to add custom rules and your sitemaps to your robots.txt file.
Installation
composer require mauricerenck/darkvisitors
Or download the latest release unzip it, copy it to site/plugins/dark-visitors
Get the access token
You need a Dark Visitors access token to use this plugin. Go to https://darkvisitors.com/ create an account and your own Project. Open your project and get your access token under settings.
Usage
Edit your config.php
and add the following line:
'mauricerenck.dark-visitors.token' => 'YOUR TOKEN'
AI crawlers
Set which types of AI crawlers you want to block:
'mauricerenck.dark-visitors.aiTypes' => ['AI Assistant', 'AI Data Scraper', 'AI Search Crawler'],
Your custom rules
Add your custom rules to the robots.txt file:
'mauricerenck.dark-visitors.agents' => [ [ 'userAgents' => ['Googlebot', 'Bingbot'], 'disallow' => ['/admin'], ], [ 'userAgents' => ['Bingbot'], 'allow' => ['/microsoft'], ], ],
Setting your custom rules will overwrite the default rules, which are:
[ 'userAgents' => ['*'], 'disallow' => ['/kirby', '/site'], ];
Sitemaps
Add your sitemaps to the robots.txt file:
'mauricerenck.dark-visitors.sitemaps' => [ 'Sitemap: https://your-site.tld/sitemap.xml', 'Sitemap: https://your-site.tld/sitemap2.xml', ],
Tracking/Analytics
Darkvisitors offers a tracking feature. If you want to use it, you can enable it in the config:
'mauricerenck.dark-visitors.analytics' => true,