mediawiki/crawlable-all-pages

Extension to remove robot restrictions from Special:AllPages in MediaWiki

dev-master 2024-04-16 12:02 UTC

README

This extension overrides Special:AllPages by changing the HTML header of the page. This is a relatively easy way to allow a search engine crawler to index all the pages in your wiki.

The HTML removed is simply:

<meta name="robots" content="noindex,nofollow"/>

Installation without composer

  • Download and place the files in a directory called CrawlableAllPages in your extensions/ folder.
  • Add the following code at the bottom of your LocalSettings.php:
wfLoadExtension( 'CrawlableAllPages' );
  • ✓ Done – Navigate to Special:Version on your wiki to verify that the extension is successfully installed.

Installation with composer

  • If you do not have a composer.local.json file in your MediaWiki installation, create one:
echo '{require: { "mediawiki/crawlable-all-pages": "dev-master" }' > composer.local.json
  • If you have jq and moreutilssponge installed and an existing composer.local.json, you can use the following command to add this extension to your composer.local.json file:
jq '.require += { "mediawiki/crawlable-all-pages": "dev-master" }' \
   composer.local.json | sponge composer.local.json
  • Run composer update
composer update
  • ✓ Done – Navigate to Special:Version on your wiki to verify that the extension is successfully installed.