miisieq / robots-txt-bundle
Robots.txt generator for Symfony 2, 3 & 4
Installs: 2 494
Dependents: 0
Suggesters: 0
Security: 0
Stars: 3
Watchers: 1
Forks: 1
Open Issues: 0
Type:symfony-bundle
Requires
- php: ^7.1
- symfony/config: ~2.0|~3.0|~4.0
- symfony/http-kernel: ~2.0|~3.0|~4.0
Requires (Dev)
- phpunit/phpunit: ~5.7 | ^6 | ^7
- symfony/dependency-injection: ~2.0|~3.0|~4.0
- symfony/yaml: ~2.0|~3.0|~4.0
README
The problem
It's pretty common workflow that we work on our projects in local
environment, then deploy code to preproduction
or staging
server for out client to approve the work, then finally push to production
environment.
While we absolutely want crawlers to index our production
environment, we don't want to see our test servers in search results.
How it works?
Depending on the Symfony environment, application will return robots.txt
file with rule that allows to index whole content only we are in prod
environment. In case of another environment, the application will block whole site from indexing.
Installation
Step 1: Install the bundle
First, open a command console, enter your project directory and execute the following command to download the latest version of this bundle:
composer require miisieq/robots-txt-bundle
Step 2: Register the bundle in your kernel
Then add the bundle to your kernel:
class AppKernel extends Kernel { public function registerBundles() { $bundles = [ // ... new Miisieq\RobotsTxtBundle\MiisieqRobotsTxtBundle(), ]; // ... } }
Step 3: Configure the bundle
Add the following to your config file:
# app/config/config.yml miisieq_robots_txt: ~
You can easily add links to your site maps:
# app/config/config.yml miisieq_robots_txt: host: http://example.com production_environment: prod sitemaps: - "/sitemap.xml" - "/catalog/sitemap.xml"
Step 4: Register the routes
To allow to get your robots.txt
file, register the following route:
# app/config/routing.yml miisieq_robots_txt: resource: "@MiisieqRobotsTxtBundle/Resources/config/routes.yaml" prefix: /
Step 5: Remove static robots.txt
file (if exists)
rm web/robots.txt