fuelviews / laravel-robots-txt
Fuelviews laravel robots.txt package
Fund package maintenance!
Fuelviews
Requires
- php: ^8.2
- illuminate/http: ^10.0.0
- illuminate/routing: ^10.0.0
- illuminate/support: ^10.0.0
- spatie/laravel-package-tools: ^1.14.0
Requires (Dev)
- laravel/pint: ^1.0
- nunomaduro/collision: ^7.8
- orchestra/testbench: ^8.8
- pestphp/pest: ^2.20
- pestphp/pest-plugin-arch: ^2.5
- pestphp/pest-plugin-laravel: ^2.0
- dev-main
- v0.0.1-RC3
- v0.0.1-RC2
- v0.0.1-RC1
- dev-dependabot/github_actions/main/softprops/action-gh-release-2
- dev-dependabot/github_actions/main/aglipanci/laravel-pint-action-2.4
- dev-dependabot/github_actions/main/dependabot/fetch-metadata-2.1.0
- dev-dependabot/github_actions/main/anothrNick/github-tag-action-1.69.0
This package is auto-updated.
Last update: 2024-05-01 17:00:04 UTC
README
The Fuelviews laravel robots.txt package is engineered to significantly accelerate the development and management of robots.txt files for laravel applications. This package seamlessly integrates into your laravel project, offering an efficient and streamlined approach to controlling how search engines interact with your website. By leveraging this package, developers can swiftly configure search engine access rules, directly from the application's configuration, thus dramatically speeding up the development process. With its focus on rapid configuration and deployment, the Fuelviews laravel robots.txt package ensures that managing your site's search engine visibility becomes a quick, hassle-free aspect of your development workflow, allowing you to focus on building and optimizing your application.
Installation
You can require the package and it's dependencies via composer:
composer require fuelviews/laravel-robots-txt
You can manually publish the config file with:
php artisan vendor:publish --provider="Fuelviews\RobotsTxt\RobotsTxtServiceProvider" --tag="robots-txt-config"
This is the contents of the published config file:
<?php /** * Configuration File: robots-txt.php * * This file contains configuration options for the robots.txt generation. */ return [ /** * The disk where the robots.txt file will be saved */ 'disk' => 'public', /** * User agent rules for different paths */ 'user_agents' => [ '*' => [ 'Allow' => [ '/', ], 'Disallow' => [ '/admin', '/dashboard', ], ], ], /** * Sitemaps to include in robots.txt */ 'sitemap' => [ 'sitemap.xml', ], ];
Usage
To access the robots.txt, navigate to your application's URL and append /robots.txt to it.
For example, if your application is hosted at http://example.com, the sitemap can be found at http://example.com/robots.txt.
Testing
composer test
Changelog
Please see CHANGELOG for more information on what has changed recently.
Contributing
Please see CONTRIBUTING for details.
Security Vulnerabilities
If you've found a bug regarding security please mail support@fuelviews.com instead of using the issue tracker.
Credits
Support us
Fuelviews is a web development agency based in Portland, Maine. You'll find an overview of all our projects on our website.
License
The MIT License (MIT). Please see License File for more information.