fixer112 / laravel-sanitizer
Laravel middleware to sanitize inputs and block malicious bots and codes
Requires
- php: ^8.0
- illuminate/http: ^10.0|^11.0|^12.0
- illuminate/support: ^10.0|^11.0|^12.0
README
A Laravel middleware package that sanitizes all incoming request data by stripping out potentially malicious scripts, SQL keywords, and dangerous shell command inputs. It also blocks known bots and crawlers based on the User-Agent
.
✨ Features
- Filters out common XSS/JS/HTML injections
- Removes SQL injection keywords
- Removes shell command patterns like
cmd
,powershell
,shutdown
- Sanitizes all fields except
password
andconfirm_password
- Blocks basic bot
User-Agent
patterns - Lightweight and auto-runs on every request (if configured)
🚀 Installation
composer require fixer112/sanitizer
⚙️ Configuration
To publish the configuration file:
php artisan vendor:publish --tag=config --provider="Fixer112\Sanitizer\SanitizerServiceProvider"
This will create config/sanitizer.php with:
return [ 'global' => true, // Automatically apply to all web and API routes ];
If global is true, the sanitizer middleware will be added to both the web and api middleware stacks automatically.
🛡️ What It Sanitizes
It removes the following:
-
<script>, <iframe>, <style>, <svg>, etc.
-
onerror=, onclick=, javascript: URIs
-
data:text/html;base64, patterns
-
Dangerous SQL terms: select, update, drop, exec, etc.
-
Shell/OS commands like cmd, powershell, shutdown, etc.
-
Character patterns like &, |, ;, <, > that can trigger shell execution
🧪 Usage
No additional setup required if global => true in config.
If not, register the middleware manually in your Kernel.php:
protected $middleware = [ \Fixer112\Sanitizer\Middleware\Sanitizer::class, ];
Or add it only to certain routes:
Route::middleware(['sanitizer'])->group(function () { // routes });
🧼 Skipped Fields
By default, these fields are not sanitized:
password
confirm_password
You can customize this inside the package or fork it to your needs.
🤖 Bot Protection Rejects requests with suspicious or missing User-Agent headers like:
-
bot
-
crawler
-
spider
-
curl
-
httpclient
-
scrapy