iprodev / sitemap-generator-pro
A professional PHP XML sitemap generator (library + CLI tool) with concurrency, robots.txt respect, and sitemap index support.
Installs: 0
Dependents: 0
Suggesters: 0
Security: 0
Stars: 84
Watchers: 10
Forks: 52
Open Issues: 7
pkg:composer/iprodev/sitemap-generator-pro
Requires
- php: >=8.0
- guzzlehttp/guzzle: ^7.8
- psr/log: ^1.1
Requires (Dev)
- phpunit/phpunit: ^9.6
- squizlabs/php_codesniffer: ^3.9
This package is not auto-updated.
Last update: 2025-10-09 12:13:48 UTC
README
A professional PHP sitemap generator by iprodev — supports concurrency, robots.txt, gzip, and sitemap index files.
Install
composer require iprodev/sitemap-generator-pro
CLI Usage
php bin/sitemap --url=https://www.iprodev.com --out=./sitemaps --concurrency=20 --max-pages=10000 --max-depth=5 --public-base=https://www.iprodev.com
Options
--url
(required): Start URL (must be same-host for crawling)--out
(default: ./output): Output directory--concurrency
(default: 10): Number of concurrent HTTP requests--max-pages
(default: 50000): Crawl limit--max-depth
(default: 5): Max link depth--public-base
(optional): Public base URL for sitemap file URLs insitemap-index.xml
Programmatic Usage
use IProDev\Sitemap\Fetcher; use IProDev\Sitemap\Crawler; use IProDev\Sitemap\SitemapWriter; use IProDev\Sitemap\RobotsTxt; $fetcher = new Fetcher(['concurrency' => 10]); $robots = RobotsTxt::fromUrl('https://www.iprodev.com', $fetcher); $crawler = new Crawler($fetcher, $robots); $pages = $crawler->crawl('https://www.iprodev.com', 10000, 5); SitemapWriter::write($pages, __DIR__ . '/sitemaps', 50000, 'https://www.iprodev.com');
Tests & Lint
composer install vendor/bin/phpunit vendor/bin/phpcs --standard=PSR12 src/ tests/
Docker
docker build -t sitemap-generator-pro . docker run --rm -v $(pwd)/sitemaps:/app/output sitemap-generator-pro --url=https://www.iprodev.com --out=/app/output --concurrency=20 --max-pages=10000 --public-base=https://www.iprodev.com
License
MIT