telescopeai / autodebug
AI-powered auto-debug & auto-fix system for Laravel. Monitors Telescope exceptions, analyzes them with AI, generates fixes, and creates GitHub PRs automatically.
dev-main
2026-03-12 19:34 UTC
Requires
- php: ^8.1
- illuminate/console: ^10.0|^11.0|^12.0
- illuminate/database: ^10.0|^11.0|^12.0
- illuminate/http: ^10.0|^11.0|^12.0
- illuminate/routing: ^10.0|^11.0|^12.0
- illuminate/support: ^10.0|^11.0|^12.0
Requires (Dev)
- orchestra/testbench: ^8.0|^9.0|^10.0
- phpunit/phpunit: ^10.0|^11.0
This package is auto-updated.
Last update: 2026-04-12 19:49:05 UTC
README
AI-powered auto-debug & auto-fix for Laravel. Monitors Telescope exceptions, analyzes them with multiple AI providers (OpenAI, Claude, Gemini, or local Ollama), generates code fixes, and creates GitHub PRs automatically.
โจ Features
- ๐ Automatic Exception Detection โ Polls Telescope for new exceptions
- ๐ง Multi-AI Engine โ Supports OpenAI, Anthropic (Claude), Google Gemini, and Ollama (Local/Free)
- ๐ ๏ธ Auto-Fix Generation โ AI suggests code patches with search/replace
- ๐ GitHub PR Creation โ Pushes fixes as PRs with detailed descriptions
- ๏ฟฝ๏ธ Terminal Diff Preview โ View suggested changes directly in your console
- ๏ฟฝ๐ Web Dashboard โ View exceptions, AI analysis, confidence scores, and PR status
- ๐ Notifications โ Slack, email, and database logging
- ๐ Safety Guards โ Protected paths, deduplication, rate limiting, confidence thresholds
๐ฆ Installation
1. Require the package
composer require telescopeai/autodebug
2. Run the install command
php artisan autodebug:install
3. Configure your .env
Option A: Local Ollama (Free & Private)
Best for internal development. No API keys required.
AUTODEBUG_AI_PROVIDER=ollama AUTODEBUG_OLLAMA_BASE_URL=http://localhost:11434 AUTODEBUG_OLLAMA_MODEL=deepseek-coder:6.7b
Option B: Google Gemini
Highly capable with generous free tiers.
AUTODEBUG_AI_PROVIDER=google AUTODEBUG_GOOGLE_API_KEY=your-gemini-key AUTODEBUG_GOOGLE_MODEL=gemini-2.0-flash
Option C: OpenAI / Anthropic (Claude)
Professional grade models.
# For OpenAI AUTODEBUG_AI_PROVIDER=openai AUTODEBUG_OPENAI_API_KEY=sk-your-key-here # For Anthropic AUTODEBUG_AI_PROVIDER=anthropic AUTODEBUG_ANTHROPIC_API_KEY=sk-ant-your-key-here
4. GitHub Configuration
Required only if you want automatic PR creation.
AUTODEBUG_GITHUB_ENABLED=true AUTODEBUG_GITHUB_TOKEN=ghp_your-github-token AUTODEBUG_GITHUB_OWNER=your-org-or-username AUTODEBUG_GITHUB_REPO=your-repo-name
๐ Usage
CLI Commands
# Run analysis (Dry run doesn't create PRs) php artisan autodebug:analyze --dry-run # ๐ฅ See the file changes in terminal php artisan autodebug:analyze --dry-run --diff # Force analysis even if recently analyzed php artisan autodebug:analyze --force
โ๏ธ Configuration
| Option | Default | Description |
|---|---|---|
ai.provider |
openai |
openai, anthropic, google, or ollama |
analysis.min_confidence_for_pr |
75 |
Minimum AI confidence to create a PR |
analysis.max_calls_per_hour |
10 |
Rate limit for AI API calls |
analysis.dry_run |
false |
Global dry run mode |
safety.protected_paths |
[...] |
Files the AI is never allowed to touch |
๐งช Prerequisites
- PHP 8.1+
- Laravel 10, 11, or 12
- Laravel Telescope
- GitHub Personal Access Token (for PRs)
๐ License
MIT License. See LICENSE for details.