- Add multi-stage Dockerfile with Alpine Linux for minimal image size
- Include docker-compose.yml for simplified deployment with volume persistence
- Add .dockerignore to exclude unnecessary files from build context
- Update main.go to support DATABASE_PATH environment variable
- Update README to promote Docker as the recommended deployment method
- Use non-root user in container for improved security
- Final image size ~38MB with all dependencies
Docker deployment now available via:
docker-compose up -d
or
docker build -t alpenqueue .
docker run -d -p 8080:8080 -v alpenqueue-data:/app/data alpenqueue
🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-Authored-By: Claude <noreply@anthropic.com>
Complete documentation including features, quick start guide, API reference, usage examples, architecture diagrams, and configuration options. Fixed typo in title (lightweigt → lightweight).
Add frequency_minutes field to schedule recurring jobs. Jobs with frequency > 0 run repeatedly at specified intervals, automatically rescheduling after each execution. One-time jobs (frequency = 0) remain unchanged. Status transitions from pending to active for recurring jobs.
Store complete HTML response in raw_html column before extraction. Enables re-running selectors on historical scrapes when sites change their DOM structure or CSS classes.
Replace hardcoded title extraction with user-defined CSS selectors using goquery. Users specify selector in job JSON to extract any HTML elements. Worker extracts text content plus src/href attributes. Webhook payload includes extracted content and URL.
Replace sleep with actual URL fetching. Worker scrapes HTML title from URLs, respects robots.txt, and includes proper User-Agent headers. Scraped titles stored in SQLite and sent via webhook callback.
Add webhook_url column to jobs table. POST /jobs endpoint accepts JSON payload with optional webhook_url. After job completion, worker POSTs to webhook with status and duration.
Add jobs table with ID, status, and created_at fields. POST /jobs endpoint creates pending jobs in SQLite. Worker polls every 5s for pending jobs, processes them with 2s delay, and marks as done.