Per review §1 — verified no callers before each deletion:
- _next_scrape_utc (context dict key never read by any template)
- ALERT_SCRAPE_INTERVAL_SECONDS settings import (only _next_scrape_utc read it)
- alert/paths.py (imported by nothing)
- alert/settings.py LANGUAGE (alert doesn't use translations.toml)
- alert/main.py: the vestigial `c = {}` connectivity dict, the comment
about re-enabling it, and the entire connectivity block in
_flat_payload — the web-side columns stay NULL on insert now
- alert/maps.py: DESTINATIONS, calculate_score, _get_next_weekday,
_calculate_transfers (only geocode is used in the scraper)
- alert/flat.py: connectivity + display_address properties,
_connectivity field, unused datetime import
- apply/utils.py str_to_preview (no callers) — file removed
- web/matching.py: max_morning_commute + commute check
- web/app.py: don't pass connectivity dict into flat_matches_filter,
don't write email_address through update_notifications
- web/db.py: get_error (no callers); drop kill_switch,
max_morning_commute, email_address from their allowed-sets so they're
not writable through update_* anymore
- web/settings.py + docker-compose.yml: SMTP_HOST/PORT/USERNAME/PASSWORD/
FROM/STARTTLS (notifications.py is telegram-only now)
DB columns themselves (kill_switch, email_address, max_morning_commute,
connectivity_morning_time, connectivity_night_time) stay in the schema
— SQLite can't drop them cheaply and they're harmless.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
1. Admin → Geheimnisse sub-tab lets you edit ANTHROPIC_API_KEY +
BERLIN_WOHNEN_USERNAME/PASSWORD at runtime. Migration v7 adds a
secrets(key,value,updated_at) table; startup seeds missing keys from
env (idempotent). web reads secrets DB-first (env fallback) via
llm._api_key(); alert fetches them from web /internal/secrets on each
scan, passes them into Scraper(). Rotating creds no longer needs a
redeploy.
Masked display: 6 leading + 4 trailing chars, "…" in the middle.
Blank form fields leave the stored value untouched.
2. Drop the max_morning_commute filter from UI + server + FILTER_KEYS +
filter summary (the underlying Maps.calculate_score code stays for
potential future re-enable).
3. /static/didi.webp wired as favicon via <link rel="icon"> in base.html.
4. apply.open_page wraps page.goto in try/except so a failed load still
produces a "goto.failed" step + screenshot instead of returning an
empty forensics blob. networkidle + post-submission sleep are also
made best-effort. The error ZIP export already writes screenshot+HTML
per step and final_html — with this change every apply run leaves a
reconstructable trail even when the listing is already offline.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
Three isolated services (alert scraper, apply HTTP worker, web UI+DB)
with argon2 auth, signed cookies, CSRF, rate-limited login, kill switch,
apply circuit breaker, audit log, and strict CSP.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>