Per review §2:
- web/db.py: new _tx() context manager wraps multi-statement writers in
BEGIN IMMEDIATE … COMMIT/ROLLBACK (our connections run in autocommit
mode, so plain `with _lock:` doesn't give atomicity). partnership_accept
(UPDATE + DELETE) and cleanup_retention (3 deletes/updates) now use it.
- Fire-and-forget tasks: add module-level _bg_tasks sets in web/app.py and
web/enrichment.py. A _spawn() helper holds a strong ref until the task
finishes so the GC can't drop it mid-flight (CPython's event loop only
weakly references pending tasks).
- apply/main.py: require_api_key uses hmac.compare_digest, matching web's
check. Also imports now use explicit names instead of `from settings *`.
- apply/language.py: replace `from settings import *` + `from paths import *`
with explicit imports — this is the pattern that caused the LANGUAGE
NameError earlier.
- alert/utils.py: pickle-based hash_any_object → deterministic JSON+sha256.
Cheaper, portable across Python versions, no pickle attack surface.
- web/notifications.py: /fehler links repointed to /bewerbungen (the
former page doesn't exist).
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
Per review §1 — verified no callers before each deletion:
- _next_scrape_utc (context dict key never read by any template)
- ALERT_SCRAPE_INTERVAL_SECONDS settings import (only _next_scrape_utc read it)
- alert/paths.py (imported by nothing)
- alert/settings.py LANGUAGE (alert doesn't use translations.toml)
- alert/main.py: the vestigial `c = {}` connectivity dict, the comment
about re-enabling it, and the entire connectivity block in
_flat_payload — the web-side columns stay NULL on insert now
- alert/maps.py: DESTINATIONS, calculate_score, _get_next_weekday,
_calculate_transfers (only geocode is used in the scraper)
- alert/flat.py: connectivity + display_address properties,
_connectivity field, unused datetime import
- apply/utils.py str_to_preview (no callers) — file removed
- web/matching.py: max_morning_commute + commute check
- web/app.py: don't pass connectivity dict into flat_matches_filter,
don't write email_address through update_notifications
- web/db.py: get_error (no callers); drop kill_switch,
max_morning_commute, email_address from their allowed-sets so they're
not writable through update_* anymore
- web/settings.py + docker-compose.yml: SMTP_HOST/PORT/USERNAME/PASSWORD/
FROM/STARTTLS (notifications.py is telegram-only now)
DB columns themselves (kill_switch, email_address, max_morning_commute,
connectivity_morning_time, connectivity_night_time) stay in the schema
— SQLite can't drop them cheaply and they're harmless.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
1. Admin → Geheimnisse sub-tab lets you edit ANTHROPIC_API_KEY +
BERLIN_WOHNEN_USERNAME/PASSWORD at runtime. Migration v7 adds a
secrets(key,value,updated_at) table; startup seeds missing keys from
env (idempotent). web reads secrets DB-first (env fallback) via
llm._api_key(); alert fetches them from web /internal/secrets on each
scan, passes them into Scraper(). Rotating creds no longer needs a
redeploy.
Masked display: 6 leading + 4 trailing chars, "…" in the middle.
Blank form fields leave the stored value untouched.
2. Drop the max_morning_commute filter from UI + server + FILTER_KEYS +
filter summary (the underlying Maps.calculate_score code stays for
potential future re-enable).
3. /static/didi.webp wired as favicon via <link rel="icon"> in base.html.
4. apply.open_page wraps page.goto in try/except so a failed load still
produces a "goto.failed" step + screenshot instead of returning an
empty forensics blob. networkidle + post-submission sleep are also
made best-effort. The error ZIP export already writes screenshot+HTML
per step and final_html — with this change every apply run leaves a
reconstructable trail even when the listing is already offline.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
apply/language.py references LANGUAGE via \`from settings import *\`, but
the constant was never added to apply/settings.py. Python only errored at
the first runtime use — which is \`str(ApplicationResult)\` in the /apply
handler. Any outcome that didn't short-circuit before the final \`return
ApplyResponse(message=str(result), …)\` blew up with NameError → 500.
Add LANGUAGE = getenv("LANGUAGE", "de") so existing translations.toml
keys resolve. Reproduced live inside the apply container: NameError:
name 'LANGUAGE' is not defined at language.py:15.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
- App is now called "wohnungsdidi" everywhere user-facing (page title,
nav brand, login header, notification subjects, report filename,
FastAPI titles, log messages)
- Brand dot replaced with an image of Didi (web/static/didi.webp),
rendered as a round 2.25rem avatar in _layout + login
- "Programmiert für Annika ♥" footer now shows for every logged-in user,
not only Annika
- Count-up shows only seconds ("vor 73 s") regardless of age — no
rollover to minutes/hours
- Data continuity: DB file stays /data/lazyflat.sqlite and the Docker
volume stays lazyflat_data so the rename doesn't strand existing data
- Session cookie renamed to wohnungsdidi_session (one-time logout on
rollout)
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
apply service
- POST /internal/fetch-listing: headless Playwright fetch of a listing URL,
returns {html, image_urls[], final_url}. Uses the same browser
fingerprint/profile as the apply run so bot guards don't kick in
web service
- New enrichment pipeline (web/enrichment.py):
/internal/flats → upsert → kick() enrichment in a background thread
1. POST /internal/fetch-listing on apply
2. llm.extract_flat_details(html, url) — Haiku tool-use call returns
structured JSON (address, rooms, rent, description, pros/cons, etc.)
3. Download each image directly to /data/flats/<slug>/NN.<ext>
4. Persist enrichment_json + image_count + enrichment_status on the flat
- llm.py: minimal Anthropic /v1/messages wrapper, no SDK
- DB migration v5 adds enrichment_json/_status/_updated_at + image_count
- Admin "Altbestand anreichern" button (POST /actions/enrich-all) queues
backfill for all pending/failed rows; runs in a detached task
- GET /partials/wohnung/<id> renders _wohnung_detail.html
- GET /flat-images/<slug>/<n> serves the downloaded image
UI
- Chevron on each list row toggles an inline detail pane (HTMX fetch on
first open, hx-preserve keeps it open across the 3–30 s polls)
- CSS .flat-gallery normalises image tiles to a 4/3 aspect with object-fit:
cover so different source sizes align cleanly
- "analysiert…" / "?" chips on the list reflect enrichment_status
Config
- ANTHROPIC_API_KEY + ANTHROPIC_MODEL wired into docker-compose's web
service (default model: claude-haiku-4-5-20251001)
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* apply: Recorder.step_snap(page, name) captures both a JPEG screenshot and
the page HTML for every major moment; every provider now calls step_snap at
each logical step so failure reports contain the exact DOM and rendered
state at every stage of the flow
* ZIP report: each snapshot becomes snapshots/NN_<label>.jpg +
snapshots/NN_<label>.html for AI-assisted debugging
* web: Wohnungsliste zeigt nur noch Flats, die die eigenen Filter treffen;
Match-Chip entfernt (Liste ist jetzt implizit matchend)
* UI komplett auf Deutsch: Protokoll statt Logs, Administrator statt admin,
Trockenmodus statt dry-run, Automatik pausiert statt circuit open,
Alarm statt Alert, Abmelden statt Logout
* Wohnungen-Header: Zeile 1 Info (Alarm + Filter), Zeile 2 Schalter mit
echten Radio-Paaren (An/Aus) für Automatisch bewerben und Trockenmodus;
hx-confirm auf den kritischen Radios; per-form CSS für sichtbaren Check-State
* Protokoll: von/bis-Datumsfilter (Berliner Zeit) + CSV-Download
(/logs/export.csv) mit UTC + lokaler Zeit
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
* DB: users + user_profiles/filters/notifications/preferences; applications gets
user_id + forensics_json + profile_snapshot_json; new errors table
with 14d retention; schema versioning via MIGRATIONS list
* auth: password hashes in DB (argon2); env vars seed first admin; per-user
sessions; CSRF bound to user id
* apply: personal info/WBS moved out of env into the request body; providers
take an ApplyContext with Profile + submit_forms; full Playwright recorder
(step log, console, page errors, network, screenshots, final HTML)
* web: five top-level tabs (Wohnungen/Bewerbungen/Logs/Fehler/Einstellungen);
settings sub-tabs profil/filter/benachrichtigungen/account/benutzer;
per-user matching, auto-apply and notifications (UI/Telegram/SMTP); red
auto-apply switch on Wohnungen tab; forensics detail view for bewerbungen
and fehler; retention background thread
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
Three isolated services (alert scraper, apply HTTP worker, web UI+DB)
with argon2 auth, signed cookies, CSRF, rate-limited login, kill switch,
apply circuit breaker, audit log, and strict CSP.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>