1. Admin → Geheimnisse sub-tab lets you edit ANTHROPIC_API_KEY +
BERLIN_WOHNEN_USERNAME/PASSWORD at runtime. Migration v7 adds a
secrets(key,value,updated_at) table; startup seeds missing keys from
env (idempotent). web reads secrets DB-first (env fallback) via
llm._api_key(); alert fetches them from web /internal/secrets on each
scan, passes them into Scraper(). Rotating creds no longer needs a
redeploy.
Masked display: 6 leading + 4 trailing chars, "…" in the middle.
Blank form fields leave the stored value untouched.
2. Drop the max_morning_commute filter from UI + server + FILTER_KEYS +
filter summary (the underlying Maps.calculate_score code stays for
potential future re-enable).
3. /static/didi.webp wired as favicon via <link rel="icon"> in base.html.
4. apply.open_page wraps page.goto in try/except so a failed load still
produces a "goto.failed" step + screenshot instead of returning an
empty forensics blob. networkidle + post-submission sleep are also
made best-effort. The error ZIP export already writes screenshot+HTML
per step and final_html — with this change every apply run leaves a
reconstructable trail even when the listing is already offline.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
Two issues surfaced on HOWOGE and similar sites:
1. Tiny icons/1x1 tracking pixels leaked through (e.g. image #5, 1.8 KB).
Added MIN_IMAGE_BYTES = 15_000 and MIN_IMAGE_DIMENSION = 400 px on the
short side; files below either threshold are dropped before saving.
Pillow already gives us the dims as part of the phash pass, so the
check is free.
2. Listings whose image URLs are opaque CDN hashes
(.../fileadmin/_processed_/2/3/xcsm_<hash>.webp.pagespeed.ic.<hash>.webp)
caused the LLM URL picker to reject every candidate, yielding 0 images
for legit flats. Fixes: (a) prompt now explicitly instructs Haiku to
keep same-host /fileadmin/_processed_/ style URLs even when the filename
is illegible, (b) if the model still returns an empty set we fall back
to the unfiltered Playwright candidates, trusting the pre-filter instead
of erasing the gallery.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
Per user request, the LLM is no longer asked to extract rooms/size/rent/WBS —
those come from the inberlinwohnen.de scraper which is reliable. Haiku is now
used for one narrow job: pick which <img> URLs from the listing page are
actual flat photos (vs. logos, badges, ads, employee portraits). On any LLM
failure the unfiltered candidate list passes through.
Image dedup runs in two tiers:
1. SHA256 of bytes — drops different URLs that point to byte-identical files
2. Perceptual hash (Pillow + imagehash, Hamming distance ≤ 5) — drops the
"same image at a different resolution" duplicates from srcset / CDN
variants that were filling galleries with 2–4× copies
UI:
- Wohnungsliste falls back to scraper-only display (rooms/size/rent/wbs)
- Detail panel only shows images + "Zur Original-Anzeige →"; description /
features / pros & cons / kv table are gone
- Per-row "erneut versuchen" link + the "analysiert…/?" status chips were
tied to LLM extraction and are removed; the header "Bilder nachladen (N)"
button still surfaces pending/failed batches for admins
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
apply service
- POST /internal/fetch-listing: headless Playwright fetch of a listing URL,
returns {html, image_urls[], final_url}. Uses the same browser
fingerprint/profile as the apply run so bot guards don't kick in
web service
- New enrichment pipeline (web/enrichment.py):
/internal/flats → upsert → kick() enrichment in a background thread
1. POST /internal/fetch-listing on apply
2. llm.extract_flat_details(html, url) — Haiku tool-use call returns
structured JSON (address, rooms, rent, description, pros/cons, etc.)
3. Download each image directly to /data/flats/<slug>/NN.<ext>
4. Persist enrichment_json + image_count + enrichment_status on the flat
- llm.py: minimal Anthropic /v1/messages wrapper, no SDK
- DB migration v5 adds enrichment_json/_status/_updated_at + image_count
- Admin "Altbestand anreichern" button (POST /actions/enrich-all) queues
backfill for all pending/failed rows; runs in a detached task
- GET /partials/wohnung/<id> renders _wohnung_detail.html
- GET /flat-images/<slug>/<n> serves the downloaded image
UI
- Chevron on each list row toggles an inline detail pane (HTMX fetch on
first open, hx-preserve keeps it open across the 3–30 s polls)
- CSS .flat-gallery normalises image tiles to a 4/3 aspect with object-fit:
cover so different source sizes align cleanly
- "analysiert…" / "?" chips on the list reflect enrichment_status
Config
- ANTHROPIC_API_KEY + ANTHROPIC_MODEL wired into docker-compose's web
service (default model: claude-haiku-4-5-20251001)
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>