{"id":42,"date":"2026-05-05T15:27:36","date_gmt":"2026-05-05T13:27:36","guid":{"rendered":"https:\/\/stefan.boeck.name\/en\/?p=42"},"modified":"2026-05-05T15:27:36","modified_gmt":"2026-05-05T13:27:36","slug":"saklam-keeping-client-data-out-of-chatgpt","status":"publish","type":"post","link":"https:\/\/stefan.boeck.name\/en\/2026\/05\/05\/saklam-keeping-client-data-out-of-chatgpt\/","title":{"rendered":"Saklam: Keeping Client Data Out of ChatGPT"},"content":{"rendered":"<p><em>\ud83c\udde9\ud83c\uddea <a href=\"\/de\/saklam-mandantendaten-aus-chatgpt-raushalten\/\">Auf Deutsch lesen<\/a><\/em><\/p>\n<p>A few months ago, a lawyer told me he drafts pleadings with ChatGPT \u2013 including client names, case numbers, diagnoses. It hit me: that&#8217;s a \u00a7203 problem in plain sight.<\/p>\n<p>In Germany, anyone bound by professional confidentiality \u2013 lawyers, doctors, tax advisors, notaries \u2013 who sends client data to ChatGPT\/Claude\/Gemini violates \u00a7203 of the Criminal Code. Cloudflare&#8217;s AI Gateway just blocks such requests. But &#8220;blocking&#8221; isn&#8217;t a solution for someone who needs the AI. The right answer is &#8220;YES, but safely&#8221;.<\/p>\n<p>The only place this is cleanly solvable is the browser. Before the data leaves the machine.<\/p>\n<p><strong>Saklam<\/strong> detects personal data in the browser and replaces it with tokens \u2013 <code>[NAME_1]<\/code>, <code>[CASE_2]<\/code>, <code>[ADDRESS_1]<\/code>. The LLM only ever sees tokens. The response comes back and is reassembled with the real values in the browser. No clear data hits the server.<\/p>\n<p>The stack:<\/p>\n<ul>\n<li>GLiNER PII models, running locally in the browser (~200 MB cache)<\/li>\n<li>ONNX Runtime Web for inference<\/li>\n<li>LiteLLM as proxy (auth, routing, audit log)<\/li>\n<li>Provider-agnostic \u2013 OpenAI, Anthropic, Google<\/li>\n<\/ul>\n<p>Available as web chat, desktop app, JavaScript SDK, and Docker for on-premise. The privacy level matches the sensitivity: web chat for daily work, desktop for confidential matters, Docker for firms with their own servers.<\/p>\n<p>\u2192 <a href=\"https:\/\/saklam.com\">saklam.com<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Why GDPR-compliant AI for professionals bound by confidentiality has to start in the browser \u2014 and how Saklam solves it with GLiNER PII models.<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[10],"tags":[],"class_list":["post-42","post","type-post","status-publish","format-standard","hentry","category-saklam"],"_links":{"self":[{"href":"https:\/\/stefan.boeck.name\/en\/wp-json\/wp\/v2\/posts\/42","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/stefan.boeck.name\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/stefan.boeck.name\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/stefan.boeck.name\/en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/stefan.boeck.name\/en\/wp-json\/wp\/v2\/comments?post=42"}],"version-history":[{"count":1,"href":"https:\/\/stefan.boeck.name\/en\/wp-json\/wp\/v2\/posts\/42\/revisions"}],"predecessor-version":[{"id":44,"href":"https:\/\/stefan.boeck.name\/en\/wp-json\/wp\/v2\/posts\/42\/revisions\/44"}],"wp:attachment":[{"href":"https:\/\/stefan.boeck.name\/en\/wp-json\/wp\/v2\/media?parent=42"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/stefan.boeck.name\/en\/wp-json\/wp\/v2\/categories?post=42"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/stefan.boeck.name\/en\/wp-json\/wp\/v2\/tags?post=42"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}