LLM assistant

Ask plain-English questions about this database and ICAC-style investigations, definitions (e.g. what is CSAM), or related topics. Answers are generated on the server with Groq; when your question needs numbers or filters from the corpus, the model runs read-only SQL against the live cases table. For browser-only API experiments, use Query.

What this page does

Sensitive data. Your question and the model's reply may be sent to LLM APIs from this server. Do not paste secrets or privileged content. Operators can disable the feature with CASLINKER_DISABLE_LLM_CHAT=1.
Inference: Groq (Gemini fallback if configured) Database: live read-only SQL on cases

Default model is Groq's llama-3.3-70b-versatile. If the override field starts with gemini (for example gemini-2.5-flash), that name is used for inference instead.

Try an example

Reply

API: endpoint and body

POST /api/llm/chat with JSON:

{ "question": "...", "model": null, "provider": "groq" }

Rate-limited per IP. GROQ_API_KEY is never sent to the browser.