← Back to glossary
CauceOS technical

Assistance disclaimer

Explicit declaration that an artificial intelligence system acts as a support tool for the human professional and not as a diagnostic device, clinical evaluation system, or substitute for professional judgment.

Definition

The "assistance, not diagnosis" disclaimer is the formal declaration through which a provider of clinical assistance technology establishes that their system is not authorized or designed to make diagnoses, assess psychological competencies, make clinical decisions, or substitute the judgment of the qualified human professional. The system is a tool — clinical responsibility remains entirely with the professional who uses it. This disclaimer is not merely a legal formality: it is an ethical stance on the appropriate role of AI in high human-impact contexts.

How it's used

The disclaimer appears in three places: in the platform's terms of service (in legal language), in the professional's onboarding (in clear, operational language), and in automatically generated reports (as a reminder in each document).

In practice, this means that when CauceOS detects a linguistic signal associated with crisis risk, it does not say "this client has suicidal ideation" — it says "risk-associated language was detected at minute 23:14" and it is up to the professional to evaluate whether that signal has clinical meaning in the specific context. The system signals; the professional evaluates.

Similarly, post-session reports are structured drafts for the professional to review, correct, and sign — never auto-generated clinical documents that the professional simply countersigns without review.

When to apply

This framework operates in any context where AI processes information of a clinical, psychological, or wellbeing nature: telehealth platforms, digital coaching tools, session text analysis systems. The regulatory boundary between "assistance device" and "medical device" varies by jurisdiction and is in active evolution in many countries.

Historical origin

The debate about the appropriate role of AI in medicine and mental health intensified with the emergence of AI-assisted diagnostic systems in the 2010s. The conceptual framework of "professional assistance" as an alternative to "professional replacement" emerged as the dominant stance in AI health ethics forums. The FDA in the US and the CE in Europe have established regulatory frameworks for "Software as a Medical Device" that determine when a system crosses from assistance to regulated medical device.

How CauceOS supports it

CauceOS incorporates the disclaimer in all automatically generated reports with the text: "This report is an AI-generated draft as a support tool. It does not replace the clinical judgment of the professional. All notes must be reviewed, modified if necessary, and signed by the professional responsible for the session." The system does not suggest diagnoses or evaluate the professional's clinical competence.

References

How does CauceOS use this?

See how it works