HPCSA AI Compliance Policy

ASCLEPIUS Alignment with HPCSA Booklet 20 (September 2025)
Ethical Guidelines on the Use of Artificial Intelligence

Compliance Summary

ASCLEPIUS has been designed in alignment with the Health Professions Council of South Africa (HPCSA) Ethical Guidelines on the Use of Artificial Intelligence (Booklet 20, September 2025). This document outlines how each AI feature in ASCLEPIUS complies with HPCSA requirements.

All AI features in ASCLEPIUS serve exclusively as clinical decision support tools. No AI output is used as a final clinical decision without explicit health practitioner review and approval.
Section 3 - Ethical Principles
HPCSA RequirementASCLEPIUS Compliance
3.1 Patient's best interests remain primary concern All AI features are designed to assist health practitioners in providing better patient care. Patient confidentiality, privacy, and dignity are protected through POPIA-compliant data handling.
3.3 AI serves as a tool to support, not replace clinical decision-making Every AI feature includes a "Doctor's Assessment" section where the practitioner makes the final clinical decision. AI outputs are clearly labelled as suggestions requiring verification.
3.5 Health practitioners shall always make the final decision on patient care All AI-generated content (fracture classifications, image analyses, discharge summaries, clinical notes) requires explicit practitioner review before being finalised. No AI output is automatically applied to patient records without practitioner approval.
3.7 Express accountability for errors All AI outputs are attributed to the reviewing health practitioner who approves them. The system maintains an audit trail of AI-generated content and practitioner modifications.
3.8 Protection of patient health information privacy Patient data is processed within the application's secure environment. AI processing uses encrypted API connections. No patient data is stored by third-party AI providers.
Section 4 - Disclosure
HPCSA RequirementASCLEPIUS Compliance
4.1 AI tool must not be secret; limitations disclosed upfront ASCLEPIUS openly discloses AI usage on every page (footer banner) and on each AI-powered feature with specific disclaimers. AI model names, capabilities and limitations are transparently documented.
4.3-4.4 Disclosure of learning type (continuous vs locked) ASCLEPIUS uses "locked learner" AI models (Claude, Gemini) that do not automatically update with new patient data. The models are pre-trained and do not learn from individual patient interactions within the application.
4.5 Understanding training data quality, safety, and bias AI providers' models are trained on diverse medical literature. ASCLEPIUS prompts include South African context-specific considerations (SA-prevalent conditions, local guidelines) to reduce bias in outputs.
Section 5 - Accountability
HPCSA RequirementASCLEPIUS Compliance
5.2 Health practitioners ultimately responsible for AI tool use Every AI output requires practitioner review. The treating doctor's name and credentials are associated with all finalised documents. AI results cannot bypass the practitioner approval workflow.
5.3-5.4 Practitioners should not over-rely on AI Prominent warnings on all AI features remind practitioners to exercise independent clinical judgment. AI outputs are presented as suggestions, not directives.
Section 6 - Equity & Transparency
HPCSA RequirementASCLEPIUS Compliance
6.1 AI must not exacerbate healthcare disparities ASCLEPIUS provides the same AI-powered clinical support regardless of patient demographics. AI prompts include South African context to ensure relevance across diverse populations.
6.2 Patients must be informed when AI assists diagnosis/treatment All AI-generated documents (PDFs, reports) clearly state that AI was used in the assessment process. Practitioners are reminded to inform patients per HPCSA requirements.
6.4-6.5 Informed consent is meaningful dialogue ASCLEPIUS supports (not replaces) the informed consent process. AI outputs provide information to help practitioners have more informed discussions with patients.
6.7 Transparency about AI capabilities, limitations, safeguards Each AI feature includes contextual information about what the AI can and cannot do. This policy page provides comprehensive transparency about all AI components.
Section 7 & 9 - Safety, Quality & Clinical Decision Making
HPCSA RequirementASCLEPIUS Compliance
7.1 Interventions must be in patient's interest and safe AI features include safety checks, red flag alerts, and urgency indicators. Practitioners retain override authority on all AI suggestions.
9.1 Responsibility for decision making must remain with the practitioner ASCLEPIUS enforces this through its workflow design: AI generates suggestions, the practitioner reviews, modifies if needed, and explicitly approves before any clinical action.
9.2 AI cannot be the final decision maker No AI output in ASCLEPIUS is automatically applied. Every feature requires the health practitioner to actively confirm or modify AI suggestions before they become part of the clinical record.
Section 10 - Data Privacy & Protection
HPCSA RequirementASCLEPIUS Compliance
10.1 Patient information protected against improper disclosure All data is encrypted in transit (TLS). Database access is restricted to authenticated, authorised practitioners. AI API calls use encrypted connections and do not store patient data.
10.2 Safeguard against unauthorized access Role-based access control ensures practitioners only access their own patients' data. Session management, secure authentication, and POPIA compliance measures are implemented throughout.
AI Features in ASCLEPIUS

The following AI-powered features are available in ASCLEPIUS. Each operates as a clinical decision support tool with practitioner oversight:

FeatureAI ProviderPurposePractitioner Oversight
AI Doc Personal Assistant Claude (Anthropic) Clinical query assistance, medical information retrieval Conversational tool; practitioner evaluates all responses
Fracture Classification Claude / Gemini AO/OTA fracture classification suggestions Practitioner reviews classification, modifies treatment plan
Image Analysis (Ophthalmology & Dermatology) Gemini Vision Clinical image analysis with diagnostic suggestions Doctor's Assessment form for final diagnosis
Discharge Summary Generator Claude / Gemini Automated discharge summary drafting Full edit/review workflow before finalisation
Theatre Operation Notes Claude / Gemini Procedure description assistance Practitioner edits and approves all content
Smart Clinical Notes Claude / Gemini AI auto-complete and note expansion Suggestions only; practitioner accepts or rejects
Clinical Motivation Letters Claude / Gemini Motivation letter drafting for funders Practitioner reviews and signs final document
Referral Letters Claude / Gemini Specialist referral letter generation Practitioner reviews, edits, and approves
Voice Dictation Speech Recognition Clinical documentation transcription Practitioner reviews and corrects transcription
Important Notices
  • AI Model Type: All AI models used in ASCLEPIUS are "locked learner" systems. They do not learn from or retain individual patient data from interactions within this application.
  • No Autonomous Decision-Making: No AI feature in ASCLEPIUS autonomously makes clinical decisions, prescribes treatments, or modifies patient records without explicit practitioner approval.
  • Patient Notification: Health practitioners are reminded of their obligation under HPCSA Booklet 20, Section 6.2 to inform patients when AI has been used to assist in their diagnosis or treatment planning.
  • Professional Responsibility: Per Section 9.1, the treating health practitioner retains full professional responsibility and accountability for all clinical decisions, regardless of AI involvement.
  • Limitations: AI systems may produce inaccurate, incomplete, or biased outputs. Practitioners must apply their clinical expertise and judgment to evaluate all AI-generated content.
  • Data Protection: All patient data processing complies with POPIA (Protection of Personal Information Act) and HPCSA Booklet 5 (Confidentiality) requirements.

Reference: Health Professions Council of South Africa. Ethical Guidelines on the Use of Artificial Intelligence (Booklet 20). Pretoria, September 2025.

This policy is subject to review as HPCSA guidelines evolve. Last updated: February 2026.

HPCSA AI Compliance Notice: All AI features in ASCLEPIUS serve as clinical decision support tools only, in accordance with HPCSA Booklet 20 (September 2025) - Ethical Guidelines on the Use of Artificial Intelligence. AI does not replace clinical judgment. The treating health practitioner retains full accountability for all patient care decisions. AI outputs must be independently verified before clinical application.

View Full HPCSA AI Policy & Compliance Details