By clicking “Accept All”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. View our Privacy Policy for more information.

What Practitioners Need to Know

The Health Professions Council of South Africa (HPCSA) has issued revised ethical guidelines affecting two central areas of practice:

  1. The use of artificial intelligence (AI) in care (Booklet 20); and
  2. The withholding/withdrawing of life-prolonging treatment (Booklet 7).

In both sets of rules, the emphasis is the same - that patient welfare comes first, there must be transparency in decision-making, and accountability remains with the practitioner.

AI in Clinical Practice (Booklet 20)

The HPCSA is supporting innovation, however it also sets clear guardrails. In essence, AI systems may assist with documentation, triage, image analysis and clinical decision support, but cannot replace clinical judgement. Practitioners remain ultimately responsible for assessing the patient, interpreting any AI output, and making (and documenting) the final decision. The guidelines highlight three pillars:

  • Ethical use (respecting autonomy and confidentiality)
  • Legal compliance (including POPIA)
  • Technical assurance (validated, safe and reliable tools suited to South African patients).

Use of AI must be transparent, meaning patients should be told when AI systems are being utilised to inform their care.

Relatedly, SAHPRA has published companion guidance for AI and Machine Learning -enabled medical devices’, signalling expectations around validation, quality control and ongoing oversight where regulated tools are used in clinical pathways.

End-of-life Decisions (Booklet 7)

The updated guidance confirms that decisions to withhold or withdraw life-prolonging treatment must be made in the patient’s best interests, with proper consultation where the patient has capacity. A notable development is the formal recognition of a “patient representative” - an individual chosen by the patient to speak on their behalf if they cannot make decisions. Where a patient lacks capacity and no representative has been appointed, the guidelines clarify the order of persons who may consent, and they offer clearer pathways to resolve disputes. Throughout the process, dignity, comfort and good documentation are central.

Practical Steps for Healthcare Organisations and Practitioners

It may be prudent, in consultation with legal advisors, to confirm that internal policies reflect the revised guidance and that clinical records adequately explain how any AI outputs were weighed and how final decisions were reached.

Practices must also ensure that patients are informed where AI plays a material role in care, and whether AI-enabled tools are appropriately validated and aligned with SAHPRA expectations. For end-of-life care, a review of consent processes is important - particularly noting any appointed patient representative – and practices may benefit from clarity on escalation and dispute-resolution pathways under the updated booklet.

Next Steps

Viewed together, the revised booklets do not prohibit innovation or constrain clinical practice; rather they articulate clearer standards for safe, transparent and accountable care. It may be sensible - working with your legal advisors - to test policies, consent templates and record-keeping against the updated expectations, and to reaffirm that clinical judgement rests with the practitioner even where AI tools are used.

If you would value structured support, the Thomson Wilks Medical Law & Healthcare team can assist with policy updates, AI-use protocols, end-of-life decision pathways, risk assessments and management, and staff training.