Sector Analysis
Healthcare & AI Risk
Based on Oxford Martin School Research · 2013
Healthcare is among the most protected sectors, though administrative and diagnostic support roles face growing pressure.
Average risk
7%
Jobs analyzed
14
Highest risk role
Pharmacist29% risk
Jobs in this sector
| Job title | Risk score 2026 | Level |
|---|---|---|
| Pharmacist | 29% | Low |
| Social Worker | 13% | Low |
| Radiologist | 12% | Low |
| Personal Trainer | 11% | Low |
| Dental Hygienist | 8% | Low |
| Nurse | 7% | Low |
| Veterinarian | 7% | Low |
| Doctor (GP) | 5% | Low |
| Psychologist / Therapist | 4% | Low |
| Dentist | 2% | Low |
| Paramedic / EMT | 2% | Low |
| Speech Therapist | 0% | Low |
| Occupational Therapist | 0% | Low |
| Orthodontist | 0% | Low |
Analysis
Healthcare occupations dominate the low-risk end of the Oxford dataset. Surgeons, nurses, therapists, and physicians score below 5% automation probability — a reflection of the physical dexterity, social perception, and real-time judgment these roles require.
The exception is administrative healthcare work. Medical secretaries, billing clerks, and appointment schedulers score above 80%. These roles process structured information in predictable patterns — exactly what automation does best.
What the 2013 research did not anticipate: the rise of AI diagnostic tools. Radiology, pathology, and clinical decision support have seen significant AI advancement since publication. Diagnostic roles may face more pressure than the original scores suggest.
The human core of healthcare — physical presence, emotional support, ethical judgment in end-of-life decisions — remains deeply resistant to automation.
Want to know your specific role?
Check your specific role →All risk scores based on Frey & Osborne (2013), Oxford Martin School. Note: this study predates generative AI — actual risk may be higher than shown.