Consider the case of connected people. Hugo Campos, for example has an implantable cardiac defibrillator (ICD) that sends his data to Medtronic before it goes to the hospital where it might go to a doctor, and finally, only after years of struggle, Hugo got to see a degraded version of his own data as an off-line file. (Activities that trigger the ICD and the resulting "tuning" are obviously a prime concern for the patient.) Hugo has been very public about this issue. Another ICD patient I advise lost track of her data when she lost her health insurance. The alarms from her device were not being monitored by anyone.
From my perspective, "privacy by design" is too vague. The design framework needs to be based on Fair Information Practice. Oversimplified, FIP requires consent, data minimization and transparency. All three criteria, require the patient to have convenient access to the ICD data _before_ it's sent to the vendor or the hospital. Without such access consent is being coerced, data minimization cannot be audited and transparency is more or less absent.
This brings us to the SIM card or the equivalent private key associated with the device. That key needs to be entirely in the control of the patient. In some cases the key may be associated with a certificate. It could be used for ID and encryption (although there's a case to insist the encryption also allow for perfect forward secrecy). In many cases, a trusted certificate is not required. For my ICD patients, a self-signed certificate and in-person authentication with my physician should be sufficient.