There are at least three distinct meanings of the word "privacy".
The first is compliance.
This one is the realm of contracts and regulations that affect the enterprise.
This is the one that legal departments think about the most.
The second is anti-surveillance.
This uses tools like end-to-end-encryption, security defense in depth, confidential computing, on device computation, etc.
This is the one that cryptographers and engineers tend to think about the most.
The last is contextual integrity.
This is Helen Nissembaum's concept.
Roughly, "data is used in line with my interest and intent"
This is the one that UXR tends to talk about the most.
This is the one that most closely aligns with what users intuitively want.
Note that the first two are a different kind than the last one.
The last one is, roughly, the end of privacy.
The first two are particular means to achieve some aspect of that end.
Contextual integrity is the platonic ideal.
You can never fully reach it in all cases.
But you can clear a good enough bar for most cases, and you can continue to improve it.
The only way to have true contextual integrity is for the user to have the agency to decide which code can run on their data.
That requires the user to run the code on their turf, where they call the shots about what happens.