
This pattern ensures GenAI applications clearly convey how user data is collected, stored, processed and protected.
GenAI systems often rely on sensitive, contextual, or behavioral data. Mishandling this data can lead to user distrust, legal risk or unintended misuse. Clear communication around privacy safeguards helps users feel safe, respected and in control.
E.g., Slack AI clearly communicates that customer data remains owned and controlled by the customer and is not used to train Slack’s or any third-party AI models.
How to use this pattern
Show transparency: When a GenAI feature accesses user data, display explanation of what’s being accessed and why.
Design opt-in and opt-out flows: Allow users to easily toggle data sharing preferences.
Enable data review and deletion: Allow users to view, download or delete their data history giving them ongoing control.