Here are a couple of instances of how companies handled user data differently than the perception they created.
The company is mission-driven and is a champion of Privacy
Duckduckgo is well known as a search engine that is a champion of user privacy. In addition, they are known for their anti-tracking stance (see their tweet on google’s Topic tracking methodology).
When running a security audit of duckduckgo’s privacy browser, a researcher was surprised to discover that although Duckduck blocked google and Facebook trackers. It allowed Microsoft trackers to run (essentially allowing Microsoft to collect data on users who thought they were not being tracked).
They don’t sell my data. I pay for their service.
Fi offers GPS tracking collars for pets; users purchase the device and pay a monthly subscription for data services (standard business model followed by many hardware companies). The collar requires owners to pair it with their phones and provide access to their precise location (the company claims this is to conserve battery). So essentially, the service knows where you are even if you are not with your pet.
My data is being shared, but it is anonymous.
Differential privacy is a common technique companies use to share aggregated user information and maintain user anonymity/privacy. The technique calls for adding random data to real data. Level anonymity is correlated with noise. Think needle in a haystack vs. needle among 5 strands of hay—odds of finding a needle change substantially.
Apple is a significant user of this technique. However, when looked at closely, it turns out the information being captured/shared by Apple is a lot less private than the user might have assumed.