It knows when you’ve been sleeping. It knows when you’re awake. It knows when your sleep has been good or bad. But for goodness sake, in order for the product to work well, consumers may have to bend on digital privacy.
Approximately 40% of adult men and 24% of adult women are habitual snorers, according to the American Academy of Sleep Medicine. And Tempur Sealy International created a bed that tried to take care of that problem. (Forbes reported that it was the first bed to readjust without someone being awake or using a remote.) With the anti-snoring bed came a morning report from a mobile app, which provided a detailed analysis of a snorer’s sleeping habits and personalized advice for sleep improvement. In order to receive the analysis, the snorer would have to allow certain settings to track private sleeping behavior.
Paired sensors on the bed confirmed whether the bed should be readjusted to stop snoring. On the plus side, for bedfellows of these snoring sleepers, they didn’t have to do anything but be their usual selves.
Recommended Read: “Is mobile app security software safer than traditional camera equipment? ~ Geeni Glimpse Wireless Smart Camera vs AlfredCamera”
While some would say this is a win-win, especially the party who wakes up nonstop because of the snorer, others still worry about where else this information goes. Will the information be shared with third-party websites? Will banner ads start popping up about alternate sleep accessories?
Even if a little less privacy helps to reduce low oxygen levels, strain on the heart and chronic headaches — all of which are associated with snoring — that do-gooder retailer still may look like Big Brother to some. While consumers want to buy useful technology and other kinds of products, they have mixed feelings about what they have to give up to get those results.
ADVERTISEMENT ~ Amazon
As an Amazon affiliate, I earn a percentage for every purchase with my referral link.
When retail companies aren’t transparent either
In the example above, a user who purchases that mattress has a fair idea of things that are being monitored. But what about those moments when computer users, smartphone users and even in-person shoppers have no idea what information is being shared?
Using a shopping mall (what’s left of them after COVID-19 shutdowns) as another example, how often does one read the disclaimer about logging into their Wi-Fi? An alert comes up on a smartphone, coaxing a shopper to use that connection. The shopper clicks the “accept” option. Meanwhile that Wi-Fi connection is keeping track of every single move the smartphone user makes at each store. Is that OK too?
“If you don’t pay for the product you’re using, you most likely are the product yourselves,” said Ida Tin, the co-founder of CLUE.
The mall says so. The disclaimer was there — in much smaller font than the Wi-Fi promo and the “accept” button. Or, maybe there was no disclaimer at all. And for some users, transparency may be all it takes for consumers and retailers to be on more trusting ground.
Some consumers may opt out of returning to that store again, solely on principle. They don’t like being tracked without their knowledge. Would a more obvious notification (like a pop-up button) to bluntly state that the retailers are tracking its consumers be a better idea? Even then, would shoppers read it?
In a July 2018 study*, First Data reported that Internet users trusted financial institutions (46%) and healthcare companies (39%) more than retailers (8%) to keep their private data. Eleven percent said they flat out would never shop at a store if there was a data breach. Another 43% would continue to shop at these risky retailers but would use cash only, making it more difficult for retailers to be able to track store points, credit card usage, store items purchased, potential e-marketing opportunities and more.
Middle ground between marketing and consumer privacy
There are some state laws related to Internet privacy that attempt to keep consumers’ minds at ease. For example, in California, companies cannot automatically operate voice recognition software during the initial setup or installation of a TV “without prominently informing” the person beforehand.
In Illinois, there are a set of rules for companies to use Biometric Information Privacy Act (BIPA) — using someone’s voice to verify an individual’s identity. There must be informed consent prior to collecting and storing this information.
Why do topics like these matter? Voice recognition software doesn’t just record; sometimes it has an awkward way of sending information on. Just ask the Oregon couple whose hardwood floor conversation was recorded by Alexa and involuntarily sent to the husband’s employees. (If that couple started seeing or hearing new ads marketing hardwood floors, that was no accident either.)
Although the hardwood floor incident is a rare example of how technology worked against a consumer, it’s a reminder to retailers, digital marketers and other companies to be trustworthy. In turn, consumers will feel more comfortable believing that information related to the software or product will be used legally. While consumers are not in denial about retailers (and other companies) wanting to be profitable, limits on behavioral data and letting consumers have the option to weigh in could go a long way with helping them sleep (quietly) at night, too.
(Note: This post was originally published as an Upwork freelancer for RETHINK Retail.)
Did you enjoy this post? You’re also welcome to check out my Substack columns “Black Girl In a Doggone World,” “BlackTechLogy,” “Homegrown Tales,” “I Do See Color,” “One Black Woman’s Vote” and “Window Shopping” too. Subscribe to this newsletter for the monthly post on the third Thursday.
If you’re not ready to subscribe but want to support my writing, you’re welcome to tip me for this post! I’ll buy a dark hot chocolate on you. Thanks for reading!