CES 2024: Will the Coolest New AI Gadgets Protect Your Privacy?

  /     /     /  
Publicated : 23/11/2024   Category : security


CES 2024: Will the Coolest New AI Gadgets Protect Your Privacy?


Consumer electronics manufacturers are innovating fast. Regulators are slow to keep up. Data privacy is in the balance.



Phones and computers host some of the most private information about us — our financial information, photos, text histories, etc. Hardly any of it compares, though, with the kind of data thatd be gathered by your future, AI-integrated bathroom mirror.
Amid all of the other latest and greatest innovations at CES 2024 in Las Vegas this week, the Bmind Smart Mirror stands out. It combines natural language processing (NLP), generative AI, and computer vision to interpret your expressions, gestures, and speech. Marketed as a mental health product, it promises to reduce stress and even insomnia by providing you with words of encouragement, light therapy, guided meditations, and mood-boosting exercises.
All that, purportedly, plus the promise that your morning hair, blackheads, and most unflattering angles will be kept secure.
In todays world of consumer electronics, privacy and security are increasingly a
selling point
. But that may not be enough to counterbalance the troves of new data your AI-enabled car, robot, and now mirror have to collect about you to function properly, and all the bad actors (including some vendors themselves) whod like to get their hands on it.
Even prior to the
AI revolution
, companies were facing challenges in building adequate data protections into their gadgets. Now its even harder, and the dearth of relevant laws and regulations in the US means that theres little authorities can do to force the issue.
Stealing private data, we know, has been a threat to devices for a long time, says Sylvain Guilley, co-founder and CTO at Secure-IC. Data-heavy AI products are particularly attractive to bad actors, and, of course, they house threats like [the potential to build]
botnets with other AI devices
, to turn them into a spying network.
Meanwhile, there are
plenty of good reasons
why consumer electronics manufacturers struggle with meeting modern standards for data protection (beyond all of the known, cynical reasons). There are resource constraints — many of these devices are built on lighter components than your average PC — that are accentuated by the demands of AI, and variation in what customers expect by way of protections.
You have to be super careful about even enabling people to utilize AI, warns Nick Amundsen, head of product for Keeper Security, because the model is, of course, trained on everything youre putting into it. Thats not something people think about when they start using it.
To assuage its half-naked users concerns, Baracoda explained in a
promotional blog post
on Jan. 6 that its smart mirror gathers information without any invasive technology, and that its underlying operating system — aptly named CareOS — is a privacy-by-design platform that stores health and personal data locally, and never shares it with any party without the users explicit request and consent.
Dark Reading reached out to Baracoda for more detailed information about CareOS, but hasnt received a reply.
However, not all gadgets on display at this years event are promising privacy-by-design. The fact is they simply dont have to, as legal experts are quick to point out.
In the US, there are privacy laws for health data (HIPAA); financial data (GLBA); and government data (the Privacy Act of 1974). But there is no direct statute that regulates the general consumer Internet of Things (IoT) or AI, points out Charlotte Tschider, associate professor Loyola University Chicago School of Law, and author of
multiple

papers
exploring what such guardrails might look like.
Instead, theres a patchwork of semi-related and state-level laws, as well as actions from regulators which, in the gestalt, might start to look like a guidebook for consumer devices.
Last July, for one thing, the White House announced a
cybersecurity labeling program for smart devices
. Though far from mandatory, its aim is to encourage manufacturers to build better security into their gadgets from the outset.
The
IoT Cybersecurity Improvement Act of 2020
, and
Senate Bill 327
in California set a course for security in connected devices, and Illinois
Biometric Information Privacy Act (BIPA)
takes direct aim at your average iPhone or smart mirror. And, perhaps most relevant of all is the
Childrens Online Privacy Protection Act (COPPA)
.
COPPA was designed to help parents control what information companies can gather about their children. COPPAs a big one, Amundsen says. Companies might not realize that theyre entering into the scope of that law when theyre releasing some of these products and some of these AI capabilities, but certainly theyll be held accountable to it.
The first IoT electronics company to learn that lesson was VTech, a Hong Kong-based consumer electronics manufacturer. For the crime of collecting personal information from children without providing direct notice and obtaining their parents consent, and failing to take reasonable steps to secure the data it collected in its Kid Connect app, the Federal Trade Commission (FTC) ordered VTech to pay
a fine of $650,000
in 2018.
The fine was a drop in the bucket for the $1.5 billion company, but it sent a message that this quarter-century-old law is Americas most effective tool for regulating data privacy in modern consumer devices. Of course, its only relevant for consumers under the age of 13, and its far from flawless.
As Tschider points out, COPPA doesn’t have any cybersecurity requirements to actually reinforce its privacy obligations. This issue is only magnified in contemporary AI-enabled IoT because compromising a large number of devices simultaneously only requires pwning the cloud or the AI model driving function of hundreds or thousands of devices. Many products dont have the kind of robust protections they actually need.
She adds, Additionally, it relies primarily on a consent model. Because most consumers dont read privacy notices (and it would take well over a hundred days a year to read every privacy notice presented to you), this model is not really ideal.
For Tschider, a superior legal framework for consumer electronics might take bits of inspiration from HIPAA, or New York States
cybersecurity law for financial services
. But really, one need only look across the water for an off-the-shelf model of how to do it right. 
For cybersecurity, the NIS 2 Directive out of the EU is broadly useful, Tschider says, adding that there are many good takeaways both from the General Data Protection Regulation and the
AI Act in the EU
.
However, she laments, they likely will not work as well for the US. The US legal system is in part based on freedom to contract and the ability of companies to negotiate the terms of their relationship directly with consumers. Regulations designed like the EUs laws place substantial restrictions on business operation, which would likely be heavily opposed by many lawmakers and could interfere with profit maximization.

Last News

▸ Debunking Machine Learning in Security. ◂
Discovered: 23/12/2024
Category: security

▸ Researchers create BlackForest to gather, link threat data. ◂
Discovered: 23/12/2024
Category: security

▸ Travel agency fined £150,000 for breaking Data Protection Act. ◂
Discovered: 23/12/2024
Category: security


Cyber Security Categories
Google Dorks Database
Exploits Vulnerability
Exploit Shellcodes

CVE List
Tools/Apps
News/Aarticles

Phishing Database
Deepfake Detection
Trends/Statistics & Live Infos



Tags:
CES 2024: Will the Coolest New AI Gadgets Protect Your Privacy?