California voters this week approved Proposition 24, the California Privacy Rights Act, with 56% of voters supporting the measure. EPIC previously published an analysis of Proposition 24, nothing that the measure “would make some important improvements to privacy protections for California residents, particularly through the establishment of a California Privacy Protection Agency.” In 2018, the State of California enacted the California Consumer Privacy Act of 2018 (“CCPA”), the first comprehensive consumer privacy law enacted in the United States. Proposition 24 significantly changes the CCPA. EPIC has also published a resource to help California residents exercise their rights under the CCPA.
The FTC has reached a settlement with Zoom requiring the company to address data security but fails to address user privacy. Writing in dissent, Commissioner Slaughter said, “When companies offer services with serious security and privacy implications for their users, the Commission must make sure that its orders address not only security but also privacy.” Commissioner Chopra, also dissenting, wrote “The FTC’s status quo approach to privacy, security, and other data protection law violations is ineffective.” In July 2019, EPIC sent a detailed complaint to the FTC citing the flaws with Zoom and warning that the company had “exposed users to the risk of remote surveillance, unwanted video calls, and denial-of-service attack.” In April 2020, EPIC wrote to Chairman Simons urging the FTC to open an investigation. EPIC has long advocated for the creation of a U.S. data protection agency.
EPIC and a coalition of privacy, civil rights, and consumer organizations have released a policy framework for the Biden Administration to protect privacy and digital rights for all Americans. “Without laws that limit how companies can collect, use, and share personal data, we end up with an information and power asymmetry that harms consumers and society at large,” the groups said. “Individual, group and societal interests are diminished, and our privacy and other basic rights and freedoms are at risk.” The ten recommendations include: 1) recognizing privacy and surveillance as racial justice issues; 2) establishing algorithmic governance and accountability to advance fair and just data practices; 3) encourage enactment of a baseline comprehensive federal privacy law; 4) the establishment of a U.S. Data Protection Agency; and 5) bringing consumer, privacy, and civil rights experts into key government positions.
TikTok, responding to a recent letter from EPIC, said that user privacy “will remain a priority for TikTok” if and when a deal with Oracle is finalized—but stopped short of agreeing to EPIC’s full demands. Last month, after Oracle reached a tentative agreement to serve as TikTok’s U.S. partner and “independently process TikTok’s U.S. data,” EPIC sent letters to both companies warning them of their legal obligation to protect the privacy of TikTok users. The deal would pair one of the largest brokers of personal data with a social network of 800 million users, posing grave privacy and legal risks. Although TikTok responded that it was “committed to helping ensure that any transfer and processing of personal data . . . complies with applicable law” and the company’s privacy policies, TikTok did not agree to other EPIC demands, including a commitment not to merge user data with Oracle products.
The Department of Health and Human Services finalized rules that require insurance and healthcare companies to provide patient access to their medical data in a format suitable for cellphones and other electronic devices. However, federal privacy protections under HIPAA no longer apply once patients transfer their data to consumer apps, creating serious risks to medical privacy. The CEO of the American Medical Association warned regulators that “These practices jeopardize patient privacy, commoditize an individual’s most sensitive information, and threaten patient willingness to utilize technology to manage their health.” Tech firms pushed for these changes. Last year, the Wall Street Journal reported that Google’s ‘Project Nightingale’ intends to amass health data on millions of Americans. There will be a six-month period before the rule goes into effect.
Proskauer Lex Blog: Media and Technology Blog Jeffrey Neuburger
March 8, 2020
In continuing its push to enforce its terms and policies against developers that engage in unauthorized collection or scraping of user data, Facebook brought suit last month against mobile marketing and data analytics firm OneAudience LLC. (Facebook, Inc. v. OneAudience LLC, No. 20-01461 (N.D. Cal. Complaint filed Feb. 27, 2020)). Facebook alleges that OneAudience harvested Facebook users’ profile data and device data in contravention of Facebook’s terms and developer policies. OneAudience purportedly gathered this data by paying app developers to bundle OneAudience’s software development kit (SDK) into their apps and then harvesting data for those users that logged into those apps via Facebook credentials.
Facebook users, including developers and page administrators, are required to assent to Facebook’s terms and various platform policies when a Facebook account is created. According to Facebook’s Complaint, . . .
In its Complaint, Facebook alleged that around September 2019, OneAudience offered to pay app developers to bundle its SDK into their apps. The SDK allegedly allowed OneAudience to collect data about users’ devices and their Facebook (and some other social media) accounts in instances where the user logged into the particular app using their Facebook credentials (e.g., the “Sign in with Facebook” option). The data included user names, email addresses, country, time zone, Facebook ID, and, in limited instances, gender, all of which were allegedly used by OneAudience for targeted marketing services. OneAudience also allegedly collected device data such as call logs, cell tower and other geolocation data, contacts, browser information, email, and information about installed apps.
Today the FCC announced proposed fines against T-Mobile, AT&T, Verizon, and Sprint for selling customers’ location information. FCC Chairman Ajit Pai said: “This FCC will not tolerate phone companies putting Americans’ privacy at risk.” The companies are given an an opportunity to respond to the FCC before the Commission makes a final decision.
[ed: some pundits note that the amounts, when divided amoung the 4 companies, amount to little more than a slap on the wrist. All 4 companies have appealed the proposed ruling/fine and as of Sept, 2020, have not paid fines that are yet to be finalized by the FCC]
Hackers have stolen the entire client database of facial recognition company Clearview AI. Clearview AI scraped over three million images from the internet to build its facial recognition database. The company sells facial recognition services to law enforcement agencies.
About half (52%) of U.S. adults said they decided recently not to use a product or service because they were worried about how much personal information would be collected about them.
Like most internet of broken things products, we’ve noted how “smart” devices quite often aren’t all that smart. More than a few times we’ve written about smart lock consumers getting locked out of their own homes without much recourse. Other times we’ve noted how the devices simply aren’t that secure, with one study finding that 12 of 16 smart locks they tested could be relatively easily hacked thanks to flimsy security standards, something that’s the primary feature of many internet of broken things devices.
“Smart” doorbells aren’t much better. A new study by Consumer Reports studied 24 different popular smart doorbell brands, and found substantial security problems with at least five of the models. Many of these flaws exposed user account information, WiFi network information, or, even in some cases, user passwords. Consumer Reports avoids getting too specific as to avoid advertising the flaws while vendors try to fix them:
<a href="http://The content in this post was found at: more
Recent Comments