California Voters Pass California Privacy Rights Act

17 11 2020
EPIC
Nov. 5, 2020
California voters this week approved Proposition 24, the California Privacy Rights Act, with 56% of voters supporting the measure. EPIC previously published an analysis of Proposition 24, nothing that the measure “would make some important improvements to privacy protections for California residents, particularly through the establishment of a California Privacy Protection Agency.” In 2018, the State of California enacted the California Consumer Privacy Act of 2018 (“CCPA”), the first comprehensive consumer privacy law enacted in the United States. Proposition 24 significantly changes the CCPA. EPIC has also published a resource to help California residents exercise their rights under the CCPA.

The content in this post was found at:
<https://epic.org/2020/11/california-voters-pass-califor.html>

Clicking the title or link will take you to the source of the post and was not authored by the moderators of privacynnewmedia.com.



FTC Fails to Address Privacy in Settlement with Zoom

17 11 2020
EPIC
Nov. 9, 2020
 
The FTC has reached a settlement with Zoom requiring the company to address data security but fails to address user privacy. Writing in dissent, Commissioner Slaughter said, “When companies offer services with serious security and privacy implications for their users, the Commission must make sure that its orders address not only security but also privacy.” Commissioner Chopra, also dissenting, wrote “The FTC’s status quo approach to privacy, security, and other data protection law violations is ineffective.” In July 2019, EPIC sent a detailed complaint to the FTC citing the flaws with Zoom and warning that the company had “exposed users to the risk of remote surveillance, unwanted video calls, and denial-of-service attack.” In April 2020, EPIC wrote to Chairman Simons urging the FTC to open an investigation. EPIC has long advocated for the creation of a U.S. data protection agency.

More

The content in this post was found at:

https://epic.org/2020/11/ftc-fails-to-address-privacy-i.html

Clicking the title or link will take you to the source of the post and was not authored by the moderators of privacynnewmedia.com.



EPIC, Coalition Release Data Protection Plan for Biden Administration

17 11 2020
EPIC
Nov. 10, 2020
EPIC and a coalition of privacy, civil rights, and consumer organizations have released a policy framework for the Biden Administration to protect privacy and digital rights for all Americans. “Without laws that limit how companies can collect, use, and share personal data, we end up with an information and power asymmetry that harms consumers and society at large,” the groups said. “Individual, group and societal interests are diminished, and our privacy and other basic rights and freedoms are at risk.” The ten recommendations include: 1) recognizing privacy and surveillance as racial justice issues; 2) establishing algorithmic governance and accountability to advance fair and just data practices; 3) encourage enactment of a baseline comprehensive federal privacy law; 4) the establishment of a U.S. Data Protection Agency; and 5) bringing consumer, privacy, and civil rights experts into key government positions.

The content in this post was found at:
<https://epic.org/2020/11/epic-coalition-release-data-pr.html>

Clicking the title or link will take you to the source of the post and was not authored by the moderators of privacynnewmedia.com.



New Housing Regulation Limits Disparate Impact Housing Claims Based on Algorithms

20 10 2020
EPIC
Sept. 29, 2020
Individuals alleging that a landlord discriminated against them by using a tenant-screening algorithm will face a higher burden of proof under a new rule that went into effect last Thursday. The rule creates a defense to a discrimination claim under the Fair Housing Act where the “predictive analysis” tools used were not “overly restrictive on a protected class” or where they “accurately assessed risk.” Last October, EPIC and several others warned the federal housing agency that providing such a safe-harbor for the use of algorithms in housing without imposing transparency, accountability, or data protection regulations would exacerbate harms to individuals subject to discrimination. The agency did modify its rule following comments from EPIC and others, removing a complete defense based on use of an “industry standard” algorithm or where the algorithm was not the “actual cause” of the disparate impact. But the final rule simply replaces the word “algorithm” with “predictive analysis” and includes vague “overly restrictive” and “accurate assessment” standards. The Alliance for Housing Justice called the rule “a vague, ambiguous exemption for predictive models that appears to confuse the concepts of disparate impact and intentional discrimination.”

The content in this post was found at:

<https://epic.org/2020/09/new-housing-regulation-limits-.html>

Clicking the title or link will take you to the source of the post and was not authored by the moderators of privacynnewmedia.com.



Tech Companies Block Washington State Privacy Law

21 09 2020
EPIC
March 13, 2020
Last minute lobbying by big tech companies blocked passage of the Washington Privacy Act. The state privacy law have given consumers the right to access, correct and delete their personal data held by tech firms. EPIC and a broad coalition of privacy groups backed a comprehensive bill that would include, as privacy laws typically do, the right of consumers to bring legal action but that was opposed by industry groups. The Washington legislature did pass a modest bill limiting the government use of facial recognition technology. EPIC has long supported federal baseline legislation and the creation of a data protection agency. EPIC has also called for a moratorium on face surveillance. The EPIC State Policy Project monitors privacy bills nationwide.

The content in this post was found at:

<https://epic.org/2020/03/tech-companies-block-washingto.html>

Clicking the title or link will take you to the source of the post and was not authored by the moderators of privacynnewmedia.com.



Financial Data Aggregator Faces Consumer Privacy Suit over “Surreptitious” Collection of Banking Information

1 09 2020

Proskauer New Media and Technology Blog

Last week, a putative privacy-related class action was filed in California district court against financial analytics firm Envestnet, Inc. (“Envestnet”), which operates Yodlee, Inc. (“Yodlee”). (Wesch v. Yodlee Inc., No. 20-05991 (N.D. Cal. filed Aug. 25, 2020)). According to the complaint, Yodlee is one of the largest financial data aggregators in the world and through its software platforms, which are built into various fintech products offered by financial institutions, it aggregates financial data such as bank balances and credit card transaction histories from individuals in the United States. The crux of the suit is that Yodlee collects and then sells access to such anonymized financial data without meaningful notice to consumers, and stores or transmits such data without adequate security, all in violation of California and federal privacy laws.

The content in this post was found at:

Financial Data Aggregator Faces Consumer Privacy Suit over “Surreptitious” Collection of Banking Information

Clicking the title or link will take you to the source of the post and was not authored by the moderators of privacynnewmedia.com.



Most Americans support right to have some personal info removed from online searches

26 08 2020

Brooke Auxier
Pew Research Center
January 27, 2020

Americans prefer to keep certain information about themselves outside the purview of online searches, according to a Pew Research Center survey conducted in June 2019. Given the option, 74% of U.S. adults say it is more important to be able to “keep things about themselves from being searchable online,” while 23% say it is more important to be able to “discover potentially useful information about others.”

The content in this post was found at:

Clicking the title or link will take you to the source of the post and was not authored by the moderators of privacynnewmedia.com.



Before we use digital contact tracing, we must weigh the costs

20 08 2020

Washington Post
Editorial Board
May 1, 2020

THE PING of a smartphone usually means a text from a friend or a news story from a favorite publication. Soon enough, it could instead signal that it’s time to stay inside for 14 days. Technologists are coding furiously to create a plan for digital contact tracing that, paired with traditional manual methods and widespread testing capability, could ease the country out of lockdowns. But before the United States bets on Silicon Valley to solve its problems, leaders ought to ask themselves two questions: How well does it work, and how high is the cost?

The content in this post was found at:

https://www.washingtonpost.com/opinions/tech-firms-must-prove-that-digital-contact-tracing-is-worth-the-privacy-intrusion/2020/05/01/cbf19b8e-7dc7-11ea-9040-68981f488eed_story.html

Clicking the title or link will take you to the source of the post. and was not authored by the moderators of privacynnewmedia.com.



How My Boss Monitors Me While I Work From Home

20 08 2020

New York Times
Adam Satariano
May 6, 2020

On April 23, I started work at 8:49 a.m., reading and responding to emails, browsing the news and scrolling Twitter. At 9:14 a.m., I made changes to an upcoming story and read through interview notes. By 10:09 a.m., work momentum lost, I read about the Irish village where Matt Damon was living out the quarantine.

All of these details — from the websites I visited to my GPS coordinates — were available for my boss to review.

Here’s why: With millions of us working from home in the coronavirus pandemic, companies are hunting for ways to ensure that we are doing what we are supposed to. Demand has surged for software that can monitor employees, with programs tracking the words we type, snapping pictures with our computer cameras and giving our managers rankings of who is spending too much time on Facebook and not enough on Excel.

The technology raises thorny privacy questions about where employers draw the line between maintaining productivity from a homebound work force and creepy surveillance. To try to answer them, I turned the spylike software on myself.

The content in this post was found at:
https://www.nytimes.com/2020/05/06/technology/employee-monitoring-work-from-home-virus.html

Clicking the title or link will take you to the source of the post. and was not authored by the moderators of privacynnewmedia.com.



Commerce Dept. Petitions FCC to Issue Rules Clarifying CDA Section 230

18 08 2020

Proskauer
New Media and Technology Law  Blog

The currents around the Communications Decency Act just got a little more turbulent as the White House and executive branch try to reel in the big fish of CDA reform.

On July 27, 2020, the Commerce Department submitted a petition requesting the FCC initiate a rulemaking to clarify the provisions of Section 230 of the Communications Decency Act (CDA). . .

While a deep dive in the 57-page Commerce Department petition (or whether the FCC even has the legal authority to issue such rules in this area) is beyond the scope of this post, its reform proposals can be broken down into several areas. In brief, the Commerce Dept. has asked to FCC to:

  • Clarify the relationship between the more well-known §230(c)(1) “publisher” immunity for hosting third-party content and the lesser-utilized §230(c)(2) “Good Samaritan” immunity for filtering of objectionable content, lest they be read and applied in a manner that renders §230(c)(2) superfluous.
  • Amend the statute to specify that §230(c)(1) has no application to any provider`s decision to restrict access to content or terminate user accounts.
  • Provide clearer guidance on what content would be deemed “objectionable content” within §230(c)(2) and when removals are done in “good faith” (including proposing that filtering decision taken contrary to terms of service or without an adequate notice or process should fall outside the CDA).
  • Modify the language that defines under what circumstances a provider becomes an “information content provider” as per 47 U.S.C. § 230(f)(3) (“responsible, in whole or in part, for the creation or development of information”), and clarify when a provider`s content moderation practices take it outside of the protections of the CDA. The proposal requests that such providers should lose CDA protection when, for example, they make editorial decision that modify or alter content, “including but not limited to substantively contributing to, commenting upon, editorializing about, or presenting with a discernible viewpoint content provided by another information content provider.”
  • Mandate disclosure for internet transparency similar to that required of other internet companies, such as ISPs.

The content in this post was found at:
https://newmedialaw.proskauer.com/2020/07/30/commerce-dept-petitions-fcc-to-issue-rules-clarifying-cda-section-230/?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+NewMediaAndTechnologyLaw+%28New+Media+and+Technology+Law%29

Clicking the title or link will take you to the source of the post. and was not authored by the moderators of privacynnewmedia.com.