Privacy

Adopting cloud computing can mean entrusting data to a third-party vendor. For agencies responsible for personally identifiable information or mission-critical applications, this raises a host of privacy concerns, chief among them the issue of data sovereignty and the question of determining appropriate government and commercial uses of private citizens’ data. This section of the SafeGov.org site analyzes the risks to privacy associated with cloud adoption and explores ongoing means to mitigate them.

Our Day in Court?

H. Bryan Cunningham by Bryan Cunningham, Cunningham Levy LLP
Friday, September 12, 2014

Something unusual happened in an Oakland federal court this summer. The U.S. Government, concerned that classified national security information had been disclosed in a courtroom crowded with reporters and spectators, asked the court to modify the public record, as though the words had never been said at all, but the government later decided no classified information had been disclosed, so the issue became moot. In a separate federal case, a private company asked another federal judge to remove from the public record possible trade secret information the company’s witness had publicly disclosed, even though a verbatim transcript of the statements had been published on the court’s website. That judge forcefully rejected the company’s attempt to make this information disappear. The parties to these two cases would differ about whether the judges adequately protected their interests in the specific case at issue. But both cases involved alleged intrusive mass surveillance of our private communications, the first by the National Security Agency (NSA) and the second by Google.

9/11 Vs. Snowden: My Students' Surprising Debate About Privacy And Government

Michael Murphy, Forbes,  Thursday, September 11, 2014

“If it means preventing another 9/11,” one student said, “I’m willing to give up some of my privacy.” A long, thoughtful debate about a personal need to encrypt versus the larger question of right-to-privacy continued. Whether it was a normal multi-perspective conversation in a journalism class, I don’t know, but the less-privacy-more-security side seemed to be in the majority.

Mobile Apps and Privacy for Federal Users: Drawing the Line on “App-Appropriate”

Julie Anderson by Julie Anderson, Civitas Group
Friday, August 15, 2014

The federal mobile device landscape is evolving at a rate faster than ever before. Budget realities have accelerated the adoption of federal telework initiatives and lowered agencies’ reluctance toward bring-your-own-device (BYOD) policies – due to promising cost savings coupled with the growing demand from employees. As a result, agencies today face the daunting task of overseeing a wider assortment of devices, operating systems, and applications – all of which require heightened security and privacy considerations. Within this realm, mobile apps are a promising contribution toward improving productivity, efficiency, and customer service in the federal workforce. Some agencies have already begun rolling out or approving mobile app tools for their employees to use for job-related functions. In addition, other agencies are leveraging public-facing apps to engage constituents such as emergency alerts or newsfeeds. But as agencies approve the use of apps hosted on common commercial market operating systems – such as Google’s Android or Apple’s iOS – how these larger consumer-focused companies set up or utilize application data should be of increasing concern. The federal shift to BYOD and mobile apps must not come at the expense of privacy.

How Should the Law Handle Privacy and Data Security Harms? (Part Four)

Daniel J. Solove by Daniel Solove, TeachPrivacy
Tuesday, July 22, 2014

In this post, I will discuss how the law should handle privacy and security harms. One potential solution is for the law to have statutory damages – a set minimum amount of damages for privacy/security violations. A few privacy statutes have them, such as the Electronic Communications Privacy Act (ECPA). The nice thing about statutory damage provisions is that they obviate the need to prove harm. Victims can often prove additional harm above the fixed amount, but if they can’t, they can still get the fixed amount.

Data Brokers, Cloud Providers, and Responsible Use

Paul Rosenzweig by Paul Rosenzweig, The Chertoff Group
Wednesday, July 16, 2014

Data brokers may soon become the pariahs of cyberspace if they don’t adopt principles of “responsible use.” And, if cloud service providers don’t watch out, they risk becoming tarred with the same brush.

Do Privacy Violations and Data Breaches Cause Harm? (Part Three)

Daniel J. Solove by Daniel Solove, TeachPrivacy
Tuesday, July 08, 2014

In this post, I want to explore two issues that frequently emerge in privacy and data security cases: (a) the future risk of harm; and (b) individual vs. social harm.

Why the Law Often Doesn’t Recognize Privacy and Data Security Harms (Part Two)

Daniel J. Solove by Daniel Solove, TeachPrivacy
Tuesday, July 01, 2014

In my previous post, I explained how the law is struggling to deal with privacy and data security harms. In this post, I will explore why. One of the challenges with data harms is that they are often created by the aggregation of many dispersed actors over a long period of time.

Privacy and Data Security Violations: What’s the Harm? (Part One)

Daniel J. Solove by Daniel Solove, TeachPrivacy
Tuesday, June 24, 2014

Courts have struggled greatly with the issue of harms for data violations, and not much progress has been made. We desperately need a better understanding and approach to these harms. I am going to explore the issue and explain why it is so difficult. Both theoretical and practical considerations are intertwined here, and there is tremendous incoherence in the law as well as fogginess in thinking about the issue of data harms. I have a lot to say here and will tackle the issue in a series of posts. In this post, I will focus on how courts currently approach privacy/security harm.

Google’s Admission to Data Mining of Student and Government Emails Demands Further Scrutiny

Jeff Gould by Jeff Gould, SafeGov.org
Thursday, May 15, 2014

In a surprise announcement on April 30, 2014, Google announced on its company blog that it would no longer “collect or use student data in Apps for Education services for advertising purposes.” Google also noted that it would make similar changes to its Google Apps for Government products. This announcement suggests that Google has been scanning, storing and monetizing student, business and government emails for years, which raises concerns about Google’s past privacy practices and their future policies. This is a significant violation of the trust placed in the company by the schools and government agencies who signed contracts with the assurance that there would be “no ad-related scanning or processing” in Google Apps – language that Google once noted on their website.

Trust But Verify Big Datamining Claims

H. Bryan Cunningham by Bryan Cunningham, Cunningham Levy LLP
Thursday, May 08, 2014

Much has been written in recent years about the benefits and risks of “free” cloud services monetized by providers mining the private data of users. These risks are particularly acute in some government cases, e.g., education applications mining the data of students, and applications used by law enforcement and national security agencies. I, along with others, have recommended that government entities include clauses in contracts with cloud providers prohibiting data mining. Some governmental contracting authorities have embraced this remedy.