The Federal Trade Commission Wants In On Big Data Regulation

The FTC has a tendency to see a role anywhere it wants to see a role.

As described on its website, the FTC has a very broad mandate. Among other things, the FTC has the authority to investigate and prosecute cases of “unfair or deceptive acts or practices in or affecting commerce.”  As a practical matter, “unfair or deceptive acts or practices” means whatever the FTC says it means.

Apparently the FTC is now looking for “unfair or deceptive acts or practices” in the Big Data world.  

The FTC gave clues about where it is looking in its recent event – “Big Data: A Tool for Inclusion or Exclusion?”

The FTC Chairwoman explained the purpose of the event as such:

“A growing number of companies are increasingly using big data analytics techniques to categorize consumers and make predictions about their behavior,” said FTC Chairwoman Edith Ramirez. “As part of the FTC’s ongoing work to shed light on the full scope of big data practices, our workshop will examine the potentially positive and negative effects of big data on low income and underserved populations.”

It’s not clear what “underserved” means in this context, but certainly Big Data can be used to discriminate (unfairly) in a number of different ways. Example discussions of this here and here.  On the other hand, as suggested in the quote above, (and as discussed here) Big Data might be useful in combating discrimination.                                                   

These are real concerns, and it’s great that the FTC wants to keep up with changing technology. Unfortunately, if this event is any indication, the FTC has a lot of catching up to do.

Consider this from the FTC’s event notice:

The FTC has found that, in some cases, companies are targeting ads based on racial or other assumptions, said Latanya Sweeney, the agency’s CTO. At a website for members of Omega Psi Phi, an African-American fraternity, the agency found ads for defense lawyers and for users to check their own criminal backgrounds, she said. The site also had a large number of ads for poorly rated credit cards, she said.

This is not a particularly current issue. Targeted advertising has been talked about for a long time – certainly long before we started talking about Big Data. 

That said, it is true that the FTC has a role to play in privacy law and in some data-related matters. To the extent that role extends to Big Data when the data includes personal information, the FTC has a role to play in Big Data.

However – the FTC is not very good at recognizing the limits of its jurisdiction. Consider data security breach issues.

As FTC Chairwoman Edith Ramirez stated in testimony before Congress that:

Under current laws, the FTC only has the authority to seek civil penalties for data security violations involving companies that fail to protect children’s information provided online in violation of the COPPA Rule or credit report information in violation of the FCRA. The Commission also recommends data security legislation that would provide the agency with jurisdiction over non-profits, which have been the source of a substantial number of breaches.

In light of this statement, it’s difficult to understand how the FTC decided it has the authority to prosecute enforcement actions with respect to data breaches.

(Note also that the FTC’s self defined jurisdiction in this regard, as well as the manner in which the FTC presumes to exercise that jurisdiction, have been challenged in currently active litigation, although not successfully yet.)

Obviously, it’s good that the FTC wants to increase its understanding of relevant issues. But government agencies should not be looking for a role in the latest cool thing just because it wants one.

Posted in Big Data, Federal Trade Commission, Regulation | Tagged , , , | Leave a comment

Crossing Borders with Big Data in the Cloud – and the Law

An article in Gigaom lists some of the geographical issues you might want to consideration when choosing a Cloud service provider.  The focus of the article is on technical issues – latency and redundancy – and how locating data centers in different countries might affect the significance of those issues. 

What about data location and the law?

Although it’s not really called one, there is a reference in the article to one legal issue. Specifically, the article states:

Data protection: Different types of data have different locality requirements, e.g. requiring personal data to remain within the EU.

That is true.  Bringing your data across a border might be a privacy law problem. 

But there are others of course.  Export control regulation is a particularly important example.  (We touched on this in a post here at Big Data and the Law when we talked about the guy who made guns by printing them.)

Export Control – the Basics

The essential thing is that, under U.S. law, some data can’t be exported from the U.S. without the permission of the U.S. government.  U.S. law also provides that some data can’t be moved outside the U.S. at all, and pretty nothing can be moved to certain countries.

Certainly not all data gives you an export control problem, and not all export destinations give you an export control problem.  You’re probably OK moving your data to Canada.  (That’s probably – not definitely.  See disclaimer of legal advice.)  Don’t plan to send anything to North Korea though.

The application of U.S. export control is complicated in some cases and a little counter-intuitive in some cases as well.  For example, sometimes you can have a U.S. export control problem with data you bring into the country and then move it out again.  That’s right.  You can receive data from someone who is outside the U.S. and not be permitted to send the same data back to the same person you got it from.  This has been an issue with encryption technology.

While we’re at it, you should know that under export control law location isn’t just a question of geography.  Location also means a place where certain people have access to your data.  Even if your data is located in a nice secure facility in the United States, you have a potential problem if people who are not U.S. citizens can access your data in that nice secure facility in the United States.

Export Control and the Cloud

There are two basic facts you need to know to determine whether, in your case, moving your data to the Cloud is a problem under U.S. export control regulations.  What is the data that you’re putting in the Cloud and where is the data is going to be stored?

Easier said than known.

Your first problem is – how do you know where your data is going?  Unless you get a commitment from the service provider to keep your data in the countries you designate you don’t know.  Vendors have their reasons for choosing where they locate their data centers and those reasons might not be consistent with your expectations or assumptions. 

You also have the problem of change. 

For starters, in all likelihood the substance of your data will change.  Today’s data might not present you with an export control problem but tomorrow’s data might be a different matter entirely. Yet another reason for good data management practices.

Your data might get moved – or replicated to systems in other data centers, that might be for disaster recovery purposes, archiving or other reasons.  Those new places might be in places the law doesn’t want that data to be – and don’t forget the thing about who has access to your data.

Export Control – the Risks

A number of different things can happen to you if you violate U.S. export control law.  You can be fined.  You can be barred from doing business with the U.S. government.  And you can be sent to prison.  Here’s an example of a criminal prosecution from the U.S. Department of Justice March, 2014 Summary of Major U.S. Export Enforcement, Economic Espionage, Trade Secret and Embargo-Related Criminal Cases:

Hwa obtained contracts to supply circuit boards to the U.S. Navy, by falsely claiming the boards would be manufactured in the United States. Instead, Hwa illegally sent restricted information to a company in Taiwan for the boards to be manufactured there.

That’s data. 

Export Control in Other Countries

Finally, don’t assume that the United States is the only country with export control law. Among others, the Federal Republic of Germany does.  You can find some information about that here in the 2013 Brief Outline on Export Controls prepared by the Bundesamt für Wirtschaft und Ausfuhrkontrolle.  Note this snippet from that document:

The provision of software and technology in the companies’ intranets or in the internet is also subject to licensing if the access to software and technology is possible from third states. Please note that a licensing requirement does not presuppose that the access took place or not.

Very much like U.S. export control law.

You can find links to the export control laws of other countries here at the U.S. Department of State website. 

Posted in Big Data, Cloud, Export Control | Tagged , , ,

Follow-up on Data Collection and Personal Responsibility – More about the Google Glass Debate

As we discussed in our last post, people are collecting lots and lots of the personal information that ends up being publicly disclosed.  A couple of interesting articles have been published since then that shed further light on the problem.

In this article in Business Insider, we see more evidence that Google Glass is more than just a potential annoyance; it’s a potential vehicle for significant personal data security disclosure.  The article notes, for example, some of the things that can be recorded by Google Glass that are scarier than your behavior at that party the other night.  For example, standing behind you at an ATM gives the Google Glass wearer a chance to record your pin number.

The key point of the article, however, is less the kinds of data that can be collected with Google Glass than the fact that Google Glass can be hacked – with the potential result that data collected with no bad intent can be accessed by people who do have bad intent.  So merely collecting data with Google Glass is a matter of personal responsibility no matter the circumstances.  And yes, that is true now with respect to any connected device.  That doesn’t let the Google Glass user off the hook; rather it should put the Google Glass user on notice of the risks.

One more thing I haven’t seen in the Google Glass debate –  if you believe that the NSA is collecting all data that you send through any network, and if you don’t like that, think about what happens to all those images you’re collecting.  Think about NSA facial recognition programs.

Maybe you’re collecting data for the Man.

On the other side of the issue, an article in Mashable discusses the possibility of jamming Google Glass network access:

The technology, called Glassholes.sh, detects Glass devices on a Wi-Fi network by their media access control (MAC) addresses and blocks their access. The program, which runs on Raspberry Pi and BeagleBone mini computers, can also “emit a beep to signal the Glass-wearer’s presence to anyone nearby,” according to Wired. The program works via a USB network antenna. It uses Aircrack-NG to impersonate the network and send a deauthorization command, which blocks Glass’s Wi-Fi connection.

One cultural note about the Mashable article.  This is the first sentence:

A Berlin artist may have escalated the persecution of Google Glass enthusiasts with a program that lets you jam their Wi-Fi reception

Persecution?

An article on TweakTown also deals with the Google Glass jamming issue.

The TweakTown article will point you to Julian Oliver of Stop the Cyborgs.  Whatever your perspective on the Google Glass debate and the general discussion of privacy and data collection, Stop the Cyborgs is a place to find thoughtful and informative reading.  If you are at all interested in the subject, you should go there.

Considering these issues in the abstract is easier than considering them in the context of real events.  Here’s an example: 

A summary of the facts as reported in this article in the StarTribune:

    • Two paramedics examined a man outside his apartment building. We’ll call him the “Patient.”
    •  Andrew J. Henderson happened to be there and happened to have a camera with him.
    •  Mr. Henderson filmed the paramedics as they went about their work.
    •  At some point, the paramedics asked the Patient about his medical history.
    •  One of the paramedics asked Mr. Henderson to stop filming.
    •  The paramedic says he was concerned for the Patient’s privacy.
    •  Deputy Sherriff Jacqueline Muellner was also present at the scene.
    •  Deputy Muellner took Mr. Henderson’s camera.
    •  The camera was returned three weeks later.
    •  Mr. Henderson says his film of the incident was gone when the camera was returned.

So – what’s right in this case?

    • What’s more important – privacy or freedom to film?
    •  Does it make a difference personal medical information was involved?
    •  Does it matter that, in this situation, the Patient wasn’t in a public space voluntarily?
    •  Should Mr. Henderson have asked for permission to film?
    •  If so, who should he have asked?
    • Would the answer to any of the questions be different if the film was posted or not?

Perhaps most importantly, who gets to decide what’s right?

 

 

 

 

Posted in Big Data, Confidentiality, Data Collection, Ethics, NSA | Tagged , , , , , , ,

Data Collection, Personal Responsibility and the Law

In a recent article in the Huffington Post, John Whitehead asks this question:

What would happen if the most powerful technology company in the world and the largest clandestine spying agency in the world joined forces?

Similarly, a recent article in Politico asks – Who watches the watchers?

It’s perfectly reasonable to be concerned about the perceived Google – NSA axis of evil. However, our discussions about privacy seem to overlook the role of the individual.

Mr. Whitehead notes that: 

 … we’re helping Corporate America build a dossier for its government counterparts on who we know, what we think, how we spend our money, and how we spend our time.

By doing so, he’s really referring to the information we volunteer about ourselves.

What about information that we collect and disclose – information about other people? We’re collecting and disclosing a lot of that – in a lot of ways. You do it when you post a picture of your friends. You do it when you post a video that you take in public. Intentional or not, in many such cases we’re sharing personal information about those other people. (Remember, we don’t say personally identifiable information here at Big Data and the Law.)

Geo-location metadata is one way we might disclose people’s personal information – specifically, where the people are when their images are when their images are captured in a photo or video. This is how John McAfee (of the security software company McAfee) was found when he was a fugitive a few years ago.  He let his picture be taken and posted online – with metadata that gave away his whereabouts.  You would think he would know better.

Apparently he does now. McAfee (the company) later blogged about the risks of geo-location metadata. They say this:

For instance, you may not mind sharing your exact location with friends and family, but what about with people you don’t know? When your location is broadcast on social networks such as Facebook and Twitter, you lose control of the information. Anyone can see it. Say you check into a hotel while on vacation. A thief could see your check-in, do an online search for your home address and rob you while you’re away.

Or you could get arrested.

Face recognition technology provides the means to connect names and faces, and possibly places and times as well. It seems that the NSA is up on this technology. But then so is Facebook

Now consider the many ways people are collecting images.  More than just with our mobile devices and cameras – Google Glass for example. 

In an article in The New York Times, Dr. Joseph Atick gets right at the issue of the consequences of individuals collecting people’s picture and the use of face image recognition technology:

Dr. Atick sees convenience in these kinds of uses as well. But he provides a cautionary counterexample to make his case. Just a few months back, he heard about NameTag, an app that, according to its news release, was available in an early form to people trying out Google Glass. Users had only to glance at a stranger and NameTag would instantly return a match complete with that stranger’s name, occupation and public Facebook profile information. “We are basically allowing our fellow citizens to surveil us,” Dr. Atick told me on the trade-show floor.

On the other hand, there is at least one Google Glass app that you might conclude has an intended use that, on balance, justifies the potential privacy loss.  You can get it here.

You can download the source code for face recognition software from CNET by the way.  A face recognition app for Google Glass is also under development, although that app is supposed to be an unauthorized app.

People are using their own personal drones now as well. (Drone lawyers are now a thing by the way – one lawyer has a Drone Law Journal.)

With this in mind, it’s important to know what information we collect and disclose. Not everyone knows if or when their phones or cameras collect geo-location metadata. Not everyone knows that geo-location metadata are sometimes found where their pictures are posted or otherwise shared.

Some people don’t want their image posted on Facebook or YouTube. Do we get consent from everyone before we do that?

In the end, the problem with relying on personal responsibility is that there are so many different perspectives about what is right – as we have discussed here in the past.

So that’s the personal responsibility part. So what about the law?

When it comes to people collecting and disclosing personal information, the law is pretty spotty. For the most part, the law regarding privacy is focused upon businesses and their collection and disclosure (and use) practices. One reason for this is jurisdiction. For example, the Federal Trade Commission is the principal regulator of data privacy issues in the U.S. Federal Government. The FTC’s jurisdiction is very broad in some ways, but its jurisdiction is limited to business practices so the FTC can’t take on issues with respect to individuals.

So the law in this regard has developed on an issue-by-issue basis. Online bullying has received attention by lawmakers, as has revenge porn and some other things. But there is no broad – hey you can’t do that – kind of regulation of individual conduct.

People can be as angry as they want about corporate collection, use and disclosure – but this is where Mr. Whitehead is correct – that information comes from somewhere.

There is no reason why individual collection, use and disclosure of personal information can’t be regulated in some way.  Intellectual property law applies to individual conduct and intellectual property certainly gets enforced.

When people, legislatures and regulators figure that out, there could be some movement on this issue.  But it’s a reasonable assumption that it’s not going to happen. That kind of thing gets dealt with on an issue-by-issue basis because it typically takes a lot of issue-specific public outrage for anything to get done. So we’ll probably be waiting forever for the behavior of individual people to become subject to anything close to the scrutiny of the Googles of this world.

In the meantime, the next time you see someone wearing Google Glass – ask that person to disclose their privacy policy.

 

 

Posted in Big Data, Data Collection, Internet of Things, NSA, Privacy | Tagged , , , , , , , , , ,

The Federal Trade Commission Wants Transparency in Data Practices – But FTC Commissioners Are Not Transparent about Their Activities

It appears that we’ve got a problem with FTC Commissioners having undisclosed meetings with business representatives and business groups.  At a time when the FTC is seeking to expand its authority over data and privacy matters this is particularly troubling.

On its website, the Electronic Privacy Information Center describes one specific instance of the problem:

FTC Commissioner Wright Meets with Industry Lobbyists, Not Consumer Representatives: Through a Freedom of Information Act request, EPIC obtained the appointment calendar of FTC Commissioner Wright. The Commissioner’s calendar reveals many meetings with corporate presentatives [sic] but no meetings with public interest organizations representing consumers. One of FTC’s primary missions is to protect consumers from unfair and deceptive business practices. Commissioner Wright became an FTC Commissioner in January 2013. Since then he has met with representatives from Apple, Microsoft, Verizon, Qualcomm, the Network Advertising Initiative, and the Consumer Data Industry Association. He has attended industry conferences and given talks at trade association meetings.

Commissioner Wright isn’t the only Commissioner that has not been forthcoming about meetings with corporate interests.  Big Data and the Law has stumbled across another one.

It appears that Commissioner Julie Brill has had at least two meetings with industry representatives that do not appear on the published list of her “Speeches, Articles, and Statements” found on the FTC website. 

Commissioner Brill was scheduled to take part in presentations to the Centre for Information Policy Leadership Annual Membership Retreat on June 12-13, 2012 and June 13, 2013.  If you believe in transparency in government, the Centre is a particularly troubling group.

To begin with, Centre membership is expensive enough to be pretty exclusive.  Depending upon the level of membership, the Centre charges either $50,000 or $30,000 annually.  Not very many companies have the budget for that, which is probably why there are only 36 Centre members, and that they are companies like American Express, Apple, Bank of America, Boeing, Facebook, Google, MasterCard, Oracle, Verizon, Visa, Wal-Mart and Yahoo.

The Centre also seems to think that transparency is a bad thing.

In 2012 the National Telecommunications and Information Administration proposed a “Multistakeholder Process to Develop Consumer Data Privacy Codes of Conduct.” In a letter to the NTIA about that process, the previous President of the Centre had this to say:

When applied to the actual development of enforceable best practices, however, the Centre believes an effective process requires that the business community serve as the primary drafter of codes and be allowed a closed forum in which to probe sensitive questions and test the workability of proposed approaches.

The Centre is concerned that the Administration’s multi-stakeholder process…  While any process to create industry codes or guidance should be open and inclusive, it must also include ample opportunity for testing ideas, candidly airing points of concern and disagreement, and discussing matters that may involve proprietary information or controversial data practices.  Such robust debate would be discouraged by the presence of media or the possibility of a written transcript of discussions.  This is particularly important when best practices will likely have a direct effect on the internal processes of companies, and will be enforceable against companies that voluntarily adopt them.

This does not mean that policymakers, experts, advocates and the public have no role in the development of best practices.  The Centre urges their full engagement.  Once the initial draft of best practices is developed by industry….

The Centre’s position then, is that industry should create the Codes of Conduct in secret, otherwise industry might have to publicly disclose their “controversial data practices.”  Policymakers (presumably regulators and legislators) can say what they think after the secret drafting is done. 

If that’s the Centre’s view of things, how is it appropriate for Commissioner Brill to meet with the Centre’s members in private?  At a minimum it looks bad.  It looks worse when you see the Centre’s description of the 2013 presentation in which Commissioner Brill is listed as a participant:

Legitimacy, Fairness and Big Data

In the big data context, how should companies reconcile requirements of legitimate processing in data protection regimes with fair processing concerns in the U.S.? Have legitimate process requirements in places like Europe and fair processing concerns in the U.S. begun to coalesce over time? How should regulators approach enforcement in this context?

Let’s summarize. 

We have an industry group with a very exclusive membership. In the recent past, the principal executive of that group has expressed hostility to transparency in the development of enforceable codes of business practices – that presumably would be applicable to its members.  We have a regulator meeting in private with that group, notwithstanding that the group’s members are subject to the authority of that regulator’s agency.   It appears that in one such meeting the topic of the meeting was how such regulators should “approach enforcement” within their jurisdiction. 

One more thing – providing a detailed and lengthy list of appearances and presentations gives the impression that there aren’t any other appearances or presentations to list.  EPIC had to resort to a Freedom of Information Act request to get full disclosure of Commissioner Wright’s meetings with industry.  Big Data and the Law learned about Commissioners Brill’s meetings with the Centre by happenstance.  We have no idea whether there are any other such undisclosed meetings.

This all seems inconsistent with Commissioner Brill’s stated views on enforcement and transparency, which we here at Big Data and the Law have no reason to doubt.  

However, at a time when the FTC wants more authority over data privacy and security matters, it is more important than ever that we know about the relationships between the FTC and those subject to regulation by the FTC.  It’s clear we haven’t been told everything we should know.

Posted in Big Data, Federal Trade Commission, Policy, Privacy, Regulation | Tagged , , , , , , ,

Paper is the New Black

It’s Friday, it’s April and it’s snowing – this is no time for serious stuff.  We’ll do the Big Data thing next week.  Today let’s talk more about paper instead.

As we said earlier this week, paper has many advantages not found in the digital world – security being one. Now we find that it’s possible to sync paper to the cloud. All you need is the Mod. It’s an old school notebook that lets you think analog and store digital.

If you need instruction in pencil sharpening, go here and order – How To Sharpen Pencils A Practical & Theoretical Treatise on the Artisanal Craft of Pencil Sharpening for Writers, Artists, Contractors, Flange Turners, Anglesmiths, & Civil Servants.

Yes Fast Company, paper is awesome.

Posted in Big Data, Friday Fun, Fun Facts, Old School | Tagged , ,

For the Ultimate in Bitcoin Security – Use Paper

You can’t make up this stuff.

From an Ars Technica article:

Carlson continues to store bitcoins both on hosted platforms like Coinbase and Blockchain.info. He also keeps bitcoins on his own hardware and uses paper wallets stored in a bank vault as a sort of low-tech backup. Bitcoin paper wallets contain a wallet’s Bitcoin address and private key, and, if secured properly, it can be one of the safest ways to store bitcoins.

From Blockchain.info – self-described as “Bitcoin’s most popular bitcoin wallet and block explorer.”

Paper Wallet Advantages

  • Protection from malware and keyloggers.
  • Maintain 100% ownership of your private keys. You own the coins not a 3rd party service.
  • No dependence on the security of any website.
  • Keeping a piece of paper safe is easier than keeping your computer secure.

That’s right – keeping a piece of paper safe is easier than keeping your computer secure.

Here’s another idea:

Want to communicate with no risk of interception by the NSA?  Use the mail – the one with stamps and envelopes. The only code you’ll need is a zip code.  Plus – it’s open source technology.  Get some stationery and ask someone over 50 how it works.

Neither snow nor rain nor heat nor gloom of night can stop it!

This is fun.

Posted in Big Data, Bitcoin, Data Security, Fun Facts, Privacy, Technology | Tagged , , , ,

Data Governance and the Law – Facebook’s Acquisition of WhatsApp Might Bring Some Needed Clarity

By now it should be clear to everyone that when you have a privacy policy you are expected to abide by the terms of that policy.  So, for example, if your privacy policy says you will not disclose any personal information that you collect, you should not disclose any information that you collect. 

But what if you change your privacy policy?  What if your new privacy policy says that you have the right to disclose personal information that you collect after, let’s say, December 31, 2014?  Presumably that means that you can disclose any personal information that is collected after December 31, 2014.  But that should not give you the right to disclose personal information that you collected before January 1, 2015. 

This raises a data governance problem.  How do you separate (and keep separate) two bodies of information that are collected during at two different times under two different rules?

Here at Big Data and the Law we assume this happens more frequently than we hear about.  Perhaps in some cases the changes in privacy policy aren’t significant and don’t require a change in information practices.  In other cases, the collecting parties might have data governance practices that can handle any problems that result from privacy policy changes.  It’s likely, however, that the issue is just ignored in some cases and no one noticed. 

In this case — people noticed.  The Electronic Privacy Information Center and The Center for Digital Democracy filed a complaint with the Federal Trade Commission (FTC) in which they assert:

    • Facebook routinely incorporates data from companies it has acquired.
  • WhatsApp’s privacy policies and official blog posts reflect a strong commitment to user privacy.
  • WhatsApp’s messaging service regularly collects and stores virtually all available user data.
  • The Commission has previously found that a company may not repurpose user data for a use other than the one for which the user’s data was collected without first obtaining the user’s “express affirmative consent.”
  • By failing to make special provisions to protect user data in the event of an acquisition, WhatsApp “unreasonably creates or takes advantage of an obstacle to the free exercise of consumer decisionmaking.”
  • Specifically, WhatsApp users could not reasonably have anticipated that by selecting a pro-privacy messaging service, they would subject their data to Facebook’s data collection practices.
  • Therefore, WhatsApp’s inadequate disclosures constitute unfair acts or practices in violation of Section 5 of the FTC Act, 15 U.S.C. § 45(n).

We’re skipping around a bit here, but we invoke the blogger’s right to summarize and generalize for the sake of brevity.

The complaint makes these requests for FTC action:

EPIC urges the Commission to investigate WhatsApp, Inc., and enjoin its unfair and deceptive data collection practices for any future changes to its privacy policy.

Specifically, EPIC requests the Commission to:

a. Initiate an investigation of the proposed acquisition of WhatsApp by Facebook, specifically with regard to the ability of Facebook to access WhatsApp’s store of user mobile phone numbers and metadata;

b. Until the issues identified in this Complaint are adequately resolved, use the Commission’s authority to review mergers to halt Facebook’s proposed acquisition of WhatsApp;

c. In the event that the acquisition proceeds, order Facebook to insulate WhatsApp users’ information from access by Facebook’s data collection practices; and

d. Provide such other relief as the Commission finds necessary and appropriate.

Which brings us to the point – what might we learn from the FTC addressing the EPIC/CDD complaint?  We’re hoping the FTC answers these questions:

1.         Can information collected under the terms of a privacy policy be used in a manner that is inconsistent with the terms of that privacy policy?

2.         Can the FTC intervene in a situation where there is only a possibility or risk of comingling information collected under two or more different rules or assumptions?

3.         What remedies can the FTC impose if the FTC finds that possibility or risk?

Justice Oliver Wendell Holmes, Jr. said, “Great cases, like hard cases, make bad law.”  We’re hopeful for good law to come out of this case, because the facts of the case are clear – even if the issues are not. 

The complaint can be found here: http://www.centerfordigitaldemocracy.org/epic-and-cdd-file-unfair-and-deceptive-practices-complaint-ftc-facebookwhatsapp-deal-whatsapp-users

Posted in Big Data, Data Blending, Data Governance, Facebook, Federal Trade Commission, Privacy | Tagged , , , , , ,

Maybe Edward Snowden Works for Google, or Facebook, or Microsoft, or One of those Guys

Ok, probably not. 

But consider who benefits from all the attention the NSA thing is getting.  Answer: all those other organizations that collect information about you.  We discussed this previously here at Big Data and the Law.  The NSA gets some of its information from those folks – in fact the NSA has gotten information from all of these folks:

Microsoft – Yahoo – Google – Facebook – PalTalk – AOL – Skype – YouTube – Apple

Some of them have joined the protest against the NSA.  We here at Big Data and the Law would never question their motives.  But, for the time being the focus on the NSA seems to have taken attention away from them.  So they are benefitting.

Let’s consider what we’re ignoring while there is such an all-consuming focus on the NSA.

Well first of all, our friends in the private sector continue to collect lots of information about us.  Sometimes we agree to give it to them. 

We don’t always want to volunteer our information.  But in some cases our choice is to agree to give up personal information or lose access to technology we want or need.  Sometimes we have to participate in social media (with the consequent need to disclose some of our personal information) or lose timely access to financial information that might affect our investments.

Sometimes our information it collected without any participation from us.  By these guys for example.

There is another way in which our information can be obtained without or disclosing it.  Information that we did not disclose can be discovered through the analysis of information that we did disclose.

And what about the things that are being done with our information? 

No doubt we don’t really know what the NSA does with the information that it gathers.  But, what about the uses of our information that we  do know about and that should also be concerning to us? 

For example, what about using our on-line information in hiring decisions?

Questions about collection, and questions about use. 

What else are we ignoring?

Well one big thing we seem to be ignoring is the exposure of our information to data breaches.  For example, the very concerning data breach at Facebook

And what about public entities other than the NSA?  What about countries other than the United States?

For some reason we seem to have forgotten that the United States is not the only country that gathers personal information. 

Consider this from a former French foreign minister:

“The magnitude of the eavesdropping is what shocked us,” Bernard Kouchner said Tuesday in a radio interview. “Let’s be honest, we eavesdrop too. Everyone is listening to everyone else. But we don’t have the same means as the United States, which makes us jealous. “

We don’t mean to pick on France.  As the man said, “Everyone is listening to everyone else.”  Russia, for example, is expanding Internet surveillance even though many think that new surveillance is not legal under Russian law. 

This week we had “The Day We Fight Back”.  That’s fine. 

We here at Big Data and the Law don’t want to rain on anyone’s parade.  If you have a problem with what the NSA is doing, there is no reason why you shouldn’t make an issue of it. 

But why is it that “The Day We Fight Back” seemed to be focused on only the United States?    Wrong is wrong – yes? 

Protest is easy.  It’s harder reach agreement on some principles and to apply them generally – to both the private sector and to the public sector – and to all governments.  Objective principles – not subjective principles.

One more thing.

Let’s look at ourselves a little.  It’s not only business and government that collects information without the consent of those from whom it is collected.  Sometimes its individuals acting on their own.   

We haven’t heard anything about that yet.

Posted in Big Data, Edward Snowden, NSA, Privacy | Tagged , , , , ,

Senate Continues Data Broker Investigation – Talk of Data Privacy and Security Legislation

Senate Investigation

On December 18, 2013, the US Senate Committee on Commerce, Science and Transportation held a hearing titled, “What Information Do Data Brokers Have on Consumers, and How Do They Use It?”   

Following up, Committee Chair Senator Rockefeller has now sent requests for information to six data brokers. 

As described in a press release from the Committee (referencing the December 18, 2013 hearing):

Rockefeller sent letters today to six companies, including two – NextMark, Inc. and MEDbase200 – that were highlighted in testimony presented at the hearing as data brokers that produce lists of consumers exhibiting certain financial and health characteristics, such as “Empty Wallets,” “African American Pay Day Loan Responders,” and “Dementia Sufferers”. Four other letters were issued to Acxiom, Epsilon, Experian, and Lexis Nexis – companies that were part of Rockefeller’s initial inquiry into data brokers that sell products focused on consumers’ financial circumstances.

This might sound familiar.  You might have seen the recent incident when a man received some junk mail from OfficeMax that was addressed to “Mike Seay, Daughter Killed in Car Crash.”  Mike Sheay’s daughter was in fact killed in a car crash.  (To make things worse, the letter was also addressed to “Or Current Business.”)

Hope for Privacy and Data Security Legislation?

There are rumors of momentum toward something getting done this year on data privacy and security legislation.

According to The Hill:

Several lawmakers in Congress are optimistic that a new law to protect consumers’ data from being stolen can be passed quickly, weeks after major hacks dominated the headlines.

So it appears that the interest is there in data privacy issues in Congress.

On the other hand, in the very next sentence in the same article, The Hill notes:

The retail and banking industries have begun to face off over potential new legislation, with each worried that new provisions could unduly affect their businesses.

It’s always something.

Strangely though, The Hill brings hope in the form of Republican Congressman Joe Barton:

“It’s one of the few issues in the next 10 months that the House and the Senate can work with the president on,” he said. “I’ll go out on a limb here and predict that we’ll actually do that.”

Can those three work together on anything though?  Certainly it seems that data privacy hasn’t been one of those “few issues” they can get resolved.  For evidence, I note the failure to enact bills for:

Data Security and Breach Notification Act of 2011

Data Security and Breach Notification Act of 2012

Data Security and Breach Notification Act of 2013

A definite pattern.

But that was before the Target incident.  Maybe that’s enough to get things moving.  Already this year we have proposals for:

Personal Data Privacy and Security Act of 2014

Data Security Act of 2014

Data Security and Breach Notification Act of 2014

Commercial Privacy Bill of Rights

That’s just the Senate stuff, and that’s just as of this writing.

So maybe something can happen.  But then ….

The Problems are the Problem

As we know, there are a lot of privacy and data security problems to solve.  Data breach notification is a problem.  It’s a pretty simple problem though, as privacy and data security problems go.  No doubt that simplicity will make data breach notification a focus (probably the focus) of any successful privacy and data security legislation.

What about all the other data privacy problems?  Here at Big Data and the Law, we’re betting those problems are too hard for Congress to deal with.

Look at the 2011, 2012 and 2013 bills (below).  Note how simple (and similar) they are, and ask yourself why such simple legislation couldn’t get passed.  Then ask yourself whether anything more complex could possibly get passed.

Additional Information on the Senate Data Broker Investigation

You can see an archived webcast of December 18, 2013 hearing.

This is the Majority Report presented at the hearing:

This is Senator Rockefeller’s letter to Acxiom:

In a post here at Big Data and the Law you can see an example of the scope of personal information that data brokers collect.  In this case, at Versium Analytics – a company with “…billions of records with billions of real life attributes on consumers and businesses.”

Background on the Privacy and Data Breach Legislation

2011, 2012 and 2013 bills

2014 Bills

 

Posted in Big Data, Data Brokers, Data Security, Policy, Regulation | Tagged , , , , ,