On Big Data becoming dangerously big

godzillaTechnology companies are talking big about encryption and new ways to shield their networks and online customer data. This has potential to start a cyber war with the NSA as government and private enterprise get locked in a battle to prevent and gain access to data. All of this is being done in an effort to distance Facebook, Google, and others from the taint of cooperating with government surveillance and, maybe, to protect user data and the unauthorized searching of company networks (but what is “unauthorized”?…that’s another question).

These new “super codes” are encrypted so well that developers say they will not be deciphered until 2030. By then Big Data will probably be called Godzilla Data and encompassing mass amounts of yottabytes — or some other term that we will have to make up to explain our data situation.

Doing ourselves a disservice

The conversations around this topic are growing almost as big as the data itself, but has anyone stopped to think that we might be doing ourselves a Big Godzilla-sized disservice?

Cue up Gene Rayburn when I say this: Big Data is so big…How big is it?…Big Data is so big, no one will find what they are looking for! This is not so much of a joke as a warning that we could be on a very real and dangerous path. Regardless of your politics and opinions, as it states on the NSA website, the agency “exists to protect the Nation.” The government isn’t playing a game of Matchgame, as there are huge stakes in trying to protect good people from bad people in our digital world.

But I can’t help but think that the more information they collect, the harder it is to find what they are looking for. Is this a data rabbit hole that no one wants to enter?

We have become more fascinated with the size of the data than ensuring that we are collecting and using the right information. So while Google, Facebook and Yahoo are making a stand against government overreach, they are also doing the NSA a favor. With the yottabytes of data already in front of the NSA, is it really that big of a deal that they don’t have access to absolutely everything. By all means, I want them to do their job and to the best of their ability, but sifting through massive amounts of data that the average person cannot begin to understand, makes it seem as if collecting more information is not the answer.

Imagine a Where’s Waldo book. Waldo is relatively easy to find when the book is of average size. Now imagine the book the size of a yottabyte. Suddenly Waldo gets lost in a world of other data points, making the job of locating him much harder.

Size doesn’t matter

The needle in a haystack Big Data problem becomes exasperating when more and more hay is thrown on the pile. It is undeniable that there are far less people on the planet trying to do us harm than there are innocent people. With the gargantuan amounts of data being collected, are the statistical anomalies, the things that should stand out and be noticed, being identified or getting lost in the crowd? It’s not about having all the data that is available as much as it is finding the right data…and quickly.

When we have grey area issues as finding the appropriate amount of surveillance and data collection, it would be naive to think less is more, but we should also question if more is making us less safe.

Tags: , ,

No comments yet.

Leave a Reply