The Issue Of Compliance – Big Data and User Privacy

This is a guest post by Aaron Bach. Find more of his great insights on his blog over at:

Earlier this summer, it became public knowledge that during one week in 2012, Facebook subjected nearly a quarter million users (all of them unwitting) to a study of social emotions. The premise was simple: does the inclusion of emotional content ? ads, sponsored posts, etc. ? on a user?s timeline have a noticeable effect on their emotional state? Put more simply, do users? emotions swing depending on the ?mood? of the content they see?

The answer was simple and resounding: yes. Facebook proved that, among other outcomes, one week of doom-and-gloom content on a user?s timeline produced a tactile downward spin on their emotions (evidenced by a heightened number of ?negative? posts from those users).

More powerful than the story itself was the outcry that followed. Facebook was criticized for seemingly abandoning ethical testing procedures. The company?s top executives were forced to backtrack. There were even suggestions that the study?s terminology ? wherein ?social contagion? is equated to “using emotion to influence emotion ? had direct links to Pentagon-backed studies on quelling social unrest. On nearly every news channel and across nearly every blog, outrage was the word of the day.

I didn?t write about this story as it happened because I didn?t want to follow the crowd and proclaim blind outrage. Having spent several weeks digesting this study (and its implications), some interesting points ? about Facebook and about us ? began to emerge, and from these points, my own opinions followed.

Why So Surprised?

My prevailing thought in the early days of the story was, ?Why is everyone so surprised??

Don?t get me wrong: I was a little perturbed that I might have been one of the lab rats in Facebook?s experiment. Like PRISM one year prior, I didn?t care for being subjected to something against my will. But once I stripped that emotion aside, the next emotion I felt was incredulity: why were the outraged expecting anything different?

Since 2004, countless individuals have poured their guts into Facebook. Pictures, phone numbers, speaking styles, addresses, favorite URLs, music preferences…petabytes of data are freely given to Facebook without any additional thought. With each new entry, Facebook?s massive database of useful data points grows.

In choosing to expand a study of this magnitude, Facebook made a clear declaration: this data is not yours and we will use/adjust/modify it as we please.

Society makes the assumption that in using 3rd party services, applications, etc., it maintains the same sense of control it?s always had. Forgetting all notions of what should be the case, something has become brutally clear: when users choose to bury their data into someone else?s application, they lose control. Those same users must now trust the application to not do anything dastardly with information voluntarily given.

This is a hard perspective to grasp. Society tends to hinge on the ?rightness? of things ? it places a large amount of focus on how things should work. When events like Facebook?s study are revealed, outrage becomes the outward expression of the wrong things (as compared to the social expectation) occurring.

Perhaps I?m a tad jaded, but my response was markedly more blase?: I didn?t trust Facebook to begin with, so I wasn?t all that surprised. My immediate takeaway was more tactical: stop putting so much stuff in Facebook. I?ve followed through; I rarely put anything in Facebook anymore.

This realization ? so simple and so clear ? astounded me. Somewhere along the way, we became comfortable with dumping our most closely guarded secrets into someone else?s system. We never stopped to ask Facebook if it was safe to do so; we never collectively asked for proof that our information was safe. We merely took for granted that it was.

Facebook isn?t the only system we dump information into. If Twitter wants to (and hasn?t already), it can begin to build a profile on users based on the things they talk about. Foursquare can profile their eating/shopping habits and even their movements. The benefits blind users to the potential risks.

The Flip Side

Earlier this year, I read Outliers. Among its many interesting stories, one stood out: the story of Jay Freireich, Tom Frei, and their quest to defeat leukemia in children. The premise was simple: by shirking medicinal convention (which advocated for monotherapy, or the treatment of cancer with one chemotherapy drug) and using a ?cocktail? of 3+ drugs, Freireich and Frei developed a cancer treatment so successful that it forms the basis of treatments today.

That happier ending, however, is only a small portion of the story. Along the way, Freireich and Frei faced enormous criticism of a very emotional sort; in advocating the use of several toxic drugs (as opposed to the normal single drug) on children, the doctors were pegged as monsters. Their studies occurred against the backdrop of enormous outrage and opposition.

Reading this story gave me pause. Though I wouldn?t dare equate a Facebook-led study of emotions and treatment of cancer in children, there was an interesting nugget: had Freireich and Frei crumbled under the weight of social expectation, would their treatment revolution have occurred? What would have ultimately happened had their cancer study halted in the name of ?doing the right thing??

It is interesting, then, to consider whether Facebook?s study might be pushing through localized social norms towards a bigger goal. From the study:

This work also suggests that, in contrast to prevailing assumptions, in-person interaction and nonverbal cues are not strictly necessary for emotional contagion, and that the observation of others? positive experiences constitutes a positive experience for people.

Perhaps Facebook has grand plans for directly influencing the state of its users? lives via ?positive emotional contagion.? Perhaps the goal is to find a way to reverse a generation steeped in negativity.

Or, perhaps the only mission is to make the ?Facebook experience? better.

Time will tell whether Facebook deserves the outrage it?s been given. In the meantime, a better question for society to pursue is one of divulgence: at the end of the day, is it in one?s best interest to deliver reams of personal data to a large service? 

Note: Don’t like sharing your personal data?  MailDeck has you covered. We don’t route any emails through our servers and all of your data is encrypted using 256-bit AES encryption. We take user privacy seriously and believe heavily in transparency. Check out our privacy policy for more information.

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>