CJRW@SXSWi: Big Data & Privacy: Are You Protected?


CJRW@SXSWi: Big Data & Privacy: Are You Protected?
A panel called "What Keeps the Internet’s Leading CPOs Up At Night" at South-by-Southwest Interactive. (Elizabeth Michael/CJRW)

(Editor’s Note: Rob Anderson, director of content strategy for Cranford Johnson Robinson Woods advertising agency in Little Rock, is on the ground at South-by-Southwest Interactive in Austin, Texas. He and his team are filing regular reports throughout the week on what’s hot at this influential annual conference. You can see previous posts hereherehere and here. Today, Elizabeth Michael, CJRW's social media manager, files this post on data and privacy.)

Remember a couple of months ago when friends flooded your newsfeed with long legal posts stating that all your posts and photos were their property and not owned by Facebook? 

First off, that doesn’t protect you from anything. And second, that trend indicates the public’s growing awareness of how companies access and use their data. And while not all access and usage is sinister, it does give people the heebie-jeebies. 

The Internet is widely accessible, and people expect a certain level of ease — and it needs to be quick. So no one actually reads that 12-page privacy policy when signing up for Facebook, Gmail and Twitter. And because no one actually reads them, no one actually knows what information they’re allowing into the world. Most of their information is public.

A major trend at South-by-Southwest this year is privacy standards and practices concerning data collected through social media, apps and other digital touch points, and how the government and businesses access and use this data to learn more about your life. 

Consider this: 90 percent of all data has been collected over the last two years. That is a lot of information about our lives.

Companies collect data for different reasons. While advertising is probably the simplest use of this data, other data gathered from social media can down-right get you killed in some countries (example: your sexual orientation, which could be outlawed in some countries). 

I attended three SXSWi sessions concerning privacy, how “Big Data” is collected, what companies and the government do with this data, and how you should protect yourself.  

Here are key takeaways from each session:

Is Privacy a Human Right? 

“Should the right to own one’s data be the 28th amendment? Though it may sound trite, as more of our intimate data makes its way onto easily stolen mobile devices, public clouds and shared networks, criminals, cops and corporations can now monitor your every move more efficiently than before.” 

Did that quote from the SXSW workshop description make you a little nervous? Call me paranoid, but it did to me! 

As Thomas Jefferson stated, “I prefer dangerous freedoms over peaceful slavery.” David Gorodyansky, CEO/founder of AnchorFree says that online privacy is a fundamental human right, and the United Nations agrees. 

Highlights from this session:

  • Love grows in the most private quarters, now companies are intruding into these private quarters. Experts say that this breaks down trust and limits the freedom of users’ communications. (Ex. If a user lives in a country that outlaws that user’s sexual preference, they are at great risk if their social media posts are public.) People should be free to create without someone looking over their shoulder. Thoughts that matter the most are the most private. (A good example of this idea is embodied in the "#ChicagoGirl" documentary.)
  • Companies currently have an average of 12-page security policies and they are aiming to reduce them to six pages. Will that make any difference? People still won’t read a six-page privacy policy.
  • The Internet of Things is much easier to hack. It is much easier to hack a Tesla than a computer.
  • One out of every 10 smartphones has Snapchat. Snapchat’s popularity can be attributed to tech-dependent new generations that are keenly aware of the public nature of the content they share.
  • New apps are in development that cater to the growing concern of the accessibility of data and the freedom to share. One app, named Kaboom, will allow posts on Facebook, Twitter and other platforms to disappear within a certain timeframe (like Snapchat).

(See more about the conversation: #MyRights.)

What Keeps the Internet’s Leading CPOs Up At Night 

First off, a CPO is a chief privacy officer. Attending these social privacy sessions, I noticed a weird convergence of legal and social media marketers. Sounds like a fun party, right? This one was particularly interesting because of the panelists. I mean Google, Facebook, and Microsoft. Amirite?

Brendon Lynch, chief privacy officer at Microsoft; Erin Egan, chief privacy officer at Facebook; and Keith Enright, legal director/privacy at Google, lined out what their companies consider when setting privacy policies for new tech product launches, how the consumer digests them, and trends they see coming.

  • Privacy and innovation are intertwined. Innovation must be disruptive and privacy must be at the very beginning. When launching a startup or a new tech product that disrupts the marketplace and pioneers a new path, companies are starting from scratch and need to build trust with their consumers. Privacy policies are the beginning of that process.
  • Don’t think about it as a privacy policy, think of it as a trust document that puts people first.
  • Facebook’s Newsfeed feature was hotly debated as a privacy overstep when it was first announced. It looks and feels different so there is a notion of privacy by obscurity. Facebook decided to go down the route of using data to populate the Newsfeed with content you directly like or content Facebook thinks you like.
  • A core issue with setting privacy standards is an eco-system issue. Everyone needs to work together (lawmakers, tech companies and the public) to find out the best way to set standards that protect the public’s data. Transparency and user trust is crucial.
  • Regulations about privacy are behind tech growth. As the gap widens, the need for people to help navigate that gap grows.

(See more about the conversation: #PrivacyPro.)

Are You in a Social Media Experiment? 

David Lazer and colleagues stated in their 2009 Science article, “Computational Social Science,” that our online activity “leaves digital traces that can be compiled into comprehensive pictures of both individual and group behavior, with the potential to transform our understanding of our lives, organizations, and societies.”  

Translation: companies and governments are taking this data and performing social experiments about you without you knowing about it. These experiments are using public information to uncover private information. You don’t realize what your actions will reveal about you. 

Panelists Jason Baldridge, chief scientist at People Pattern; Jennifer Golbeck, associate professor of information studies and director of the Human-Computer Interactive Lab at the University of Maryland; Michelle Zhou, co-founder and CEO of Juji; and Philip Resnik, professor at University of Maryland, discussed the ethics of using data collected on social media and Internet usage to make extrapolations and predictions about people’s lives, behavior patterns and more. 

Is it ethical to gathering data at scale and compute correlations about people without their knowledge? The panelist expounded upon several eye-opening statements, like big data models can predict if you will get divorced or if you are depressed. 

Social data can predict not only personality traits (which are often used in targeted social media ads), but also your future as well. 

Other key points:

  • Would you say you know yourself very well? Your computer might know you better. Everything you do on your computer is recorded and companies use data to make predictions about your patterns and life.
  • Facebook conducted a study on its users under the premise that they were trying to make the user experience better. In an academic setting, Facebook would have gotten informed consent from participants and debriefed them on findings. Given its privacy policy, it was able to preform this experiment without participants consenting. In an academic setting, experimenters might have simply been observing, but Facebook might have been manipulating. How does it feel to have been unknowingly emotionally manipulated by Facebook? Do you feel that is ethical? Are you comfortable with unknowingly being a part of an experiment that can predict your sexual orientation, political views, happiness level and more? And all for financial gain? Just sayin’.
  • Cambridge conducted a study about intelligence indicators from social media use and found that intelligent Facebook users are interested in the Colbert Report, curly fries, thunderstorms and science. While this may seem random, and while Cambridge’s study was conducted for academic reasons, this illustrates the extrapolated inferences that companies and governments could use big data for to target individuals.

(See more about the conversation: #SocialRats.)

Whew. That’s a lot to digest about the current situation of privacy and your public data. 

Not only is U.S. policy failing to keep up with the pace of tech, but the European Union is pioneering sweeping privacy rules that apply to the entire industry. 

One thing is clear: there’s quite revolution brewing about privacy standards and the public nature of our data, and it must be led by the  users of these products.

(For more, follow the CJRW team on Twitter at @RobWAnderson@WeAreCJRW@ZackHill@LizzyMichael and @BryceParker.)


More On This Story