(Editor’s Note: Rob Anderson, director of content strategy for Cranford Johnson Robinson Woods advertising agency in Little Rock, is on the ground at South-by-Southwest Interactive in Austin, Texas. He and his team are filing regular reports throughout the week on what’s hot at this influential annual conference. You can see previous posts here, here, here and here. Today, Elizabeth Michael, CJRW's social media manager, files this post on data and privacy.)
Remember a couple of months ago when friends flooded your newsfeed with long legal posts stating that all your posts and photos were their property and not owned by Facebook?
First off, that doesn’t protect you from anything. And second, that trend indicates the public’s growing awareness of how companies access and use their data. And while not all access and usage is sinister, it does give people the heebie-jeebies.
A major trend at South-by-Southwest this year is privacy standards and practices concerning data collected through social media, apps and other digital touch points, and how the government and businesses access and use this data to learn more about your life.
Consider this: 90 percent of all data has been collected over the last two years. That is a lot of information about our lives.
Companies collect data for different reasons. While advertising is probably the simplest use of this data, other data gathered from social media can down-right get you killed in some countries (example: your sexual orientation, which could be outlawed in some countries).
I attended three SXSWi sessions concerning privacy, how “Big Data” is collected, what companies and the government do with this data, and how you should protect yourself.
Here are key takeaways from each session:
Is Privacy a Human Right?
“Should the right to own one’s data be the 28th amendment? Though it may sound trite, as more of our intimate data makes its way onto easily stolen mobile devices, public clouds and shared networks, criminals, cops and corporations can now monitor your every move more efficiently than before.”
Did that quote from the SXSW workshop description make you a little nervous? Call me paranoid, but it did to me!
As Thomas Jefferson stated, “I prefer dangerous freedoms over peaceful slavery.” David Gorodyansky, CEO/founder of AnchorFree says that online privacy is a fundamental human right, and the United Nations agrees.
Highlights from this session:
- Love grows in the most private quarters, now companies are intruding into these private quarters. Experts say that this breaks down trust and limits the freedom of users’ communications. (Ex. If a user lives in a country that outlaws that user’s sexual preference, they are at great risk if their social media posts are public.) People should be free to create without someone looking over their shoulder. Thoughts that matter the most are the most private. (A good example of this idea is embodied in the "#ChicagoGirl" documentary.)
- The Internet of Things is much easier to hack. It is much easier to hack a Tesla than a computer.
- One out of every 10 smartphones has Snapchat. Snapchat’s popularity can be attributed to tech-dependent new generations that are keenly aware of the public nature of the content they share.
- New apps are in development that cater to the growing concern of the accessibility of data and the freedom to share. One app, named Kaboom, will allow posts on Facebook, Twitter and other platforms to disappear within a certain timeframe (like Snapchat).
(See more about the conversation: #MyRights.)
What Keeps the Internet’s Leading CPOs Up At Night
First off, a CPO is a chief privacy officer. Attending these social privacy sessions, I noticed a weird convergence of legal and social media marketers. Sounds like a fun party, right? This one was particularly interesting because of the panelists. I mean Google, Facebook, and Microsoft. Amirite?
Brendon Lynch, chief privacy officer at Microsoft; Erin Egan, chief privacy officer at Facebook; and Keith Enright, legal director/privacy at Google, lined out what their companies consider when setting privacy policies for new tech product launches, how the consumer digests them, and trends they see coming.
- Privacy and innovation are intertwined. Innovation must be disruptive and privacy must be at the very beginning. When launching a startup or a new tech product that disrupts the marketplace and pioneers a new path, companies are starting from scratch and need to build trust with their consumers. Privacy policies are the beginning of that process.
- Facebook’s Newsfeed feature was hotly debated as a privacy overstep when it was first announced. It looks and feels different so there is a notion of privacy by obscurity. Facebook decided to go down the route of using data to populate the Newsfeed with content you directly like or content Facebook thinks you like.
- A core issue with setting privacy standards is an eco-system issue. Everyone needs to work together (lawmakers, tech companies and the public) to find out the best way to set standards that protect the public’s data. Transparency and user trust is crucial.
- Regulations about privacy are behind tech growth. As the gap widens, the need for people to help navigate that gap grows.
(See more about the conversation: #PrivacyPro.)
Are You in a Social Media Experiment?
David Lazer and colleagues stated in their 2009 Science article, “Computational Social Science,” that our online activity “leaves digital traces that can be compiled into comprehensive pictures of both individual and group behavior, with the potential to transform our understanding of our lives, organizations, and societies.”
Translation: companies and governments are taking this data and performing social experiments about you without you knowing about it. These experiments are using public information to uncover private information. You don’t realize what your actions will reveal about you.
Panelists Jason Baldridge, chief scientist at People Pattern; Jennifer Golbeck, associate professor of information studies and director of the Human-Computer Interactive Lab at the University of Maryland; Michelle Zhou, co-founder and CEO of Juji; and Philip Resnik, professor at University of Maryland, discussed the ethics of using data collected on social media and Internet usage to make extrapolations and predictions about people’s lives, behavior patterns and more.
Is it ethical to gathering data at scale and compute correlations about people without their knowledge? The panelist expounded upon several eye-opening statements, like big data models can predict if you will get divorced or if you are depressed.
Social data can predict not only personality traits (which are often used in targeted social media ads), but also your future as well.
Other key points:
- Would you say you know yourself very well? Your computer might know you better. Everything you do on your computer is recorded and companies use data to make predictions about your patterns and life.
- Cambridge conducted a study about intelligence indicators from social media use and found that intelligent Facebook users are interested in the Colbert Report, curly fries, thunderstorms and science. While this may seem random, and while Cambridge’s study was conducted for academic reasons, this illustrates the extrapolated inferences that companies and governments could use big data for to target individuals.
(See more about the conversation: #SocialRats.)
Whew. That’s a lot to digest about the current situation of privacy and your public data.
Not only is U.S. policy failing to keep up with the pace of tech, but the European Union is pioneering sweeping privacy rules that apply to the entire industry.
One thing is clear: there’s quite revolution brewing about privacy standards and the public nature of our data, and it must be led by the users of these products.