6.9 C
New York
Friday, March 29, 2024

Facebook Spies on You to Manipulate Your Emotions

Courtesy of ZeroHedge. View original post here.

Via The Daily Bell

Learning through observation is an important trait for humans, and can account for many aspects of why we are a highly advanced species. Just as peers’ behavior influences our own, it should come as no surprise that we are also influenced by the emotions of those around us.

Those who inject themselves into our lives also influence our emotions. It is no longer just face to face contact that humans deal with on a daily basis, but also a barrage of information from the news media and social media.

Facebook study performed in 2014 has implications far beyond the social network. The study found that when a user was shown fewer positive posts, they were more likely to write negative posts themselves, and vice versa. So simply by having some negativity injected into them, a person is more likely to go and express negativity. And what will those further infected with negativity do? The same thing, spreading the sadness that Facebook primed.

And though Facebook always takes things further, they are still relatively new to the emotional manipulation scene. News media have been doing this for years. During the 90’s, violent crimes rates were dropping, and the news was showing more stories about murder. By the end of the 90’s people were clamoring for the government to do something about the out of control murder rate that seemed to be skyrocketing. But the world was actually safer than a decade earlier; the only difference was that people were hearing about more negative things in the media.

Just consider how easy that is for social and mainstream media to manipulate people on a broad scale. It is less about inserting a particular belief, and more about influencing the overall feelings.

They could even induce certain emotions for certain reasons. For instance, if a re-election is coming up, Facebook could inundate people with negative posts, even having nothing to do with the election. But feeling negative overall, if that person goes to the polls they would be more likely to vote for change since something is causing them to be so sad all the time.

Or maybe when Facebook needs to sell you something, they will spread positive posts all over your feed, putting you in a good receptive mood where you are more likely to say yes, because you just nodded yes in agreement to posts about how cute puppies are, that women are being empowered, and that chocolate isn’t so bad for you after all.

As Usual, Facebook Pushes the Privacy Boundaries

But Facebook wants to take it even further. Facebook wants real-time updates on how happy or sad they are making people.

They have submitted a patent to spy on users through their phones’ camera in order to analyze their facial emotions in real time. They want to see every wrinkle and grimace, smile and chortle. With this information, their big data empire will grow to untold heights. Never has a company before had such immense power to analyze that type of data on such a large scale.

Facebook says it could use the technology to deliver more content that people like, without them having to “like” it. But another possibility is that they could deliver content to change the emotions of the viewer. In a perfect world, they would see the content making someone sad, and instead, deliver happy content until that frown turns upside down. But in the real world… it is possible Facebook would do the opposite.

Of course, this is super creepy. As a private company, Facebook should be able to do what it wishes. But every step it takes seems to make the social networking site less palatable. Will they go too far? Or is there “baby steps” approach to data domination effectively curtailing any exodus?

The other question is, just how much of this information will be made available to employers, schools, governments, and other organizations who want to scoop up specific users’ data in order to vet them for a job or admittance into their group?

Recently a group of students who were accepted to Harvard had their offers revoked after it was discovered that they were participating in a meme sharing Facebook group. The students posted and joked about offensive memes. When Harvard found out, it revoked the acceptance offers for some of the students.

Harvard is not to blame for this; you could debate whether or not they went too far, but it is entirely within their rights as a private organization to refuse admittance to those they deem immature or a liability to their reputation. But a just a few years ago, this was clearly not an issue. Whatever offensive opinions a person held in their minds were not given a public avenue that could damage their reputation.

Now, it is currently pretty easy to avoid this type of public shame: just don’t post and react to offensive material. But with the new facial emotion analysis technology that Facebook is exploring, will you even be able to hide your true opinion?

Could you imagine a dystopian future where everyone walks around with a dead-pan blank expression to keep their “thought crimes” from companies and governments?

Unfortunately, Facebook could be pioneering this type of situation. Beliefs that you never expressed publicly could come to light by seeing how your face reacts to information, memes, and articles.

Think of all the eye rolls they will be recording every time an article about the social justice warriors’ newest shenanigans is released. Is scoffing a micro-aggression?

Subscribe
Notify of
0 Comments
Inline Feedbacks
View all comments

Stay Connected

157,450FansLike
396,312FollowersFollow
2,280SubscribersSubscribe

Latest Articles

0
Would love your thoughts, please comment.x
()
x