Trusted Reviews is supported by its audience. If you purchase through links on our site, we may earn a commission. Learn more.

Facebook Likes can accurately reveal your personality, says study

Research from a Cambridge University study into online privacy on Facebook shows that what someone has liked on Facebook can be used to accurately outline a huge amount of personal information.

The study revealed that Facebook users are inadvertently exposing some of their most intimate secrets via the pages they have publically “liked” on the social media website.

Cambridge University researchers were able to correctly deduce  a user’s IQ, race, sexuality, drug use, political views or personality traits only using a record of the subjects, people and brands a person had liked on Facebook, even if not publically visible.

Analysing 58,000 Facebook users in the US, the researchers devised an algorithm to determine different personality traits. Gay men were picked out 88 per cent of the time, drug use was deciphered correctly in 65 per cent of cases, and even whether a user’s parents had divorced before that user turned 21 was found out in 60 per cent of cases.

“The important point is that, on one hand, it is good that people’s behaviour is predictable because it means Facebook can suggest very good stories on your news feed,” said Michal Konsinski, the lead Cambridge University analyst working with Microsoft Research for the study. “But what is shocking is that you can predict your political views or your sexual orientation. This is something most people don’t realise you can do.”

The computer software used to predict the traits could be used by anyone with training in data analysis, meaning that the same information could be collected by more dangerous organisations.

“Everyone carries around their Facebook ‘likes’, their browsing history and their search history, trusting corporations that it will be used to predict their movies or music tastes. But if you ask about governments, I am not sure people would like them to predict things like religion or sexuality, especially in less peaceful or illiberal countries,” added Konsinski.

The findings could reignite concerns about how much private data can be collected by governments and private companies, especially via seemingly innocuous Facebook likes and online habits.

“I hope internet users will change their ways and choose products and services that respect their privacy. Companies like Microsoft and Facebook depend on users willing to use their service – but this is limited when it could to Facebook because 1 billion people use it.”

The academic added that online sites like Facebook or Google should be enforced to inform users that their private data could be gathered purely by their likes, using the same technology that suggests music, movies and apps.

Do you limit how much information you give out on your Facebook and other social media profiles? Should social media sites warn users about how much personal data can be gathered without your consent? Give us your thoughts on the matter via the TrustedReviews Facebook and Twitter pages or write your comment below.  
 
Via: Guardian

Why trust our journalism?

Founded in 2003, Trusted Reviews exists to give our readers thorough, unbiased and independent advice on what to buy.

Today, we have millions of users a month from around the world, and assess more than 1,000 products a year.

author icon

Editorial independence

Editorial independence means being able to give an unbiased verdict about a product or company, with the avoidance of conflicts of interest. To ensure this is possible, every member of the editorial staff follows a clear code of conduct.

author icon

Professional conduct

We also expect our journalists to follow clear ethical standards in their work. Our staff members must strive for honesty and accuracy in everything they do. We follow the IPSO Editors’ code of practice to underpin these standards.

Trusted Reviews Logo

Sign up to our newsletter

Get the best of Trusted Reviews delivered right to your inbox.

This is a test error message with some extra words