Social media has become ingrained into our daily lives over the past decade with over 3.8 billion users worldwide. Since its creation, people have become more connected and separated through polarization, and many people have been questioning the usage of data by social media companies as it becomes more personalized to users.
Amber McCord, assistant professor of practice in media and communications, said algorithms are data collected from various sources used to generate patterns that help companies understand users better so that they can advertise and promote relevant content.
“We like to be exposed to content and features that we believe are beneficial for our own sakes,” McCord said. “When something is personalized we get a satisfaction out of that.”
With that satisfaction also comes some risks. McCord said there seems to be an increased awareness of the risk of data mining, but she is not sure if the risk will outweigh the satisfaction users feel short term.
McCord told a story about the company Fitbit, which had some of their data get in the hands of a third party. The data they found revealed the location of a military base overseas, which posed a serious security threat. She warns that if it can happen to Fitbit it can happen to other companies too.
“A lot of time we are giving up data for the sake of personalization, and there could be long term effects with that in having our data be used in ways we might not approve of,” McCord said. “A lot of times we sign up for terms and conditions without reading them so our data can be sold to any third parties, and how they use that data is out of our control at that point.”
Some ways to stay protected from data mining are paying attention to the terms and conditions, being careful about what is put online, not sharing locations, not saving passwords or card information, and being educated about what data is being collected.
Another potential effect of social media is the polarization of people as they consume more and more media personalized to their values.
“Cognitive dissonance is when we see information that is contradictory to us and we feel discomfort. We don’t like to feel our values being contradicted,” McCord said. “Through personalization, if we only ever see opinions and information that aligns with our values, we’re never challenged on that and we don’t see a variety of opinions. That is what makes us become more extreme or engrained in our own beliefs.”
Natalie Buenker, a junior public relations major from Houston, said she uses social media to promote her fashion blog. Social media has offered her a lot of benefits in her professional life, most notably being connections.
“I’ve been able to meet other bloggers, and I’ve connected with Houston bloggers as well. I was able to actually meet one,” Buenker said. “I personally like personalized feed because it helped me to connect with people. I do think social media has its way of connecting us even though some people think it’s a big disconnect.”
Buenker said she is personally very engrossed in social media because her job revolves around it. Despite this, it is important for her to take mental breaks from social media, especially when so much is going on in the world.
It is understandable that the media will show people what they are interested in and promote that content in their feed, but sometimes personalization can get invasive, Buenker said.
“It can be a little creepy when I do a Google search and then my advertisements are geared towards that,” Buenker said. “It’s creepier when it pops up and you weren’t looking for it, or when people say they said something around their phone and an ad will pop up. I personally think that’s an invasion of privacy.”
Kerk Kee, associate professor in media and communications, said he appreciates personalization for the same reason many others do, it is convenient.
Algorithms show that we as people are very eclectic, Kee said. The algorithms are there to remind even ourselves of our own eclectic interests.
“The human brain is not as accurate and reliable and fast as a computer. I might forget that I have a particular interest but the computer will not forget,” Kee said. “ Some people call it digital breadcrumbs. When we analyze those breadcrumbs, we can detect patterns and have an insight into who you are as a person, and the algorithm can keep feeding you things that you are interested in.”
On the topic of polarization, Kee said he believes it is not really social media or the algorithms faults, but rather an extension of the yes-man attitude that many people subscribe to being amplified very fast and on a larger scale.
Kee recommends that people actively seek out other viewpoints to use the algorithm to their benefit and have more balanced and informed viewpoints.
“I like to look at the algorithm as neutral. The algorithm itself is not bad, it’s how we use it. If we’re more conscious of that long-term potential, we can be curious and intentionally search for videos that are the opposite of our natural interest or position,” Kee said. “When you expose yourself to content that is biased to your viewpoint that is going to become your reality.”
When active on social media, it is important to remember that people still have some control over the ways they are impacted by it and what data they give out. As people become more aware of the long-term effects of data mining and polarization, people will look to social media companies for more guidance and action regarding these issues.
“These are positions that people behind social media companies need to make, there’s an increased social awareness but are people going to give up short-term gratification for a less negative society long term? Well just observe," Kee said. “Social media platforms are going to adapt as user behaviors change.”