When a Facebook conference starts with Mark Zuckerberg stating, “We think there’s a really important place for a personalized news feed,” you immediately know that things will never be the same. Even worse, you know that things will never be the same – whether you want them to … or not.
Personalized news feeds have been around for nearly four years now and almost every social media site uses them, in fact we’ve got to a stage where every media site uses them, but not everyone knows how much control they have over their news feeds. The internet seems to have become so burdened down with the morass of information produced for it that it would also seem that nobody knows, but more to the point, nobody cares. It would seem in the rush to influence and inform we have turned the online experience into a cascade of white noise that relentlessly clamours for our attention, which seems to be the reason that personalized news feeds were invented.
Zuckerberg continued at the launch with, “What we are trying to give everyone is a copy of the best personalized newspaper in the world.” Really? That was the goal of the personalized feed? Then when did it change to limiting the content we see, taking away our choice? ‘Personalized feed’ has turned into another way of silently hijacking our right to choose and to be manipulated by a non-sentient intelligence. What was wrong with the way we used to filter what we read? Like using our brain to work out whether we wanted to read it or not? Did we do such a bad job of it before the internet was around that we needed a bots help? Why the change?
The personalized feed is run by artificial intelligences that ‘see’ the content we interact with and selects more of the same type of content to populate our feed. It sounds so simple and effective, but that doesn’t make it fool proof or ideal.
One quite huge flaw that doesn’t seem to have been figured out by the innovators and developers of the technology is that it assumes that we don’t passively observe content. It assumes that all the content we want to read, we engage with. That’s a massive assumption which is extremely erroneous. Look at it this way:
So, the state of play ends up you like 20 pages of content with the intent to keep informed on what those companies are doing, but instead of getting their posts, your newsfeed is filled up with ‘the same old, same old’ of similar posts from the one post that may just have been a funny post that caught your eye. By a process of elimination you are missing out of items that you would be interested in and that could enrich your user experience, but it’s being hidden from you. The platform demands that if we want to consume content, we have to interact with it – or it gets hidden.
Greg Marra, Facebook’s News Feed product manager, was asked how users can best curate the content that comes up in their News Feeds. “The easiest way to change what you see? Engage with content,” says Marra.
“The basic interactions of News Feed are some of the most important signals that we get,” he explains. “Unfortunately, those interactions aren’t able to capture everything that we want to know, so we also give people additional controls to tell us things we can’t figure out just from normal usage of News Feed.”
With that news, the situation changes again. Now you know that if you want more of your favourite pages to appear on your feed you need to engage with them more, that means when you like 10 pages, you have to interact with all 10 to keep their posts appearing in your feed. How much time do they think we have to spend on social media every day? Why do they think we ‘liked’ the page in the first place? You ‘like’ because you want them to appear in your feed and for no other reason. You may just want to read, and not interact, with them. Engagement is personal and we need to choose how far we engage, not a machine. Personalised feeds seem to be able negate the need for ‘liking’ a community which makes a bit of a mockery of the whole endorsing system.
Is the answer just to randomly ‘like’ everything we see? One journalist, Mat Honan, tried just that and it was really interesting what happened to him. Take the time to follow the link as it’s worth the read, but this a potted version in his own words –
“I like everything. Or at least I did, for 48 hours. Literally everything Facebook sent my way, I liked—even if I hated it. I decided to embark on a campaign of conscious liking, to see how it would affect what Facebook showed me.”
“Facebook uses algorithms to decide what shows up in your feed. It isn’t just a parade of sequential updates from your friends and the things you’ve expressed an interest in. In 2014 the News Feed is a highly-curated presentation, delivered to you by a complicated formula based on the actions you take on the site, and across the web. I wanted to see how my Facebook experience would change if I constantly rewarded the robots making these decisions for me, if I continually said, “good job, robot, I like this.” I also decided I’d only do this on Facebook itself—trying to hit every Like button I came across on the open web would just be too daunting. But even when I kept the experiment to the site itself, the results were dramatic.”
“I liked one of my cousin’s updates, which he had re-shared from Joe Kennedy, and was subsequently besieged with Kennedys to like (plus a Clinton and a Shriver). I liked Hootsuite. I liked The New York Times, I liked Coupon Clipinista. I liked something from a friend I haven’t spoken to in 20 years—something about her kid, camp and a snake. I liked Amazon. I liked Kohl’s. KOHL’S! I liked Kohl’s for you.”
“My News Feed took on an entirely new character in a surprisingly short amount of time. After checking in and liking a bunch of stuff over the course of an hour, there were no human beings in my feed anymore. It became about brands and messaging, rather than humans with messages.”
“As day one rolled into day two, I began dreading going to Facebook. It had become a temple of provocation. Just as my News Feed had drifted further and further right, so too did it drift further and further left. Rachel Maddow, Raw Story, Mother Jones, Daily Kos and all sort of other leftie stuff was interspersed with items that are so far to the right I’m nearly afraid to like them for fear of ending up on some sort of watch list.”
“This is a problem much bigger than Facebook. It reminded me of what can go wrong in society, and why we now often talk at each other instead of to each other. We set up our political and social filter bubbles and they reinforce themselves—the things we read and watch have become hyper-niche and cater to our specific interests. We go down rabbit holes of special interests until we’re lost in the queen’s garden, cursing everyone above ground.”
“While I expected that what I saw might change, what I never expected was the impact my behavior would have on my friends’ feeds. I kept thinking Facebook would rate-limit me, but instead it grew increasingly ravenous. My feed become a cavalcade of brands and politics and as I interacted with them, Facebook dutifully reported this to all my friends and followers.”
“When I posted a status update to Facebook just saying “I like you,” I heard from numerous people that my weirdo activity had been overrunning their feeds. “My newsfeed is 70 percent things Mat has liked,” noted my pal Heather. Eventually, I would hear from someone who worked at Facebook, who had noticed my activity and wanted to connect me with the company’s PR department.”
Social media works. Mat Honen got to get in touch with the PR department of Facebook to ask him to be included, the PR department! The Holy Grail of marketing, but in doing so, he lost touch with his identity. And not just his identity, his identity to his friends. Pretty much the exact embodiment of the old adage, ‘sold his soul to the devil’.
The problem rests in our willingness to abdicate our responsibility to machines. When we let machines make decisions for us we lose our right to choose, but not only that, the choices that are made for us are made by cold blooded, thoughtless, emotionless androids that have no idea what they are doing. It is not an improvement, as Honens experiment showed, its insanity.
The other problem is how we opt out of or personalised news feeds. It’s not easy – mainly because Facebook does not tell you that robots control what goes into your feeds, but also because you have to work for it. Work hard.
At first, a search of Facebook reveals:
“You can adjust what you see in your News Feed by using see first; following or unfollowing people, Pages or groups; or liking Pages. Other things that affect News Feed: hiding stories, using friend lists, and changing how stories are posted on News Feed.”
Helpful-ish, but with more investigation you find:
“Your News Feed preferences help you control what you see on your News Feed.
To view your News Feed preferences:
To adjust your News Feed preferences:
Does that help? Not really. It has you chasing round in circles mostly, until you give up and leave it as it is. But you could, of course, fix the problem by ‘unfollowing’ every contact you don’t want to see, and going on to each page, each day, and see what they have posted. Remind me again – in that case, why do they have a ‘like’ button in the first place?
One very interesting thing that came from Honens experiment was the difference in feeds between his devices. He said:
“I was also struck by how different my feeds were on mobile and the desktop, even when viewed at the same time. By the end of day one, I noticed that on mobile, my feed was almost completely devoid of human content. I was only presented with the chance to like stories from various websites, and various other ads. Yet on the desktop—while it’s still mostly branded content—I continue to see things from my friends. On that little bitty screen, where real-estate is so valuable, Facebook’s robots decided that the way to keep my attention is by hiding the people and only showing me the stuff that other machines have pumped out. Weird.”
From that it would seem that Honen was not the only one to ‘sell their soul to the devil’. Facebook was always meant to be a platform to keep in touch with friends – the Human to Human connection. Now their own bots have hijacked their own platform, and they decided the human element is irrelevant. The lunatics truly have taken over the asylum.
Has prioritised news feed improved the user experience of Facebook? It depends on whether you care. It depends on if you have a curious spirit that wants to learn about all the wild stuff out there that you don’t know about or not. It depends on your inquisitiveness. It depends on whether you want to take responsibility for your own actions. It depends on if you want to control – or be controlled. It depends on whether you know that you are being controlled. It depends on whether you want to be influenced by a one sided view – that type of interaction used to be called propaganda. No one uses that phrase anymore. It depends on how much you value your right to choose.
Whether we like it or not, personalised news feeds are being implemented faster than any other marketing technology to maximise the marketing ability. It does make much more sense to be able to control what people see and to tap into their psyche and put your products in front of someone that may be more interested than another – but should that decision be made by the platform owner – or by a machine.
The more I research the way personalized news feeds work, the more I realise man or machine – they’re fast becoming pretty much the same.