One problem is the design of the “Like” button. The target audience - the recipient - of the action of pressing this button is the other people. They will see a higher number next to the thumb up. If this is the visible result - this is what people will use it for. To show support to the opinions they agree with. The stronger the opinion - the stronger the support. Often the content people agree with is not actually what they find useful for themselves.
But what if we designed the “Like” button to be directed not at the others, but at yourself. I mean, when you press the “Like” button it has consequences for you. If you “Like” useless content, then you will get more useless content in the future. Would that change the dynamics of “liking”? Would it make people think more carefully about what they like if their future content recommendations depended on it?
When you upvote an item - you don’t simply increment the counter and make that item rank higher for everyone else. Instead, you connect stronger to other users who upvoted that item before you. The stronger you connect to someone - the more weight their upvote has for you - the higher their other upvoted content ranks in your recommendations. This creates a feedback loop where you use the upvote button no to influence others, but to direct your future recommendations. This is a “filter bubble” - one that you very consciously form.
This may have been true-ish some years ago, and it certainly was my expectation, and how many others expected the systems to work.
However I was shocked to find that pages I had liked, I would not see their posted content and would instead see other stuff that was often outrageous.
I can't remember when twitter rolled out the 'curated' instead of chronological change for higher engagement / more advertising value - but at this point it was long ago too.
This was shocking to many people who created 'pages' for businesses and such as well. Even Kim Komando (on the air/via her broadcast / podcasts) railed against the algorithm hiding her stuff from people that had liked her fbook.. (meanwhile offering to 'boost' her presence in people's feed for high fees.
So I'm not sure if the systems originally were setup to run that way, but they certainly have not been in the past many years akaik.
Other people I have spoken with have been shocked that the system has not been working as they expected and wondered why they 'liked' a page when they don't get all the info the page posts, many not believing that pages they were really interested in posting new stuff were instead being censored from appearing in their feed.
Most platforms use a black box ML model (a deep neural net) to optimize for business objectives such as time spent which correlates more with the number of ads viewed than if they optimized for "Likes".
These systems have far fewer "Likes" than other implicit signals such as which posts the user read (ie, didn't scroll past). So they use all of those signals and likes are unlikely to have a strong impact on the output of the algorithm.
The algorithms are opaque by nature. Even the developers of these deep models won't be able to explain to you why a user's recommendation list is ranked the way it is. Now imagine a user contemplating the consequences of liking a certain item on their future recommendations. It's hopeless.
As a result, the developers learn not to rely on these "like" signals and the users learn to provide likes only to affect the directly observable changes - the like counter.
If you know that the content you like will get more visibility then it incentivizes people to upvote what they want other people to see, as opposed to what they themselves want to see.
But what if we designed the “Like” button to be directed not at the others, but at yourself. I mean, when you press the “Like” button it has consequences for you. If you “Like” useless content, then you will get more useless content in the future. Would that change the dynamics of “liking”? Would it make people think more carefully about what they like if their future content recommendations depended on it?
I’m building just such a system at https://linklonk.com
When you upvote an item - you don’t simply increment the counter and make that item rank higher for everyone else. Instead, you connect stronger to other users who upvoted that item before you. The stronger you connect to someone - the more weight their upvote has for you - the higher their other upvoted content ranks in your recommendations. This creates a feedback loop where you use the upvote button no to influence others, but to direct your future recommendations. This is a “filter bubble” - one that you very consciously form.