From its earliest days, digital technology has sought to help us sift through the information overload to discover relevance. While this promised to broaden our horizons it has in fact often tended to restrict exposure to new ideas by trapping us in a “filter bubble”.
The filter bubble, a term coined by Eli Pariser, has been a long-standing concern for our social media, online retail, content and information search technologies. Like begets like in a world where AI algorithms and personal confirmation biases often lead us into a virtual echo chamber. Beliefs are amplified or reinforced by repetition inside a closed system. This may increase social polarisation and lead to partial information blindness – all of which drive cultural tribalism and dogmatism.
While most of the focus on this phenomenon revolves around one-sided ideological exposure and our ability to assess the veracity of information, there are some other big implications arising from it for customer analytics, specifically in terms of driving next best actions and attempts to optimize customer outcomes. There is little robust research or clear directions for dealing with this.
Behavioural science argues we only perceive and internalise a narrow slice of reality through a thick haze of emotional and cognitive biases. We are naturally distracted and easily misled – which makes it even harder to make decisions and take actions in our own best interests.
Self-fulfilling prophecies
Most recommendation systems and prescriptive analytics for next best action base themselves on a similar approach to social media, search engines, retail sites and content platforms – in other words, “people like you did this”, or “people who liked this also liked this”. Clearly these systems are also then vulnerable to perpetuating self-fulfilling prophecies among those whose behaviours we actually seek to modify.
This is especially the case when dealing with financial services consumers where we often seek to ensure we meet their best interests and that we nudge them towards more positive behaviours (typically a win-win for all).
“We tend to target the more engaged to engage,
ignoring the more pressing need to engage the unengaged.”
For example, we tend to target the more engaged to engage, ignoring the more pressing need to engage the unengaged. We tend to recommend next best actions that reflect either past behavior or the limited actions taken by closely matched peers. Essentially this can be closing off a more progressive growth path informed by things they or their peers have never actually considered or done.
Some companies have tried using algorithms to combat these effects. Facebook made changes to its “Trending” page by showing multiple news sources for a topic or event, and therefore expose readers to a variety of viewpoints.
There are also startups building apps with the mission of encouraging users to break the confines of their cozy echo chambers. Several AI-curated news apps have emerged to encourage exposure to diverse and distinct perspectives, helping form rational and informed opinion rather than succumbing to biases. They also nudge individuals to read different perspectives if their reading pattern is biased towards one side.
Yahoo! Labs also claim to have hit on a way to burst the filter bubble. The concept asserts that although people may have opposing views on specific, sensitive topics, they may also share interests in other areas. They nudge users to read content from people who may have differing views, while still being relevant to their preferred interests.
The result is that individuals are exposed to a much wider range of opinions, ideas and people than they would otherwise experience. It is also claimed that challenging people with new ideas makes them generally more receptive to change.
So what can we do?
Beyond waiting for the technology companies to address this growing conundrum what can we do to take more control of our own filter bubbles? Here are some quick tips.
Fake it until you make it
Force yourself to turn left when you might naturally turn right. For example, like and comment on posts you would not normally interact with. Look to actively counter your typical choices and watch with glee as the algorithms are thrown into a spin.
“I like to keep an extra profile called ‘Random’ where I will never watch my regular favourite shows and try to seed the algorithm with some off-center choices.”
Start again
Try opening a fresh account or setting up a new profile. You can quickly experiment with this by having a go at retraining your Netflix account. I like to keep an extra profile called ‘Random’ where I will never watch my regular favourite shows and try to seed the algorithm with some off-center choices. A whole world of previously “hidden” content comes to the fore.
Turn on, tune in, drop out
While personal setting options tend to be buried, we all can essentially hide from or opt out of the algorithmic yoke. You can get rid of your Google search history (not just your browser history but your entire history) and browse incognito, anonymous or through a VPN service. You can delete and/or disable browser tracking cookies features. You can turn off targeted advertisements and simply keep social media data private from all but close friends and family.
Implications for marketing and customer experience
What does this mean for those working in marketing and customer experience? What can be done to avoid perpetuating self-fulfilling prophecies? There are some steps we can take.
Wildcards
We can purposefully push the boundaries of experience by looking to challenge the status quo with a combination of randomness and relevancy. The system needs to learn your interests in order to know what is unexpected and surprising for you – yet still relevant.
Supervised nudging
Sometimes (well quite often really) people do not even know what they really want themselves. We all assume we want what we have wanted in the past but we don’t know what we don’t know. We need to take a ‘logical leap’ to things they might find useful and eye opening but are yet to be aware of. We can push the boundaries here by providing more expansive suggestions based on a priori hypothesis and established learnings from other domains.
Triggered
We tend to naturally operate with considerable inertia and status quo bias – we avoid change if possible as it often represents uncertainty and effort. However, when critical moments of change happen in our lives this can often be a powerful trigger to open our minds to new perceptions and behaviours. Having our fingers on the pulse of these potential trigger events helps us target those likely to be more open to shifting or modifying their “default destiny” – in other words, those most amenable to a recommendation “stretch”.
“We can purposefully push the boundaries of experience
by looking to challenge the status quo.”
Seeking minority wisdom
We can seek clusters of those who demonstrate relatively unique behaviours and try to tap into the wisdom of this minority. Are they doing things that suggest less established patterns of behavior that still provide useful next best actions insights? We can reevaluate ‘people that like this like this’ as well by getting beyond what the majority like and focusing in on less common yet significant preferences.
Degrees of separation
We can also push the boundaries with our peer or behavioral comparisons. We can recommend a next best action based on others like you but still different in critical ways e.g. recommend the actions of someone who looks a lot like you but is fundamentally ‘doing better’ in some respect and doing things that you are not. This is extremely effective in nudging people towards improved financial wellbeing.
The worst way to predict the weather
In a world of information overload, AI has long promised to curate our experiences and exposure to ideas. While some believe the threat of virtual echo chambers is overstated, many observers such a Sean Schuster-Craig argue these systems assume they can “tell what we want to see based on what we have seen”.
“This is the worst way to predict the weather,” Schuster-Craig says.
“If this mechanism isn’t just used to predict the weather, but actually is the weather, then there is no weather…[it’s] a weatherless world.”
So we need to ask ourselves: are we happy to accept such a “weatherless world” or do we need to actively take more control of our ability to discover the “new”, embrace some randomness, and forge previously unforeseen paths. This has major implications not only for our information consumption but also for our ability to effect behavioural change in the best interests of consumers.