By Gerrit Van Wyk.
Humans flock like birds.
Our human social world is complex, which means we are interconnected and respond to others around us. It means, as Stacey suggested, in some ways we behave like flocking birds.
We marvel at the sight of starlings murmuring, or swirling together in a flock, constantly changing shape. On average, birds see the seven birds nearest to them and adjust their behavior in response to them. Each bird doesn’t know where the flock is going, and can’t by itself significantly change the whole. Network logic is determined by the structure of the network, or flock, which shapes the behavior of the network, or flock, which feeds back to shape the structure, and so on. Information in the flock is transmitted through chains of connections.
Patterns emerge from the interconnections and interactions of complex entities in unplanned ways, which is how our social world is structured and works as well.
There is evidence that social media works in the same way, but what makes it dangerous is algorithms more and more determine what information we receive and about whom, or in other words what we pay attention and react to. Friend becomes something we do to get more followers; the more followers, the more important and popular you are. Followers attract money, and the more followers you have and time they spent with you, the more ads they get to see. Algorithms gather data from activity on sites which is used by them to suggest tailored content and others to network with. This grows into groups and online communities with statistically similar interests.
Neither humans nor algorithms think about this, hence this also creates communities of conspiracy theorists, white supremacists, anarchists, etc. Statistical algorithms create flocks, but can’t tell whether it will be a collection of songbirds or predators. The algorithm picks up an interest, feeds more of the same, which then gets shared with seven others close to us, and then cascades out from there. And we are unaware this is happening. Eventually, people start acting on that by storming parliament, banishing contradictory opinions, etc. If you shut down one set of accounts to stop it, like a flock of birds, the same people quickly reform elsewhere.
The fundamental problem with today’s’ information overload is we can only pay attention to a limited amount, few people can pick out what matters, and few are able to fact check it. Filtering information to reduce information overload introduces bias, and we can’t know whether that bias will turn out good or bad. Today, algorithms choose for us what we pay attention to, and algorithms are written by humans with their own biases.
As Renée Diresta point out, this is not a problem of free speech, as many make it, it’s a problem of technology creating networks with terrible social consequences. Do we change the platforms, or do we try and control them, which in something as complex as human society is daunting.
There is a deeper issue here. As Sherif’s famous Robber’s Cave experiments showed, it’s in our nature to form groups based on bare bones conditions, and once we form them, we rapidly create a them-us membrane around us. We have an overwhelming need to belong, and once in a group, it becomes part of who we are. Despite contractions, we’ll defend the group, our buddies in them, and our self-image, metaphorically and sometimes literally to the death.
The moment you try starting a dialogue about free speech or controlling online media, you cleave people into those for and against it, with both parties mobilizing any resource they can lay their hands on to gain an advantage. Lobbying starts, media accusations fly, lawyers get involved, powerful economic forces step in, etc.
Over generations of human flocking, patterns emerged which we attach ourselves to because they create some sort of stability. We are Americans/Canadians/ Russians/Germans/etc., white/black/first nations, liberal/conservatives, catholic/protestant/Muslim/etc., and much more. We pick from each category until we identify someone called John, or Betty, or whatever. One of those patterns dominating in North America is the so-called free market system. This flock is the King Kong of society and few who take it on survive. That’s what’s behind the free speech, media control, and similar conversations, and explains why they don’t get resolved. Too many vested interests are afraid of losing the money generated by ads and algorithms to talk about the collateral damage.
Why do we stay in and defend groups, even if it’s a Nazi death squad? We have a primordial fear of being excluded, since exclusion in early small societies meant death. So, when someone attacks my group, I’m afraid they may destroy it and me with it. That’s a powerful incentive to dig in.
Sherif overcame the warring by presenting groups with a common threat which they could only overcome by cooperating. That remains the best way to overcome them-us splits and create community.