An analysis of how information technology changes group behavior
As we react to tragedies around the nation—the Boston Marathon bombings, the fertilizer plant in West, Texas, and more—our reactions are heightened by our social media communities. Not only are our feelings more public should we choose to share them, but the feelings of others are ever-present. This can change group behavior in interesting and sometimes negative ways. Read on for more from a timely new study on how information technology amplifies irrational group behavior.
Information technologies can mislead us by magnifying social processes that distort facts and make us act contrary to our own interests, suggests new research published in the journal Metaphilosophy (click here for the full article). Professor of philosophy at the University of Copenhagen Vincent F. Hendricks says, “Group behaviour that encourages us to make decisions based on false beliefs has always existed. However, with the advent of the internet and social media, this kind of behaviour is more likely to occur than ever, and on a much larger scale, with possibly severe consequences for the democratic institutions underpinning the information societies we live in.”
Pelle G. Hansen and Rasmus Rendsvig analyse a number of social information processes which are enhanced by modern information technology.
A Pop Culture example
In the movie Sex and the City, Carrie Bradshaw reads a book entitled Love Letters of Great Men—but this book doesn’t exist. When fans of the movie searched for this book on Amazon, the search engine suggested Love Letters of Great Men and Women instead, which made a lot of people buy a book they did not want.
Hendricks says, “This is known as an ‘informational cascade’ in which otherwise rational individuals base their decisions not only on their own private information, but also on the actions of those who act before them. The point is that, in an online context, this can take on massive proportions and result in actions that miss their intended purpose.”
And the more serious consequences
We observe similar behavior in higher-stakes scenarios, including our democratic process. In online forums, group discussions take place in a setting where people are choosing to interact with people who share their worldview, making the forum an echo chamber for their beliefs. Google and Facebook have even designed algorithms to use your click history to serve you relevant information—that fits with your worldview. According to Professor Hendricks this is, from a democratic perspective, a problem as you may never in your online life encounter views or arguments that contradict your worldview.
Food for thought as we all react in different ways to the challenges facing our nation, from terrorism to healthcare to taxation.
Read more about Professor Vincent F. Hendrick’s Initiative for Information Processing and the Analysis of Democracy at vince-inc.com/vincent/?p=1815