There is a general fear about Facebook’s algorithm. I know. My Facebook newsfeed – an algorithm – of my Facebook friends’ status updates shows me their links daily to stories about the algorithm. Generally, this fear is about manipulation. Users of Facebook are worried that either their emotions or their consumption habits or possibly even their access to news is being manipulated. They are probably right. The algorithm manipulates data appearance based on a tricky combination of association and prediction. A mention of something can trigger an association and a news item or product in one’s feed. I guess that is some form of manipulation (though not too far off from other forms such as a history of catalog association/manipulation based on purchasing). Upset with this, many users want to write about how they manipulated the manipulator. In doing so, they prove that manipulation, as some sort of cultural force, is causing us to think in certain ways or not think at all. Very cultural studies in approach.
But what I find to be more of a form of manipulation than a Facebook algorithm is the shared link. The shared link is not an algorithm (unless you count your ability to see a shared link as being algorithmic). The shared link is what you see when a Facebook friend feels an issue or idea is important, funny, worth reading, reflective of a belief and so on. Not all shared links are manipulative, of course. But some, particularly those based in politics, very much are. The poster of the link wants to manipulate a general consensus of either rage or apathy or some other feeling in order to gather attention. I have done it many times, though, mostly with Chronicle of Higher Education pieces that I think are not accurate. Since the Humanities have long paid attention to issues of race, gender, class, ethnicity, it is no surprise that Humanities Facebook users would want to share links that reflect these aggregated positions (as forms of disciplinary interpellation) to remind or cajole a professional audience to not lose site of such concerns (and thus be manipulated in the way, I supposed, I am being manipulated to like Toyota because Friend X likes Toyota).
No place is this more clear than in two recent issues: Israel’s war with Hamas and the recent incidents in Ferguson, Missouri. These two incidents have been very much present in my Facebook newsfeed over the last few weeks. They are obviously not the only global conflict/wars occurring or incidents of police senselessly killing or assaulting African American women or men. But they are the majority if not the only links I see these days reflecting such issues. And that is important. It is important because it reflects how link manipulation focuses attention in odd ways based on aggregated feelings, emotions, positions, beliefs, etc. No doubt, for a certain audience, Ferguson draws on already aggregated frustrations and anger over a man being killed in a chokehold by a police officer or a professor being assaulted by a police officer for crossing in the middle of the street. And more. Aggregations are networks of belief. That was my argument about John Pike.
Whatever one thinks about either issue is not relevant to me here. Instead, I’m interested in link sharing as the building of manipulated discourse via a sense of aggregation. The Facebook newsfeed aggregates. It assembles positions and ideas within one’s Facebook space via a series of patterns and associations. Links are very similar. They are shared as headlines (one does not have to click through to get the gist of the link’s impact as Buzzfeed and other sites show us). As merely a link, these shared headlines confirm McLuhan’s notion of cool media. I read the headline, I fill in the gaps, I draw a conclusion – thus, in McLuhan fashion, I am involved and engaged (which differs from other types of involvement and engagements). That filling in, though, often is a confirmation of my belief system as it has been aggregated over some period of time (lifetime, weeks, days, etc) based on the images and headlines I’ve already internalized as some sense of reality. I wrote about this process in my College English piece via Flusser’s concept of the technical image. The technical image, we can also say, is not even an image at times. At least not a material image. In other words, it is an image because I have an image of what something is (injustice, war crime, abuse, outrage, etc.). I don’t have to click through or even put the link into any other context (what we often call “critical thinking”) because I fill in the details with my past aggregations. In some cases, this is also what I call non referential critique.
I recall one recent incident where a noted scholar posted a link whose headline said something to the effect of “Israel admits Hamas did not kidnap teens.” If one clicked through, one would see that no Israeli officials were cited in the article nor any proof offered beyond hearsay. And, in fact, that statement has proven to be false as a recent arrest confirms that the murders were Hamas. But…if one followed the comments after the link was shared, one sees the effect of cool media as commenters fill in in the gaps with: “I knew it.” “It was all a ploy.” “See?” These comments reflect an already aggregated position that there must be a Jewish conspiracy here (following the long tradition of so called Jewish control and plots). “I knew it” means “how could it have been any other way? Those Zionists did it again!” These comments reflect an already aggregated racism, further triggered or even manipulated by the shared link.
I write this partially for myself and my attempt again this semester to teach a large core course on Facebook in some kind of interesting way. I write this not to say: stop posting links. That sounds silly. I write this instead as a short note to reflect that what one audience constantly thinks is “manipulative” (Facebook algorithms) is not really as manipulative nor is not really outside of the network of other forms of manipulation, such as the shared (and often political) Facebook link.