June 22, 2024 - Vol. 2 Issue 28
Welcome to Infophilia, a weekly letter about the human love of information and connections. This is one of the places where I’m developing the infophilia framework, an evolutionary, social, positive psychology of information, avant garde research. If this is your first time, I’m glad you’ve joined us. Past posts are listed in the Archive and open access Infophilia videos are here and here. Enjoy and thanks for reading!
Cite this article as: Coleman, Anita S. (2024, June 22). The AI Information Game: Ground Truth. Infophilia, a positive psychology of information, 2 (28). https://infophilia.substack.com/
The AI Information Game: Ground Truth
I am extending the phrase, ‘games people play' to refer to human interactions with AI and information. The 'games' include creating misleading content, exploiting biases, and the various manipulations of truth and lies in digital formats, digital cultures, and more!
A recent theoretical research paper by Hicks, Humphries, and Slater (2024), has received attention for its analysis of ChatGPT mistakes and for their conclusion that “ChatGPT is a bullshit machine.” [i] According to the authors, ChatGPT’s errors should not be called ‘hallucinations’ or ‘confabulations,’ another term that has been proposed by others but rather “bullshit” using Frankfurt’s theory. [ii]
One of the most salient features of our culture is that there is so much bullshit. Everyone knows this. Harry Frankfurt, On Bullshit (2005). p.3
Harry G. Frankfurt, in his 2005 book, a NY Times bestseller, presents a philosophical analysis of bullshit in politics and defines it as:
· Bullshit is not necessarily false or untrue;
· Bullshit is not necessarily intended to deceive;
· But rather, bullshit is an attempt to create the impression that what is being said is true or meaningful.
The core of bullshit, according to Frankfurt, is not its content but rather its intention and tone. Bullshit is characterized by a lack of concern for truth or accuracy, and the two key features of bullshit are:
1. The absence of genuine interest in the truth, and
2. A willingness to sacrifice truth for the sake of appearances.
Bullshitters are more harmful than liars.
Frankfurt claims that people often engage in bullshit unintentionally, as a result of social pressures, conformity, or laziness. However, he also notes that some individuals may intentionally use language to manipulate others or create a false impression. He concludes that recognizing and understanding bullshit is essential for maintaining intellectual integrity, honesty, and critical thinking. Bullshitters are more harmful than liars. Frankfurt recommends that people should strive to eliminate bullshit from their own speech and behavior, as it has negative consequences for personal relationships, public discourse, and societal values.
I digress to share an observation and issue an alert. There’s an increasing acceptance of coarse language in academic scholarship over the past few decades, it seems. This includes traditional academic fields as well as the colorful language of bloggers and journalists like Cory Doctorow's provocative terms describing the degradation of online platforms over neutral descriptors like "platform decay" and "algorithmic attention rents." A recent Wired article employed similar colorful terminology when reporting on errors made by Perplexity.ai. Given my previous writings on Perplexity, I want to alert you of this development, and naturally, I’ll write more later.
Indifference to the truth is extremely dangerous… Frankfurt (cited in Hicks, Humphries, Slater, 2024, p.38)
The interaction between humans, AI, and information has become a complex game. The stakes are high. The language is increasingly colorful. Appeals to manipulate our emotions and sway opinions are increasing.