Infophilia: A Positive Psychology of Information

Infophilia: A Positive Psychology of Information

From Book Bans to Data Demands

Reimagining Intellectual Freedom From Earth to Orbit

Anita Sundaram Coleman's avatar
Anita Sundaram Coleman
Jan 19, 2026
∙ Paid
coastal rock formation with mollusk colonies clinging, sunlight glints diamonds on ocean, brilliant sun distant horizon
Infrastructure and its dependents: What kind of future are we building when the platforms mediating access are heading to orbit? Crystal Cove. Photo by author.

“Consensual hallucination.” We see the trade. We accept it anyway. That’s capitalist convenience.

Infophilia, a Positive Psychology of Information | January 19, 2026 | Vol. 4 Issue 5

✨ Welcome to Infophilia, a weekly letter exploring how our love of information and connections can help us all thrive, individually and collectively.

Intellectual Freedom | Civic Infophilia | Technophilia | Toolbox | Wellbeing

📌 Access & Attribution: This serial offers free previews always and occasional full open access issues to keep scholarship accessible. Students and those facing financial barriers can request complimentary access. If you find value here, please cite the original work and consider supporting the public scholarship through subscription. Proper attribution sustains this work and models healthy information engagement.


Cite as: Coleman, A. S. (2026). From book bans to data demands: Reimagining intellectual freedom from earth to orbit. Infophilia: A Positive Psychology of Information, 4(5).


A Note on Timing: This essay follows up on “The Self-censorship They Didn’t Want to Know About,” which opened our January focus on reimagining intellectual freedom. The second week we published “Trust in Turbulent Times, An Infophilia Roundup.” Last week, the third in January, brought two bonus issues: “Librarians Missing from America’s Stamps” (what philatelic silence reveals about knowledge work) and “A Joyful Prophet of Freedom” (a tribute to Greg Newby, 1965-2025, builder of the commons and internet freedom advocate). All of these were fully open access.

We publish this essay on Martin Luther King Jr. Day, Monday, honoring his legacy. Dr. King’s work reminds us that freedom—intellectual, civic, economic—requires a constant and collective vision. In light of that, the main essay ~2,200 words is fully open access.

About the image: The beach rocks with mollusks and glinting diamond sunlight work continue the self-censorship’s tidepool imagery: infrastructure (the rock platforms), life that colonizes it (mollusks = us), surveillance/data points (glinting diamonds), and the unreachable horizon (orbit).

From Book Bans to Data Demands

Reimagining Intellectual Freedom from Earth to Orbit

My gold iPhone—part of the flagship release in 2017 and one of Apple’s best-selling generations—has been slowly dying. Apple stopped supporting it in 2025 even though it wasn’t even 10 years old. Preparing for that moment, I knew the choice had finally come: stay in Apple’s ecosystem, switch to Android (move deeper into Google’s world), or do something harder—degoogle entirely.

Degoogling means systematically replacing Google services with privacy-focused alternatives. It has developed into a visible but niche movement. People often start by changing their default browser and search engine because the free Gmail, Calendar, and Drive stack are hard to leave. A phone is even harder.

But this isn’t about whether you should degoogle. It’s about what happens when we outsource the infrastructure of intellectual life to platforms we don’t control. It’s about the choice Morpheus offers Neo in The Matrix: take the blue pill and stay comfortable, or take the red pill and “see how deep the rabbit hole goes.”

Degoogling rests on something bigger. An iPhone dying after eight years isn’t unique—it’s the economic pattern of our times. Phones designed to die. Software that abandons working devices. “Free” platforms extracting value from every click. Wearables—from Fitbits to Oura rings—harnessing our health data. (Google acquired Fitbit in 2021, and legacy accounts will be disabled February 2026 unless migrated to Google.) Systems that lock you in, then change the rules.

Some people call it platform power. I call it capture. We, the people, are feeling it.

Attention capture. Data capture. Eyeballs capture. People capture.

Are you the person who can leave a platform when its values diverge from yours? Or the person so enmeshed that leaving feels impossible?

Are you practicing intellectual freedom—the capacity to think for yourself—or outsourcing that capacity to feeds and engines optimized for engagement, not enlightenment?

Here’s the reframe we need: Capture is an intellectual freedom problem – attention, data, eyeballs, people. I’ve explored aspects of capture before—digital hoarding, infrastructure literacy, wicked problems, the dead internet. But platform capture of intellectual freedom requires seeing the full pattern. Assuming we can solve platform power with tech solutions—degoogling, better privacy tools, smarter regulation—risks missing what we actually need: a fundamental reimagining of what intellectual freedom means when the infrastructure of thought itself has been privatized.


People Capture: Platform Power Made Visible

“There’s not much point in me becoming chair of the Education sub-committee,” I explained last fall, “when the advocacy group uses Google Groups.” My friend was empathetic as she rationalized “capitalist convenience”—two words sitting on the jagged edges of three recent developments:

First: Publishers are fighting to join Google Generative AI Copyright Litigation, accusing Google of copying millions of books—often from pirated sources—to train Gemini AI. Disney sent a cease-and-desist claiming Google trained models on Disney IP and now lets users generate “pristine” Disney characters via features like NanoBanana, which raises privacy and misinformation concerns.

Second: Google announced Project Suncatcher—plans to put data centers in orbit, beyond earthbound regulation. (Google isn’t the only one.)

Third: OpenAI announced ads in ChatGPT. The “free” AI assistant millions use daily will now interrupt queries with advertising, monetizing intellectual vulnerability.

Three platforms. One pattern: Offer convenience. Build dependency. Extract value. Move beyond accountability.

But convenience is not value-neutral. It’s a trade. And most of us haven’t reckoned with what we’re trading.

Here’s what platform power, i.e. people capture, looks like:

When a private company controls infrastructure that could be public—like railroads eventually regulated because private control threatened the public good.

When the same company provides your email, storage, calendar, documents, search, and AI—and can change terms unilaterally.

When “free” means “we monetize your behavior.” The product isn’t the service—it’s data about you using it.

When platforms engineer infinite scroll to capture attention, making our K-12 schools compete with TikTok memes and YouTube shorts for growing focus and wonder.

When people, schools, libraries, and universities become so dependent that leaving means losing years of institutional memory (and individual too).

When companies move infrastructure literally to orbit, beyond regulatory reach.

The capture is designed to be invisible. The tools work. Integration is seamless. Price is “free.” By the time you realize you’re locked in, leaving feels impossible.

Technolust and Technophilia

People capture happens because of technolust and it’s very different from technophilia (love of complexity).

Technolust is what librarian Bill Badke calls being seduced by “the great big AI rubbish heap”—uncritically celebrating innovation while ignoring power dynamics.

Technophilia is loving technology enough to ask: Who controls this? What am I trading? What kind of person does this platform want me to be? How does that affect others, our democracy?

Technophilia is librarians fighting Libby’s AI features. It’s asking about Google Groups even when it’s inconvenient. It’s choosing sovereignty over seamlessness.

It’s making people capture visible so we can decide whether to accept it.

Offer convenience. Build dependency. Extract value. Move beyond accountability. A captive people.

The Captured Selves We’re Becoming

In the self-censorship essay, I argued how information health connects to intellectual freedom—that outsourcing consciousness construction to algorithmic feeds creates a Huxleyan threat: erosion of our capacity to think independently from within.

Now add economic and financial health to that equation. Platform capture isn’t just about privacy—it’s about who controls the economic infrastructure of knowledge itself. Who profits when students ask questions? Who owns data generated by intellectual exploration? This economic extraction is also intellectual freedom extraction.

In the library world, Sarah Lamdan calls this data cartels: companies that “control and monopolize our information” across multiple layers—not just consumer platforms like Google, but the databases libraries depend on. RELX (owner of LexisNexis and Elsevier), Clarivate (Web of Science), EBSCO—these aren’t just vendors. They’re vertically integrated information oligopolies that control what gets published, indexed, discovered, and priced.

Data cartels control knowledge vertically—Elsevier decides what gets published, Google decides how it’s discovered, OverDrive decides how it’s read. At every layer: extraction.

And now they’re adding AI layers. Libraries are watching OverDrive (Libby reader) integrate AI into the book-lending experience. Some librarians are fighting it, recognizing that AI features in the tools we use to lend books mean platforms are extracting value even from reading—logging what people search for, what they hesitate over, what they abandon.

Platform technologies (not just Google) with their “free” tools are building us into particular kinds of selves:

A self whose intellectual exploration is mediated by algorithms optimized for the platform’s business model, not understanding.

A self whose thoughts aren’t private—every query, draft, pause logged and monetized.

A self who cannot easily leave—so enmeshed that alternatives seem impossible.

A self who has stopped asking hard questions because tools work and thinking about it overwhelms.

A self who inadvertently ensnares others, because opting out means opting out of civic participation

A self who has become complicit in a K-shaped economy. That advocacy group using Google Groups—fighting book bans and supporting intellectual freedom while unable to see platform capture because of “capitalist convenience.” Fighting for the right to read while using infrastructure that logs every discussion and could shut down their community with a terms-of-service change. Choosing pragmatic inertia, prioritizing mission, seamlessness over sovereignty. Switching requires time/money advocacy groups lack, perpetuating the capture cycle.

We’re captives.

At every layer: extraction.

In William Gibson’s Neuromancer, cyberspace is “a consensual hallucination experienced daily by billions of legitimate operators.”

Consensual hallucination. We see the trade. We accept it anyway. That’s capitalist convenience.


We’ve Been Watching This Coming

I’ve lived this pattern before.

In the late 1980s, I helped implement NOTIS—the Northwestern Online Total Integrated System—at a multi-campus library in California. The library had signed a contract but hadn’t been able to implement it. I was hired to make it work.

It did work. Brilliantly. NOTIS was the first truly integrated library system. The integration was revolutionary.

I was practicing technolust without knowing it. I paid no attention to who owned NOTIS.

Here’s what I didn’t notice: before I was hired, NOTIS had already become NOTIS Systems, Inc.—a for-profit spun out from Northwestern in 1987. Four years later, Ameritech purchased it.

The pattern: Nonprofit university creates a public good → Spins off as for-profit → Gets acquired → Libraries become dependent customers rather than stakeholders.

Sound familiar?

We welcomed convenience without ensuring governance. Now it’s happening again.

That’s what Google did with Book Search, Gmail, Workspace, Classroom. That’s what’s happening with Gemini, with ChatGPT ads, with Fitbit forcing Google accounts, with OverDrive adding AI to Libby. Offer integration. Build dependency. Extract value. Move control away from public institutions.

When Google proposed to digitize millions of books in the early 2000s, libraries cautiously welcomed it. Associations warned about concentration of power, weak privacy, dependency lock-in. The settlement was rejected, but by then we were already dependent.

We welcomed convenience without ensuring governance. Now it’s happening again—Google training Gemini on potentially pirated books, embedding AI in tools we depend on, planning to move compute to orbit.

The pattern across platforms: Offer convenience. Build dependency. Extract value. Move beyond reach. Monetize everything.


Can This Be Our Red Pill Moment?

At the 2024 Charleston Conference, when a librarian asked “When is growth enough?” another librarian—not a vendor—answered: “Growth is never enough.”

The profession had internalized platform logic.

The profession stewarding intellectual freedom had not only internalized platform logic; it had forgotten the corollary to the fifth law of library science: The law says, the library is a growing organism. But common sense and the corollary demonstrate that a growing organism experiences growing pains. This is how people capture works—platforms don’t just control infrastructure, they shape what we think is possible, inevitable, professional.

Can this be our red pill moment? Not just individual degoogling choices, but collective recognition of what we’ve traded? Can we practice technophilia—demanding technology serve the selves we want to become?

What we do to our future selves matters. And right now, we’re outsourcing their capacity to think independently, one convenient trade at a time.

Can we see the pattern now? Because seeing it changes what intellectual freedom must mean.


What Intellectual Freedom Means When Platforms Escape to Orbit

You cannot think freely when every query generates profit for someone else. People capture.

Intellectual freedom is “the right to think for yourself.” But what does that mean when:

The AI answering students’ questions was trained on undisclosed, potentially pirated sources?

Every search is logged and monetized, building profiles that shape future behavior?

Infrastructure is moving literally to space, beyond democratic governance?

Data cartels control not just consumer platforms but the scholarly databases libraries depend on?

The tools librarians use to lend books are themselves designed to extract value from reading?

This isn’t separate from intellectual freedom. This is the intellectual freedom crisis of our time.

Intellectual freedom in 2026 isn’t just about what books are on shelves, though that fight continues. It’s about:

  • Who controls platforms shaping what you see, think, become

  • What happens to data when you search, read, explore

  • Whether you can leave without losing years of work

  • Where infrastructure lives and who can regulate it

  • Who profits from intellectual exploration

  • Whether knowledge infrastructure serves public good or private extraction

Platform capture extracts and harms economic and financial health, and this is intellectual freedom erosion. When knowledge infrastructures are controlled by platforms optimized for extraction, when intellectual exploration generates profit for data cartels and platforms rather than understanding for communities, when the tools librarians use to support reading are themselves designed to monetize it—we’re not just losing privacy. We’re losing the economic foundation that makes independent thought possible.

In The Matrix, Morpheus tells Neo: “You take the red pill—you stay in Wonderland, and I show you how deep the rabbit hole goes.”

The rabbit hole goes to orbit.

The Gemini litigation, Project Suncatcher, ChatGPT ads, data cartels, Libby AI—these reveal platforms providing “free” tools are training on contested data, embedding dependency, escaping Earth (environmentally cleaner, but infrastructure beyond regulation), monetizing intellectual vulnerability.

Platform capture is heading to orbit. A captive people.

Information health. Economic health. Financial health. Intellectual freedom. They’re all connected. You cannot think freely when every query generates profit for someone else. You cannot explore intellectually when the infrastructure mediating that exploration is designed to extract value from your not-knowing. You cannot practice self-rule when the platforms building your consciousness optimize for their shareholders, not your flourishing.

Every “free” tool is a trade. Every platform shapes what kind of self you become. Infrastructure is heading to orbit.

What does intellectual freedom mean when the infrastructure of thought itself has been privatized?

When data cartels control access to knowledge? When AI built on pirated books answers students’ questions? When the tools librarians use to lend books extract value from reading—and our capacity to ask that question has been fully outsourced?

These aren’t rhetorical questions. They’re the questions we need to answer before our capacity to ask them has been fully outsourced.

In The Matrix, Morpheus asks: “What’s your place in this epic battle?”


Beyond the preview: Why Bandcamp's stand against AI music is technophilia in action, how the Google Books settlement set the pattern we’re repeating, why OpenAI’s discovery loss matters for IF transparency, what publishers monetizing AI licensing reveals about data cartels. Plus reader responses, notes, and housekeeping.

This post is for paid subscribers

Already a paid subscriber? Sign in
© 2026 Anita Sundaram Coleman, Irvine, CA | ISSN: 3069-6526 · Privacy ∙ Terms ∙ Collection notice
Start your SubstackGet the app
Substack is the home for great culture