Exclusive: Vauhini Vara wants communion, not "content"
Behind the book with the author of 'Searches: Selfhood in the Digital Age,' one of Esquire's best books of 2025 so far.
Three years ago, on a whim, I picked up a copy of Vauhini Vara’s The Immortal King Rao in the McNally Jackson Seaport (a Lower Manhattan neighborhood which, I must say, feels like a studio backlot). It was one of the best novels I’ve read since 2020, and now, Vauhini’s new nonfiction book, Searches: Selfhood in the Digital Age, is one of my picks for the best books of 2025 so far at Esquire.
A few years ago, Vara experimented with GPT-3 while writing stories about the death of her sister, and the results went viral. Searches is a book-length exploration of humankind’s relationship with technology at this particularly fraught moment in time. It’s moving, fascinating, and alarming.
For The Frontlist, I spoke with Vara about generative AI, Google, online “content,” and what she learned from writing Searches.
What was the most surprising thing you discovered during your research for Searches?
A lot of the time, we describe the internet as having started, in the mid-90s, as this open, almost anarchic space — one that was corrupted only later by big technology companies’ rapaciousness. I became a technology reporter only in the mid-2000s, and I shared that vague sentiment. What surprised me was realizing, through my reporting, that big technology companies and their interest in amassing wealth and power were bound up in the internet and its rise from almost the very start. The first commercial website appeared in 1993; the Microsoft co-founder Paul Allen was using his money to influence politics in Seattle by 1995.
How different would our digital lives be today if Google had kept its "don't be evil" ethos?
It’s so hard for me to imagine that counterfactual, because I think that slogan — to the extent that Google ever actually held this as a top priority — is fundamentally at odds with the capitalist structure in which Google was born and grew. Here’s how I can work my way toward imagining it. These academics at Stanford decide to create a search engine that is meant to provide information more effectively and broadly. It’s important to them not to be evil — and they define in specific terms what that means to them. Is it that they prioritize communal benefit above all else? If so, how do they define that benefit?
They work out those questions and write down the answers — that is, a set of specific goals meant to ensure non-evilness. The idea is that the search engine will be meant to meet those goals. It's expensive to build, which means they need funding — so they seek it from people or institutions who share their goals. Depending on what the goals are, maybe the funders are government entities (the Library of Congress, or some such? is that a thing?) or maybe they’re foundations (that eBay founder, Pierre Omidyar — I bet he'd get behind it?). Or something else entirely: an enormous ragtag group of collectively minded individuals each pitching in a little? Then they go forth from there — with a search engine whose goal, from the start, is not to make a return on investors’ investment.
What happens next, then? Does that change the course of internet and world history — setting the groundwork for a world in which the internet is generally oriented toward collective good? Or does some other startup come along — funded by venture capitalists who need a return on investment and have much more money to spend — and put Google out of business? That is: Do we eventually end up with something like what we ended up with in the real world?
This is to say: To what extent is Google at fault for what it became, and to what extent are we all at fault for having created systems for sustaining human inventiveness that define the value of inventions in financial terms? If there’s an extent to which we’re all at fault, does that open up space for us to consider other, better systems?
Should writers and editors be concerned that AI will soon make them unemployable?
Yes! And, therefore, we should exercise our agency and respond to that concern, individually and collectively.
If Google Search depends on high-quality content from publications, but its AI features starve those publications of traffic, where will the high-quality content come from?
I mean, I think even “high-quality content,” as a concept, is situated inside a framework in which financial gain is the highest value: It characterizes human self-expression and communication just as material — which can have meaning or not — to fill monetizable space; it assumes that some objective assessment of quality is possible. As a human being, I don't want content, high-quality or not. I want to be in communion with other life, with my planet, with the universe beyond my planet. One way in which I want to do that is through words and images that help us describe ourselves and the universe to one another. I think this is a value that is really widely held — and so I feel confident that, no matter what, we're going to keep doing it and keep doing it in ways that are beautiful and original and that resist the totalizing impulses of big technology companies.
What was the most difficult part about writing Searches?
The form is so unusual — including the use of big technology companies’ products to express, through form, a meta-critique of those companies and their products — that I knew readers wouldn’t really have a frame of reference for it. So I had to figure out how to balance the expository writing about these companies and their totalizing influence (and our complicity in it) so that readers would have some context with letting some of the more experimental material speak for itself and having faith that readers will see how it’s working in conversation with the argument I make more more conventionally.
You weave personal memoir sections with reporting so deftly in Searches. How do you structure that balance so effectively?
I'm glad you thought I did! I don't think of the book as being primarily a memoir of my life, which allowed me to write only about those parts of my life that served the broader narrative at hand. I think of that narrative as being about big technology companies' rise and our complicity (including mine) in it, and so I used myself as a kind of character — thinking about my role almost journalistically — to show how that dynamic operates.
There’s so much in this book that’s unconventional in form, so, honestly, when it came to blending memoiristic sections with reporting, I tried to take a somewhat conventional approach, using chronological narrative to organize the material. It turns out that big technology companies’ influence has been so pervasive that it wasn't hard to show how my own sense of self and of the world developed in the context of big technology companies’ rise.
Forthcoming in The Frontlist
Behind the book with Stephen Graham Jones, author of The Buffalo Hunter Hunter (Saga Press)
Behind the book with , author of Metallic Realms (Atria Books, May 13)
June book preview