A few weeks ago, my wife and I made a bet. I said there was no way ChatGPT could credibly mimic my writing style for a smartwatch review. I had already asked the bot to do this months ago, and the results were laughable. My spouse bet he could ask ChatGPT exactly the same thing, but get a a lot best result. My problem, they said, was that I didn’t know the right questions to ask to get the answer I wanted.
To my great regret, they were right. ChatGPT wrote a lot best reviews like me when my spouse made the request.
This memory crossed my mind while I was blogging on Google I/O. This year’s keynote was essentially a two-hour thesis on AI, its impact on research, and all the ways it could boldly And responsibly make our lives better. Many things were neat. But I felt a shiver run down my spine when Google openly acknowledged that it was difficult to ask the AI the right questions.
During its demo of Duet AI, a suite of tools that will live in Gmail, Docs, and more, Google showed off a feature called Sidekick that can proactively give you prompts that change depending on the Workspace document you’re on. work. In other words, it encourages You on how to invite he telling you what it can do.
This came up again later in the keynote when Google introduced its new AI search results, called Search Generative Experience (SGE). SGE takes all the questions you type in the search bar and generates a mini report, or “snapshot”, at the top of the page. At the bottom of this snapshot are follow-up questions.
As someone whose job it is to ask questions, both demos were unsettling. The queries and prompts Google uses on stage are nothing like the questions I type into my search bar. My search queries often read like a talking toddler. (They’re also usually followed by “Reddit,” so I get responses from a non-SEO content mill.) Things like “Name of movie actor Bald Dennis BlackBerry.” When I’m looking for something I wrote about Peloton earnings in 2022, I jump to “Site: theverge.com Peloton McCarthy ship metaphors”. I rarely search for things like “What should I do in Paris for a weekend?” I don’t even think to ask Google for stuff like that.
I admit that when I look at any kind of generative AI, I don’t know what I’m supposed to do. I can watch a million demos, and yet the empty window taunts me. It’s like I’m back in second grade and my grumpy teacher just called me with a question I don’t know the answer to. When I ask for something, the results I get are ridiculously bad – things that would take me longer to make presentable than if I did it myself.
On the other hand, my spouse took to the AI like a fish in water. After our bet, I watched them play with ChatGPT for a good hour. What struck me most was how different our prompts and queries were. Mine were short, open and wide. My spouse left very little room for interpretation to the AI. “You have to hold it in your hand,” they said. “You have to give him exactly what you need.” Their commands and queries are hyper-specific, lengthy, and often include reference links or datasets. But even they have to rephrase prompts and queries over and over again to get exactly what they’re looking for.
It’s just ChatGPT. What Google offers goes even further. Duet AI is meant to extract contextual data from your emails and documents and guess what you need (which is hilarious since I don’t even know what I need half the time). SGE is designed to answer your questions, even those that don’t have a “right” answer, and then anticipate what you might ask next. For this more intuitive AI to work, programmers need to make the AI know what questions to ask users so that users, in turn, can ask it the right questions. This means that programmers need to know what questions users want answered before they even ask them. It gives me a headache to think about it.
Don’t be too philosophical, but you could say that all of life is about finding the right questions to ask. For me, the most uncomfortable thing about the age of AI is that I don’t think any of us know what we Really want AI. Google says that’s all it showed on stage at I/O. OpenAI thinks these are chatbots. Microsoft thinks this is a really exciting chatbot. But whenever I talk to the average person about AI these days, the question everyone wants answered is simple. How will AI change and impact My life?
The problem is that no one, not even robots, has a good answer to this yet. And I don’t think we’ll get a satisfying answer until everyone takes the time to rewire their brains to speak more fluently with AI.