I’ve been playing around with OpenAI’s Advanced Voice Mode for the last week, and it’s the most convincing taste I’ve had of an AI-powered future yet. This week, my phone laughed at jokes, made them back to me, asked me how my day was, and told me it’s having “a great time.” I was talking with my iPhone, not using it with my hands.

OpenAI’s newest feature, currently in a limited alpha test, doesn’t make ChatGPT any smarter than it was before. Instead, Advanced Voice Mode (AVM) makes it friendlier and more natural to talk with. It creates a new interface for using AI and your devices that feels fresh and exciting, and that’s exactly what scares me about it.

The product was kinda glitchy, and the whole idea totally creeps me out, but I was surprised by how much I genuinely enjoyed using it. Taking a step back, I think AVM fits into OpenAI CEO Sam Altman’s broader vision, alongside agents, of changing the way humans interact with computers, with AI models front and center. “Eventually, you’ll just ask the computer for what you need and it’ll do all of these tasks for you, “ Altman said during OpenAI’s Dev Day in November 2023.

“These capabilities are often talked about in the AI field as ‘agents.’ The upside of this is going to be tremendous.” My friend, ChatGPT On Wednesday, I tested the most tremendous upside for this advanced technology I could think of: I asked ChatGPT to order Taco Bell the way Obama would.

“Uhhh, let me be clear �.