AI Is Fine Alone. It Breaks Between People.
The same tool that made me sharper at 11pm made three people dumber at 8pm.
It's 11pm. I'm stuck on an essay I've been rewriting for two days. I open a chat window and paste in the whole draft. "Tell me what's wrong with this."
The model tells me. Some of it is right. Some of it I disagree with, and I push back. We go back and forth for twenty minutes. By the end, I've cut two paragraphs, added a better opening, and I see what the essay was actually about — which wasn't what I'd thought when I started.
I close the laptop. I feel sharper than when I opened it. The thing I made is more mine, not less. If you asked me right now whether AI was good for thinking, I'd say yes without hesitating.
Friday night. Three of us at dinner. Someone mentions interest rates — the kind of throwaway comment that used to get a "yeah, probably" and slide past.
The friend next to me pulls out his phone. "Wait, let me ask GPT." He reads the answer: rates have peaked, cuts coming. He says it like he's won something. The friend across from me frowns and reaches for his phone. "Claude says the opposite. Peak is still six months out."
Now there are two answers on the table, and neither friend is going to budge, because neither one is holding his own opinion anymore. Each is holding an answer. I watch them push the answers back and forth like chess pieces. The friend who started the whole thing has gone quiet. I eat my food.
By the time the check comes, nobody remembers what they were actually arguing about. Something about real estate. I walk home thinking about how the same tool that made me sharper at 11pm just made three people dumber at 8pm.
Nothing changed about the model between my desk and the restaurant. Same company, same weights, same training. What changed was how many people were in the room.
A model bends toward the shape of the question. Alone, this is a feature. I have opinions, the model adjusts to help me test them, and I can tell when it's being too agreeable because I'm the only one in the conversation. I push, the model gives. It works.
Put two people in the room, each with their own model, and the same property becomes poison. Now there are two mirrors, each curving toward the face of the person holding it. Whatever each person walked in believing, their AI will help them believe it harder. The mirrors don't meet in the middle. They reinforce the edges.
Alone, an AI that agrees with you is a thinking partner. At a table with someone else, an AI that agrees with you is a weapon pointed at the person across from you. Same behavior. Different room.
These days I leave my phone in my pocket at dinner. Two rooms, same tool. I'm still learning which one I'm in.