[7k2m9p4x]
Q: You mention "AI can help us run a longer game but might reduce short-term pain while increasing long-term pain." This feels like your central tension. What specifically is the long-term pain you're warning about, and why is it worth experiencing the short-term pain instead?
A: The long-term pain is the lost of the skills, for example the rising of the Youtube and short videos like TikTok has already reduce the ability of reading of these teenagers. What they are exchanging is maybe fast learning speed, but the long-term pain is that they lost the ability of reading.
AI has much bigger affecting range, it not only affect how we act, it also affect how we think through the questions. I think the major short term gain here for AI is "FAST", but the long term lost is the journey to develop the skill and the opportunity to deeply dive into something. Since we will gradually rely AI on many other things.
Take for example, lots of people begin to write with AI, and the way they use is not diving into how to make their argument concrete and make their writing easy to read, consume and resonate, this all demand lots of time to practice, but people make their AI to do it for them, they will receive maybe good praise on the short-term, they will gain traction, they will think ok, I can write too. But in the end, they didn't gain and polish any skill. The pain they avoid is exact the stepping stone for them to proceed to a better place. They lost the opportunity by indulged by the AI.
Whether a mind is actively thinking is important, we should make sure we constantly stay in this state and reduce the chance that this task is taken away. Use AI to think is just like entering a cult, they both reduce the urgency of thinking by yourself.
[7k2m9p4x-4d8n2w9k]
Q: You compare using AI to think to "entering a cult" because both reduce the urgency of thinking. This is a strong claim. But cults also offer community, meaning, and relief from decision fatigue. Are there scenarios where reducing thinking urgency is actually beneficial? When is cognitive offloading healthy vs. harmful?
A: We definitely need some way to offload our cognitive loading to something else, that is the reason why we search for tools. "Tool Usage" is the central of our productivity. Entering a cult is also searching for a tool to reduce the pain in their life, and in exchange of productivity.
What I want to emphasize here is that, "Offload mental loading" is dangerous in any scenario. It's a question we should think of when we are using the tools.
I agree that it's hard to decide the boundary, that really depends on how the person work, what is his talent? What is his goal? And I don't have universal answer for it.
But in any circumstance, we should keep thinking about this question. We gain the productivity, but in what cost?
[3n8d5q1w]
Q: You distinguish between being a "pipe" vs. a "medium" when using AI. Can you articulate the precise moment where someone crosses from one to the other? What's the litmus test?
A: Look at any discussion happening in your life, did you ever put on a conclusion or presentation that is purely coming from the LLM, or do you first digest it and convert it into something of your own?
That is the litmus test. Do you ever copy paste something from the LLM.
[3n8d5q1w-2p9m4k7x]
Q: Your litmus test is "do you copy-paste from LLM." But what about someone who reads AI output, closes it, and then rewrites it in their own words from memory? Are they a pipe or a medium? Where's the line between "inspired by" and "outsourced to"?
A: I will argue that, if the person remember it and rewrite it with their own word, that is already a mental burden, in a positive way. In this way, they already becomes a medium if they say the line in their own words, their own interpretation.