Love the pairing of the examples and putting them in the context of education. Frankly, I think it's easy to go back farther with the forest example. Not long after the forests were maximized for production in Germany, Horace Mann visited Prussia and came up with a similar plan for education -- maximize learning by separating off each subject area from the ecosystem of learning that had been more common. That partitioning has proven to be appealing and long-lasting. Unfortunately we lost sight of the forest for the trees. AI may accelerate the decline, especially as we try to prevent students from learning basic responsibilities when encountering new technologies. I'm afraid we're sending many of them into the shredder as a result of our unwillingness to actually deal with the realities that are emerging.
This was a great essay. I am not remotely an expert on this stuff - basically tech illiterate. But there's a wonderful article by Maria Farrell (Henry Farrell's sister) and Robin Berjon from last year, which similarly opens with the forest example from James Scott's book. If you haven't read it's really worth checking out: https://www.noemamag.com/we-need-to-rewild-the-internet/
I love your analogy of the cube with the sharp corners slowly sanded down (growing desensitization). Importantly, there's a difference between a shiny new scheme that "fails to improve the human condition" like countless others, and a shiny new scheme that actually causes unmeasurable, irreparable damage with effects reverberating over generations. One concern I have with the AI stuff is that it has some potential to cause real damage, yet this damage might be difficult to assess and parse from all the other uncharted territory - especially as it becomes increasingly normalized and integrated into our social fabric.
Love the pairing of the examples and putting them in the context of education. Frankly, I think it's easy to go back farther with the forest example. Not long after the forests were maximized for production in Germany, Horace Mann visited Prussia and came up with a similar plan for education -- maximize learning by separating off each subject area from the ecosystem of learning that had been more common. That partitioning has proven to be appealing and long-lasting. Unfortunately we lost sight of the forest for the trees. AI may accelerate the decline, especially as we try to prevent students from learning basic responsibilities when encountering new technologies. I'm afraid we're sending many of them into the shredder as a result of our unwillingness to actually deal with the realities that are emerging.
This was a great essay. I am not remotely an expert on this stuff - basically tech illiterate. But there's a wonderful article by Maria Farrell (Henry Farrell's sister) and Robin Berjon from last year, which similarly opens with the forest example from James Scott's book. If you haven't read it's really worth checking out: https://www.noemamag.com/we-need-to-rewild-the-internet/
I love your analogy of the cube with the sharp corners slowly sanded down (growing desensitization). Importantly, there's a difference between a shiny new scheme that "fails to improve the human condition" like countless others, and a shiny new scheme that actually causes unmeasurable, irreparable damage with effects reverberating over generations. One concern I have with the AI stuff is that it has some potential to cause real damage, yet this damage might be difficult to assess and parse from all the other uncharted territory - especially as it becomes increasingly normalized and integrated into our social fabric.