At the end of the article, Mr. Chen suggests three rules of thumb for navigating this new world ethically and safely. We challenge your students — perhaps in their small groups — to do the same. What rules would they put in place for using this kind of technology in school? In their personal lives? Why?
After they have come up with some guidelines, invite your students to compare them with what other small groups have devised, and then read what Mr. Chen recommends.
They might then pool and refine their suggestions in future classes, and test them by using them to guide their work from the semester. Finally, if the guidelines they have developed work well, the students might even present them to their school or district administration, along with some samples of the work that results.
5.Discuss the implications of A.I. for art and for the humanities.
In September, we asked students, “Are A.I.-Generated Pictures Art?” We posed follow-up questions like “Should pictures created with artificial intelligence be considered art — equal to what an artist might create with a pen, a brush or a lump of clay?” and “What is the value of visual art in our world? What might we gain from A.I.-created art? What might we lose?” Over 300 teenagers weighed in.
Now we’re posing similar questions about writing. Someone recently brought the issue to the musician Nick Cave:
I asked ChatGPT to write a song in the style of Nick Cave and this is what it produced. What do you think?
You can read the A.I.-created lyrics and Mr. Cave’s full response at his blog, The Red Hand Files, but here is some of what he said:
ChatGPT may be able to write a speech or an essay or a sermon or an obituary but it cannot create a genuine song. It could perhaps in time create a song that is, on the surface, indistinguishable from an original, but it will always be a replication, a kind of burlesque.
Songs arise out of suffering, by which I mean they are predicated upon the complex, internal human struggle of creation and, well, as far as I know, algorithms don’t feel. Data doesn’t suffer. ChatGPT has no inner being, it has been nowhere, it has endured nothing, it has not had the audacity to reach beyond its limitations, and hence it doesn’t have the capacity for a shared transcendent experience, as it has no limitations from which to transcend. ChatGPT’s melancholy role is that it is destined to imitate and can never have an authentic human experience, no matter how devalued and inconsequential the human experience may in time become.
In a recent edition of her newsletter, Tressie McMillan Cottom, the Times Opinion columnist, says something similar:
A.I. writes prose the way horror movies play with dolls. Chucky, Megan, the original Frankenstein’s monster. The monster dolls appear human and can even tell stories. But they cannot make stories. Isn’t that why they are monsters? They can only reflect humanity’s vanities back at humans. They don’t make new people or chart new horizons or map new experiences. They are carbon copies of an echo of the human experience.
She continues:
Even when the essays are a good synthesis of other essays, written by humans, they are not human. Frankly, they creep me out precisely because they are so competent and yet so very empty. ChatGPT impersonates sentiment with sophisticated word choice but still there’s no élan. The essay does not invoke curiosity or any other emotion. There is a voice, but it is mechanical. It does not incite, offend or seduce. That’s because real voice is more than grammatical patternmaking.
If “This is a great time to think about the line between human and machine,” as Ms. McMillan Cottom says, what is that line? How do you see the “humanness” in the art you love? What makes it different from something created by a machine? Will A.I. ever be capable of creating real art?