Ex•plain comes from an original sense of "to flatten", from the Latin planus, for plane.
The drive towards explanation is dual: aversion to texture and attraction to the planar, i.e. the force to line things up and buff things out. Or to right them, make them regular, both of which come from a root (reg-) meaning straightness, moving in a line. (See also: regal.)
Technology is a cousin of texture, both descendants of teks-, to weave or to fabricate.* Its other parentage, ology, has roots in collection and, later, in picking out words.
ChatGPT, explain "poetry"...
It picks out words, fabricates a text. (Fabricate didn't come to mean "tell a lie" until after the invention of the steam engine.) I see the words arrive, one by one, in lines on that most flattening of human inventions, the screen.
Yet language models like ChatGPT, despite their presentation, despite their use, defy the linear** and the planar. Their strengths lie in texture, in the woven fabric of the collected language of a vast array of sources which cannot be rendered in full one word at a time but as a network of contingent probabilities. That is to say, they are mirrors of context.
What context do we give if we are to read the outputs of language models as reflections of ourselves?
* "Fabric" only came to mean the particular material of textiles 1500 years after fabricate held a more general sense of making. Meanwhile the sense of "weave" crossed over it towards generality and took on the metaphorical sense of combining into a whole, away from its original meaning of interlacing yarn.
** Deep neural networks, more generally, are given shape by the nonlinearities spaced between matrix multiplications. Without them, networks flatten into a single layer.