To me, what GPT-n really tells is how redundant human text in general is. It doesn't provide a lot of new information (if it does, they are accidental), but more or less wandering around the given topic. It is the redundancy that gives the system a space to churn.
The reduntancy is not low. GPT-* are basic "pattern generators" for our language organ. We have pattern generators for our gait for example -- all those synchronized movements of so many muscles can be effectively controlled by a well understood circuit, to the point that even a deafferented animal can still be be made to recognizably walk.
GPT is the equivalent, but for language: it can be used to phrase thoughts. Real language however, has thoughts, GPT doesnt have any thoughts. People are impressed by how many responses it has learned, but forget that it is contains a lot of gigabytes of "compressed" text associations. It needs "something else" to become actually useful.