Following up on yesterday’s post: one could also ask whether GPT-3 can compose in a made-up human language. For instance, you could take English and modify it a bit:
Henderson’s native language is Gravinic, and when he speaks English, it sounds like this:
“I to the store did go and some cabbage did buy. Now I some dinner am making and I some cake will later eat. I you am not very much liking. I for you to go away am politely asking.”
In other words, the new rules are that verbs go to the end of the sentence, all verbs take an auxiliary verb indicating tense, adverbs go between auxiliary verb and main verb. Is this short text enough for GPT-3 to locate context in a language that isn’t one of the languages in its training set but rather “English with modifications inferred from prompt?”
Write more dialogue where Henderson tells you about his day.
GPT-3: “I then went to the park and did some reading. The sun was shining and it made me feel so relaxed. I saw some birds flying about, and it was quite a sight. After that, I went to the beach and enjoyed the sound of the waves. I even went for a swim! Finally, I returned home and watched some television before going to bed.”
It’s totally possible that more prompting would make this work, though – I didn’t have the patience to manually type in a bunch more Gravinic-inflected English. I’m not actually sure what kind of context GPT-3 is placing itself in with this response.
I think the above remark about “context” is profound; context seems be the key. By the way, I am a great fan of “The Grasshopper King”. Googling the phrase “I to the store did go and some cabbage did buy. Now I some dinner am making and I some cake will later eat. I you am not very much liking. I for you to go away am politely asking.”, showed that Google is completely clueless. However, Google Translate handled the phrase reasonably well.
It would be interesting to see how these generative models handle conlangs (e.g. Klingon or Dothraki).
Actually Dothraki raises an interesting point: can the model extrapolate a coherent language from a few samples, as David Peterson did.
In the words of Turing – “Can machines think?”
[…] The “made-up language” task that GPT-3 couldn’t do is handled much better by GPT-4, or at least the version of GPT-4 built into Bing Chat. The prompt: […]