Request: GPT-2 based text generation
-
There’s a website that does what I’m suggesting in a browser page: huggingface’s Write With Transformer. The problem is, it’s in a browser page.
The general idea is, you press a button, and the last X number of words that come before your cursor are used as a prompt for GPT-2 to generate a small number of tokens, probably stopping when it completes a sentence. Ideally, you could control certain aspects of generation: maximum number of tokens, maximum number of seconds spent generating, maximum prompt size, top_p, temperature, and model size. It would also make sense if you could select a section of text to use that as a prompt.
I gave creating this plugin an honest shot myself, however GPT-2 is developed in python and I don’t understand its code enough to recreate it using TensorFlowSharp or whatnot. If that were done, I imagine the rest wouldn’t be that hard…?
I’m fairly certain there’d be a lot of interest in a plugin like this!
-
If it helps, there’s a C# implementation of GPT-2 here, but it relies on a C# adaptation of tensorflow called Gradient, and I’m not sure if the license would allow its use for an NPP plugin? There’s also TensorFlowSharp, and TensorFlow.NET, but neither of them has a GPT-2 adaptation, and I’m not sure if either is necessarily better for this task.
-
Hi Zachary,
I am the author of both Gradient and GPT-2 adaptation for it.
Unless you are planning to sell your plugin, feel free to use Gradient. If you want to make it open-source, I’d be glad to provide help.Note though, that one would have to install Python + TensorFlow to use GPT-2.