Generative Pre-trained Transformer is a generation of models used to produce human-like text based on some initial text (part of dialogs or some task). One of the most "hyped" models is a GPT-3. When you see what GPT-3 generates you feel like "the future is here". GPT-J is self-hosted open-source analog of GPT-3: how to run in Docker