selected model image

Free Service

GPT-3

Large Language Model that can be used for a variety of language related tasks

GPT-3 is a computer program that can write human-like text. It is the third generation of a series of language prediction models created by OpenAI, a San Francisco-based artificial intelligence research laboratory. GPT-3 has a capacity of 175 billion machine learning parameters. It was introduced in May 2020, and as of July 2020, it was in beta testing.

 

Capabilities

The quality of the text generated by GPT-3 is so high that it can be difficult to determine whether or not it was written by a human, which has both benefits and risks. For example GPT-3 has come under criticism from Google's AI ethics researchers for the environmental impact of training and storing the models. Additionally, the growing use of automated writing technologies based on GPT-3 and other language generators has raised concerns regarding academic integrity and plagiarism.
The tool can complete almost any English language task. It was trained on a lot of data, so it doesn't require further training for distinct language tasks. However, because its training data contains occasional toxic language, GPT-3 occasionally generates toxic language as a result of mimicking its training data.

Current applications are for example the GitHub Copilot↗︎, a code completion and generation software that can be used in various code editors and IDEs.
GPT-3 has also been used by Andrew Mayne for AI Writer, which allows people to correspond with historical figures via email.
GPT-3 was used by The Guardian to write an article about AI being harmless to human beings. It was fed some ideas and produced eight different essays, which were ultimately merged into one article.