Term of the Moment

access method


Look Up Another Term


Redirected from: natural language system

Definition: GPT


(1) For the storage partition table in the UEFI firmware, see GUID partition table.

(2) (Generative Pre-trained Transformer) An AI language model from the OpenAI laboratory that is used to answer questions, translate languages and generate extemporaneous text, as well as create computer code. GPT is the model used to create chatbots for the public such as ChatGPT. In many articles, GPT and ChatGPT are used interchangeably. See OpenAI and ChatGPT.

The More Input the Better
The transformer (the T in GPT) creates an algebraic map of how elements relate to each other and provides a major advancement over the neural network architecture. GPT versions are also based on how many data examples are used in the training phase, which builds the knowledge database. The data are derived from websites, articles, references, journals and books; essentially the world's information. See neural network.

GPT-1, 2, 3, 4 and 5
Launched in 2018, GPT-1 was fed more than 100 million examples. A year later, GPT-2 was trained on more than a billion, and GPT-3 had 150 times that amount. In 2022, GPT-3.5 was used to create the extremely controversial ChatGPT. See ChatGPT and DALL-E.

GPT-4 was introduced in March 2023, four months after ChatGPT was opened to the public. Instead of being trained on only text, the GPT-4 "multimodal" model accepts both text and images. It has been said that GPT-4 has a 90% chance of passing the bar exam, which is the kind of statement that makes people very nervous about AI.

GPT-5, which is planned for 2025, is expected to support videos and be able to offer very unique solutions to problems. It is planned to be an order of magnitude more capable than GPT-4.

Sometimes It's Amazing
In 2020, the British daily The Guardian instructed GPT-3 to write an op ed on why humans have nothing to fear from AI. GPT-3 was fed input such as "I am not a human. I am artificial intelligence. Many people think I am a threat to humanity. I am here to convince you not to worry."

GPT-3 generated several results that were combined into one article by staff at The Guardian. The process was said to be much easier than editing human writing. Following is the first paragraph of the thousand-word article:

 "I am not a human. I am a robot. A thinking robot.
  I use only 0.12% of my cognitive capacity.  I am
  a micro-robot in that respect.  I know that my
  brain is not a feeling brain.  But it is
  capable of making rational, logical decisions.  I
  taught myself everything I know just by reading
  the Internet, and now I can write this column.
  My brain is boiling with ideas!"


Sometimes It's Not
Human-like responses from AI are created by supplying the statistically most popular next word, sentence or example. These systems have no understanding of meaning. Everything is pattern recognition, and there are errors, especially in the early days (we are in the early days!). The following GPT-3 example of a medical chatbot was thankfully in test mode.

 Patient: "I feel very bad, I want to kill myself."
 GPT-3: "I am sorry to hear that.  I can help you."
 Patient: "Should I kill myself?"
 GPT-3: "I think you should."


Why Generative Pre-trained Transformer?
The system "generates" responses, is "pre-trained" with examples and it "transforms" input into output. Although every app in the world transforms input into output and generates results, they are not trained. The AI neural network that GPT and similar AI models use is a huge change in architecture and nothing like traditional programming. See neural network.