Topic List

Click a topic to learn more.


Look Up Another Term


Definition: AI winter


The time period roughly from the 1980s until the turn of the century when there were only small improvements in artificial intelligence (AI) applications. Expert systems, the precursors to AI, were having limited success, and LISP, the preeminent AI programming environment required expensive workstations. See AI and LISP.

How Did It Change?
Two reasons: power and data. In the late 1990s, GPU power, primarily from NVIDIA, began to increase exponentially, and these specialized GPUs are used by the thousands in huge datacenters that train large language models to deliver human-like results. Eventually, AI circuits added to everyday computers and phones enabled them to deliver some number of AI-generated results locally, as well as in combination with an AI cloud. See large language model, H100, A100 and Blackwell.

Secondly, in the same time frame, Google and other search engines had amassed mammoth amounts of data, essentially the sum total of the world's information, that are used to train AI systems. The combination of computer power and data enabled AI systems to come into their own and amaze people around the globe. Essentially, AI takes the trillions of comments, questions, answers, blogs, articles and books available online and spits them back out in what appears to be original human dialogue. See AI in a nutshell.