An AI model that has a smaller number of parameters than a large language model (LLM) and is trained on fewer data samples. For example, a small language model (SLM) may have only a few million parameters compared to tens of billions in an LLM.
SLMs are also trained on more homogenous datasets that pertain to a particular subject. The advantage of a small model is that it takes less time to train and should generate more precise results. The small language model (SLM) may also run locally and not require the cloud. Contrast with
large language model. See
inference and
AI dataset.