Tokenization

A system that breaks text into smaller units (tokens) that an AI model can process. These can be words, parts of words, or punctuation marks.

Technology

Related Terms