A system that breaks text into smaller units (tokens) that an AI model can process. These can be words, parts of words, or punctuation marks.
The ability of an AI to perform tasks it was not specifically trained for, by applying existing knowledge to new situations.