A system that breaks text into smaller units (tokens) that an AI model can process. These can be words, parts of words, or punctuation marks.
An API (Application Programming Interface) is used to exchange data between applications in a standardized way. This existing technology is widely used within AI.
The number of tokens (words and characters) that an AI model can process at once in a single conversation or task.
An advanced form of machine learning that uses deep neural networks to discover complex patterns in data.
An AI technique where the model learns from just a few examples instead of thousands or millions of data points.
The process of adapting an existing AI model for a specific task or domain through additional training.
An advanced AI model that can understand and generate natural language, trained on massive amounts of text data.