← Back to AI Foundations

AI Foundations · Chapter 8

Tokens Explained Simply

Understand what AI tokens are, how LLMs process text, and why tokens matter for cost, context, and performance.

TokensLLMsContext WindowAI BasicsPrompting

Introduction

Tokens are the small pieces of text that AI models process internally.

Large Language Models do not read full sentences exactly like humans do. Instead, they break text into smaller units called tokens.

What is a Token?

A token can be:

  • A complete word
  • Part of a word
  • A punctuation mark
  • A number
  • A code fragment

For example:

  • "Hello" may become one token
  • "Artificial Intelligence" may become multiple tokens
  • Long technical words may be split into smaller pieces

Why Tokens Matter

Tokens are important because AI systems measure:

  • Input size
  • Output size
  • Context window usage
  • API pricing
  • Memory usage

Most AI providers charge based on token usage.

Context Windows

A context window is the amount of information an AI model can consider at one time.

The context window includes:

  • Your prompt
  • Conversation history
  • System instructions
  • Retrieved documents
  • The model’s response

Larger context windows allow AI systems to handle longer conversations and larger documents.

Token Limits

Every model has a token limit. If the conversation becomes too long, older information may be removed from context.

This is why AI systems sometimes “forget” earlier parts of a conversation.

Tokens and Pricing

AI APIs usually charge separately for:

  • Input tokens
  • Output tokens

Larger prompts and larger generated responses increase cost.

Efficient prompting becomes important when building large-scale AI systems.

Tokens in Real Work

Understanding tokens becomes very important when:

  • Building AI applications
  • Working with APIs
  • Managing AI costs
  • Designing RAG systems
  • Creating AI agents
  • Handling long documents

Common Misconceptions

Many beginners think AI models understand complete documents naturally like humans.

In reality, the model processes tokens mathematically within a limited context window.

Summary

Tokens are the basic units AI models use to process text. They affect context size, pricing, performance, and memory handling.

Understanding tokens is essential for building practical AI systems and working efficiently with LLMs.