Skip to main content

AI Usage is now included in our Enterprise Plans

Written by Support
Updated today

Note: This article is for reference only. AI features are now included. Get in touch to learn more.

What are tokens?

Tokens are the basic units we use to measure AI usage in our platform. Think of tokens as pieces of words - on average, 1 token ≈ 4 characters or about ¾ of a word. Both the text you send to our AI and the responses you receive consume tokens.

Token usage examples

Here's how many tokens common tasks typically use:

Document Processing

  • Reviewing a 10-page document: ~11,000 tokens

  • Summarising a 5-page report: ~6,000 tokens

  • Analysing a 20-page contract: ~22,000 tokens

Data Analysis

  • Analysing a database with 100 rows: ~5,000 tokens

  • Creating a data summary report: ~4,000 tokens

  • Generating insights from data: ~8,000 tokens

Token estimation guide

To estimate your token usage:

  • 1 page of text ≈ 500 words ≈ 1,000-1,200 tokens

  • 1 paragraph ≈ 100 words ≈ 150-200 tokens

  • 1 sentence ≈ 15-20 words ≈ 20-30 tokens

Billing details

  • Pay as you go: Only pay for what you use

  • No hidden fees: Simple, transparent pricing

  • Usage tracking: Real-time token usage monitoring in your billing dashboard

Did this answer your question?