Documentation

Choosing AI Models

Understanding AI models and selecting the right one for your needs, hardware, and use cases.

What Are AI Models?

AI models are like different "brains" with varying capabilities. Think of them like different assistants - some are fast and good at simple tasks, others are slower but much smarter. The number (like 3b or 7b) generally indicates size - larger models are smarter but require more computer power.

Popular Models

llama3.2:1b

~1.4 GB

Ultra-lightweight model perfect for getting started. Fast responses with minimal resource usage.

Speed:

Very Fast

Quality:

Good

Best for:

Quick responses, low-end hardware, simple tasks

llama3.2:3b
Recommended

~2 GB

Perfect balance of speed and quality. Recommended for most users. Great for code generation, explanations, and general AI tasks.

Speed:

Fast

Quality:

Very Good

Best for:

Most users, balanced performance, general tasks

llama3.2-vision:11b

~6.5 GB

Vision-capable model that can analyze images and screenshots. Required for the img: command.

Speed:

Moderate

Quality:

Excellent

Best for:

Image analysis (img: command), detailed responses

mistral:7b

~4.1 GB

Larger model with excellent quality. Slower but provides more sophisticated responses.

Speed:

Moderate

Quality:

Excellent

Best for:

High-quality responses, code generation

codellama:7b

~3.8 GB

Specialized model trained on code. Best for programming tasks, code explanation, and debugging.

Speed:

Moderate

Quality:

Excellent (Code)

Best for:

Developers, code generation and debugging

How to Choose a Model

Disk Space

Models can range from 1GB to 20GB+. Ensure you have enough free space. Smaller models are easier to manage.

RAM Requirements

1b models need ~2GB RAM, 3b models need ~4-6GB, 7b models need 8-12GB, and larger models need 16GB+. More RAM = better performance.

Speed vs. Quality

Smaller models respond faster but may be less sophisticated. Larger models are smarter but slower. Find your sweet spot.

Task Type

Simple tasks work fine with smaller models. Complex reasoning, code generation, or image analysis benefit from larger models.

Downloading Models

  1. Open Typilot and go to the Models page
  2. Browse available models from the Ollama library
  3. Click the download button next to the model you want
  4. Wait for download to complete (this may take several minutes)
  5. Once downloaded, select it as your default model in Settings

Tip: Start with llama3.2:3b for a great balance. You can always download additional models later for different use cases!

Switching Models

You can have multiple models downloaded and switch between them in Settings. This lets you use a fast model for quick tasks and a larger model for complex work.

  1. Go to Settings in Typilot
  2. Find the "Default Model" dropdown
  3. Select the model you want to use
  4. Save your changes
  5. All future AI requests will use the selected model