Cracked AI Engineering

@cf/meta/llama-3.2-1b-instruct vs GLM 4.5V

GLM 4.5V includes advanced reasoning, GLM 4.5V supports vision. Compare full specs, pricing, and choose the best model for your use case.

Quick Overview

@cf/meta/llama-3.2-1b-instruct

Cloudflare Workers AI

60K tokens context • $0.03 / $0.20 per 1M tokens

View full specifications →

GLM 4.5V

Z.AI Coding Plan

64K tokens context • Free

View full specifications →

Detailed Comparison

Specification
@cf/meta/llama-3.2-1b-instruct
GLM 4.5V
Provider
Cloudflare Workers AI
Z.AI Coding Plan
Context Window
60K tokens
64K tokens
Max Output Tokens
60K tokens
16K tokens
Input Pricing (per 1M tokens)
$0.03
Free
Output Pricing (per 1M tokens)
$0.20
Free
Release Date
Sep 2024
Aug 2025

Capabilities

Capability
@cf/meta/llama-3.2-1b-instruct
GLM 4.5V
Text Generation
Function Calling
Vision
Video Understanding
Advanced Reasoning
File Attachments

Which Model Should You Choose?

Choose @cf/meta/llama-3.2-1b-instruct if:

    Choose GLM 4.5V if:

    • • You need a larger context window
    • • Cost efficiency is a priority
    • • You need advanced reasoning