@cf/meta/llama-3.2-11b-vision-instruct vs @cf/deepgram/aura-1
@cf/meta/llama-3.2-11b-vision-instruct offers 128K tokens context vs 0 tokens. Compare full specs, pricing, and choose the best model for your use case.
Quick Overview
@cf/meta/llama-3.2-11b-vision-instruct
Cloudflare Workers AI
128K tokens context • $0.05 / $0.68 per 1M tokens
View full specifications →@cf/deepgram/aura-1
Cloudflare Workers AI
0 tokens context • $0.01 / $0.01 per 1M tokens
View full specifications →Detailed Comparison
Specification
@cf/meta/llama-3.2-11b-vision-instruct
@cf/deepgram/aura-1
Provider
Cloudflare Workers AI
Cloudflare Workers AI
Context Window
128K tokens
0 tokens
Max Output Tokens
128K tokens
0 tokens
Input Pricing (per 1M tokens)
$0.05
$0.01
Output Pricing (per 1M tokens)
$0.68
$0.01
Release Date
Sep 2024
Aug 2025
Capabilities
Capability
@cf/meta/llama-3.2-11b-vision-instruct
@cf/deepgram/aura-1
Text Generation
Function Calling
Which Model Should You Choose?
Choose @cf/meta/llama-3.2-11b-vision-instruct if:
- • You need a larger context window
Choose @cf/deepgram/aura-1 if:
- • Cost efficiency is a priority