mohamed777
Contributor
Yes perplexity limits context tokens to 32k per prompt.I have been using perplexity but I find myself going back to claude.ai for coding related tasks. Even though perplexity has the same model (sonnet 4) i find that it doesnt give the same level of output as in claude.ai. Might be something to do with how they cut short the context window in their api calls.