Why Prestige?
Multi-Provider Support
Use OpenAI, Anthropic, Groq, Google, or run locally with LM Studio/Ollama. You control your API costs.
Context awareness
Prestige is aware of everything going on in your scene, and is made to react accordingly.
RAG database
3k+ indexed docs with semantic search, including extensive hand-made examples from professionals.
How It Works
1
Choose Your LLM Provider
Select from OpenAI, Anthropic, Groq, Google, or run locally with LM Studio/Ollama. Enter your API key and connect.
2
Ask in Natural Language
"What's the error in this node?", "Generate a VEX code for this..." or "How can I make this code more efficient?". Prestige will understand the context of your scene and provide relevant suggestions.
3
Coming Soon - Graph Creation
Focus on the output, and let Prestige handle the graph creation. Many features are coming to Prestige, in active development.