Vellum is coming to the AI Engineering World's Fair in SF. Come visit our booth and get a live demo!
Introducing Environments in Vellum: Isolate, Promote, and Deploy with Confidence
6 min

A first-class way to manage your work across Development, Staging, and Production.

Author
Akash Sharma
Jul 17, 2025
Product Updates
No items found.

Today, we’re launching Environments: a first-class concept in Vellum for managing your work across development, staging, and production with proper isolation and full control.

If you’ve dealt with multiple deployments, managed several release tags, struggled with API rate limits from non production traffic or wondered which version of a workflow is live in production, this one’s for you.

Now, you can clearly separate environments, promote releases between them, and manage your deployments well so your team moves fast and ships with confidence.

👉 Try Environments now

How this Works

1. Start with Your Environments

Every Vellum Workspace starts with a Production environment. You can create more — Development, QA, Staging, etc. — depending on your team’s setup.

Each environment is isolated. It gets its own:

  • API keys
  • Documents
  • Release history
  • Monitoring data

A new environment can be created in the Workspace settings page:

2. Set environment-scoped API keys

Each environment gets its own API keys, this is especially useful for secrets, which need different values in development, staging or production.

3. Deploy to the right Environment

When you’re ready to deploy a Prompt or Workflow, you can choose which environment(s) to deploy to. Just select one or more targets, hit deploy, and a new Release gets cut in each environment you selected.

4. Promote a Release Between Environments

Finished testing in Development? You can promote a release directly to Staging or Production and no need to manually redeploy. Just click “Promote” on a Release and select which Environment it should be promoted to.

5. Use Environment Variables to Customize Behavior

Environments also support Environment Variables, which let you define constants or reference secrets that differ by environment. Think: a unique FIRECRAWL_API_KEY in dev vs prod

You define your variables once, and then reference them in any workflow node. Vellum will automatically resolve the right value based on which environment is executing the workflow.

This allows your AI system’s logic to stay the same, while using a different API key based on which Environment it's running in.

Best Practices

Here’s how we recommend setting up your environment strategy:

  • Development: for building and iterating
  • Staging: for QA and stakeholder testing
  • Production: for live, user-facing workflows

API Key Management

  • Use separate API keys for each environment
  • Regularly rotate API keys, especially for production environments
  • Never use production API keys in development or testing

Release Management

  • Use descriptive Release Tags that follow semantic versioning
  • Test thoroughly in development and staging before promoting to production
  • Maintain clear documentation of what changes are included in each release
  • Use the LATEST tag for most deployments to simplify your process

Monitoring and Observability

  • Monitor costs and usage patterns separately across environments
  • Use Environment-specific webhooks and integrations
  • Set up alerts for production environments

What’s Next

This is just the beginning. We’ll soon be adding:

  • Model provider API keys per environment
  • CI/CD-friendly tooling for automated promotions and releases

We’re excited to help you scale your AI development with the same rigor and structure as any other production system.

Try it Now

Environments are now generally available to all Vellum users. You now have proper isolation, a clear audit trail and safer testing & rollout. It’s the easiest way to bring structure to your AI deployment process.

👉 Sign up or log in to get started

Have feedback or want to show us your setup? We’d love to hear from you.

ABOUT THE AUTHOR
Akash Sharma
Co-founder & CEO

Akash Sharma, CEO and co-founder at Vellum (YC W23) is enabling developers to easily start, develop and evaluate LLM powered apps. By talking to over 1,500 people at varying maturities of using LLMs in production, he has acquired a very unique understanding of the landscape, and is actively distilling his learnings with the broader LLM community. Before starting Vellum, Akash completed his undergrad at the University of California, Berkeley, then spent 5 years at McKinsey's Silicon Valley Office.

No items found.
Related Posts
Guides
June 3, 2025
5 min
10 Humanloop Alternatives in 2025
Product Updates
July 16, 2025
Built-In Tool Calling for Complex Agent Workflows
Product Updates
July 15, 2025
Introducing Custom Docker Images & Custom Nodes
Product Updates
July 14, 2025
Vellum Workflows SDK is Generally Available
July 10, 2025
5 min
Announcing our $20m Series A
Product Updates
July 1, 2025
6 min
Vellum Product Update | May & June
The Best AI Tips — Direct To Your Inbox

Latest AI news, tips, and techniques

Specific tips for Your AI use cases

No spam

Oops! Something went wrong while submitting the form.

Each issue is packed with valuable resources, tools, and insights that help us stay ahead in AI development. We've discovered strategies and frameworks that boosted our efficiency by 30%, making it a must-read for anyone in the field.

Marina Trajkovska
Head of Engineering

This is just a great newsletter. The content is so helpful, even when I’m busy I read them.

Jeremy Hicks
Solutions Architect

Experiment, Evaluate, Deploy, Repeat.

AI development doesn’t end once you've defined your system. Learn how Vellum helps you manage the entire AI development lifecycle.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.