Learn How to Launch Production-Ready AI Products. Download Our Free Guide
June 11, 2024

Announcing an Investment from InvestInData

Guest Post
No items found.

We’re thrilled to announce our investment in Vellum, an AI product development platform that empowers companies to build high-quality AI applications.

At InvestInData, we are always on the lookout for top startups pushing the boundaries of data and AI. When we met the Vellum team, we were immediately impressed by their deep understanding of the challenges faced by companies developing AI-powered features.

The founders, Akash Sharma (CEO), Noa Flaherty (Co-CTO), and Sidd Seethepalli (Co-CTO), bring a wealth of experience from their time at McKinsey & Company, Dover, and Quora, where they saw firsthand the pain points associated with building AI applications.

Learn how successful companies build with AI

Download this practical guide and enable your teams to innovate with AI.
Get Free Copy
If you want to compare more models, check our LLM Leaderboard here or book a demo to start using Vellum Evaluations to run these tests at scale.
Read the whole analysis in the sections that follow, and sign up for our newsletter if you want to get these analyses in your inbox!
Inspired by this, we've designed Vellum to meet these needs, and now many product and engineering teams use our suite of tools—Workflows, Evaluations, and Deployments—to build agentic workflows.

Build a Production-Ready AI System

Platform for product and engineering teams to build, deploy, and improve AI products.
Learn More

LLM orchestration with Vellum

Platform for product and engineering teams to build, deploy, and improve AI products.
Learn More

The Opportunity

The rapid adoption of large language models (LLMs) has opened up significant opportunities for companies to build with AI, but also introduces a distinct set of challenges:

  • Prompts are non-deterministic and require extensive evaluation before and after deployment
  • Testing various prompts, models, and retrieval strategies is cumbersome
  • Managing and versioning prompts in code is difficult
  • Collaborating with non-technical stakeholders on prompt development is not straightforward
  • Multi-step LLM applications compound these prompt engineering and evaluation challenges

As practitioners, we’ve felt these pain points first-hand as we’ve tried to balance our desire to rapidly ship new LLM-enabled products with the enterprise-level requirements our customers demand.

The Product

Vellum's platform tackles these issues head on, providing tools across the AI product development lifecycle. Customers are using Vellum for prompt engineering, document indexing, LLMs evaluations, as well as to build multi-step LLM workflows and deploy those workflows to production.

What we found most exciting was that Vellum was built with both technical and non-technical users in mind. Customers spoke about how bringing in non-technical users helped them iterate more quickly while increasing the quality of their LLM apps. 

Why Vellum

What impressed us most about Vellum is their deep understanding of their target market and the rapid traction they've achieved. Vellum has already onboarded over 150 paying customers across various industries, including SaaS, e-commerce, consumer, and fintech, within its first 15 months of operation. The platform is flexible enough to delight customers ranging from two person startups to 10,000+ person public tech companies. 

As Vellum continues to grow, we believe their advantage lies in their ability to embed deeply within a company's workflow, acting as an extension of their backend logic. Customers love the product, and our angel investing collective of 50+ leading data executives is uniquely positioned to support Vellum's growth. Our deep expertise in data, ML/AI, and enterprise sales will be invaluable as Vellum expands upmarket and refines its product offering.

Here's what some of our members had to say about Vellum:

“If your Gen AI experience is similar to mine you will find Vellum useful. Let’s say you want to set up a RAG use case. It’s super easy to set up. You built a chatbot and people start using it. Initial adoption is very fast. But soon you discover that people start asking it questions that you did not anticipate. And the quality of answers is not good. You find yourself constantly updating the documentation. You fix a question and a different question breaks. You want to change embedding models because it’s not retrieving answers from the right location. The tools you have are no good for maintaining a repository of unit test questions and performance of those questions as you iterate. This is what Vellum helps you do. Vellum has identified the real pain points people run into as they deploy Gen AI. And it makes it efficient to test a wide variety of use cases.”

Amit Das, Data @ Klaviyo
"Based on my experience with building Generative AI applications, I am excited to see how Vellum has adopted an innovative approach to streamline and enhance efficiencies through an out-of-the-box development toolkit for some critical steps in the development process. I expect their product to bridge the gap between POC (proof of concept) and production-ready builds, enabling faster deployments for AI-driven solutions​”

Barkha Saxena, Chief Data Officer @ Chime
"I recognize the immense potential of AI in transforming our interactions and communication. Vellum's platform empowers companies to develop AI-powered features swiftly and efficiently, all while maintaining high standards of quality. Their commitment to fostering collaboration between technical and non-technical users truly sets them apart. I am thrilled to witness the innovative applications that will emerge from their platform."

Khatereh Khodavirdi, VP of AI Product @ Paypal

Looking Ahead

Looking ahead, we believe Vellum is well-positioned to become a critical component of every company’s AI stack. As companies increasingly adopt AI, concerns around data privacy, PII masking, audit trails, and prompt injection will become more pressing. Vellum's unique position in the flow of information for AI-powered features puts them in an ideal spot to address these challenges for data and engineering leaders.

At InvestInData, we are committed to supporting startups that are pushing the boundaries of innovation in data and AI. Vellum's vision, strong founding team, and impressive traction made them a natural fit for our portfolio. We look forward to partnering with Akash, Noa, Sidd, and the entire Vellum team as they continue to empower companies to build groundbreaking AI applications.

A note from Vellum's CEO, Akash Sharma:

"We are thrilled to have InvestInData join us on our journey to transform AI product development. Their deep expertise in the data and AI space, combined with their extensive network of industry leaders, makes them an invaluable partner as we scale our business. We look forward to leveraging their insights and support as we continue to empower companies to build cutting-edge AI applications."


Join 10,000+ developers staying up-to-date with the latest AI techniques and methods.
Thanks for joining our newsletter.
Oops! Something went wrong.
Linkedin's logo

Angel Investor Collective

InvestInData (IID) is an angel investor collective composed of 50+ leading data executives from across the United States. Together, we invest in early-stage startups that we believe are pushing the frontiers of innovation in data.

About the authors

No items found.

Related posts