Back to Blog Home

A New Era of Sentry

David Cramer image

David Cramer -

A New Era of Sentry

This content is out of date

Since this blog post has been published we’ve evolved the feature to reduce complexity of configuration and automatically store the most valuable transaction data for you. Please see our docs on Performance at Scale to see all the latest info.


Today we are releasing Dynamic Sampling, available to all new customers, and opt-in for existing customers. This goes beyond a new feature however and is an overhaul to the way we package Sentry's Performance Monitoring product. We are saying goodbye to the days of static, magic number sampling configured within the SDK and moving to a world of flexibility.

Sampling by Design

For a few months now we've been kicking the tires on Dynamic Sampling (DS), letting customers experiment with more volume and picking up the cost on our side. If we're honest, we needed the time to figure out how to monetize the product, but also to align our longer-term vision with this major addition of new technology. With DS we were fundamentally changing how Sentry works. Our job was shifting from maximizing the value of stored (indexed) events to maximizing the value of all data we see, and making sampling nearly invisible in impact.

To do that we went back and forth on a few decisions, but eventually settled on the concept that sampling was not something customers should desire to manage. We wanted a low-friction maximum-value product offering. Asking a customer to fine-tune or pick a magic number for their sampling rate did not achieve those goals. At the same time, we also knew that some kinds of customers needed the ability to influence how data was sampled. A social network like Twitter might treat sampling decisions very differently than a B2B tool or a financial institution like American Express.

Achieving both the flexibility, as well as the low-friction goal, required us to remove the micro-management of sampling entirely. We’d make an intelligent decision on a baseline sample rate (which, in all honesty, we may not get right on our first attempt), but we’d also find ways to capture more value when it made sense. For example, we’re currently experimenting with moving our new performance issue detection to the Edge, meaning we could identify unique issues across your entire stream of data, not just within the smaller set of samples we capture.

The most interesting outcome of this was how it affected pricing and packaging. In this world the value of a stored event is enormous, but the value of the total stream of data is minimal. It created complexity in how we would pass through the cost on the stream analysis. If we only charged you for stored events, we’d never be able to sustain the costs of the sampling chain. That meant we had to fundamentally change our business model.

In this new sampling world, you no longer have to think about micromanaging which data points to sample, or how much it will cost you to store data. Sentry will help you achieve the right level of fidelity at a price point that’s fair, and also ensure both you - the customer - as well as our product team focus on storing the data that’s most relevant and actionable.

Opting In

We’re rolling this out iteratively given it’s such a large change. Today that means it’s only going live for new customers. We’re going to be actively watching the total volume of data coming into the systems to make sure we can maintain service levels and scale up accordingly, which means we are focused on customers sending 100s of millions of transactions today, rather than some of our larger users which are sending 10s of billions.

That said, existing customers will be getting this functionality soon. If you’re interested, we might be able to opt you in today — just contact your account rep or customer support.

Once you’re on the new model and you’ve sent more than five million transactions you’ll begin to see Dynamic Sampling kick in. Right now it’s fairly naive, and simply attempts to create a decreasing baseline of fidelity. For example at 100 million transactions that might be something like 50% fidelity, while that target will be much lower at one billion transactions. We won’t give you complete control over the sample rate, but we will give you transparency.

Additionally we are launching with our first set of controls for Dynamic Sampling, allowing you to increase the fidelity under several scenarios. This is an area we expect to invest a lot in the coming months. The team already has some great ideas, including things like:

  • Increasing fidelity at lower volume projects automatically to resolve quota and budget challenges

  • Sampling transactions when an error happens giving you a complete picture

  • Capturing samples for low volume transactions that otherwise might be drowned out

We know we still have a lot of work to do on the product side, but we’re excited to get feedback on what you think as well as what you’d like to see. Long term we expect to bring this functionality to the rest of the Sentry product. We’re exploring ways we can improve our error monitoring with sampling, as well as how it will apply to our upcoming Continuous Profiling and Session Replay products.

Share

Share on Twitter
Share on Facebook
Share on HackerNews
Share on LinkedIn

Published

Sentry Sign Up CTA

Code breaks, fix it faster

Sign up for Sentry and monitor your application in minutes.

Try Sentry Free

Topics

Sentry

New product releases and exclusive demos

Listen to the Syntax Podcast

Of course we sponsor a developer podcast. Check it out on your favorite listening platform.

Listen To Syntax
    TwitterGitHubDribbbleLinkedinDiscord
© 2024 • Sentry is a registered Trademark of Functional Software, Inc.