Introducing Optimizely Data Lab
As we kick off Opticon19 today, we’d like to share something we’ve been working very hard on. It’s the first step in a direction we’re excited about, and I think many of you are going to be as well. Ten years ago, Dan Siroker and I founded Optimizely to help people use experiments to make
As we kick off Opticon19 today, we’d like to share something we’ve been working very hard on. It’s the first step in a direction we’re excited about, and I think many of you are going to be as well. Ten years ago, Dan Siroker and I founded Optimizely to help people use experiments to make better decisions. That idea got me out of bed in the morning when we started, and it’s still getting me out of bed ten years later.
Back then, when it came to collecting, analyzing, and making sense of their experiment data, most teams were struggling with the basics, so that’s what we focused on. Our experiment results page, built with Stats Engine, was designed to help customers view results in real-time and make trusted decisions with data. This helps executives scale their experimentation programs because it ensures everyone in your organization has a consistent, “self-serve” way of looking at experiment results, and it “just works.”
But, like any trade-off, our focus on simplicity and scale comes at a cost: flexibility. Here’s a view of the data pipeline that powers Optimizely’s results page:
Each of the components in this pipeline is managed by Optimizely. They’re designed to work seamlessly together and because of that they’re kind of a package deal.
The world has changed
Today, many organizations have replaced off-the-shelf, all-in-one analytics tools with powerful, centralized data stacks in order to provide their teams with a consistent view of their customer data and business metrics. Many have built out centralized data teams tasked with building that infrastructure and drawing insights from that data.
And these data teams LOVE experimentation. They’re experts, and they play a big role in increasing an organization’s testing maturity because they bring structure, energy, and sophisticated techniques to the work of answering hard business questions with experiment data. For data experts like these, flexibility is often more important than simplicity.
We’re changing with it
Today, I’d like to share our vision for the future of Optimizely data, and some things we’ll be releasing over the coming months to give data teams the flexibility they need to answer difficult questions with their experiments.
The core of our vision is to break Optimizely’s results pipeline into interchangeable components for collecting data, measuring metrics, applying statistics and building and sharing reports. Our goal is to provide data teams with the flexibility to combine these components with their own systems and data so they can answer their most difficult business questions with experiments.
We’re excited to introduce Optimizely Data Lab, a solution for data teams looking to uncover deeper insights from experiments.
Optimizely Data Lab is a powerful toolkit for experiment analysis and report-building. It includes:
- Enriched events, a new queryable experiment dataset with fine-grained event data such as attributes and event tags.
- A set of powerful experiment analysis libraries containing a variety of statistical tests, data visualization tools, and connectors for working with Optimizely and 3rd-party experiment data.
- The experiment lab: a portable environment for performing ad-hoc experiment analysis and building experiment reports with Jupyter Notebooks.
What can you do with this?
If you’re a data scientist, engineer, or analyst, you can use Optimizely Data Lab to:
- customize the look and feel of your experiment reports using our analysis libraries and open source technology that many already know and love: Jupyter Notebooks.
- incorporate complex metrics and segments by querying our new enriched events dataset.
- join this event data with critical business data in other platforms, or even rely on another source of truth, whether it’s an off-the-shelf analytics platform like Adobe or a powerful data warehouse like Snowflake or BigQuery.
- use Optimizely’s Stats Engine to estimate experiment effects with first and third-party experiment data.
- incorporate alternative statistical models, like our built-in implementation of the T-test, a new Bayesian inference model, or your own custom models.
Optimizely Data Lab will be entering private beta soon, which we’ll begin expanding over the next few months. We’re taking this approach because, just like many of our customers, we’re learning too, and we’d like your help. You can contact us here today and we’ll let you know as soon as we’re ready to onboard you.
Stay tuned, because we’re just getting started.