I'm the founder of Encore, which alerts executives to top opportunities and threats via machine learning (500 Startups company, acquired by Meltwater). I'm sharing tactical lessons I've learned about building companies, products, and teams.

003 Before You Build, Be the Wizard of Oz

003 Before You Build, Be the Wizard of Oz

"Are you sure the usage numbers are correct?!"

Felipe, Tammy, and I had just spent seven weeks building a prototype of a new platform we thought would be pretty cool--something that collected Instagram photos related to a certain brand, helped the marketer curate those photos into a library, and promoted the best ones as testimonials. In our initial interviews, most of the marketers we had talked to thought it was a great idea--so naturally, we promised them we would build a prototype and come back to them in a few weeks to try it out.

We poured countless hours into making sure the initial prototype worked smoothly and reflected the key value proposition--I remember my eyes lighting up as I saw the first photos flow into the platform, and I knew in my gut that these marketers would no doubt feel the same.

Per usual, I was dead wrong.

When we looked at the behavioral analytics on the prototype, the dark truth descended upon us: nobody was even using the prototype. Worst of all, most were logging in minutes before our follow-up meeting with them. They weren't finding the prototype useful to their daily work at all... they were just being nice to us!

After our failed experiment, my co-founders and I threw out our prototype and went back to the drawing board. What could we do differently this time?

My Favorite Kind of Experiment

In my last post, I mentioned that building a simple minimally viable product (MVP) is one of the best ways to gather traction (and potentially find the co-founder of your dreams). But there's a specific type of test that you can (and I would argue, should) run before even building your prototype: the Concierge / Wizard of Oz MVP.

A Concierge MVP basically just replaces any automated parts of your product with human work. Every customer gets an end-to-end white glove treatment. A Wizard of Oz MVP does the same, but adds a technical-looking user interface to hide the fact that humans are working behind the scenes. Either way, this type of MVP can be one of the most effective ways for you to validate your problem and solution. And best of all, you don't need to be technical--anybody can do it.

With our accelerator's Demo Day only 6 weeks away, however, we didn't have time to code something else just to find out later it also sucked. Here is how we executed this test with our newest idea:

The New Idea

We learned that marketers were overwhelmed by everything happening on social media, and it was almost impossible for them to keep up with everything going on around their brand and customers. However, we knew there were a handful of big testimonials and crises each day that they really needed to keep an eye on--the type of thing that would get them in trouble if they missed it. The hypothesis was that these marketers would find it useful to get an alert if one of these needle-moving events happened on social.

Our Wizard of Oz MVP

To test this hypothesis, Tammy, Felipe, and I sat around a table and set up social media streams (in Hootsuite) around some of the same beta tester's brands, and we sat there all day--manually reading thousands of tweets, one mind-numbing tweet at a time.

It wasn't that we were masochists (although you may argue, as founders, we basically are). We were specifically looking for testimonials and crises for each brands--the tweets that would noteworthy, like a major influencer praising their products or somebody reporting a terrible experience--mimicking what the automated version of our platform would do one day. When we found an interesting tweet, we manually embedded into the tweet into a MailChimp template we manually designed, and we'd manually send the official-looking alert to the marketer:

One of the OG manual alerts.

One of the OG manual alerts.

I'm repeating the word manually just to emphasize how freaking annoying and tedious this whole process was. Our testers probably thought it was all automated. But that's the point--if you're truly solving a need, customers initially really don't care if your solution is completely automated or powered by elves (or co-founder slave labor).

The Results

After sending dozens of these alerts one-by-one, the data we were looking for finally came in. The MailChimp dashboard showed that our beta testers were opening these emails nearly 100% of the time and clicking through on them almost half the time. Keeping in the back of our minds that most industry open and clickthrough rates were pretty low, this was mind-blowing to us. We had validated that there was some kind of need for alerts like these, with no code and just a couple days of simulation.

Key Things to Know

1) Especially if you're thinking about a software idea, consider that most software exists to automate something that would take a human a really long time to do. Almost every idea can be tested with a Concierge / Wizard of Oz MVP first. In some rare cases, though, you can't -- your solution may be doing something that is humanly impossible (in scale, in speed, in processing power needed) or fundamentally relies on a context that is difficult to simulate (like virtual reality).

Try to think about how you can break down the automated parts of your solution and simulate them manually. You can try doing it yourself or hiring some outsourced labor to help you out if you can provide them with a playbook. Pay special attention to the touchpoints with your customers--how can you replicate those interactions using email/SMS or face-to-face conversations? Be sure to measure the most important metrics that show true engagement/fulfillment of need, and also collect qualitative feedback from your users.

And finally, be sure to leave opportunity for failure--that is, don't lead the customers too hard toward getting results you want. For example, there was a real chance of our manual email alerts getting a low open or click rate over the course of sending one person many emails, meaning something would be wrong.

2) These types of MVPs are far more effective at validating ideas than customer interviews. Most potential customers will say nice things to make you feel better about your idea, but it's almost impossible to fake genuine engagement with your Concierge / Wizard of Oz experience.

Real talk: most entrepreneurs--especially technical ones--don't want to do a Concierge / Wizard of Oz MVP because they think it's beneath them. After all, if you have the coding skills to build a prototype, why sit there and do something so tedious and non-scalable? The answer is that these MVPs are one of the fastest and most effective ways to get your idea in full contact with the market and gather honest feedback. Most entrepreneurs would rather spend months coding in a cocoon instead of potentially getting their feelings hurt. I don't know about you, but I'd much rather have a more validated concept that I can confidently build toward, than waste weeks or months building a prototype that nobody will ever use.

Have you ever created a Concierge MVP? Or are you thinking about creating one? Share it with me in the comments below!

004 The Art of Asking for an Intro

004 The Art of Asking for an Intro

002 Finding a Technical Co-founder

002 Finding a Technical Co-founder