Last time I shared my conversion dashboard and promised some numbers. I don’t have all the numbers yet. But enough to start identifying some actionable next steps.
First some tools discussion is in order
I had been using Mixpanel and evaluating KISSmetrics for my metrics data. While I still see lots of possibilities using Mixpanel for application level metrics (e.g. what features are really getting used), I find KISSmetrics a lot more aligned with my goals for conversion metrics.
Here’s why… First some common goodness between the two: Both are near-real time. Both are event driven. Both have the ability to define custom properties on events like plan type, operating system version, browser version, etc.
But here’s where KISSmetrics really shines:
Ad-hoc funnel reports
While both Mixpanel and KISSmetrics use events to construct funnel reports, Mixpanel assumes all funnels are linear and fixed. You have to pre-define the exact sequence of steps upfront, then hardcode them into your pages.
For example, to record a “Created Gallery” event during a “Signup Flow” for funnel analysis, you would generate an event like this:
mpmetrics.track_funnel('Signup Flow', 3, 'Created Gallery');
where 3 identifies the third step in your funnel. Pre-defining funnels like this is fragile. If the event occurs out of order, it isn’t counted. If you need to add another event to uncover more detail, you have to touch all the events that follow it in the flow.
KISSmetrics on the other hand simply collects events that can be later assembled into one or more ad-hoc funnel reports.
You would code the same “Created Gallery” event as:
You don’t need to identify a step number or funnel for this event. This sort of late composition of funnel reports really decouples raw event data from how they are eventually used in ad-hoc reports which is incredibly flexible.
What I also found, a little to my surprise, is that CloudFire’s conversion funnel is linear but not fixed. People generally move from the top of the funnel towards the bottom but they can jump in at multiple points. I’ve had users click-through directly to the signup page from another site without going through the landing or pricing pages at all. This wouldn’t get recorded at all in a Mixpanel type pre-defined funnel since it wouldn’t start at Step 1.
Ability to identify users
Another challenge when collecting metrics is tracking users consistently across sessions. Most analytics tools use unique cookies, but those break down across browsers or multiple computers. CloudFire, being a downloaded p2web app, has the additional requirement for multiple domain support. KISSmetrics offers a simple (too simple) solution for tracking users across any of these scenarios.
Before a user’s identity is known, such as pre-signup, KISSmetrics tracks users using a unique cookie (like other analytics tools). But once the identity is known, such as post-login, you can call a
method and tell it exactly who the user is using a persistent user identifier such as username or email. All data pre-login is merged with that post-login into a single record.
This way events are really tied to people and much more meaningful.
I really like the 1-page report visualization which you’ll see a little later. I do, however, wish they made their dashboard a little more useful.
Mine currently looks like this:
You might recognize it as my AARRR conversion dashboard. All it’s missing are conversion percentages next to each funnel.
On to the numbers
As I mentioned last time, I am laying the foundation for measuring all the metrics but only focussing on optimizing Activation and Retention initially. As each of the AARRR metrics is itself a sub-funnel, I decided to break them out into separate funnel reports rather than build one giant report that would be a nightmare to maintain in the long run.
The ability to create ad-hoc funnel reports, discussed earlier, allows me start measuring everything at the macro level and add more detail to drill into the sub-funnels as needed for optimization.
I define Acquisition (user engagement) as a visitor who doesn’t abandon, and visits the pricing page (usually from the landing page). I am collecting KM events on my landing and pricing pages and have created a funnel report that looks like this:
A 35% conversion (or 65% bounce rate) isn’t particularly great but it’s good enough to drive meaningful traffic to validate my MVP for now (scaling comes after product/market fit). Apart from my earlier adventures in SEM, I have not spent any more money on paid channels, and am instead investing time building up some viable free channels (SEO, blogs) in parallel.
I define Activation (happy first-user experience) as a sub-funnel made up of the following steps: Sign-up, Download App, Create First Gallery.
The first 2 steps occur on the CloudFire website, while the last is done in the downloaded application. Since I just started using KISSmetrics, I haven’t finished integrating it with the app and am relying on an earlier custom database report I created to measure “Created First Gallery”.
Here’s what the KISSmetrics Activation funnel looks like without the last step:
You’ll see what I meant by CloudFire’s conversion funnel being linear but non-fixed: 11 new people viewed the sign-up form, and 8 new people downloaded the app without starting from the top of the funnel.
Supplementing, as closely as possible, with my own custom report for the number of users that successfully “Created First Gallery” lowers the Activation conversion rate from 11.5% to 5.3%.
This is where I currently am on the numbers. I will be finishing up data collection for Retention, Referral, and Revenue this week but just the Activation numbers already reveal a number of potential hot spots.
Normalized Conversion Dashboard
I like to visualize my conversion funnel as a percentage of total visitors and have normalized the numbers to reflect that. I’ve also blown up the Activation row to show the full Activation sub-funnel since I’m highly motivated to optimize that right now. As I don’t have the other numbers yet, I’m not bothering with showing them for now.
These numbers immediately indicate that the Activation process is NOT healthy.
While all 3 steps exhibit leaking buckets, of particular concern to me was loosing more than half the people that chose to download the app but couldn’t successfully finish the first task of creating a gallery. That’s where I decided to start.
It’s not that the others aren’t as important but they seem more of an optimization problem (pricing, sign-up form, copy, etc.) than a fundamental failing of the MVP (software) itself. Plus people that made it to the last step, successfully navigated the previous steps so there is some added rational in starting from the bottom of the funnel.
Despite having an easy way to contact us (800 number, email, GetSatisfaction) on every page of the sign-up process, most people did not choose to contact us when things went wrong, which placed the burden of figuring what went wrong on us.
The first step was being able to identify users that downloaded the app. I used to allow users to download the application from the website and complete the signup process from the app. The idea behind that was to reduce friction and make the app self-contained so it could be distributed from other websites (like say download.com). However, if a user had a problem with the installation, there was no way of knowing. So I reordered the flow to where users create an account first on the website, then download the app. That way we have their email address and can contact them if needed.
The second step was being able to identify as quickly as possible if users ran into an issue. It was fairly easy to construct a report that found users that signed-up but didn’t finish creating their first gallery within a reasonable timeframe. I sent them all personalized emails (this has been automated now) and happily many replied back. Some downloaded the installer, but didn’t know to run it (how do I fix that?). Others had issues with the installation itself which they shared. No one, so far, had issues creating a gallery once the app was installed and launched.
Downloaded apps, in general, are a challenge. The desktop, with multiple operating system versions, java versions, anti-virus programs, NATs and firewalls, is a pretty hostile environment for a new networked application.
One of the issues I uncovered was over a nasty shortcut Apple took with force migrating everyone from 32-bit Java 5 to 64-bit Java 6 in Snow Leopard using a symbolic link pointing Java 1.5 -> Java 1.6 (WTF!). This broke CloudFire. Fixing it actually required upgrading a 3rd party component (Eclipse) which required rewriting the software update process (now using P2/OSGI), and my continuous deployment process (future post). Other issues had to do with 64-bit versus 32-bit on Windows, and bad pre-existing Java installations.
What I’ve found is that, in the end, users WILL encounter unanticipated problems, because you can only test so many desktop/browser configurations (until you can afford running all of them). The key is to be able to identify users that run into problems as quickly as possible and then try to engage them directly with an offer of help, gift certificates, extended trials… whatever it takes to get them to talk to you as they hold the answers (actually they hold the problems, it’s up to you to uncover the answers).
I’ve also started running some more usability tests on the download process which hasn’t revealed anything as significant as the issues already uncovered so maybe I’ll start seeing some improvement in those numbers soon.
Completing the rest of the conversion dashboard, prioritizing other areas in Activation/Retention that need addressing, a/b testing, usability testing, customer follow-up interviews.